Skip to content

Commit

Permalink
Replace oneshot_webserver with fake curl (#10808)
Browse files Browse the repository at this point in the history
This solves a lot of the waiting leftover HTTP processes etc. The new
fake curl will read the list of files to serve from `fake-curls` and map
each line to a port.

To match the behavior of the oneshot-webserver it will only serve each
port once, subsequent requests for the same port will be answered by 404
(so not exactly the same but similar enough), the ports that were served
are recorded into an `already-served` file.

The same file can be served multiple times, as long as it is present
multiple times in `fake-curls`.

Signed-off-by: Marek Kubica <marek@tarides.com>
  • Loading branch information
Leonidas-from-XIV authored Aug 7, 2024
1 parent 1dd8015 commit 74dab89
Show file tree
Hide file tree
Showing 12 changed files with 147 additions and 168 deletions.
2 changes: 1 addition & 1 deletion src/dune_pkg/fetch.ml
Original file line number Diff line number Diff line change
Expand Up @@ -176,7 +176,7 @@ let fetch_curl ~unpack:unpack_flag ~checksum ~target (url : OpamUrl.t) =
let exn =
User_message.make
[ Pp.textf
"failed to unpackage archive downloaded from %s"
"failed to unpack archive downloaded from %s"
(OpamUrl.to_string url)
; Pp.text "reason:"
; msg
Expand Down
2 changes: 1 addition & 1 deletion test/blackbox-tests/test-cases/dune
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
../utils/dunepp.exe
../utils/melc_stdlib_prefix.exe
../utils/refmt.exe
../utils/webserver_oneshot.exe
../utils/curl
../utils/sherlodoc.exe
../utils/ocaml_index.exe)))

Expand Down
82 changes: 30 additions & 52 deletions test/blackbox-tests/test-cases/pkg/compute-checksums-when-missing.t
Original file line number Diff line number Diff line change
Expand Up @@ -5,91 +5,72 @@ downloaded from non-local sources.
$ . ./helpers.sh
$ mkrepo

$ strip_transient() {
> sed -e "s#$PWD#<pwd>#" | \
> # strip the entire address as the port number may also appear in the hash
> sed -e "s#http://.*$PORT#<addr>#"
> }

A file that will comprise the package source:
$ echo "Hello, World!" > foo.txt

Run the server in the background:

We have some really slow tests here
$ export DUNE_WEBSERVER_TIMEOUT=10

$ webserver_oneshot --content-file foo.txt --port-file port.txt &
$ until test -f port.txt; do sleep 0.1; done
$ PORT=$(cat port.txt)
$ echo "Hello, World!" > foo.txt
$ echo foo.txt > fake-curls
$ PORT=1

$ mkpkg foo <<EOF
> url {
> src: "http://0.0.0.0:$PORT"
> }
> EOF

$ solve foo | strip_transient
Solution for dune.lock:
- foo.0.0.1
$ solve foo
Package "foo" has source archive which lacks a checksum.
The source archive will be downloaded from: <addr>
The source archive will be downloaded from: http://0.0.0.0:1
Dune will compute its own checksum for this source archive.

$ wait
Solution for dune.lock:
- foo.0.0.1

Replace the path in the lockfile as it would otherwise include the sandbox
path.
$ cat dune.lock/foo.pkg | strip_transient
$ cat dune.lock/foo.pkg
(version 0.0.1)

(source
(fetch
(url <addr>)
(url http://0.0.0.0:1)
(checksum md5=bea8252ff4e80f41719ea13cdf007273)))

(dev)

Now make sure we can gracefully handle the case when the archive is missing.
Recreate the foo package with a fake port number to signal that the file will
404:

$ rm port.txt
$ webserver_oneshot --content-file foo.txt --port-file port.txt --simulate-not-found &
$ until test -f port.txt; do sleep 0.1; done
$ PORT=$(cat port.txt)

Recreate the foo package as the port number will have changed:
$ PORT=9000
$ mkpkg foo <<EOF
> url {
> src: "http://0.0.0.0:$PORT"
> }
> EOF

$ solve foo 2>&1 | strip_transient
$ solve foo 2>&1
Package "foo" has source archive which lacks a checksum.
The source archive will be downloaded from: <addr>
The source archive will be downloaded from: http://0.0.0.0:9000
Dune will compute its own checksum for this source archive.
Warning: download failed with code 404
Solution for dune.lock:
- foo.0.0.1
$ cat dune.lock/foo.pkg | strip_transient
$ cat dune.lock/foo.pkg
(version 0.0.1)

(source
(fetch
(url <addr>)))
(url http://0.0.0.0:9000)))

(dev)

$ wait

Check that no checksum is computed for a local source file:

$ mkpkg foo <<EOF
> url {
> src: "$PWD/foo.txt"
> }
> EOF
$ solve foo 2>&1 | strip_transient
$ solve foo 2>&1
Solution for dune.lock:
- foo.0.0.1

Expand All @@ -101,19 +82,19 @@ Check that no checksum is computed for a local source directory:
> src: "$PWD/src"
> }
> EOF
$ solve foo 2>&1 | strip_transient
$ solve foo 2>&1
Solution for dune.lock:
- foo.0.0.1


Create 3 packages that all share the same source url with no checksum. Dune
will need to download each package's source archive to compute their hashes.
Test that dune only downloads the file a single time since the source url is
identical among the packgaes. The fact that the download only occurs once is
identical among the packages. The fact that the download only occurs once is
asserted by the fact that the webserver will only serve the file a single time.
$ webserver_oneshot --content-file foo.txt --port-file port.txt &
$ until test -f port.txt; do sleep 0.1; done
$ PORT=$(cat port.txt)

$ echo foo.txt >> fake-curls
$ PORT=2

$ mkpkg foo <<EOF
> url {
Expand All @@ -133,20 +114,17 @@ asserted by the fact that the webserver will only serve the file a single time.
> }
> EOF

$ solve foo bar baz | strip_transient
Solution for dune.lock:
- bar.0.0.1
- baz.0.0.1
- foo.0.0.1
$ solve foo bar baz
Package "bar" has source archive which lacks a checksum.
The source archive will be downloaded from: <addr>
The source archive will be downloaded from: http://0.0.0.0:2
Dune will compute its own checksum for this source archive.
Package "baz" has source archive which lacks a checksum.
The source archive will be downloaded from: <addr>
The source archive will be downloaded from: http://0.0.0.0:2
Dune will compute its own checksum for this source archive.
Package "foo" has source archive which lacks a checksum.
The source archive will be downloaded from: <addr>
The source archive will be downloaded from: http://0.0.0.0:2
Dune will compute its own checksum for this source archive.

$ wait

Solution for dune.lock:
- bar.0.0.1
- baz.0.0.1
- foo.0.0.1
3 changes: 2 additions & 1 deletion test/blackbox-tests/test-cases/pkg/dune
Original file line number Diff line number Diff line change
Expand Up @@ -38,8 +38,9 @@
(applies_to pkg-deps))

(cram
(deps %{bin:webserver_oneshot})
(deps %{bin:curl})
(applies_to
unavailable-source-package
compute-checksums-when-missing
e2e
source-caching
Expand Down
10 changes: 4 additions & 6 deletions test/blackbox-tests/test-cases/pkg/e2e.t
Original file line number Diff line number Diff line change
Expand Up @@ -22,10 +22,10 @@ Make a library:
$ tar -czf foo.tar.gz foo
$ rm -rf foo

Start a oneshot webserver so dune can download the packgae with http:
$ webserver_oneshot --content-file foo.tar.gz --port-file port.txt &
$ until test -f port.txt; do sleep 0.1; done
$ PORT=$(cat port.txt)
Configure our fake curl to serve the tarball

$ echo foo.tar.gz >> fake-curls
$ PORT=1

Make a package for the library:
$ mkpkg foo <<EOF
Expand Down Expand Up @@ -80,5 +80,3 @@ Lock, build, and run the executable in the project:

$ dune exec bar
Hello, World!

$ wait
25 changes: 10 additions & 15 deletions test/blackbox-tests/test-cases/pkg/extra-sources.t
Original file line number Diff line number Diff line change
Expand Up @@ -69,12 +69,10 @@ First we need a project that will have the patch applied:
Then we start the oneshot server for both the source and the patch.
$ webserver_oneshot --content-file needs-patch.tar --port-file tarball-port.txt &
$ until test -f tarball-port.txt ; do sleep 0.1; done
$ SRC_PORT=$(cat tarball-port.txt)
$ webserver_oneshot --content-file required.patch --port-file required-patch-port.txt &
$ until test -f required-patch-port.txt ; do sleep 0.1; done
$ REQUIRED_PATCH_PORT=$(cat required-patch-port.txt)
$ echo needs-patch.tar > fake-curls
$ SRC_PORT=1
$ echo required.patch >> fake-curls
$ REQUIRED_PATCH_PORT=2
We now have the checksums as well as the port numbers, so we can define the
package.
Expand Down Expand Up @@ -124,15 +122,12 @@ correct, patched, message:
Set up a new version of the package which has multiple `extra-sources`, the
application order of them mattering:

$ webserver_oneshot --content-file needs-patch.tar --port-file tarball-port.txt &
$ until test -f tarball-port.txt ; do sleep 0.1; done
$ SRC_PORT=$(cat tarball-port.txt)
$ webserver_oneshot --content-file required.patch --port-file required-patch-port.txt &
$ until test -f required-patch-port.txt ; do sleep 0.1; done
$ REQUIRED_PATCH_PORT=$(cat required-patch-port.txt)
$ webserver_oneshot --content-file additional.patch --port-file additional-patch-port.txt &
$ until test -f additional-patch-port.txt ; do sleep 0.1; done
$ ADDITIONAL_PATCH_PORT=$(cat additional-patch-port.txt)
$ echo needs-patch.tar >> fake-curls
$ SRC_PORT=3
$ echo required.patch >> fake-curls
$ REQUIRED_PATCH_PORT=4
$ echo additional.patch >> fake-curls
$ ADDITIONAL_PATCH_PORT=5

$ mkpkg needs-patch 0.0.2 <<EOF
> build: ["dune" "build" "-p" name "-j" jobs]
Expand Down
7 changes: 3 additions & 4 deletions test/blackbox-tests/test-cases/pkg/source-caching.t
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
This test demonstratest that fetching package sources should be cached
This test demonstrates that fetching package sources should be cached

$ . ./helpers.sh

Expand All @@ -9,9 +9,8 @@ This test demonstratest that fetching package sources should be cached
$ mkdir $sources; touch $sources/dummy
$ tar -czf $tarball $sources
$ checksum=$(md5sum $tarball | awk '{ print $1 }')
$ webserver_oneshot --content-file $tarball --port-file port.txt &
$ until test -f port.txt ; do sleep 0.1; done
$ port=$(cat port.txt)
$ echo $tarball > fake-curls
$ port=1

$ makepkg() {
> make_lockpkg $1 <<EOF
Expand Down
18 changes: 5 additions & 13 deletions test/blackbox-tests/test-cases/pkg/tarball.t
Original file line number Diff line number Diff line change
Expand Up @@ -14,29 +14,21 @@ Demonstrate that we should support tarballs with and without a root directory

$ make_lockdir

CR-rgrinberg: no idea why, but curl doesn't like it when we run the server
twice in one test (even on different ports)

$ webserver_oneshot --port-file port.txt \
> --content-file tarball1.tar.gz \
> --content-file tarball2.tar.gz &
$ until test -f port.txt; do sleep 0.1; done
$ port=$(cat port.txt)
$ echo tarball1.tar.gz > fake-curls
$ echo tarball2.tar.gz >> fake-curls

$ runtest() {
> make_lockpkg foo <<EOF
> (version 0.1.0)
> (source (fetch (url http://0.0.0.0:$port)))
> (source (fetch (url http://0.0.0.0:$1)))
> (build (run sh -c "find . | sort"))
> EOF
> build_pkg foo
> rm -rf _build
> }
$ runtest tarball1.tar.gz
$ runtest 1
.
./foo
$ runtest tarball2.tar.gz
$ runtest 2
.
./foo

$ wait
15 changes: 8 additions & 7 deletions test/blackbox-tests/test-cases/pkg/unavailable-package-source.t
Original file line number Diff line number Diff line change
Expand Up @@ -34,10 +34,11 @@ Git
Hint: Check that this Git URL in the project configuration is correct:
"file://PWD/dummy"

Http
A bit annoying that this test can pass by accident if there's a server running
on 35000
$ runtest "(fetch (url \"https://0.0.0.0:35000\"))" 2>&1 | sed -ne '/Error:/,$ p'
Error: curl returned an invalid error code 7


HTTP

$ runtest "(fetch (url \"https://0.0.0.0:35000\"))" 2>&1 | sed -ne '/Error:/,$ p' | sed -e 's/".*"/"<PATH>"/'
Error:
failed to unpack archive downloaded from https://0.0.0.0:35000
reason:
unable to extract "<PATH>"

Loading

0 comments on commit 74dab89

Please sign in to comment.