Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merge Main into mw-1.40 #584

Merged
merged 17 commits into from
Feb 20, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
17 commits
Select commit Hold shift + click to select a range
15dd2ce
T350782 / #542 wrap-up: Fixes Docker Image loading, standarizes docke…
lorenjohnson Dec 22, 2023
ac8e423
Removes confusing WDQS URL default for test/helpers Page#open (#549)
lorenjohnson Dec 22, 2023
9c3e279
T350878: Covers lines 18 and 21 of Product Verification sheet (Proper…
lorenjohnson Jan 4, 2024
21088b5
T353520: Fixes flakey Property and Special:Property tests with mocha …
lorenjohnson Jan 7, 2024
135f5d5
Docker healthchecks instead of waitForUrls in test services setup (#558)
lorenjohnson Jan 7, 2024
27b4fcd
T349806 - Fix Quickstatements image (#564)
lorenjohnson Jan 8, 2024
27ca46d
T353520: Enables debugging in spec files (including Node), and prompt…
lorenjohnson Jan 8, 2024
2778b06
T353520: Further addresses Property and Special Property spec flakine…
lorenjohnson Jan 15, 2024
e490dfd
T346037: Remove Makefile (#561)
RickiJay-WMDE Jan 22, 2024
12979d0
T33704 - Page Object Updates (#566)
RickiJay-WMDE Feb 5, 2024
c343a4c
T351720 tag GHCR containers with version too (#578)
rti Feb 14, 2024
51d5324
Retry Fix (#577)
RickiJay-WMDE Feb 15, 2024
eb06277
T354644 bump version on main branch to not collide with release branc…
rti Feb 19, 2024
b7379bd
chore: remove tarball related functionality (#581)
rti Feb 19, 2024
eddcbd9
T350878 - Uncovered Tests (#574)
RickiJay-WMDE Feb 19, 2024
a2a01b4
chore: streamline mediawiki version number variables (#580)
rti Feb 19, 2024
97877df
Merge remote-tracking branch 'origin/main' into merge-main-mw-1.40
RickiJay-WMDE Feb 20, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
26 changes: 0 additions & 26 deletions .github/actions/upload-results/action.yml

This file was deleted.

37 changes: 27 additions & 10 deletions .github/workflows/build_and_test.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
name: Build and Test
name: 🏗️🧪 Build and Test

on:
push:
Expand Down Expand Up @@ -44,13 +44,15 @@ jobs:
uses: actions/checkout@v4

- name: Build
run: ./build.sh --save-image --extract-tarball ${{ matrix.component }}
shell: bash
run: ./build.sh --save-image ${{ matrix.component }}

- name: Upload results
uses: ./.github/actions/upload-results
- name: Archive Docker artifact
uses: actions/upload-artifact@v3
with:
component: ${{ matrix.component }}
name: DockerImages
path: artifacts/${{ matrix.component }}*.docker.tar.gz
if-no-files-found: error

- name: Scan Image
uses: ./.github/actions/scan-image
Expand Down Expand Up @@ -223,8 +225,23 @@ jobs:
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}

- name: Store built docker image on GHCR
uses: wmde/tag-push-ghcr-action@v3
with:
image_name: ${{ matrix.docker_image }}
tag: ${{ github.run_id }}
- name: Push docker image to GHCR
shell: bash
run: |
set -x

# Get the docker tag name != latest of the image we are currently looking at. (The version string that is)
version_tag=$(docker image ls | grep -e '^${{ matrix.docker_image }} ' | grep -v latest | awk '{print $2}' | head -n 1)

# We need to retag the container for GHCR because the current tag is targetting docker hub.
# We will push two tags to GHCR, a version tag and a tag representing the current CI pipeline run id.
# Let's create "from" and "to" container URLs first.
from_url=${{ matrix.docker_image }}:"$version_tag"
to_url_version=ghcr.io/${{ github.repository_owner }}/${{ matrix.docker_image }}:"$version_tag"
to_url_run_id=ghcr.io/${{ github.repository_owner }}/${{ matrix.docker_image }}:${{ github.run_id }}

# Retag and push
docker tag "$from_url" "$to_url_version"
docker tag "$from_url" "$to_url_run_id"
docker push "$to_url_version"
docker push "$to_url_run_id"
24 changes: 24 additions & 0 deletions .vscode/launch.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
{
// Use IntelliSense to learn about possible attributes.
// Hover to view descriptions of existing attributes.
// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0",
"configurations": [
{
"name": "Test Runner (Docker): Attach to Runnings Specs",
"type": "node",
"request": "attach",
"restart": true,
"port": 9229,
"address": "localhost",
"localRoot": "${workspaceFolder}/test",
"remoteRoot": "/usr/src/test",
"sourceMaps": true,
"autoAttachChildProcesses": true,
"skipFiles": [
"${workspaceFolder}/test/node_modules/**",
"<node_internals>/**"
]
}
]
}
14 changes: 0 additions & 14 deletions Makefile

This file was deleted.

5 changes: 1 addition & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

## Repository overview

The wikibase release pipeline contains scripts used for building, testing and publishing Wikibase docker images and tarballs.
The wikibase release pipeline contains scripts used for building, testing and publishing Wikibase docker images.

It contains a set of build targets defined in the [Makefile](./Makefile) which can be executed in two different ways.

Expand All @@ -24,9 +24,6 @@ $ ./build.sh wikibase
# Build only the query service container and save the docker image to a tarball
$ ./build.sh --save-image wdqs

# Build the wdqs-frontend container and extract a standalone tarball from the webroot
$ ./build.sh --extract-tarball wdqs-frontend

# Build the wdqs container without using Dockers cache
$ ./build.sh --no-cache wdqs
```
Expand Down
27 changes: 3 additions & 24 deletions build.sh
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,6 @@ set +o allexport


SAVE_IMAGE=false
EXTRACT_TARBALL=false
DOCKER_BUILD_CACHE_OPT=""

function save_image {
Expand Down Expand Up @@ -68,7 +67,7 @@ function setup_image_name_url_and_tag {


function build_wikibase {
setup_image_name_url_and_tag "$WIKIBASE_SUITE_WIKIBASE_IMAGE_URL" "$RELEASE_VERSION-$WMDE_RELEASE_VERSION"
setup_image_name_url_and_tag "$WIKIBASE_SUITE_WIKIBASE_IMAGE_URL" "$MEDIAWIKI_VERSION-$WMDE_RELEASE_VERSION"

docker build \
$DOCKER_BUILD_CACHE_OPT \
Expand All @@ -87,22 +86,14 @@ function build_wikibase {

save_image "$image_name" "$image_url" "$image_name_with_tag" "$image_url_with_tag"

if $EXTRACT_TARBALL; then
docker run --entrypoint="" --rm "$image_url_with_tag" \
tar cz -C /var/www --transform="s,^html,${image_name}," html \
> "artifacts/${image_name_with_tag//:/-}.tar.gz"
pushd artifacts
ln -sf "${image_name_with_tag//:/-}.tar.gz" "${image_name}.tar.gz"
popd
fi

setup_image_name_url_and_tag "$WIKIBASE_SUITE_WIKIBASE_BUNDLE_IMAGE_URL" "$RELEASE_VERSION-$WMDE_RELEASE_VERSION"
setup_image_name_url_and_tag "$WIKIBASE_SUITE_WIKIBASE_BUNDLE_IMAGE_URL" "$MEDIAWIKI_VERSION-$WMDE_RELEASE_VERSION"

docker build \
$DOCKER_BUILD_CACHE_OPT \
--build-arg COMPOSER_IMAGE_URL="$COMPOSER_IMAGE_URL" \
--build-arg WMDE_RELEASE_VERSION="$WMDE_RELEASE_VERSION" \
--build-arg RELEASE_VERSION="$RELEASE_VERSION" \
--build-arg MEDIAWIKI_VERSION="$MEDIAWIKI_VERSION" \
--build-arg WIKIBASE_SUITE_WIKIBASE_IMAGE_URL="$WIKIBASE_SUITE_WIKIBASE_IMAGE_URL" \
\
--build-arg BABEL_COMMIT="$BABEL_COMMIT" \
Expand Down Expand Up @@ -173,15 +164,6 @@ function build_wdqs-frontend {
build/WDQS-frontend/ -t "$image_url_with_tag" -t "$image_url"

save_image "$image_name" "$image_url" "$image_name_with_tag" "$image_url_with_tag"

if $EXTRACT_TARBALL; then
docker run --entrypoint="" --rm "$image_url_with_tag" \
tar cz -C /usr/share/nginx --transform="s,^html,${image_name}," html \
> "artifacts/${image_name_with_tag//:/-}.tar.gz"
pushd artifacts
ln -sf "${image_name_with_tag//:/-}.tar.gz" "${image_name}.tar.gz"
popd
fi
}


Expand Down Expand Up @@ -259,9 +241,6 @@ for arg in "$@"; do
-s|--save-image)
SAVE_IMAGE=true
;;
-t|--extract-tarball)
EXTRACT_TARBALL=true
;;
-n|--no-cache)
DOCKER_BUILD_CACHE_OPT="--no-cache"
;;
Expand Down
47 changes: 19 additions & 28 deletions build/QuickStatements/README.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,13 @@
# Quickstatements docker image
# QuickStatements docker image

Quickstatements as seen at [https://github.com/magnusmanske/quickstatements](https://github.com/magnusmanske/quickstatements)
QuickStatements as seen at [https://github.com/magnusmanske/quickstatements](https://github.com/magnusmanske/quickstatements)

### Environment variables

Variable | Default | Description
-------------------------------------|--------------------------|------------
`WIKIBASE_SCHEME_AND_HOST` | NONE | Host and port of Wikibase instance as seen by Quick Statements
`WB_PUBLIC_SCHEME_HOST_AND_PORT` | NONE | Host and port of Wikibase as seen by the user's browser
`QS_PUBLIC_SCHEME_HOST_AND_PORT` | NONE | Host and port of Quick Statements as seen by the user's browser
`QS_PUBLIC_SCHEME_HOST_AND_PORT` | NONE | Host and port of QuickStatements as seen by the user's browser
`OAUTH_CONSUMER_KEY` | NONE | OAuth consumer key (obtained from Wikibase)
`OAUTH_CONSUMER_SECRET` | NONE | OAuth consumer secret (obtained from wikibase)
`PHP_TIMEZONE` | UTC | setting of php.ini date.timezone
Expand All @@ -23,44 +22,36 @@ Variable | Default | Description

Directory | Description
--------------------------------------------|-------------------------------------------------------------------------------
`/var/www/html/quickstatements` | Base quickstatements directory
`/var/www/html/quickstatements/public_html` | The Apache Root folder
`/var/www/html/quickstatements` | Base QuickStatements directory
`/var/www/html/quickstatements/public_html` | The Apache root folder
`/var/www/html/magnustools` | Base magnustools directory

File | Description
------------------------- | ------------------------------------------------------------------------------
`/templates/config.json` | Template for Quickstatements' config.json (substituted to `/var/www/html/quickstatements/public_html/config.json` at runtime)
`/templates/oauth.ini` | Template for Quickstatements' oauth.ini (substituted to `/var/www/html/quickstatements/oauth.ini` at runtime)
`/templates/config.json` | Template for QuickStatements' config.json (substituted to `/var/www/html/quickstatements/public_html/config.json` at runtime)
`/templates/oauth.ini` | Template for QuickStatements' oauth.ini (substituted to `/var/www/html/quickstatements/oauth.ini` at runtime)
`/templates/php.ini` | php config (default provided sets date.timezone to prevent php complaining substituted to `/usr/local/etc/php/conf.d/php.ini` at runtime)


### Set up quickstatements
In order for quickstatements to communicate with wikibase it needs to know where your instance is and how it can find it.
This must be done by setting the ENV variable `WIKIBASE_SCHEME_AND_HOST`. n.b. This should reflect how this container when running
sees the wikibase container. For example the example docker container alias like `wikibase.svc`.
### Set up QuickStatements
In order to authorize QuickStatements against Wikibase via OAuth, this container must be available on an address on the host machine that is also visible within the Docker network. Set `QS_PUBLIC_SCHEME_HOST_AND_PORT` to this address.

The user's browser will also be redirected to the Wikibase instance and finally back to quickstatements. The address
the user sees for the Wikibase may be different from how the running container sees it. For example: it may be running
on localhost on a specific port. e.g. http://localhost:8181. This should be passed to the quickstatements container as
`WB_PUBLIC_SCHEME_HOST_AND_PORT`.
Likewise, Wikibase needs to be able to access QuickStatements for the OAuth callback on a host-recognizable address, set using `WB_PUBLIC_SCHEME_HOST_AND_PORT`.

One must also know how this container will be visible to the user as well so it can ask the wikibase to redirect the
user back here. This should be passed as `QS_PUBLIC_SCHEME_HOST_AND_PORT`.
Note that Docker Engine doesn't provide such addresses, so you will likely need to set up a reverse proxy (such as nginx or haproxy) alongside either public DNS entries or a local DNS server using entries that route to these container. See the Wikibase Suite example configuration for more guidance on how to set that up.

You can pass the consumer and secret token you got from the wikibase to this container as the environment variables
`OAUTH_CONSUMER_KEY` and `OAUTH_CONSUMER_SECRET`. If you don't, there are [extra-install scripts](../WikibaseBundle/extra-install/QuickStatements.sh) supplied in the Wikibase bundle that can automatically handle this.
You can pass the consumer and secret token you got from your Wikibase instance to this container using the environment variables
`OAUTH_CONSUMER_KEY` and `OAUTH_CONSUMER_SECRET`. Alternatively you can let the [extra-install scripts](../WikibaseBundle/extra-install/QuickStatements.sh) supplied in the Wikibase bundle handle this for you.

You can now test that it works by navigating to `QS_PUBLIC_SCHEME_HOST_AND_PORT` and logging in.
Test whether it works by navigating to `QS_PUBLIC_SCHEME_HOST_AND_PORT` and logging in.

You should be redirected to the wiki where you can authorize this Quickstatements to act on your behalf.
You should be redirected to the wiki, where you can authorize this QuickStatements to act on your behalf.

Finally you should be redirected back to Quickstatements and you should appear logged in.
Finally you should be redirected back to QuickStatements, and you should see yourself logged in.

Use Quickstatements as normal with the Run button. Currently "Run in background" is not supported by this image.
Use QuickStatements as you normally would, using the Run button. The "Run in background" option is not supported by this image.

#### Troubleshooting
If you see an error such as mw-oauth exception when trying to log in check that you have passed the right consumer token
and secret token to quickstatements.
If you see an error such as `mw-oauth exception` when trying to log in, check that you have passed the correct consumer token and secret token to QuickStatements.

If you have changed the value of $wgSecretKey $wgOAuthSecretKey since you made the consumer you'll need to make another new consumer or
reissue the secret token for the old one.
If you have changed the value of $wgSecretKey $wgOAuthSecretKey since you made the consumer, you'll need to make another new consumer or reissue the secret token for the old one.
2 changes: 1 addition & 1 deletion build/QuickStatements/config.json
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
"project":"${MW_SITE_NAME}" ,
"ini_file":"/quickstatements/data/oauth.ini" ,
"publicMwOAuthUrl":"${WB_PUBLIC_SCHEME_HOST_AND_PORT}/w/index.php?title=Special:OAuth" ,
"mwOAuthUrl":"${WIKIBASE_SCHEME_AND_HOST}/w/index.php?title=Special:OAuth" ,
"mwOAuthUrl":"${WB_PUBLIC_SCHEME_HOST_AND_PORT}/w/index.php?title=Special:OAuth" ,
"mwOAuthIW":"mw"
} ,
"server" : "${WB_PUBLIC_SCHEME_HOST_AND_PORT}" ,
Expand Down
2 changes: 1 addition & 1 deletion build/QuickStatements/entrypoint.sh
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
#!/usr/bin/env bash

# Test if required environment variables have been set
REQUIRED_VARIABLES=(QS_PUBLIC_SCHEME_HOST_AND_PORT WB_PUBLIC_SCHEME_HOST_AND_PORT WIKIBASE_SCHEME_AND_HOST WB_PROPERTY_NAMESPACE WB_PROPERTY_PREFIX WB_ITEM_NAMESPACE WB_ITEM_PREFIX)
REQUIRED_VARIABLES=(QS_PUBLIC_SCHEME_HOST_AND_PORT WB_PUBLIC_SCHEME_HOST_AND_PORT WB_PROPERTY_NAMESPACE WB_PROPERTY_PREFIX WB_ITEM_NAMESPACE WB_ITEM_PREFIX)
for i in "${REQUIRED_VARIABLES[@]}"; do
if ! [[ -v "$i" ]]; then
echo "$i is required but isn't set. You should pass it to docker. See: https://docs.docker.com/engine/reference/commandline/run/#set-environment-variables--e---env---env-file";
Expand Down
2 changes: 1 addition & 1 deletion build/WDQS/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ Wikibase specific Blazegraph image.

[WDQS](https://gerrit.wikimedia.org/r/admin/repos/wikidata/query/rdf) exposes by default some endpoints and methods that reveal internal details or functionality that might allow for abuse of the system. With the example docker configuration we are using the [WDQS-proxy](../WDQS-proxy/README.md) which filters out all long-running or unwanted requests.

When running WDQS in a setup without WDQS-proxy **please consider disabling these endpoints in some other way**. For more information on how this is tested in this pipeline see the test-cases in [queryservice.ts](../../../test/specs/repo/queryservice.ts)
When running WDQS in a setup without WDQS-proxy **please consider disabling these endpoints in some other way**. For more information on how this is tested in this pipeline see the test-cases in [queryservice.ts](../../test/specs/repo/queryservice.ts)

### Upgrading

Expand Down
6 changes: 3 additions & 3 deletions build/WikibaseBundle/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
ARG COMPOSER_IMAGE_URL
ARG WMDE_RELEASE_VERSION
ARG RELEASE_VERSION
ARG MEDIAWIKI_VERSION
ARG WIKIBASE_SUITE_WIKIBASE_IMAGE_URL

# ###########################################################################
FROM ${WIKIBASE_SUITE_WIKIBASE_IMAGE_URL}:${RELEASE_VERSION}-${WMDE_RELEASE_VERSION} as wikibase
FROM ${WIKIBASE_SUITE_WIKIBASE_IMAGE_URL}:${MEDIAWIKI_VERSION}-${WMDE_RELEASE_VERSION} as wikibase

# ###########################################################################
# hadolint ignore=DL3006
Expand Down Expand Up @@ -64,7 +64,7 @@ RUN set -x; \
composer install --no-dev -vv -n

# ###########################################################################
FROM ${WIKIBASE_SUITE_WIKIBASE_IMAGE_URL}:${RELEASE_VERSION}-${WMDE_RELEASE_VERSION}
FROM ${WIKIBASE_SUITE_WIKIBASE_IMAGE_URL}:${MEDIAWIKI_VERSION}-${WMDE_RELEASE_VERSION}
LABEL org.opencontainers.image.source="https://github.com/wmde/wikibase-release-pipeline"
RUN apt-get update && \
DEBIAN_FRONTEND=noninteractive apt-get install \
Expand Down
4 changes: 4 additions & 0 deletions clean.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
#!/usr/bin/env bash
rm -rf artifacts/*.tar.gz
rm -rf artifacts/*.log
rm -rf artifacts/*.env
2 changes: 1 addition & 1 deletion docs/diagrams/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,6 @@
},
"scripts": {
"lint": "npx eslint .",
"fix": "npx eslint . --fix"
"lint:fix": "npx eslint . --fix"
}
}
21 changes: 4 additions & 17 deletions docs/topics/pipeline.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,15 +23,11 @@ After triggering the pipeline a set of jobs will start running, which after a su

![Queuing the pipeline](../diagrams/output/overview.svg "Queuing the pipeline")

`DockerImages-lts`, `DockerImages-stable`, `DockerImages-next` - contains the release candidate docker images for each component that was built.
`DockerImages` - contains the release candidate docker images for each component that was built.

`TarBalls-lts`, `TarBalls-stable`, `TarBalls-next` - contains the release candidate tar archives for each component that was built.
`TestResults` - contains logs and screenshots from testing.

`Metadata-lts`, `Metadata-stable`, `Metadata-next` - contains artifacts describing what was built for each component that is included. Also contains the artifacts produced by the finishing `metadata` job that describe which versions were used when building / testing.

`TestResults-lts`, `TestResults-stable`, `TestResults-next` - contains logs and screenshots from testing.

`ScanResults-lts`, `ScanResults-stable`, `ScanResults-next` - contains logs from vulnerability testing.
`ScanResults` - contains logs from vulnerability testing.


## Running the pipeline locally
Expand All @@ -55,12 +51,6 @@ When building locally the artifacts are only stored in the docker daemons image
$ ./build.sh --save-image
```

If you want to also extract standalone tarballs, use the following command. This is basically also what the CI calls.

```sh
$ ./build.sh --save-image --extract-tarball
```

To rebuild without using Dockers cache, add the `--no-cache` option. Note that this will extend build times as all components need to be downloaded again (except Docker base images).

```sh
Expand All @@ -73,7 +63,7 @@ $ ./build.sh --no-cache
To remove any locally produced artifacts you can run the following commands.

```sh
make clean
./clean.sh
```

### Downloaded artifacts
Expand Down Expand Up @@ -108,6 +98,3 @@ MOCHA_OPTS_TIMEOUT=90000
```
WAIT_FOR_TIMEOUT=30000
```

#### Settings related to tarball publishing
See [publishing](publishing.md).
Loading
Loading