Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upgrading Tron to py3.8 + patching it with the fix #934

Merged
merged 19 commits into from
Jan 31, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 4 additions & 5 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,13 +8,12 @@ jobs:
fail-fast: false
matrix:
toxenv:
- py36,docs
- cluster_itests
- py38,docs
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: 3.6
python-version: 3.8
# GHA won't setup tox for us and we use tox-pip-extensions for venv-update
- run: pip install tox==3.2 tox-pip-extensions==1.3.0
# there are no pre-built wheels for bsddb3, so we need to install
Expand All @@ -32,12 +31,12 @@ jobs:
strategy:
fail-fast: false
matrix:
dist: [bionic]
dist: [bionic, jammy]
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: 3.6
python-version: 3.8
# the container provided by GitHub doesn't include utilities
# needed for dpkg building, so we need to install `devscripts`
# to bring those in
Expand Down
13 changes: 6 additions & 7 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
default_language_version:
python: python3.6
python: python3.8
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v2.5.0
Expand All @@ -17,7 +17,7 @@ repos:
- id: pretty-format-json
args: [--autofix, --indent, '4', --no-sort-keys]
- repo: https://github.com/PyCQA/flake8
rev: 3.9.2
rev: 5.0.4
hooks:
- id: flake8
exclude: ^docs/source/conf.py$
Expand All @@ -27,10 +27,10 @@ repos:
- id: reorder-python-imports
args: [--py3-plus]
- repo: https://github.com/asottile/pyupgrade
rev: v2.0.2
rev: v3.3.1
hooks:
- id: pyupgrade
args: [--py36-plus]
args: [--py38-plus]
- repo: local
hooks:
- id: patch-enforce-autospec
Expand All @@ -41,8 +41,7 @@ repos:
language: script
files: ^tests/.*\.py$
- repo: http://github.com/psf/black
rev: 19.10b0
rev: 22.3.0
hooks:
- id: black
language_version: python3.6
args: [--target-version, py36]
args: [--target-version, py38]
31 changes: 14 additions & 17 deletions Makefile
nemacysts marked this conversation as resolved.
Show resolved Hide resolved
Original file line number Diff line number Diff line change
Expand Up @@ -11,42 +11,43 @@ endif

NOOP = true
ifeq ($(PAASTA_ENV),YELP)
# This index must match the Ubuntu codename in the dockerfile.
DOCKER_PIP_INDEX_URL ?= http://169.254.255.254:20641/bionic/simple/
export PIP_INDEX_URL ?= http://169.254.255.254:20641/$*/simple/
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh nice, this works! I was thinking that we'd need to do the $* bit inside the targets themselves :D

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yupp, same I thought I would need to add it at first to the targets themselves :p

export NPM_CONFIG_REGISTRY ?= https://npm.yelpcorp.com/
ADD_MISSING_DEPS_MAYBE:=-diff --unchanged-line-format= --old-line-format= --new-line-format='%L' ./requirements.txt ./yelp_package/extra_requirements_yelp.txt >> ./requirements.txt
else
DOCKER_PIP_INDEX_URL ?= https://pypi.python.org/simple
export PIP_INDEX_URL ?= https://pypi.python.org/simple
export NPM_CONFIG_REGISTRY ?= https://registry.npmjs.org
ADD_MISSING_DEPS_MAYBE:=$(NOOP)
endif

.PHONY : all clean tests docs dev cluster_itests
.PHONY : all clean tests docs dev

-usage:
@echo "make test - Run tests"
@echo "make deb_bionic - Generate bionic deb package"
@echo "make itest_bionic - Run tests and integration checks"
@echo "make _itest_bionic - Run only integration checks"
@echo "make deb_jammy - Generate bionic deb package"
@echo "make itest_jammy - Run tests and integration checks"
@echo "make _itest_jammy - Run only integration checks"
@echo "make release - Prepare debian info for new release"
@echo "make clean - Get rid of scratch and byte files"
@echo "make dev - Get a local copy of trond running in debug mode in the foreground"

docker_%:
@echo "Building docker image for $*"
[ -d dist ] || mkdir -p dist
cd ./yelp_package/$* && docker build --build-arg PIP_INDEX_URL=${DOCKER_PIP_INDEX_URL} --build-arg NPM_CONFIG_REGISTRY=${NPM_CONFIG_REGISTRY} -t tron-builder-$* .
cd ./yelp_package/$* && docker build --build-arg PIP_INDEX_URL=${PIP_INDEX_URL} --build-arg NPM_CONFIG_REGISTRY=${NPM_CONFIG_REGISTRY} -t tron-builder-$* .

deb_%: clean docker_% coffee_%
@echo "Building deb for $*"
# backup these files so we can temp modify them
cp requirements.txt requirements.txt.old
$(ADD_MISSING_DEPS_MAYBE)
$(DOCKER_RUN) -e PIP_INDEX_URL=${DOCKER_PIP_INDEX_URL} tron-builder-$* /bin/bash -c ' \
$(DOCKER_RUN) -e PIP_INDEX_URL=${PIP_INDEX_URL} tron-builder-$* /bin/bash -c ' \
dpkg-buildpackage -d && \
mv ../*.deb dist/ && \
rm -rf debian/tron && \
chown -R $(UID):$(GID) dist debian \
rm -rf debian/tron \
'
# restore the backed up files
mv requirements.txt.old requirements.txt
Expand All @@ -56,15 +57,14 @@ coffee_%: docker_%
$(DOCKER_RUN) tron-builder-$* /bin/bash -c ' \
rm -rf tronweb/js/cs && \
mkdir -p tronweb/js/cs && \
coffee -o tronweb/js/cs/ -c tronweb/coffee/ && \
chown -R $(UID):$(GID) tronweb/js/cs/ \
coffee -o tronweb/js/cs/ -c tronweb/coffee/ \
'

test:
tox -e py36
tox -e py38

test_in_docker_%: docker_%
$(DOCKER_RUN) tron-builder-$* python3.6 -m tox -vv -e py36
$(DOCKER_RUN) tron-builder-$* python3.8 -m tox -vv -e py38

tox_%:
tox -e $*
Expand All @@ -78,17 +78,14 @@ debitest_%: deb_% _itest_%
itest_%: debitest_%
@echo "itest $* OK"

cluster_itests:
tox -e cluster_itests

dev:
SSH_AUTH_SOCK=$(SSH_AUTH_SOCK) .tox/py36/bin/trond --debug --working-dir=dev -l logging.conf --host=0.0.0.0
SSH_AUTH_SOCK=$(SSH_AUTH_SOCK) .tox/py38/bin/trond --debug --working-dir=dev -l logging.conf --host=0.0.0.0

example_cluster:
tox -e example-cluster

yelpy:
.tox/py36/bin/pip install -r yelp_package/extra_requirements_yelp.txt
.tox/py38/bin/pip install -r yelp_package/extra_requirements_yelp.txt

LAST_COMMIT_MSG = $(shell git log -1 --pretty=%B | sed -e 's/[\x27\x22]/\\\x27/g')
release:
Expand Down
73 changes: 56 additions & 17 deletions bin/tronctl
Original file line number Diff line number Diff line change
Expand Up @@ -47,24 +47,40 @@ COMMAND_HELP = (
"job name, job run id, or action id",
"Start the selected job, job run, or action. Creates a new job run if starting a job.",
),
("rerun", "job run id", "Start a new job run with the same start time command context as the given job run.",),
(
"rerun",
"job run id",
"Start a new job run with the same start time command context as the given job run.",
),
(
"retry",
"action id",
"Re-run a job action within an existing job run. Uses latest code/config except the command by default. Add --use-latest-command to use the latest command.",
),
("recover", "action id", "Ask Tron to start tracking an UNKNOWN action run again"),
("cancel", "job run id", "Cancel the selected job run."),
("backfill", "job name", "Start job runs for a particular date range",),
(
"backfill",
"job name",
"Start job runs for a particular date range",
),
("disable", "job name", "Disable selected job and cancel any outstanding runs"),
("enable", "job name", "Enable the selected job and schedule the next run"),
("fail", "job run or action id", "Mark an UNKNOWN job or action as failed. Does not publish action triggers.",),
(
"fail",
"job run or action id",
"Mark an UNKNOWN job or action as failed. Does not publish action triggers.",
),
(
"success",
"job run or action id",
"Mark an UNKNOWN job or action as having succeeded. Will publish action triggers.",
),
("skip", "action id", "Skip a failed action, unblocks dependent actions. Does *not* publish action triggers.",),
(
"skip",
"action id",
"Skip a failed action, unblocks dependent actions. Does *not* publish action triggers.",
),
(
"skip-and-publish",
"action id",
Expand All @@ -87,8 +103,7 @@ def parse_date(date_string):

def parse_cli():
parser = cmd_utils.build_option_parser()
subparsers = parser.add_subparsers(dest="command", title="commands", help="Tronctl command to run")
subparsers.required = True # add_subparsers only supports required arg from py37
subparsers = parser.add_subparsers(dest="command", title="commands", help="Tronctl command to run", required=True)

cmd_parsers = {}
for cmd_name, id_help_text, desc in COMMAND_HELP:
Expand All @@ -100,14 +115,20 @@ def parse_cli():

# start
cmd_parsers["start"].add_argument(
"--run-date", type=parse_date, dest="run_date", help="What the run-date should be set to",
"--run-date",
type=parse_date,
dest="run_date",
help="What the run-date should be set to",
)

# backfill
backfill_parser = cmd_parsers["backfill"]
mutex_dates_group = backfill_parser.add_mutually_exclusive_group(required=True)
mutex_dates_group.add_argument(
"--start-date", type=parse_date, dest="start_date", help="First run-date to backfill",
"--start-date",
type=parse_date,
dest="start_date",
help="First run-date to backfill",
)
backfill_parser.add_argument(
"--end-date",
Expand Down Expand Up @@ -215,14 +236,16 @@ def request(url: str, data: Dict[str, Any], headers=None, method=None) -> bool:
def event_publish(args):
for event in args.id:
yield request(
urljoin(args.server, "/api/events"), dict(command="publish", event=event),
urljoin(args.server, "/api/events"),
dict(command="publish", event=event),
)


def event_discard(args):
for event in args.id:
yield request(
urljoin(args.server, "/api/events"), dict(command="discard", event=event),
urljoin(args.server, "/api/events"),
dict(command="discard", event=event),
)


Expand All @@ -236,7 +259,10 @@ def _get_triggers_for_action(server: str, action_identifier: str) -> Optional[Tu
return None

trigger_response = client.request(
uri=urljoin(server, f"/api/jobs/{namespace}.{job_name}/{run_number}/{action_name}",),
uri=urljoin(
server,
f"/api/jobs/{namespace}.{job_name}/{run_number}/{action_name}",
),
)
if trigger_response.error:
print(f"Unable to fetch downstream triggers for {action_identifier}: {trigger_response.error}")
Expand All @@ -256,17 +282,28 @@ def control_objects(args: argparse.Namespace):
url_index = tron_client.index()
for identifier in args.id:
try:
tron_id = client.get_object_type_from_identifier(url_index, identifier,)
tron_id = client.get_object_type_from_identifier(
url_index,
identifier,
)
except ValueError as e:
possibilities = list(tron_jobs_completer(prefix="", client=tron_client),)
suggestions = suggest_possibilities(word=identifier, possibilities=possibilities,)
possibilities = list(
tron_jobs_completer(prefix="", client=tron_client),
)
suggestions = suggest_possibilities(
word=identifier,
possibilities=possibilities,
)
raise SystemExit(f"Error: {e}{suggestions}")

if args.command == "skip-and-publish":
# this command is more of a pseudo-command - skip and publish are handled in two different resources
# and changing the API would be painful, so instead we call skip + publish separately from the client
# (i.e., this file) to implement this functionality
if request(url=urljoin(args.server, tron_id.url), data={"command": "skip"},):
if request(
url=urljoin(args.server, tron_id.url),
data={"command": "skip"},
):
# a single action can have 0..N triggers to publish and these can be arbitrarily named, so we need to
# query the API and figure out what triggers exist
triggers = _get_triggers_for_action(server=args.server, action_identifier=identifier)
Expand All @@ -283,7 +320,8 @@ def control_objects(args: argparse.Namespace):
# around the full set of args everywhere to do so
for trigger in triggers:
yield request(
url=urljoin(args.server, "/api/events"), data={"command": "publish", "event": trigger},
url=urljoin(args.server, "/api/events"),
data={"command": "publish", "event": trigger},
)
else:
print(f"Unable to skip {identifier}.")
Expand Down Expand Up @@ -414,7 +452,8 @@ def main():
sys.exit(ExitCode.fail)
except RequestError as err:
print(
f"Error connecting to the tron server ({args.server}): {err}", file=sys.stderr,
f"Error connecting to the tron server ({args.server}): {err}",
file=sys.stderr,
)
sys.exit(ExitCode.fail)

Expand Down
Loading
Loading