This file provides guidance to AI coding agents when working with code in this repository.
This is autotick-bot (conda-forge-tick), the automated maintenance bot for the conda-forge ecosystem. It creates PRs to update packages, run migrations, and maintain the conda-forge dependency graph across thousands of feedstocks.
# Using the lockfile
conda-lock install conda-lock.yml
# Install in editable mode
pip install -e .# Run all tests (requires docker for container tests)
pytest -v
# Run tests in parallel
pytest -v -n 3
# Run a single test
pytest -v tests/test_file.py::test_function
# Skip MongoDB tests
pytest -v -m "not mongodb"
# To enable container-based tests, first build the test image:
docker build -t conda-forge-tick:test .# General help
conda-forge-tick --help
# Debug mode (enables debug logging, disables multiprocessing)
conda-forge-tick --debug <command>
# Online mode (fetches graph data from GitHub, useful for local testing)
conda-forge-tick --online <command>
# Disable containers (for debugging, but note security implications)
conda-forge-tick --no-containers <command>
# Example: update upstream versions for a single package
conda-forge-tick --debug --online update-upstream-versions numpy# Pre-commit handles linting (ruff, mypy, typos)
pre-commit run --all-filesCLI Entry Points (conda_forge_tick/cli.py, conda_forge_tick/container_cli.py):
conda-forge-tick: Main CLI for bot operationsconda-forge-tick-container: CLI for containerized operations
Key Modules:
auto_tick.py: Main bot job - creates PRs for migrations and version updatesmake_graph.py: Builds the conda-forge dependency graphmake_migrators.py: Initializes migration objectsupdate_upstream_versions.py: Fetches latest versions from upstream sourcesupdate_prs.py: Updates PR statuses from GitHubfeedstock_parser.py: Parses feedstock metadata
Base class: Migration in core.py. Migrators handle automated changes:
version.py: Version updates (special - usesCondaMetaYAMLparser)migration_yaml.py: CFEP-09 YAML migrations from conda-forge-pinningarch.py,cross_compile.py: Architecture migrations- Custom migrators for specific ecosystem changes (libboost, numpy2, etc.)
The bot uses cf-graph-countyfair repository as its database. Key structures:
graph.json: NetworkX dependency graphnode_attrs/: Package metadata (one JSON per package, sharded paths)versions/: Upstream version informationpr_json/: PR status trackingpr_info/,version_pr_info/: Migration/version PR metadata
Pydantic models in conda_forge_tick/models/ document the schema.
Data is loaded lazily via LazyJson class. Backends configured via CF_TICK_GRAPH_DATA_BACKENDS:
file: Local filesystem (default, requires cf-graph-countyfair clone)github: Read-only from GitHub raw URLs (good for debugging)mongodb: MongoDB database
CondaMetaYAML in recipe_parser/ handles Jinja2-templated YAML recipes:
- Preserves comments (important for conda selectors)
- Handles duplicate keys with different selectors via
__###conda-selector###__tokens - Extracts Jinja2 variables for version migration
See conda_forge_tick/settings.py for full list. Key ones:
CF_TICK_GRAPH_DATA_BACKENDS: Colon-separated backend listCF_TICK_GRAPH_DATA_USE_FILE_CACHE: Enable/disable local cachingMONGODB_CONNECTION_STRING: MongoDB connection stringBOT_TOKEN: GitHub token for bot operationsCF_FEEDSTOCK_OPS_IN_CONTAINER: Set to "true" when running in container
The bot runs as multiple parallel cron jobs via GitHub Actions:
bot-bot.yml: Main job making PRsbot-feedstocks.yml: Updates feedstock listbot-versions.yml: Fetches upstream versionsbot-prs.yml: Updates PR statusesbot-make-graph.yml: Builds dependency graphbot-make-migrators.yml: Creates migration objectsbot-pypi-mapping.yml: PyPI to conda-forge mapping
Located in tests_integration/. Tests the full bot pipeline against real GitHub repositories using staging accounts.
The integration tests require three GitHub entities that mimic production:
- Conda-forge org (
GITHUB_ACCOUNT_CONDA_FORGE_ORG): Contains test feedstocks - Bot user (
GITHUB_ACCOUNT_BOT_USER): Creates forks and PRs - Regro org (
GITHUB_ACCOUNT_REGRO_ORG): Contains a testcf-graph-countyfairrepository
Default staging accounts are conda-forge-bot-staging, regro-cf-autotick-bot-staging, and regro-staging. You can use your own accounts by setting environment variables.
- Initialize git submodules (test feedstock resources are stored as submodules):
git submodule update --init --recursive- Create a
.envfile with required environment variables:
export BOT_TOKEN='<github-classic-pat>'
export TEST_SETUP_TOKEN='<github-classic-pat>' # typically same as BOT_TOKEN
export GITHUB_ACCOUNT_CONDA_FORGE_ORG='your-conda-forge-staging-org'
export GITHUB_ACCOUNT_BOT_USER='your-bot-user'
export GITHUB_ACCOUNT_REGRO_ORG='your-regro-staging-org'
export PROXY_DEBUG_LOGGING='true' # optional, for debuggingGitHub token requires scopes: repo, workflow, delete_repo.
- Set up mitmproxy certificates (required for HTTP proxy that intercepts requests):
cd tests_integration
./mitmproxy_setup_wizard.shOn macOS: Add the generated certificate to Keychain Access and set "Always Trust".
On Linux: Copy to /usr/local/share/ca-certificates/ and run update-ca-certificates.
- Build the Docker test image (required for container-based tests):
docker build -t conda-forge-tick:test .Important: Integration tests take a long time to execute (5+ minutes per test). To avoid repeated runs:
- Persist stdout/stderr to a file and grep for errors
- Run tests in the background while working on other tasks
# Source your environment variables
source .env
# Run from repository root, skipping container tests (default)
# Recommended: redirect output to file for later analysis
pytest -s -v --dist=no tests_integration -k "False" > /tmp/integration_test.log 2>&1 &
tail -f /tmp/integration_test.log # follow output in another terminal
# Or run interactively if needed
pytest -s -v --dist=no tests_integration -k "False"
# Run only container tests (requires Docker image built with test tag)
pytest -s -v --dist=no tests_integration -k "True"
# Run a specific test scenario
pytest -s -v --dist=no tests_integration -k "test_scenario[0]"Test cases are defined in tests_integration/lib/_definitions/<feedstock>/__init__.py. Each test case:
get_router(): Defines mock HTTP responses via FastAPI routerprepare(helper): Sets up test state (e.g., overwrites feedstock contents)validate(helper): Asserts expected outcomes (e.g., PR was created with correct changes)
Tests run the full bot pipeline in sequence:
gather-all-feedstocksmake-graph --update-nodes-and-edgesmake-graphupdate-upstream-versionsmake-migratorsauto-tick- (repeat migrators and auto-tick for state propagation)
Each step deploys to the staging cf-graph-countyfair repo.