Skip to content

Latest commit

 

History

History
377 lines (262 loc) · 12.1 KB

workflows.rst

File metadata and controls

377 lines (262 loc) · 12.1 KB

Workflows (Nox and Poetry)

Following sections provide information on how to use the excellent Hypermodern-Python project foundation proposed by Claudio Jolowicz

This project uses Sphinx relying on docstrings in NumPy Style which get inforced by flake8-docstrings and darglint. Use Nox to conveniently build the documentaiton inside the :file:`docs/_build` folder:

To tweak or add nox sessions, alter the :file:`noxfile.py` inside this project's root directory.

Build the documentation while only acutally rebuilding those files that changed:

nox -s docs

Rebuild the entire documentation from scratch:

nox -s docs_rebuild

Builts the documentation from scratch, servs it locally on port 8000, opens your default browser on the main page (:file:`docs/_build/index.html`) and rebuilts any pages live, that are subject to change (when saved to disk).

Invaluable when creating the documentation!

nox -s docs_live

This project uses Nox to conveniently run both:

To tweak or add nox testing sessions, alter the :file:`noxfile.py` inside this project's root directory.

Unittests reside in :file:`tests/` inside the root directory of this project. Make sure to provide docstrings (since they are enforced, heh!) and add new test modules to :file:`docs/source/unittests.rst`.

Run all unittests using nox:

nox -s tests

Unittests can be marked by adding a @pytest.mark.MARKER decorator as for example in :file:`tests/test_connectivity.py`:

@pytest.mark.con
def test_wikipedia_connectivity(request_random_wiki_article):
  """Try reaching the wikipedia site to get a random article."""
  answer = request_random_wiki_article
  print(answer)
  assert "Error" not in answer

These markers can be explicitly run by passsing the -m MARKER option to the nox session as in:

nox -s tests -- -m MARKER

This templates supports following markers by default:

  • con -- Marks interernet connection attempts
  • e2e -- Marks end 2 end tests
  • slow - Marks particularly slow tests

These markers are excluded from the default nox -s test session (which also gets invoked by just calling nox). These are thus also excluded from the Tests CI-Workflow in :file:`.github/workflows/tests.yml`. To modify this behavior or exclude additional markers modify the "not e2e and not con and not slow", line inside the :file:`noxfile.py`:

@nox.session(python="3.10")
def tests(session):
    """Run test suite."""
    args = session.posargs or [
        "--cov",
        "-m",
        "not e2e and not con and not slow",
        # append exlcuded markers as "and not ..."
    ]
    session.run("poetry", "install", "--no-dev", external=True)
    install_with_constraints(
        session,
        "coverage[toml]",
        "pytest",
        "pytest-cov",
        "pytest-mock",
    )
    session.run("pytest", *args)

So to test one of them run e.g.:

nox -s tests ---m con

Me personally, I love doctests. I thinks they are the most natural form of testing. Since archiev both with them: enforced tests and pretty, copy-pastable examples inside your documentation.

Run all doctests using nox:

nox -s xdoctests

After new code is added and all tests are passed, following is the usua workflow:

  1. Run Black to format your code nox -s black
  2. Stage your changes using git add
  3. Run the pre-commit session to test lint and format your package using nox -s pre-commit
  4. Stage again to reflect changes done by pre-commit git add
  5. Commit your changes using git commit -m "MY MESSAGE"

This project template provides two major forms of automated publishing

  1. Development 'release' publishes on TestPyPI
  2. Stable release publishes on PyPI

Pseudo release a (potentially unstable) development version of your package by Pushing or Merging a Pull-Request to your remote develop branch. This automatically triggers the TestPyPI Workflow in :file:`.github/workflows/test-pypi`, which publishes a development version on TestPyPI.

To enable your repo interacting with your TestPyPI account you need to create an API-Token named TEST_PYPI_TOKEN in your TestPyPI account settings and declare it a Secret in your remote Github repo.

Assuming you've successfully generated and declared your Secret TestPyPI Api-Token, following workflow is proposed for creating a new (unstable) development release:

  1. Add all changes to your local develop branch
  2. Run the full test and lint suite using nox.
  3. Commit and Push your changes to the remote develop branch.
  4. The TestPyPI Workflow in :file:`.github/workflows/test-pypi.yml` automatically publishes the package using Poetry using a dev versioning scheme.

Release a stable version of your package by creating a Release of your main/ master branch via the Github website. This triggers the github Workflow called PyPI residing in :file:`.github/workflows/pypi.yml`, which automatically creates a release on PyPI.

To enable your repo interacting with your PyPI account you need to create an API-Token named PYPI_TOKEN in your PyPI account settings and declare it a Secret in your remote Github repo.

Assuming you've successfully generated and declared your Secret PyPI Api-Token, following workflow is proposed for creating a new release:

  1. Bump the package version on your local develop branch using poetry version major|minor|patch| following the Semantic-Versioning.

  2. Run the full test and lint suite using nox.

  3. Commit and Push your changes to the remote develop branch.

  4. Create a Pull-Request from your remote develop branch to the remote main / master branch via your remote repo's github webpage.

  5. Merge the Pull-Request on your remote repo using the github webpage

  6. Create a Release using the remote repos webpage.

    Note that the Release Drafter Workflow in :file:`.github/workflows/release-drafter.yml` automatically creates a release draft listing all your changes.

  7. The PyPI Workflow in :file:`.github/workflows/pypi.yml` automatically publishes the package using Poetry

Project dependencies are managed using Poetry.

Adding third party dependencies is done by using the poetry add command.

Add a required third party package to your package by using poetry:

poetry add PACKAGE

Add additional developer dependencies by using one of the following poetry commands:

poetry add --dev PACKAGE

poetry add package^1.0
poetry add "package>=1.0"
poetry add fogdb@latest
poetry add git+https://github.com/tZ3ma/fogdb.git
poetry add git+https://github.com/tZ3ma/fogdb.git#develop
poetry add ./my-package/

Modify the :file:`pyproject.toml` file inside this project's root directory:

[tool.poetry.dependencies]
my-package = {path = "../my/path", develop = true}

If the package(s) you want to install provide extras, you can specify them when adding the package by using one of the following lines:

poetry add requests[security,socks]
poetry add "requests[security,socks]~=2.22.0"
poetry add "git+https://github.com/pallets/flask.git@1.1.1[dotenv,dev]"

Updating third party dependencies is done by using the poetry add command.

Update all project dependencies by using:

poetry update

Update specific dependencies by using:

poetry update package1 pakage2

Bumping your package's verion is done by using the poetry version semver command. Where semver is one of poetry's supported Semantic-Versioning specifiers.

To bump your package's version use one of the following poetry commands:

poetry add patch
poetry add minor
poetry add major
poetry add prepatch
poetry add preminor
poetry add premajor
poetry add prerelease

Removing third party dependencies is done by using the poetry remove command.

Remove a required third party package from your package by using poetry:

poetry remove PACKAGE