Skip to content

Latest commit

 

History

History
284 lines (232 loc) · 12.9 KB

CONTRIBUTING.md

File metadata and controls

284 lines (232 loc) · 12.9 KB

About this document

This document is intended to take effect on August 1st 2022, and before that, it is WIP.

When you make edits to this document, make sure you update the Table of contents. There is a nice VS Code plugin for it called markdown-all-in-one.

Table of contents

Writing issues

Writing good issues is key. Both for sharing to the larger group of users what is needed or not working, but also for helping the developer working on the issue to reach the Definition Of Done (DoD) and beyond.

For the developer writing the issue, it is good practice to share a screenshot or some data examples or a drawing on what changed. Since Issues are linked into the CHANGELOG.MD, this habit will propagate well-written issues over to Pypi and more.

Formulating a DoD is good practice. Take a moment to do this properly.

Code contribution workflow

We use Github Flow In the ideal situation, this is what you do:

1. Create a branch off of main and name it according to the feature you are working on

If you are working based on a GitHub issue (which you should be), it is good practice to create your branch from the issue in GitHub. This will automatically give the branch a descriptive name, and link it to the issue.

> git checkout main    <-- Everything starts with main
> git pull    <-- Make sure you have the latest
> git branch my_new_feature    <-- Name your new branch
> git checkout my_new_feature    <-- Check it out
> git push --set-upstream origin my_new_feature    <-- Publish it

2. ✨ Do you thing ✨

  • 🧪 Write your tests
  • ⌨️ Write your code
  • 📒 Add documentation to the readme if necessary
  • 🚦 It's good practice to add test data to the migration_repo_template in order to maintain a set of examples for new users and also maintain integration test coverage

3. Prepare for merging

3.1. 🩹 Check for vulnerabilities

Run

nox -rs safety

and update any packages with a vulnerability.

3.2. 🧐 Check and format your code

The following command runs Flake8 with plugins on your code. It:

  • Uses black to format the code. The Line-lenght is to be 99 characters
  • Uses isort to sort your imports. This makes merging much easier.

pre-commit run --all-files

3.3. 🧪 Run the entire tests suite.

This is cruical for making sure nothing else has broken during your work

nox -rs tests -- https://okapi-LATEST_BUGFEST_URI TENANT_ID USERNAME PASSWORD

3.4. Make sure the code can run

> cd src
> poetry run python3 -m folio_migration_tools -h

should output

usage: __main__.py [-h] [--okapi_password OKAPI_PASSWORD] [--base_folder_path BASE_FOLDER_PATH] configuration_path task_name

positional arguments:
  configuration_path    Path to configuration file
  task_name             Task name. Use one of: BatchPoster, BibsTransformer, HoldingsCsvTransformer, HoldingsMarcTransformer, ItemsTransformer, LoansMigrator,
                        RequestsMigrator, UserTransformer

optional arguments:
  -h, --help            show this help message and exit
  --okapi_password OKAPI_PASSWORD
                        password for the tenant in the configuration file
  --base_folder_path BASE_FOLDER_PATH
                        path to the base folder for this library. Built on migration_repo_template

3.5. Create a pull request in GitHub

3.6. 🧑‍🤝‍🧑 Code review

3.7. After a successful code review, merge the branch into main

Use the closes tag in your merge commit message to automatically close any issue(s) that should be closed by your merged changes.

closes #123 #456 #789

Create release

Create the release on Github

Choose your version, and tag the release

Create release notes and change log using gren

Once released, create release notes using gren:

gren release --override 

and create the change log, also using gren:

gren changelog --override 

Publish package to pypi

1. Up the release in pyproject.toml

Open pyproject.toml and apply the new version number

version = "1.5.1"

2. Build the package

poetry build

Make sure one of the builds aligns with the version number you choosed above

3. Push the release to pypi

Run

poetry publish --username $PYPI_USERNAME --password $PYPI_PASSWORD

and follow the instructions

4. Finalize the release

Save the file and commit (and push) the file back to main.

(main) > git add pyproject.toml
(main) > git commit -m "version VERSION_NUMBER"
(main) > git push

Python Coding standards and practices

What to install

> pipx install pre-commit  (and run pre-commit install)
> pipx install isort
> pipx install nox
> pipx install poetry
> pipx install twine 
> poetry shell
> poetry install
> npm install github-release-notes -g

Important settings

  • Set black max-line-length to 99
  • Use black in conjunction with Isort. Make sure to set the --force-single-line-imports parameter

Setting up Visual studio

Here is one example of the python settings part to use in VS code:

"[python]": {
    "editor.codeActionsOnSave": {
        "source.organizeImports": true
    },
    "editor.wordBasedSuggestions": false
},
"editor.rulers": [99],
"python.formatting.blackArgs": ["--line-length=99"],
"python.formatting.provider": "black",
"python.languageServer": "Pylance",
"python.linting.flake8Args": [
    "--max-line-length=99",
    "--ignore=E203,W503 ",
    "--select=B,B9,BLK,C,E,F,I,S,W"
],
"python.linting.flake8Enabled": true,
"python.linting.mypyEnabled": true,
"python.linting.pylintEnabled": true,
"python.sortImports.args": [
    "--profile",
    "black",
    "--force-single-line-imports"
    ],

https://cereblanco.medium.com/setup-black-and-isort-in-vscode-514804590bf9

Testing

Running tests

Running tests against a FOLIO environment

Pytest. Run the test suite against the latest bugfest release. Example call:

 nox -rs tests -- https://okapi-LATEST_BUGFEST_URI TENANT_ID USERNAME PASSWORD

Running unit tests

If you configure VS code properly (for example by using the vs code settings in this repository), you will be able to either run or debug your tests from the IDE itself. Just right-click the green triangle next to the test method and either choose Run test or Debug test image

Running will just run the test for you, but debugging the test will allow you to step through the code and look at the values in the varous objects. Make sure you add a breakpoint at the right place. The following screenshot shows how the value of the schema variable is visible in the Variables pane in VS Code image

Writing tests

Naming

Tests are written and maintained in the tests folder in the repository. Test files should be named after the class/file they are testing, and then the tests are named according to the methods being tested. So, if you are to test a method named condition_trim_period in the conditions.py file, your test file should be named test_conditions.py and the test method should be named test_condition_trim_period image

Unit tests or integration-like tests?

The test suite contains both tests that needs a connection to a FOLIO tenant to run, as well as a growing number of unit tests that can be run without any actual FOLIO tennant. The latter is preferable, so try to write unit tests, mocking the behaviour of a FOLIO tenant.

The exception to this is the test suite in test_rules_mapper_bibs.py that needs to be rewritten long-term, but that will remain in the current form as is. So if you want to test the tools agains real-world data and a tenant, then this is the place to do it.

Test libraries used

We rely on Pytest in conjunction with unittest.mock. There are numerous introductions to both libraries:

Test data

In the past we have used OAI-PMH-formatted MARC records. This is for historical reasons no longer needed, and going MARC records should be as close to the original form as possible. One could argue that having all MARC records in JSON or .mrk for readability and for searching, but this would risk loosing important nuances.

Test records should be placed in the tests/test_data folder.

Testing infrastructure

There is a folder in the src/ folder named test_infrastructure. This folder contains classes and mocks that are and could be shared in a wider set of tests. This way the behaviour of FolioClient.folio_get_all() method could be standardized for example, and more complexity could be added to these mocks as we introduce new tests.

Code coverage

Your ambition should be to increase code coverage with every new commit. Coverage does not have to mean that you cover every single outcome or side-effect of a method, but start by testing and verifying that the "happy path" works as expected.

By ensuring we have at least "happy path" test coverage, when a bug is discovered, the threshold for writing a test to make sure the bug is handled gets significantly lowered..

Running an end-to-end transformation

(migration_repo_template)[] contains a bash script called bash run_test_data_suite.sh allowing you to run the transformers against the latest bugfest environment:

> bash run_test_data_suite.sh -pwd

When doing larger changes to the code base, it is a good idea to see that all of this works.

Contributing to the documentation

Documentation is hosted on Read the docs

Writing

The documentation is build in Sphinx, using the MyST parser to enable Markdown. Some documentation can be found here:

Publishing

There is a Github Hook that automatically builds the documentation when pushing to main.

Building

In order to build the documentation locally, run

nox -rs docs

The documentation should now have been built in the docs/_build folder. Open up the index.html file in a browser to see how it looks. Note that the formatting on Read the docs will make it look different, but that the resulting HTML should be the same