diff --git a/.devcontainer/README.md b/.devcontainer/README.md
deleted file mode 100644
index 36c1399..0000000
--- a/.devcontainer/README.md
+++ /dev/null
@@ -1,40 +0,0 @@
-# Dev Container Configuration
-
-This directory contains the configuration files for setting up a development container.
-These configurations are compatible with **GitHub Codespaces**, **Visual Studio Code**,
-and **JetBrains IDEs**, and provide a pre-configured environment with all necessary
-dependencies for development.
-
-## GitHub Codespaces
-
-To launch a dev container using GitHub Codespaces:
-
-1. Navigate to the repository's main page.
-2. Click the **"Code"** button.
-3. Select the **"Codespaces"** tab.
-4. Click the **"+"** button to create a new codespace.
-
-The container will be initialized automatically using the configurations in this
-directory.
-
-[GitHub Codespaces Documentation](https://docs.github.com/en/codespaces/developing-in-a-codespace/creating-a-codespace-for-a-repository)
-
-## Visual Studio Code
-
-To use the dev container in VS Code:
-
-1. Open the root folder of the repository in Visual Studio Code.
-2. A prompt will appear asking if you want to reopen the folder in a dev container.
-3. Confirm by selecting **"Reopen in Container"**.
-
-[VS Code Dev Containers Guide](https://code.visualstudio.com/docs/devcontainers/tutorial)
-
-## JetBrains IDEs
-
-To open the dev container in a JetBrains IDE (e.g., IntelliJ IDEA, PyCharm):
-
-1. Open the `.devcontainer/devcontainer.json` file in your IDE.
-2. Click the Docker icon that appears in the UI.
-3. Follow the prompts to create and open the dev container.
-
-[JetBrains Dev Container Integration Guide](https://www.jetbrains.com/help/idea/connect-to-devcontainer.html)
diff --git a/.devcontainer/devcontainer.json b/.devcontainer/devcontainer.json
deleted file mode 100644
index a2652e3..0000000
--- a/.devcontainer/devcontainer.json
+++ /dev/null
@@ -1,78 +0,0 @@
-{
- "image": "mcr.microsoft.com/vscode/devcontainers/python:3.11",
- "postCreateCommand": "sh ./.devcontainer/setup.sh",
- "customizations": {
- "vscode": {
- "settings": {
- "todo-tree.regex.enableMultiLine": true,
- "editor.rulers": [
- {
- "column": 72,
- "color": "#4a4f63"
- },
- {
- "column": 88,
- "color": "#7a8ad1"
- }
- ],
- "autoDocstring.docstringFormat": "google-notypes",
- "autoDocstring.startOnNewLine": true,
- "better-comments.tags": [
- {
- "tag": "!!",
- "color": "#F6FF33",
- "strikethrough": false,
- "backgroundColor": "transparent"
- },
- {
- "tag": "#!",
- "color": "#3498DB",
- "strikethrough": false,
- "backgroundColor": "transparent"
- },
- {
- "tag": "TODO",
- "color": "#FF8C00",
- "strikethrough": false,
- "backgroundColor": "transparent"
- },
- {
- "tag": "//",
- "color": "#68FF33",
- "strikethrough": false,
- "backgroundColor": "transparent"
- },
- {
- "tag": "**",
- "color": "#FF33EC",
- "strikethrough": false,
- "backgroundColor": "transparent"
- }
- ],
- "[python]": {
- "editor.formatOnType": true
- },
- "python.defaultInterpreterPath": "/home/jovyan/envs/env/bin/python",
- "[jsonc]": {
- "editor.defaultFormatter": "vscode.json-language-features"
- },
- "[json]": {
- "editor.defaultFormatter": "esbenp.prettier-vscode"
- },
- "prettier.printWidth": 88,
- "prettier.proseWrap": "always"
- },
- "extensions": [
- "njpwerner.autodocstring",
- "aaron-bond.better-comments",
- "esbenp.prettier-vscode",
- "ms-python.python",
- "ms-python.debugpy",
- "Gruntfuggly.todo-tree"
- ]
- }
- },
- "features": {
- "ghcr.io/devcontainers/features/github-cli:1": {}
- }
-}
diff --git a/.devcontainer/setup.sh b/.devcontainer/setup.sh
deleted file mode 100644
index 658115b..0000000
--- a/.devcontainer/setup.sh
+++ /dev/null
@@ -1,17 +0,0 @@
-#!/bin/bash
-
-set -e
-
-echo "Installing uv..."
-curl -LsSf https://astral.sh/uv/install.sh | sh > /dev/null 2>&1
-echo "✅ uv installed."
-
-echo "Installing system dependencies..."
-sudo apt-get update > /dev/null 2>&1
-sudo apt-get install -y build-essential > /dev/null 2>&1
-echo "✅ System dependencies installed."
-
-echo "Installing Python dependencies with Makefile..."
-make install > /dev/null 2>&1
-
-echo "✅ Devcontainer setup complete."
\ No newline at end of file
diff --git a/.github/CODE_OF_CONDUCT.md b/.github/CODE_OF_CONDUCT.md
index 9c6a518..76bedcf 100644
--- a/.github/CODE_OF_CONDUCT.md
+++ b/.github/CODE_OF_CONDUCT.md
@@ -25,8 +25,7 @@ include:
Examples of unacceptable behavior include:
-- The use of sexualized language or imagery, and sexual attention or advances of any
- kind
+- The use of sexualized language or imagery, and sexual attention or advances of any kind
- Trolling, insulting or derogatory comments, and personal or political attacks
- Public or private harassment
- Publishing others' private information, such as a physical or email address, without
@@ -58,8 +57,8 @@ Instances of abusive, harassing, or otherwise unacceptable behavior may be repor
directly to the community leaders responsible for enforcement. All complaints will be
reviewed and investigated promptly and fairly.
-All community leaders are obligated to respect the privacy and security of the
-individual reporting any incident.
+All community leaders are obligated to respect the privacy and security of the individual
+reporting any incident.
## Enforcement Guidelines
@@ -81,9 +80,9 @@ inappropriate. A public apology may be requested.
**Consequence**: A warning with consequences for continued behavior. No interaction with
the people involved, including unsolicited interaction with those enforcing the Code of
-Conduct, for a specified period of time. This includes avoiding interactions in
-community spaces as well as external channels like social media. Violating these terms
-may lead to a temporary or permanent ban.
+Conduct, for a specified period of time. This includes avoiding interactions in community
+spaces as well as external channels like social media. Violating these terms may lead to
+a temporary or permanent ban.
### 3. Temporary Ban
diff --git a/.github/CONTRIBUTING.md b/.github/CONTRIBUTING.md
index fa54ff5..0f0cf75 100644
--- a/.github/CONTRIBUTING.md
+++ b/.github/CONTRIBUTING.md
@@ -27,8 +27,8 @@ You can contribute by:
Start by opening an issue to describe your proposed change or the problem you
encountered. This helps maintainers review and guide the work before coding begins.
-> For minor changes, such as typo fixes, you may skip this step and submit a pull
-> request directly.
+> For minor changes, such as typo fixes, you may skip this step and submit a pull request
+> directly.
### Step 2: Make Your Changes
@@ -75,7 +75,7 @@ To set up locally:
3. Clone the repository and run:
```bash
-make install
+make setup
```
> If using `uv`, a compatible virtual environment will be created automatically.
diff --git a/.github/ISSUE_TEMPLATE/bug-report.yml b/.github/ISSUE_TEMPLATE/bug-report.yml
index 3733f1e..a96d278 100644
--- a/.github/ISSUE_TEMPLATE/bug-report.yml
+++ b/.github/ISSUE_TEMPLATE/bug-report.yml
@@ -14,8 +14,7 @@ body:
attributes:
label: "System Information"
description: "Please provide details about your system configuration."
- placeholder:
- "python-project-template version:\nPlatform/OS:\nPython version:"
+ placeholder: "python-project-template version:\nPlatform/OS:\nPython version:"
validations:
required: true
diff --git a/.github/actions/build-mkdocs/action.yml b/.github/actions/build-mkdocs/action.yml
index fb2ddd0..847a095 100644
--- a/.github/actions/build-mkdocs/action.yml
+++ b/.github/actions/build-mkdocs/action.yml
@@ -18,15 +18,8 @@ runs:
- name: Setup gh-pages Branch
shell: bash
run: |
- # Store current branch
- CURRENT_BRANCH=$(git branch --show-current)
-
- # Check if gh-pages branch exists
- if git ls-remote --heads origin gh-pages | grep -q gh-pages; then
- echo "gh-pages branch exists, fetching..."
- git fetch origin gh-pages:gh-pages
- else
- echo "gh-pages branch doesn't exist, creating..."
+ if ! git ls-remote --heads origin gh-pages | grep -q gh-pages; then
+ CURRENT_BRANCH=$(git branch --show-current)
git checkout --orphan gh-pages
git reset --hard
git commit --allow-empty -m "Initial gh-pages commit"
@@ -34,14 +27,8 @@ runs:
git checkout "$CURRENT_BRANCH"
fi
- - name: Build and Deploy Docs with Mike
+ - name: Build and Deploy Docs
shell: bash
run: |
- echo "Deploying Docs Version: ${{ inputs.docs-version }}"
uv run mike deploy --push --update-aliases "${{ inputs.docs-version }}" latest
-
- - name: Set Default Version to Docs
- shell: bash
- run: |
- echo "Setting 'Latest' as Default Version"
uv run mike set-default --push latest
diff --git a/.github/actions/create-release/action.yml b/.github/actions/create-release/action.yml
index 3d91904..4afd27d 100644
--- a/.github/actions/create-release/action.yml
+++ b/.github/actions/create-release/action.yml
@@ -26,52 +26,24 @@ runs:
run: |
if gh release view "${{ inputs.version }}" >/dev/null 2>&1; then
echo "exists=true" >> $GITHUB_OUTPUT
- echo "Release ${{ inputs.version }} already exists"
else
echo "exists=false" >> $GITHUB_OUTPUT
- echo "Release ${{ inputs.version }} doesn't exist"
fi
env:
GH_TOKEN: ${{ inputs.token }}
- - name: Generate Release Notes
+ - name: Create Release
if: steps.check_release.outputs.exists == 'false'
shell: bash
run: |
- echo "Generating release notes..."
LAST_TAG=$(git describe --tags --abbrev=0 2>/dev/null || echo "")
+ RANGE="${LAST_TAG:+$LAST_TAG..HEAD}"
+ COMMITS=$(git log ${RANGE:---max-count=10} --pretty=format:"- %s" --no-merges)
- if [ -z "$LAST_TAG" ]; then
- echo "No previous tags found, using last 10 commits..."
- COMMITS=$(git log --pretty=format:"- %s" --no-merges -10)
- else
- echo "Previous tag found: $LAST_TAG"
- COMMITS=$(git log "${LAST_TAG}..HEAD" --pretty=format:"- %s" --no-merges)
- fi
-
- cat > release-notes.md << EOF
- ## Changes
-
- $COMMITS
- EOF
-
- echo "Release notes generated:"
- cat release-notes.md
-
- - name: Create Release
- if: steps.check_release.outputs.exists == 'false'
- shell: bash
- run: |
- echo "Creating release ${{ inputs.version }}..."
gh release create "${{ inputs.version }}" \
--title "${{ inputs.version }}" \
- --notes-file release-notes.md
- echo "Release ${{ inputs.version }} created successfully"
+ --notes "## Changes
+
+ $COMMITS"
env:
GH_TOKEN: ${{ inputs.token }}
-
- - name: Skip Release
- if: steps.check_release.outputs.exists == 'true'
- shell: bash
- run: |
- echo "Skipping release creation - ${{ inputs.version }} already exists"
diff --git a/.github/actions/setup-python-env/action.yml b/.github/actions/setup-python-env/action.yml
index 2a553f6..ef503ec 100644
--- a/.github/actions/setup-python-env/action.yml
+++ b/.github/actions/setup-python-env/action.yml
@@ -25,39 +25,20 @@ runs:
enable-cache: true
- name: Install dependencies with uv
- id: install-deps
shell: bash
run: |
- # Check if we should install all extras
- if [ -z "${{ inputs.uv-group }}" ] && [ -z "${{ inputs.uv-extra }}" ]; then
- echo "Installing all extras (default when no group or extra specified)..."
- uv sync --all-extras
- elif [ "${{ inputs.uv-extra }}" = "--all-extras" ]; then
- echo "Installing all extras (explicitly requested)..."
- if [ -n "${{ inputs.uv-group }}" ]; then
- echo "Note: Installing all extras overrides the specified group: ${{ inputs.uv-group }}"
- fi
- uv sync --all-extras
+ ARGS=""
+ if [ "${{ inputs.uv-extra }}" = "--all-extras" ] || [ -z "${{ inputs.uv-group }}${{ inputs.uv-extra }}" ]; then
+ ARGS="--all-extras"
else
- echo "Installing with group: ${{ inputs.uv-group }}, and extra: ${{ inputs.uv-extra }}..."
- if [ -n "${{ inputs.uv-group }}" ] && [ -n "${{ inputs.uv-extra }}" ]; then
- uv sync --group ${{ inputs.uv-group }} --extra ${{ inputs.uv-extra }}
- elif [ -n "${{ inputs.uv-group }}" ]; then
- uv sync --group ${{ inputs.uv-group }}
- elif [ -n "${{ inputs.uv-extra }}" ]; then
- uv sync --extra ${{ inputs.uv-extra }}
- else
- uv sync
- fi
+ [ -n "${{ inputs.uv-group }}" ] && ARGS="$ARGS --group ${{ inputs.uv-group }}"
+ [ -n "${{ inputs.uv-extra }}" ] && ARGS="$ARGS --extra ${{ inputs.uv-extra }}"
fi
+ uv sync $ARGS
- name: Verify uv and environment
- id: verify
shell: bash
run: |
- echo "uv version:"
uv --version
- echo "Virtual environments:"
uv venv list
- echo "Python version:"
uv run python --version
diff --git a/.github/actions/test-code/action.yml b/.github/actions/test-code/action.yml
new file mode 100644
index 0000000..7ebb725
--- /dev/null
+++ b/.github/actions/test-code/action.yml
@@ -0,0 +1,23 @@
+name: Test Code
+description: Run Python tests with Pytest
+
+inputs:
+ src-project-folder:
+ description: "Directory where the project is located"
+ required: true
+ default: "src"
+
+ src-tests-folder:
+ description: "Directory where the tests are located"
+ required: true
+ default: "tests"
+
+runs:
+ using: composite
+ steps:
+ - name: Run tests with Pytest
+ shell: bash
+ run: |
+ if [ -d "${{ inputs.src-tests-folder }}" ] && [ -n "$(find ${{ inputs.src-tests-folder }} -name 'test_*.py')" ]; then
+ uv run pytest ${{ inputs.src-tests-folder }}
+ fi
diff --git a/.github/dependabot.yml b/.github/dependabot.yml
index da59a8e..6e638c6 100644
--- a/.github/dependabot.yml
+++ b/.github/dependabot.yml
@@ -17,4 +17,3 @@ updates:
labels:
- "dependencies"
- "python"
-
diff --git a/.github/workflows/workflow.yml b/.github/workflows/workflow.yml
index 706cfcc..72dff19 100644
--- a/.github/workflows/workflow.yml
+++ b/.github/workflows/workflow.yml
@@ -8,10 +8,11 @@ on:
env:
SRC_PYTHON_VERSION: "3.11"
+ TEST_PATH: "tests"
jobs:
- setup:
- name: Setup Code
+ setup-test:
+ name: Setup and Test
runs-on: ubuntu-latest
steps:
@@ -25,11 +26,16 @@ jobs:
uv-group: "pipeline"
uv-extra: "--all-extras"
+ - name: Run Tests
+ uses: ./.github/actions/test-code
+ with:
+ src-tests-folder: ${{ env.TEST_PATH }}
+
build-deploy-docs:
if: github.ref == 'refs/heads/main'
name: Build MkDocs Documentation
runs-on: ubuntu-latest
- needs: setup
+ needs: setup-test
permissions:
contents: write
diff --git a/AGENTS.md b/AGENTS.md
index 8c67fba..041d778 100644
--- a/AGENTS.md
+++ b/AGENTS.md
@@ -1,59 +1,40 @@
# Agent Instructions
-This project is a **Cookiecutter template** used to generate new Python projects with a
-professional, ready-to-use structure. It integrates best practices for code quality,
-testing, security, documentation, and CI/CD. Because it is highly dynamic, it includes
-many optional settings and Jinja2 template blocks.
+This is a **Cookiecutter template** for generating Python projects with integrated best
+practices for code quality, testing, security, documentation, and CI/CD.
-## Getting Context
+## Before You Start
-Before interacting with the template, make sure to:
+1. Review `README.md` for template overview and features
+2. Check `AGENTS.md` in generated projects for project-specific instructions
+3. Understand this is a **dynamic generator**, not a static project
-1. **Review the README**: The `README.md` provides a high-level overview of the
- template, its features, CI/CD workflow, and instructions to generate new projects.
- Key features include:
+## Key Files
- - Linting & type checking (Ruff & Mypy)
- - Security scanning (Bandit)
- - Code complexity analysis (Complexipy)
- - Testing (Pytest)
- - Auto documentation (MkDocs + GitHub Pages)
- - Preconfigured GitHub Actions for CI/CD
+- **cookiecutter.json**: Template variables (metadata, Python version, optional features)
+- **hooks/post_gen_project.py**: Removes disabled optional folders post-generation
+- **Makefile**: Workflow commands (`setup`, `lint`, `code-check`, `test`, `pipeline`)
+- **.github/actions/**: Reusable GitHub Actions for CI/CD
-2. **Review `AGENTS.md` in the generated project**: After generating a new project, the
- template copies an `AGENTS.md` file to the project itself
- (`{{cookiecutter.__package_slug}}/AGENTS.md`). This file provides project-specific
- instructions for agents.
+## Template Syntax
-## Working with the Template
+- Files use **Jinja2 syntax**: `{{ cookiecutter.project_name }}`
+- Preserve template delimiters and conditional blocks when modifying
+- Invalid syntax breaks project generation
-- **Jinja2 blocks**: Because this is a Cookiecutter template, many files contain Jinja2
- template syntax. Agents should expect template variables like
- `{{ cookiecutter.project_name }}` or conditional blocks.
+## Validation Workflow
-- **Testing functionality**:
+Run `make pipeline` to execute:
- - Create new projects inside the `workspaces/` directory.
- - Run the Makefile targets to validate functionality:
+- Linting (Ruff, isort)
+- Type checking (Mypy)
+- Complexity analysis (Complexipy)
+- Security scanning (Bandit)
+- Tests (Pytest)
- - `make install` → installs dependencies
- - `make pipeline` → runs linting, type checking, security analysis, complexity
- checks, and tests
- - `make all` → full workflow including documentation preview
+## Critical Rules
- - Ensure the environment is isolated (e.g., via `venv`) and that
- [`uv`](https://github.com/astral-sh/uv) is installed to handle grouped dependency
- installations.
-
-## Summary
-
-Agents interacting with this template should:
-
-- Understand it is a **dynamic project generator**, not a single static project.
-- Always use the generated `workspaces/` directory for testing.
-- Follow the Makefile workflow to validate and test features.
-- Keep track of optional components and ensure `post_gen_project.py` handles cleanup
- properly.
-
-By following these instructions, agents will be able to safely generate, test, and
-maintain projects derived from this Python template.
+1. This generates projects; each instance differs based on `cookiecutter.json` choices
+2. Optional features: `.devcontainer/`, `.vscode/`, `notebooks/`, `prompts/`
+3. Always validate changes with `make pipeline`
+4. Never break Jinja2 template syntax
diff --git a/Makefile b/Makefile
index 2c43880..37d7a58 100644
--- a/Makefile
+++ b/Makefile
@@ -1,12 +1,13 @@
.PHONY: setup \
clean-cache-temp-files \
- lint code-check \
+ lint code-check test \
doc \
pipeline all
.DEFAULT_GOAL := all
-SRC_PROJECT_HOOKS ?= hooks
+PATH_PROJECT_ROOT ?= .
+TEST_PATH ?= tests
setup:
@echo "Installing dependencies..."
@@ -23,23 +24,27 @@ clean-cache-temp-files:
lint:
@echo "Running lint checks..."
- @uv run isort $(SRC_PROJECT_HOOKS)/
- @uv run ruff check --fix $(SRC_PROJECT_HOOKS)/
- @uv run ruff format $(SRC_PROJECT_HOOKS)/
+ @uv run isort $(PATH_PROJECT_ROOT)
+ @uv run ruff check --fix $(PATH_PROJECT_ROOT)
+ @uv run ruff format $(PATH_PROJECT_ROOT)
@echo "✅ Linting complete."
code-check:
@echo "Running static code checks..."
- @uv run mypy $(SRC_PROJECT_HOOKS)/
- @uv run complexipy -f $(SRC_PROJECT_HOOKS)/
- @uv run bandit -r $(SRC_PROJECT_HOOKS)/
+ @uv run mypy $(PATH_PROJECT_ROOT)
+ @uv run complexipy -f $(PATH_PROJECT_ROOT)
@echo "✅ Code and security checks complete."
+test:
+ @echo "Running tests..."
+ @uv run pytest $(TEST_PATH) -v
+ @echo "✅ Tests complete."
+
doc:
@echo "Serving documentation..."
@uv run mkdocs serve
-pipeline: clean-cache-temp-files lint code-check
+pipeline: clean-cache-temp-files lint code-check test
@echo "✅ Pipeline complete."
all: setup pipeline doc
diff --git a/README.md b/README.md
index 9977947..4a4d25e 100644
--- a/README.md
+++ b/README.md
@@ -15,68 +15,76 @@
# Python Project Template
Python Project Template provides a ready-to-use structure for Python projects,
-integrating best practices for code quality, testing, security, documentation, and CI/CD.
-It helps developers start new projects quickly with a maintainable and professional
-foundation.
+integrating best practices for code quality, testing and more. It helps developers start
+new projects quickly with a maintainable and professional foundation.
+
+> **Warning**: This template is configured for Linux x86_64 systems. For other platforms,
+> you may need to adjust the `environments` and `required-environments` settings in
+> `pyproject.toml`.
## Features
-- **Linting & Type Checking**: Ruff and Mypy for clean, consistent code.
-- **Security Scanning**: Bandit detects potential vulnerabilities.
-- **Code Complexity Analysis**: Complexipy identifies complex functions and modules.
-- **Testing Suite**: Reliable unit testing with Pytest.
-- **Auto Documentation**: MkDocs + GitHub Pages for automated docs.
-- **CI/CD**: GitHub Actions automates linting, testing, and documentation deployment.
+- **Linting & Type Checking**: [Ruff](https://docs.astral.sh/ruff/) and
+ [Mypy](https://www.mypy-lang.org/) for clean, consistent code.
+- **Security Scanning**: [Bandit](https://bandit.readthedocs.io/en/latest/) detects
+ potential vulnerabilities.
+- **Code Complexity Analysis**: [Complexipy](https://rohaquinlop.github.io/complexipy/)
+ identifies complex functions and modules.
+- **Testing Suite**: Reliable unit testing with
+ [Pytest](https://docs.pytest.org/en/stable/).
+- **Auto Documentation**: [MkDocs](https://www.mkdocs.org/) +
+ [GitHub Pages](https://docs.github.com/en/pages) for automated docs.
+- **CI/CD**: [GitHub Actions](https://docs.github.com/en/actions) automates linting,
+ testing, and documentation deployment.
And more.
## Getting Started
Before starting, ensure that you have required Python installed and a virtual environment
-set up. It is recommended to create an isolated environment (e.g., using `venv`) to
-manage dependencies cleanly. Additionally, ensure that
-[`uv`](https://github.com/astral-sh/uv) is installed in your environment to handle
-grouped dependency installations.
+set up. It is recommended to create an isolated environment to manage dependencies
+cleanly. Additionally, ensure that [`uv`](https://github.com/astral-sh/uv) is installed
+in your environment to handle grouped dependency installations.
1. Generate Your Project
- Use Cookiecutter to create a new project from the template:
+ Use Cookiecutter to create a new project from the template:
- ```bash
- cookiecutter https://github.com/danibcorr/python-project-template.git
- ```
+ ```bash
+ cookiecutter https://github.com/danibcorr/python-project-template.git
+ ```
- Follow the prompts to configure project metadata, package name, and other options.
+ Follow the prompts to configure project metadata, package name, and other options.
2. Install Dependencies
- Activate your virtual environment and install all dependencies using the included
- `Makefile`:
+ Activate your virtual environment and install all dependencies using the included
+ `Makefile`:
- ```bash
- make install
- ```
+ ```bash
+ make setup
+ ```
- This installs development, testing, and documentation tools as defined in
- `pyproject.toml`.
+ This installs development, testing, and documentation tools as defined in
+ `pyproject.toml`.
3. Run the Pipeline
- Execute the quality pipeline, which includes linting, type checking, security
- analysis, complexity checks, and test execution:
+ Execute the quality pipeline, which includes linting, type checking, complexity
+ checks, and test execution:
- ```bash
- make pipeline
- ```
+ ```bash
+ make pipeline
+ ```
4. Run the Full Workflow (Optional)
- To perform a complete setup including dependency installation, full quality checks,
- and local documentation preview:
+ To perform a complete setup including dependency installation, full quality checks,
+ and local documentation preview:
- ```bash
- make all
- ```
+ ```bash
+ make all
+ ```
- This ensures that the project environment is fully prepared for development and
- validation.
+ This ensures that the project environment is fully prepared for development and
+ validation.
diff --git a/docs/assets/css/custom.css b/docs/assets/css/custom.css
new file mode 100644
index 0000000..5aa8180
--- /dev/null
+++ b/docs/assets/css/custom.css
@@ -0,0 +1,16 @@
+body {
+ text-align: justify;
+}
+
+.mermaid {
+ display: flex;
+ justify-content: center;
+}
+
+.md-typeset__table {
+ min-width: 100%;
+}
+
+.md-typeset table:not([class]) {
+ display: table;
+}
diff --git a/docs/content/ci-cd.md b/docs/content/ci-cd.md
index e5693c9..c2db9f4 100644
--- a/docs/content/ci-cd.md
+++ b/docs/content/ci-cd.md
@@ -1,72 +1,63 @@
# Continuous Integration and Continuous Deployment with GitHub Actions
-## Overview of the Automation Strategy
-
-The continuous integration and continuous deployment (CI/CD) process is implemented
-through GitHub Actions using a workflow definition located at
-`.github/workflows/workflow.yml`. This workflow is designed to automate, in a unified and
-reproducible manner, the validation, documentation generation, and release management
-phases of the project lifecycle. By integrating these activities into a single automated
-pipeline, the repository ensures consistency between code changes, documentation updates,
-and published releases, while reducing manual intervention and the risk of human error.
-
-## Workflow Architecture and Execution Logic
-
-The workflow is composed of several logically separated jobs, each responsible for a
-specific stage of the CI/CD process. These jobs are executed according to clearly defined
-triggering conditions, primarily based on repository events and branch context. The
-structure reflects a progressive validation model in which foundational checks are
-performed first, followed by documentation deployment and release creation when the code
-reaches a stable state.
-
-## Environment Setup and Dependency Initialization
-
-The initial job, referred to as the setup phase, is executed on every push and on every
-pull request. Its primary purpose is to establish a controlled and reproducible execution
-environment. During this phase, a Python runtime environment is provisioned, ensuring
-that subsequent jobs operate under consistent interpreter conditions. All dependencies
-defined under the `pipeline` group are then installed, which guarantees that the tools
-required for validation, documentation generation, and automation are available before
-any further processing occurs. This step functions as a foundational safeguard, as
-failures at this stage prevent downstream jobs from executing under incomplete or
-misconfigured conditions.
-
-## Documentation Build and Deployment on the Main Branch
-
-When changes are merged into the `main` branch, the workflow activates an additional job
-dedicated to documentation management. In this phase, the project documentation is built
-using MkDocs, a static site generator specifically designed for technical documentation.
-The resulting site is then automatically deployed to GitHub Pages, ensuring that the
-published documentation always reflects the current state of the main codebase.
-
-Version management of the documentation is handled through `mike`, which enables the
-coexistence of multiple documentation versions corresponding to different project
-releases. This approach allows users to consult documentation aligned with specific
-versions of the software, thereby improving traceability and long-term maintainability.
-The deployment process is fully automated and does not require manual approval once the
+## Automation Strategy Overview
+
+Continuous integration and continuous deployment are implemented through GitHub Actions
+using a workflow definition located at `.github/workflows/workflow.yml`. This workflow
+serves as the central automation mechanism of the project and orchestrates validation,
+documentation generation, and release management within a unified pipeline. By
+integrating these processes, the repository maintains consistency between code changes,
+documentation, and released artifacts, while reducing manual intervention and operational
+errors.
+
+## Workflow Architecture and Execution Model
+
+The workflow is organized into distinct jobs, each corresponding to a specific stage of
+the CI/CD lifecycle. Job execution is controlled by repository events, such as pushes and
+pull requests, and by branch-specific conditions. The overall design follows a
+progressive execution model in which preliminary checks are completed first, establishing
+a stable baseline before documentation deployment and release publication are triggered.
+This structure ensures that only validated code advances to user-facing outputs.
+
+## Environment Initialization and Dependency Setup
+
+The initial job is executed on every push and pull request and is responsible for
+preparing a reproducible execution environment. A Python runtime is provisioned to ensure
+consistent interpreter behavior across all stages of the pipeline. Subsequently, all
+dependencies defined in the `pipeline` group are installed, guaranteeing the availability
+of the tools required for validation, documentation building, and automation tasks. Any
+failure during this phase interrupts the workflow, preventing downstream jobs from
+running under incomplete or unreliable conditions.
+
+## Documentation Build and Deployment
+
+When changes are integrated into the `main` branch, the workflow triggers a dedicated
+documentation job. In this stage, MkDocs is used to generate a static documentation site
+from the project sources, which is then automatically deployed to GitHub Pages. This
+process ensures that the published documentation accurately reflects the current state of
+the main codebase.
+
+Documentation versioning is handled through `mike`, which allows multiple versions of the
+documentation to coexist. Each version corresponds to a specific software release,
+enabling users to consult documentation aligned with the version they are using. The
+entire build and deployment process is automated and requires no manual approval once the
workflow conditions are satisfied.
-## Automated Release Creation and Versioning
-
-Also restricted to the `main` branch, the release creation job formalizes the delivery of
-a new software version. During this stage, the workflow programmatically determines the
-current project version and uses this information to create a new GitHub release. Release
-notes are generated automatically, typically by aggregating relevant commit messages or
-changes since the previous release. This ensures that each release is consistently
-documented and that users have immediate access to a structured summary of modifications,
-enhancements, and fixes.
+## Automated Release Management
-By integrating release creation into the CI/CD pipeline, the workflow enforces a direct
-relationship between the main branch state and published artifacts, reducing
-discrepancies between source code and released versions.
+Release creation is also restricted to the `main` branch and represents the final stage
+of the CI/CD pipeline. During this phase, the workflow determines the current project
+version and creates a corresponding GitHub release. Release notes are generated
+automatically based on changes since the previous release, providing users with a concise
+and structured summary of updates. Integrating release management into the workflow
+enforces a direct relationship between the repository state and published versions,
+reducing discrepancies between source code and distributed artifacts.
-## GitHub Pages Configuration and Activation
+## GitHub Pages Configuration Requirements
-For the automated documentation deployment to function correctly, GitHub Pages must be
-configured to use GitHub Actions as its source. This configuration is performed within
-the repository settings by navigating to the Pages section and selecting GitHub Actions
-as the publication source. Once this setting is applied, every push to the `main` branch
-triggers the documentation build and deployment process defined in the workflow. As a
-result, the documentation site is continuously updated without requiring any additional
-manual steps, ensuring alignment between the repository content and its public technical
+For documentation deployment to function correctly, GitHub Pages must be configured to
+use GitHub Actions as its publication source. This setting is applied in the repository
+Pages configuration. Once enabled, every push to the `main` branch automatically triggers
+the documentation build and deployment defined in the workflow, ensuring continuous
+alignment between the repository content and its publicly available technical
documentation.
diff --git a/docs/content/cookiecutter.md b/docs/content/cookiecutter.md
index 441ea54..3bc67d5 100644
--- a/docs/content/cookiecutter.md
+++ b/docs/content/cookiecutter.md
@@ -10,12 +10,12 @@ enforces best practices and organizational standards in project structure.
## Configuration Variables
-The customization process is driven by a configuration file, typically named
-`cookiecutter.json`, which defines the variables that the template expects. Each variable
-corresponds to a specific aspect of the project, such as naming conventions, folder
-structures, or optional features. During project generation, Cookiecutter interactively
-prompts the user to provide values for these variables, thereby producing a project
-tailored to the user’s specifications. The main configuration variables include:
+The customization process is driven by a configuration file, `cookiecutter.json`, which
+defines the variables that the template expects. Each variable corresponds to a specific
+aspect of the project, such as naming conventions, folder structures, or optional
+features. During project generation, Cookiecutter interactively prompts the user to
+provide values for these variables, thereby producing a project tailored to the user’s
+specifications. The main configuration variables include:
| Variable | Description | Example |
| -------------------------- | ----------------------------------------------- | ------------------------------ |
@@ -27,6 +27,7 @@ tailored to the user’s specifications. The main configuration variables includ
| `add_notebooks_folder` | Option to include a notebooks folder | `yes` / `no` |
| `add_prompts_folder` | Option to include a prompts folder | `yes` / `no` |
| `add_dev_container_folder` | Option to include a Dev Container configuration | `yes` / `no` |
+| `add_vscode_folder` | Option to include VS Code configuration | `yes` / `no` |
These variables enable fine-grained control over the structure and functionality of the
generated project, accommodating different workflows and development environments.
diff --git a/docs/content/makefile-commands.md b/docs/content/makefile-commands.md
index c319e75..c512fbe 100644
--- a/docs/content/makefile-commands.md
+++ b/docs/content/makefile-commands.md
@@ -42,7 +42,6 @@ readability and maintainability. It integrates several tools:
- **isort**: Organizes imports alphabetically and categorically.
- **Ruff**: A high-performance linter that detects and fixes style issues.
-- **nbqa**: Applies linting to Jupyter notebooks.
Typical automatic fixes include removing unused imports, correcting spacing and
formatting, and reorganizing imports according to PEP 8 conventions. This command is
@@ -83,7 +82,7 @@ during refactoring or when optimizing the codebase for maintainability.
make check-dead-code
```
-## Running Unit Tests: `make tests`
+## Running Unit Tests: `make test`
This command executes all unit tests using Pytest. It supports structured test execution
via `pytest-order` and can generate coverage reports if configured. The `tests/`
@@ -91,7 +90,7 @@ directory is scanned recursively, ensuring that all modules are tested thoroughl
command should be run frequently after code modifications to verify correctness.
```bash
-make tests
+make test
```
## Documentation Management: `make doc`
@@ -106,6 +105,22 @@ local edits and published content.
make doc
```
+## Pre-Commit Validation: `make pre-commit`
+
+The `pre-commit` command runs a subset of the pipeline specifically designed for
+pre-commit hooks. It sequentially:
+
+1. Cleans caches and temporary files.
+2. Runs linting and formatting.
+3. Performs static analysis, complexity measurement, and security scanning.
+
+This command is automatically executed by Git pre-commit hooks to validate code before
+committing.
+
+```bash
+make pre-commit
+```
+
## Complete Code Quality Pipeline: `make pipeline`
The `pipeline` command orchestrates the full sequence of code quality checks and testing.
diff --git a/docs/content/pre-commit-hooks.md b/docs/content/pre-commit-hooks.md
index 2d10efa..2aff7d3 100644
--- a/docs/content/pre-commit-hooks.md
+++ b/docs/content/pre-commit-hooks.md
@@ -12,10 +12,10 @@ changes meet established coding standards and pass essential checks automaticall
## Functionality
When a developer attempts to commit changes, the pre-commit system automatically triggers
-the full development pipeline by executing:
+the pre-commit validation pipeline by executing:
```bash
-make pipeline
+make pre-commit
```
This command performs a comprehensive sequence of actions, including:
@@ -23,10 +23,11 @@ This command performs a comprehensive sequence of actions, including:
- Linting and formatting checks to ensure consistent code style.
- Type validation using Mypy to detect mismatches and type-related errors.
- Security analysis with Bandit to identify potential vulnerabilities in the code.
-- Execution of all unit tests to confirm that functionality remains correct.
+- Complexity analysis with Complexipy to identify overly complex code.
If any of these checks fail, the commit is blocked, thereby preventing problematic code
-from being added to the repository.
+from being added to the repository. Note that tests are not executed during pre-commit to
+keep the validation fast, but they are run in the full `make pipeline` command.
## Advantages
diff --git a/docs/content/pyproject-configuration.md b/docs/content/pyproject-configuration.md
index 03546c0..a9b984f 100644
--- a/docs/content/pyproject-configuration.md
+++ b/docs/content/pyproject-configuration.md
@@ -10,11 +10,22 @@ interoperability between tools. By leveraging `pyproject.toml`, the project achi
modular dependency management, consistent tool behavior, and simplified setup for both
development and production environments.
+## Build System: `[build-system]` Section
+
+Defines the build backend used for packaging:
+
+```toml
+[build-system]
+requires = ["uv_build>=0.9.18,<0.10.0"]
+build-backend = "uv_build"
+```
+
+This configuration uses `uv_build` as the build backend, enabling fast and reliable
+package building.
+
## Project Metadata: `[project]` Section
-The `[project]` section defines fundamental information about the project. This includes
-the project name, version, description, author information, and the minimum supported
-Python version. An example configuration is:
+The `[project]` section defines fundamental information about the project:
```toml
[project]
@@ -22,12 +33,56 @@ name = "my-project"
version = "0.0.1"
description = "Project description"
authors = [{name = "Your Name", email = "your@email.com"}]
+license = {file = "LICENSE"}
+readme = "README.md"
requires-python = ">=3.11"
+classifiers = [
+ "Intended Audience :: Developers",
+ "Intended Audience :: Science/Research",
+ "Topic :: Scientific/Engineering",
+ "Topic :: Software Development",
+ "License :: OSI Approved :: MIT License",
+ "Programming Language :: Python :: 3",
+ "Programming Language :: Python :: 3 :: Only",
+]
+keywords = ["python"]
```
These fields ensure that packaging tools, dependency managers, and documentation
-generators can automatically retrieve key project information. This metadata also
-standardizes versioning and distribution across different environments.
+generators can automatically retrieve key project information. The `classifiers` help
+categorize the project on PyPI, while `keywords` improve discoverability.
+
+## UV Configuration: `[tool.uv]` Section
+
+Configures the UV package manager behavior:
+
+```toml
+[tool.uv]
+default-groups = "all"
+cache-keys = [{ file = "pyproject.toml" }, { git = { commit = true } }]
+environments = [
+ "sys_platform == 'linux'",
+]
+required-environments = [
+ "sys_platform == 'linux' and platform_machine == 'x86_64'"
+]
+```
+
+- `default-groups = "all"`: Installs all dependency groups by default.
+- `cache-keys`: Defines cache invalidation triggers based on file changes and git
+ commits.
+- `environments`: Specifies supported platforms (Linux).
+- `required-environments`: Restricts to Linux x86_64 architecture.
+
+### UV Build Backend: `[tool.uv.build-backend]` Section
+
+```toml
+[tool.uv.build-backend]
+module-name = "my_module"
+module-root = ""
+```
+
+Configures the module name and root directory for the build backend.
## Dependency Management: `[dependency-groups]` Section
@@ -40,15 +95,14 @@ workflows.
The `pipeline` group includes tools essential for code quality, testing, security, and
automation:
-- `pytest` and `pytest-order`: Frameworks for running unit tests and controlling test
- execution order.
-- `ruff`: A high-performance linter and automatic formatter.
-- `mypy`: Static type checker to detect type inconsistencies.
- `bandit`: Security analysis tool for detecting common vulnerabilities.
- `complexipy`: Measures cyclomatic complexity to identify potentially unmaintainable
code.
+- `mypy`: Static type checker to detect type inconsistencies.
+- `pytest` and `pytest-order`: Frameworks for running unit tests and controlling test
+ execution order.
+- `ruff`: A high-performance linter and automatic formatter.
- `isort`: Sorts imports according to standard conventions.
-- `nbqa`: Extends linting capabilities to Jupyter notebooks.
- `deadcode`: Detects unused or unreachable code.
- `pre-commit`: Manages Git pre-commit hooks for automated checks.
@@ -58,23 +112,39 @@ The `documentation` group contains tools for building and maintaining project
documentation:
- `mkdocs` and `mkdocs-material`: Static site generator and material design theme.
-- `mkdocstrings`: Generates API documentation directly from Python docstrings.
+- `mkdocstrings[python]`: Generates API documentation directly from Python docstrings.
+- `mkdocs-git-revision-date-localized-plugin`: Displays last update dates for pages.
+- `mkdocs-git-authors-plugin`: Shows contributors for each page.
+- `mkdocs-enumerate-headings-plugin`: Automatically numbers headings.
- `mkdocs-jupyter`: Integrates Jupyter notebooks into documentation.
+- `mkdocs-awesome-nav`: Enhances navigation capabilities.
- `mike`: Provides versioned documentation management.
-- Additional plugins: Enhance navigation, metadata handling, and site functionality.
## Tool-Specific Configuration
### Ruff: `[tool.ruff]`
-The Ruff linter is configured to enforce code style, detect errors, and simplify code
-where possible:
+The Ruff linter is configured to enforce code style, detect errors, and simplify code:
```toml
[tool.ruff]
line-length = 88
indent-width = 4
+exclude = [".venv", "build", "dist", ...]
+extend-exclude = ["*.ipynb"]
+```
+
+- `line-length = 88`: Maximum line length following Black's convention.
+- `indent-width = 4`: Standard Python indentation.
+- `exclude`: Directories to ignore during linting.
+- `extend-exclude`: Additional patterns to exclude (Jupyter notebooks).
+
+#### Ruff Lint Rules: `[tool.ruff.lint]`
+
+```toml
+[tool.ruff.lint]
select = ["E", "F", "UP", "B", "SIM"]
+ignore = ["E203"]
```
- `E`: Enforces PEP 8 style guidelines.
@@ -82,6 +152,30 @@ select = ["E", "F", "UP", "B", "SIM"]
- `UP`: Applies modern Python syntax checks and upgrades.
- `B`: Detects common programming bugs.
- `SIM`: Suggests code simplifications to improve readability.
+- `ignore = ["E203"]`: Ignores whitespace before ':' (conflicts with Black).
+
+#### Ruff Docstring Style: `[tool.ruff.lint.pydocstyle]`
+
+```toml
+[tool.ruff.lint.pydocstyle]
+convention = "google"
+```
+
+Enforces Google-style docstring conventions.
+
+#### Ruff Formatter: `[tool.ruff.format]`
+
+```toml
+[tool.ruff.format]
+quote-style = "double"
+indent-style = "space"
+line-ending = "auto"
+docstring-code-format = true
+docstring-code-line-length = 88
+```
+
+Configures automatic code formatting with double quotes, space indentation, and formats
+code blocks within docstrings.
### Mypy: `[tool.mypy]`
@@ -91,21 +185,54 @@ Mypy performs static type checking, ensuring type safety and consistency:
[tool.mypy]
check_untyped_defs = true
ignore_missing_imports = true
+exclude = ["^(build|dist|venv)", ".venv/"]
+cache_dir = "/dev/null"
```
- `check_untyped_defs = true`: Validates functions even if type annotations are missing.
- `ignore_missing_imports = true`: Avoids errors for third-party modules without type
- stubs, preventing unnecessary interruptions.
+ stubs.
+- `exclude`: Directories to skip during type checking.
+- `cache_dir = "/dev/null"`: Disables caching to avoid stale cache issues.
### isort: `[tool.isort]`
isort enforces a standardized import order, improving code readability and
-maintainability. Imports are grouped as follows:
+maintainability:
+
+```toml
+[tool.isort]
+profile = "black"
+known_first_party = ["my_module"]
+sections = ["FUTURE", "STDLIB", "THIRDPARTY", "FIRSTPARTY", "LOCALFOLDER"]
+import_heading_stdlib = "Standard libraries"
+import_heading_thirdparty = "3pps"
+import_heading_firstparty = "Own modules"
+import_heading_localfolder = "Own modules"
+line_length = 88
+include_trailing_comma = true
+combine_as_imports = true
+skip_glob = [".venv/*", ".uv-cache/*"]
+```
+
+Imports are grouped as follows:
-1. **STDLIB**: Python standard library modules.
-2. **THIRDPARTY**: External dependencies installed via package managers.
-3. **FIRSTPARTY**: Modules developed within the current project.
-4. **LOCALFOLDER**: Relative imports for local submodules or packages.
+1. **FUTURE**: Future imports (`from __future__ import ...`).
+2. **STDLIB**: Python standard library modules.
+3. **THIRDPARTY**: External dependencies installed via package managers.
+4. **FIRSTPARTY**: Modules developed within the current project.
+5. **LOCALFOLDER**: Relative imports for local submodules or packages.
+
+The configuration uses Black-compatible formatting with custom section headings for
+better organization.
+
+### deadcode: `[tool.deadcode]`
+
+Detects unused code in the project:
+
+```toml
+[tool.deadcode]
+exclude = [".venv", ".uv-cache", "tests"]
+```
-This grouping ensures that import statements remain organized and easy to navigate,
-facilitating collaboration across multiple developers and modules.
+Excludes virtual environments and test directories from dead code analysis.
diff --git a/docs/index.md b/docs/index.md
index 9f48e68..d80e699 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -3,80 +3,81 @@