Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
23 changes: 20 additions & 3 deletions .github/workflows/tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,13 +6,26 @@ on:
branches: [ main ]

jobs:
tests:
linters:
runs-on: ubuntu-latest
strategy:
matrix:
just-trigger:
- "fmt"
- "lint"
- "tests"
- "mypy"
steps:
- uses: actions/checkout@v4
- uses: extractions/setup-just@v2
- uses: astral-sh/setup-uv@v4
- run: uv python install 3.8
- run: uv sync --all-extras --dev
- run: just ${{ matrix.just-trigger }}

tests:
runs-on: ubuntu-latest
strategy:
matrix:
python-version:
- "3.8"
- "3.9"
Expand All @@ -26,4 +39,8 @@ jobs:
- uses: astral-sh/setup-uv@v4
- run: uv python install ${{ matrix.python-version }}
- run: uv sync --all-extras --dev
- run: just ${{ matrix.just-trigger }}
- run: just tests
- uses: coverallsapp/github-action@master
with:
github-token: ${{ secrets.github_token }}
path-to-lcov: tests.lcov
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,7 @@ htmlcov/
.nox/
.coverage
.coverage.*
*.lcov
.cache
nosetests.xml
coverage.xml
Expand Down
2 changes: 1 addition & 1 deletion LICENSE
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
MIT License

Copyright (c) 2024 KODE LLC
Copyright (c) 2025 KODE LLC

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
Expand Down
259 changes: 207 additions & 52 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,19 +5,11 @@ Python 3 library to write simple asynchronous health checks (probes).
[![Ruff](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/astral-sh/ruff/main/assets/badge/v2.json)](https://github.com/astral-sh/ruff)
[![PyPI](https://img.shields.io/pypi/v/probirka.svg)](https://pypi.python.org/pypi/probirka)
[![PyPI](https://img.shields.io/pypi/dm/probirka.svg)](https://pypi.python.org/pypi/probirka)
[![Coverage Status](https://coveralls.io/repos/github/appKODE/probirka/badge.svg?branch=polish-docs)](https://coveralls.io/github/appKODE/probirka?branch=polish-docs)

## Overview

Probirka is a Python library designed to facilitate the creation of simple asynchronous health checks, also known as probes. It allows you to define custom probes to monitor the health of various services and components in your application.

## Features

- Simple API for defining asynchronous health checks
- Support for custom probes
- Integration with FastAPI and aiohttp
- Ability to add custom information to health check results
- Grouping of probes for selective execution
- Timeout support for probes
Probirka is a lightweight and flexible Python library for implementing asynchronous health checks in your applications. It provides a simple yet powerful API for monitoring the health of various components and services, making it ideal for microservices architectures, containerized applications, and distributed systems.

## Installation

Expand All @@ -27,97 +19,260 @@ Install Probirka using pip:
pip install probirka
```

## Usage
## Quick Start

Here is a simple example of how to use Probirka to create an asynchronous health check:
Here is a simple example of how to use Probirka to create health checks using decorators:

```python
import asyncio
from probirka import Probe, HealthCheck
from probirka import Probirka

# Create a Probirka instance
probirka = Probirka()

# Add some custom information
probirka.add_info("version", "1.0.0")
probirka.add_info("environment", "production")

# Define health checks using decorators
@probirka.add(name="database") # This probe will always run
async def check_database():
# Simulate a database check
await asyncio.sleep(1)
return True

@probirka.add(groups=["cache"]) # This probe will only run when cache group is requested
async def check_cache():
# Simulate a cache check
await asyncio.sleep(1)
return False # Simulate a failed check

class DatabaseProbe(Probe):
async def check(self):
@probirka.add(groups=["external"]) # This probe will only run when external group is requested
def check_external_service():
# Synchronous check example
return True

async def main():
# Run only required probes (without groups)
basic_results = await probirka.run()
print("Basic check results:", basic_results)

# Run required probes + cache group probes
cache_results = await probirka.run(with_groups=["cache"])
print("Cache check results:", cache_results)

# Run required probes + multiple groups
full_results = await probirka.run(with_groups=["cache", "external"])
print("Full check results:", full_results)

if __name__ == "__main__":
asyncio.run(main())
```

Alternatively, you can create custom probes by inheriting from the `ProbeBase` class:

```python
from probirka import Probirka, ProbeBase
import asyncio

class DatabaseProbe(ProbeBase):
async def _check(self):
# Simulate a database check
await asyncio.sleep(1)
return True

class CacheProbe(Probe):
async def check(self):
class CacheProbe(ProbeBase):
async def _check(self):
# Simulate a cache check
await asyncio.sleep(1)
return False # Simulate a failed check

async def main():
health_check = HealthCheck(probes=[DatabaseProbe(), CacheProbe()])
health_check.add_info("version", "1.0.0")
health_check.add_info("environment", "production")
results = await health_check.run()
probirka = Probirka(probes=[DatabaseProbe(), CacheProbe()])
probirka.add_info("version", "1.0.0")
probirka.add_info("environment", "production")
results = await probirka.run()
print(results)

if __name__ == "__main__":
asyncio.run(main())
```

This example defines two probes, `DatabaseProbe` and `CacheProbe`, and runs them as part of a health check. The `CacheProbe` simulates a failed check. Additionally, it adds custom user data using the `add_info` method.
## Advanced Usage

### Creating Custom Probes

Example output:
You can create custom probes by inheriting from the `ProbeBase` class:

```python
HealthCheckResult(
ok=False,
info={'version': '1.0.0', 'environment': 'production'},
started_at=datetime.datetime(2023, 10, 10, 10, 0, 0),
total_elapsed=datetime.timedelta(seconds=1),
checks=[
ProbeResult(name='DatabaseProbe', ok=True, elapsed=datetime.timedelta(seconds=1)),
ProbeResult(name='CacheProbe', ok=False, elapsed=datetime.timedelta(seconds=1))
]
)
from probirka import ProbeBase
import asyncio

class CustomProbe(ProbeBase):
def __init__(self, name="CustomProbe"):
super().__init__(name=name)

async def _check(self):
# Implement your health check logic here
return True
```

## Integration with FastAPI
### Adding Metadata to Probes

You can integrate Probirka with FastAPI as follows:
You can add metadata to your probes:

```python
from probirka import ProbeBase
import asyncio

class DatabaseProbe(ProbeBase):
async def _check(self):
await asyncio.sleep(1)
self.add_info("connection_pool_size", 10)
self.add_info("active_connections", 5)
return True
```

The added information will be included in the probe results and can be accessed through the `info` field of each probe result. This is useful for providing additional context about the probe's state or performance metrics.

### Grouping Probes

Probes can be organized into required and optional groups. Probes without groups are always executed, while probes with groups are only executed when explicitly requested:

```python
import asyncio
from probirka import Probirka

# Create a Probirka instance
probirka = Probirka()

# Required probe (will always run)
@probirka.add(name="database")
async def check_database():
await asyncio.sleep(1)
return True

# Optional probes (will only run when their groups are requested)
@probirka.add(groups=["cache"])
async def check_cache():
await asyncio.sleep(1)
return True

@probirka.add(groups=["external"])
async def check_external_service():
return True

async def main():
# Run only required probes (database)
basic_results = await probirka.run()
print("Basic check results:", basic_results)

# Run required probes + cache group
cache_results = await probirka.run(with_groups=["cache"])
print("Cache check results:", cache_results)

# Run required probes + multiple groups
full_results = await probirka.run(with_groups=["cache", "external"])
print("Full check results:", full_results)

if __name__ == "__main__":
asyncio.run(main())
```

### Setting Timeouts

You can set timeouts for individual probes:

```python
from probirka import ProbeBase
import asyncio

class SlowProbe(ProbeBase):
async def _check(self):
await asyncio.sleep(2) # This will timeout
return True

probe = SlowProbe(timeout=1.0) # 1 second timeout
```

### Caching Results

```python
from typing import Optional
from probirka import Probirka, ProbeBase
import asyncio

# Create a Probirka instance with global caching settings
probirka = Probirka(success_ttl=60, failed_ttl=10) # Cache successful results for 60s, failed for 10s

# Add a probe with custom caching settings
@probirka.add(success_ttl=300) # Cache successful results for 5 minutes
async def check_database():
# Simulate a database check
await asyncio.sleep(1)
return True

# Or create a custom probe with caching
class DatabaseProbe(ProbeBase):
def __init__(self, success_ttl: Optional[int] = None, failed_ttl: Optional[int] = None):
super().__init__(success_ttl=success_ttl, failed_ttl=failed_ttl)

async def _check(self) -> bool:
# Simulate a database check
await asyncio.sleep(1)
return True
```

The caching mechanism works as follows:
- If `success_ttl` is set, successful results will be cached for the specified number of seconds
- If `failed_ttl` is set, failed results will be cached for the specified number of seconds
- If both are set to `None` (default), no caching will be performed
- Global settings in `Probirka` instance can be overridden by individual probe settings

## Integration Examples

### FastAPI Integration

```python
from fastapi import FastAPI
from probirka import Probirka, make_fastapi_endpoint

app = FastAPI()

probirka_instance = Probirka()
fastapi_endpoint = make_fastapi_endpoint(probirka_instance)

app.add_api_route("/run", fastapi_endpoint)
# Define some health checks
@probirka_instance.add(name="api")
async def check_api():
return True


# Create and add the endpoint
fastapi_endpoint = make_fastapi_endpoint(probirka_instance)
app.add_route("/health", fastapi_endpoint)

if __name__ == "__main__":
import uvicorn
uvicorn.run(app, host="0.0.0.0", port=8000)
```

## Integration with aiohttp

You can integrate Probirka with aiohttp as follows:
### aiohttp Integration

```python
from aiohttp import web
from probirka import Probirka, make_aiohttp_endpoint

app = web.Application()

probirka_instance = Probirka()
aiohttp_endpoint = make_aiohttp_endpoint(probirka_instance)

app.router.add_get('/run', aiohttp_endpoint)
# Define some health checks
@probirka_instance.add(name="api")
async def check_api():
return True

# Create and add the endpoint
aiohttp_endpoint = make_aiohttp_endpoint(probirka_instance)
app.router.add_get('/health', aiohttp_endpoint)

if __name__ == '__main__':
web.run_app(app)
```

## Contributing

Contributions are welcome! Please open an issue or submit a pull request on GitHub.

## License

This project is licensed under the MIT License.
2 changes: 1 addition & 1 deletion justfile
Original file line number Diff line number Diff line change
Expand Up @@ -17,4 +17,4 @@ fix:
uv run ruff check --fix --unsafe-fixes {{ SOURCE_PATH }}

tests:
uv run pytest tests/
uv run pytest --cov=probirka --cov-report lcov:tests.lcov tests/
Loading