Skip to content

Commit

Permalink
feat: crowdsec plugin
Browse files Browse the repository at this point in the history
  • Loading branch information
Morriz committed Feb 22, 2024
1 parent 7bdb461 commit 658d638
Show file tree
Hide file tree
Showing 18 changed files with 372 additions and 47 deletions.
3 changes: 2 additions & 1 deletion .env.sample
Original file line number Diff line number Diff line change
Expand Up @@ -14,4 +14,5 @@ LETSENCRYPT_EMAIL=admin@example.com
# Uncomment next line for prod certs:
LETSENCRYPT_STAGING=1

# API_KEY=your_api_key
# Main api key for itsUP
API_KEY=
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ __pycache__
.venv
certs/*
db.yml
proxy/docker-compose.yml
proxy/nginx/*.conf
proxy/traefik/*.yml
upstream
12 changes: 12 additions & 0 deletions .vscode/launch.json
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,18 @@
"PYTHONPATH": "${workspaceRoot}"
}
},
{
"name": "Python: write artifacts",
"type": "debugpy",
"request": "launch",
"program": "bin/write-artifacts.py",
"console": "integratedTerminal",
"justMyCode": false,
"envFile": "${workspaceFolder}/.env",
"env": {
"PYTHONPATH": "${workspaceRoot}"
}
},
{
"name": "Python Debugger: Current File",
"type": "debugpy",
Expand Down
5 changes: 4 additions & 1 deletion .vscode/settings.json
Original file line number Diff line number Diff line change
Expand Up @@ -41,5 +41,8 @@
"files.exclude": {
"**/.history": true
},
"python.testing.unittestEnabled": true
"python.testing.unittestEnabled": true,
"[dotenv]": {
"editor.defaultFormatter": "foxundermoon.shell-format"
}
}
41 changes: 31 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,24 +4,25 @@

## Fully managed docker compose infra

Single machine multi docker compose architecture with a low cpu/storage footprint and near-zero<sup>*</sup> downtime.
Single machine multi docker compose architecture with a low cpu/storage footprint and near-zero<sup>\*</sup> downtime.

It runs two nginx proxies in series (proxy -> terminate) to be able to:

- terminate SSL/TLS
- do SSL/TLS passthrough
- target many upstream endpoints

**Advantages:**

- shared docker network: encrypted intra-node communication (over a shared network named `proxynet`)
- near-zero-downtime*
- near-zero-downtime\*

*<sup>*</sup>Near-zero-downtime?*
_<sup>_</sup>Near-zero-downtime?\*

Well, all (stateless) nodes that get rotated do not incur downtime, yet nginx neads a reload signal. During low traffic that will allow for a graceful handling of outstanding http sessions and no downtime, but may be problematic if nginx needs to wait for a magnitude of open sessions. In that case a timeout will force the last open sessions to be terminated.
This approach is a very reliable and cheap approach to achieve zero downtime.

*But what about stateful services?*
_But what about stateful services?_

It is surely possible to deploy stateful services but those MUST NOT be targeted with the `entrypoint: xxx` prop, as those services are the entrypoints which MUST be stateless, as those are rolled up with the `docker rollout` by the automation. In order to update those services you are on your own, but it's a breeze compared to local installs, as you can just docker compose commands.

Expand All @@ -46,7 +47,7 @@ Install everything and start the proxy and api so that we can receive incoming c
1. `bin/install.sh`: installs all project deps.
2. `bin/start-all.sh`: starts the proxy and the api server.
3. `bin/apply.py`: applies all of `db.yml`.
4. 4. `bin/api-logs.sh`: tail the output of the api server.
4. `bin/api-logs.sh`: tail the output of the api server.

### Adding an upstream service

Expand All @@ -55,9 +56,28 @@ Install everything and start the proxy and api so that we can receive incoming c

### Adding a passthrough endpoint

1. Edit `db.yml` and add your service(s), which now need `name`, `domain` and `passthrough: true`.
1. Add a project without `entrypoint:` and one service, which now need `name`, `domain` and `passthrough: true`.
2. Run `bin/apply.py` to roll out the changes.

### Adding a local (host) endpoint

1. Add a project without `entrypoint:` and one service, which only need `name` and `domain`.
2. Run `bin/apply.py` to roll out the changes.

### Plugins

You can enable and configure plugins in `db.yml`. Right now we support the following:

#### CrowdSec

[CrowdSec](https://www.crowdsec.net) can run as a container via plugin [crowdsec-bouncer-traefik-plugin](https://github.com/maxlerebourg/crowdsec-bouncer-traefik-plugin). First set `enable: true`, run `bin/write-artifacts.py`, and bring up the stack (or just the `crowdsec` container:

```
docker compose exec crowdsec cscli bouncers add crowdsecBouncer
```

Put the resulting api key in the plugin configuration in `db.yml` and apply with `bin/apply.py`.

### Api & OpenApi spec

The API allows openapi compatible clients to do management on this stack (ChatGPT works wonders).
Expand All @@ -71,13 +91,15 @@ Exception: Only github webhook endpoints (check for annotation `@app.hooks.regis
#### Webhooks

Webhooks are used for the following:

1. to receive updates to this repo, which will result in a `git pull` and `bin/apply.py` to update any changes in the code. The provided project with `name: itsUP` is used for that, so DON'T delete it if you care about automated updates to this repo.
2. to receive incoming github webhooks (or GET requests to `/update-upstream?project=bla&service=dida`) that result in rolling up of a project or specific service only.

One GitHub webhook listening to `workflow_job`s is provided, which needs:

- the hook you will register in the github project to end with `/hook?project=bla&service=dida` (`service` optional), and the `github_secret` set to `.env/API_KEY`.

I mainly use GitHub workflows and created webhooks for my individual projects, so I can just manage all webhooks in one place.
I mainly use GitHub workflows and created webhooks for my individual projects, so I can just manage all webhooks in one place.

## Dev/ops tools

Expand All @@ -95,7 +117,6 @@ I don't want to switch folders/terminals all the time and want to keep history o
### Scripts

- `bin/update-certs.py`: pull certs and reload the proxy if any certs were created or updated. You could run this in a crontab every week if you want to stay up to date.
- `bin/write-artifacts.py`: after updating `db.yml` yo ucan run this script to check new artifacts.
- `bin/write-artifacts.py`: after updating `db.yml` you can run this script to generate new artifacts.
- `bin/validate-db.py`: after manually editing `db.yml` please run this (also ran from `bin/write-artifacts.py`)
- `bin/requirements-update.sh`: You may want to update requirements once in a while ;)

17 changes: 17 additions & 0 deletions db.yml.sample
Original file line number Diff line number Diff line change
@@ -1,3 +1,20 @@
plugins:
crowdsec:
enabled: false
version: v1.2.0
# apikey:
options:
logLevel: INFO
updateIntervalSeconds: 60
defaultDecisionSeconds: 60
httpTimeoutSeconds: 10
crowdsecCapiMachineId: login
crowdsecCapiPassword: password
crowdsecCapiScenarios:
- crowdsecurity/http-path-traversal-probing
- crowdsecurity/http-xss-probing
- crowdsecurity/http-generic-bf

projects:
- description: Home Assistant passthrough
domain: home.example.com
Expand Down
42 changes: 31 additions & 11 deletions lib/data.py
Original file line number Diff line number Diff line change
@@ -1,12 +1,13 @@
from atexit import register
from logging import debug, info
from typing import Callable, Dict, List
from typing import Any, Callable, Dict, List, Union, cast

import yaml

from lib.models import Env, Project, Service
from lib.models import Env, Plugin, PluginRegistry, Project, Service


def get_db() -> Dict[str, List[Dict[str, str]]]:
def get_db() -> Dict[str, List[Dict[str, Any]] | Dict[str, Any]]:
"""Get the db"""
with open("db.yml", encoding="utf-8") as f:
return yaml.safe_load(f)
Expand All @@ -16,18 +17,43 @@ def validate_db() -> None:
"""Validate db.yml contents"""
debug("Validating db.yml")
db = get_db()
plugins_raw = cast(Dict[str, Any], db["plugins"])
for name, plugin in plugins_raw.items():
p = {"name": name, **plugin}
Plugin.model_validate(p)
for project in db["projects"]:
Project.model_validate(project)


def get_plugin_registry() -> PluginRegistry:
"""Get plugin registry."""
debug("Getting plugin registry")
db = get_db()
plugins_raw = cast(Dict[str, Any], db["plugins"])
return PluginRegistry(**plugins_raw)


def get_plugins(filter: Callable[[Plugin], bool] = None) -> List[Plugin]:
"""Get all plugins. Optionally filter the results."""
debug("Getting plugins" + (f" with filter {filter}" if filter else ""))
registry = get_plugin_registry()
plugins = []
for name, p in registry:
plugin = Plugin(name=name, **p)
if not filter or filter(plugin):
plugins.append(plugin)
return plugins


def get_projects(filter: Callable[[Project, Service], bool] = None) -> List[Project]:
"""Get all projects. Optionally filter the results."""
debug("Getting projects" + (f" with filter {filter}" if filter else ""))
db = get_db()
projects_raw = cast(List[Dict[str, Any]], db["projects"])
ret = []
for p_json in db["projects"]:
for project in projects_raw:
services = []
p = Project(**p_json)
p = Project(**project)
for s in p.services.copy():
if not filter or filter(p, s):
services.append(s)
Expand Down Expand Up @@ -83,7 +109,6 @@ def get_service(project: str | Project, service: str, throw: bool = True) -> Ser
"""Get a project's service by name"""
debug(f"Getting service {service} in project {project.name if isinstance(project, Project) else project}")
p = get_project(project, throw) if isinstance(project, str) else project
assert p is not None
for item in p.services:
if item.name == service:
return item
Expand All @@ -98,17 +123,14 @@ def get_env(project: str | Project, service: str) -> Dict[str, str]:
"""Get a project's env by name"""
debug(f"Getting env for service {service} in project {project.name if isinstance(project, Project) else project}")
service = get_service(project, service)
assert service is not None
return service.env


def upsert_env(project: str | Project, service: str, env: Env) -> None:
"""Upsert the env of a service"""
p = get_project(project) if isinstance(project, str) else project
debug(f"Upserting env for service {service} in project {p.name}: {env.model_dump_json()}")
assert p is not None
s = get_service(p, service)
assert s is not None
s.env = Env(**(s.env.model_dump() | env.model_dump()))
upsert_service(project, s)

Expand All @@ -117,9 +139,7 @@ def upsert_service(project: str | Project, service: Service) -> None:
"""Upsert a service"""
p = get_project(project) if isinstance(project, str) else project
debug(f"Upserting service {service.name} in project {p.name}: {service}")
assert p is not None
for i, s in enumerate(p.services):
assert s is not None
if s.name == service.name:
p.services[i] = service
break
Expand Down
2 changes: 1 addition & 1 deletion lib/data_test.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ def test_write_projects(self, mock_open: Mock, mock_yaml: Mock) -> None:
mock_open.assert_called_once_with("db.yml", "w", encoding="utf-8")

# Assert that the mock functions were called correctly
mock_yaml.dump.assert_called_once_with(test_db, mock_open())
mock_yaml.dump.assert_called_once_with({"projects": test_db["projects"]}, mock_open())

# Get projects with filter
@mock.patch(
Expand Down
21 changes: 21 additions & 0 deletions lib/models.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,27 @@ class Env(BaseModel):
model_config = ConfigDict(extra="allow")


class Plugin(BaseModel):
"""Plugin model"""

enabled: bool = False
"""Wether or not the plugin is enabled"""
apikey: str = None
"""The API key to use for the plugin, if the plugin requires one"""
name: str = None
"""The name of the plugin"""
description: str = None
"""A description of the plugin"""
options: Dict[str, Any] = {}
"""A dictionary of options to pass to the plugin"""


class PluginRegistry(BaseModel):
"""Plugin registry"""

crowdsec: Plugin


class Service(BaseModel):
"""Service model"""

Expand Down
40 changes: 32 additions & 8 deletions lib/proxy.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
from dotenv import load_dotenv
from jinja2 import Template

from lib.data import get_project, get_projects
from lib.data import get_plugin_registry, get_project, get_projects
from lib.utils import run_command

load_dotenv()
Expand Down Expand Up @@ -82,9 +82,12 @@ def write_routers() -> None:
t = f.read()
tpl_routers_web = Template(t)
domain = os.environ.get("TRAEFIK_DOMAIN")
traefik_admin = os.environ.get("TRAEFIK_ADMIN")
routers_web = tpl_routers_web.render(
projects=projects, traefik_rule=f"Host(`{domain}`)", traefik_admin=traefik_admin
projects=projects,
traefik_rule=f"Host(`{domain}`)",
traefik_admin=os.environ.get("TRAEFIK_ADMIN"),
plugin_registry=get_plugin_registry(),
trusted_ips_cidrs=os.environ.get("TRUSTED_IPS_CIDRS").split(","),
)
with open("proxy/traefik/routers-web.yml", "w", encoding="utf-8") as f:
f.write(routers_web)
Expand All @@ -94,28 +97,49 @@ def write_routers() -> None:
routers_tcp = tpl_routers_tcp.render(projects=projects, traefik_rule=f"HostSNI(`{domain}`)")
with open("proxy/traefik/routers-tcp.yml", "w", encoding="utf-8") as f:
f.write(routers_tcp)


def write_config() -> None:
with open("proxy/tpl/config-tcp.yml.j2", encoding="utf-8") as f:
t = f.read()
tpl_config_tcp = Template(t)
trusted_ips_cidr = os.environ.get("TRUSTED_IPS_CIDRS").split(",")
config_tcp = tpl_config_tcp.render(trusted_ips_cidr=trusted_ips_cidr)
trusted_ips_cidrs = os.environ.get("TRUSTED_IPS_CIDRS").split(",")
config_tcp = tpl_config_tcp.render(trusted_ips_cidrs=trusted_ips_cidrs)
with open("proxy/traefik/config-tcp.yml", "w", encoding="utf-8") as f:
f.write(config_tcp)
with open("proxy/tpl/config-web.yml.j2", encoding="utf-8") as f:
t = f.read()
tpl_config_web = Template(t)
le_email = os.environ.get("LETSENCRYPT_EMAIL")
le_staging = os.environ.get("LETSENCRYPT_STAGING")
config_web = tpl_config_web.render(trusted_ips_cidr=trusted_ips_cidr, le_email=le_email, le_staging=le_staging)
plugin_registry = get_plugin_registry()
has_plugins = any(plugin.enabled for _, plugin in plugin_registry)
config_web = tpl_config_web.render(
trusted_ips_cidrs=trusted_ips_cidrs,
le_email=os.environ.get("LETSENCRYPT_EMAIL"),
le_staging=bool(os.environ.get("LETSENCRYPT_STAGING")),
plugin_registry=plugin_registry,
has_plugins=has_plugins,
)
with open("proxy/traefik/config-web.yml", "w", encoding="utf-8") as f:
f.write(config_web)


def write_compose() -> None:
plugin_registry = get_plugin_registry()
with open("proxy/tpl/docker-compose.yml.j2", encoding="utf-8") as f:
t = f.read()
tpl_compose = Template(t)
compose = tpl_compose.render(plugin_registry=plugin_registry)
with open("proxy/docker-compose.yml", "w", encoding="utf-8") as f:
f.write(compose)


def write_proxies() -> None:
write_maps()
write_proxy()
write_terminate()
write_routers()
write_config()
write_compose()


def update_proxy(
Expand Down
Loading

0 comments on commit 658d638

Please sign in to comment.