Skip to content

Commit

Permalink
3.0.2
Browse files Browse the repository at this point in the history
### Changelog:
* Feature(backend): Setup filename for ANSIBLE_STRING plugin.
* Feature(backend): Send `object_name` of models in hooks.
* Fix(backend): Allow to configure rpc backend and broker using environment variables in ``dockerrun``.
* Fix(package): Do not store compressed files.
* Chore(deps): Update rtd, prod and tests dependencies for backend.
* Docs: Add more information about architecture, plugins and centrifugo settings.

See merge request polemarch/ce!307
  • Loading branch information
onegreyonewhite committed Apr 21, 2023
2 parents 4e12a13 + 39cced2 commit 13ddb7c
Show file tree
Hide file tree
Showing 12 changed files with 128 additions and 89 deletions.
39 changes: 36 additions & 3 deletions doc/config.rst
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,35 @@ Polemarch nodes to maintain reliability or speedup things. It will give you
understanding of services, which are included into Polemarch and how to distribute them
between the nodes to reach your goal.

Project architecture
--------------------

Polemarch was created to adapt to any work environment. Almost every service can be easily replaced by another
without losing any functionality. The application architecture consists of the following elements:

- **Database** supports all types and versions that django can. The code was written to be vendor agnostic
to support as many backends as possible. Database contains information about projects settings, schedule and templates
of tasks, execution history, authorisation data, etc. Database performance is a key performance limitation of the entire Polemarch.

- **Cache** services is used for store session data, services locks, etc. Also, PM support all of Django can.
Mostly, we recommend to use Redis in small and medium clusters.

- **MQ** or rpc engine is required for notifying celery worker about new task execution request.
Redis in most cases can process up to 1000 executions/min. For more complex and high-load implementations,
it is recommended to use a distributed RabbitMQ cluster. If technically possible,
AWS SQS and its compatible counterparts from other cloud providers are also supported.

- **Centrifugo** (optional) is used for active user interaction. At this point,
the service notifies the user of an update or change to the data structure that the user is viewing to complete
a data update request. This reduces the load on the database, because without this service,
the interface makes periodic requests on a timer.

- **Project storage** at now is directory in filesystem where PM clone or unarchive project files for further executions.
Storage must be readable for web-server and writeable for celery worker. It can be mounted dir from shared storage.

Understanding what services the Polemarch application consists of, you can build any architecture of services
suitable for the circumstances and infrastructure.

.. _cluster:

Polemarch clustering overview
Expand Down Expand Up @@ -151,7 +180,7 @@ If you want to use LDAP protocol, you should create next settings in section ``[
ldap-default-domain is an optional argument, that is aimed to make user authorization easier
(without input of domain name).

ldap-auth_format is an optional argument, that is aimed to customize LDAP authorization.
ldap-auth_format is an optional argument, that is aimed to customize LDAP authorization request.
Default value: cn=<username>,<domain>

So in this case authorization logic will be the following:
Expand Down Expand Up @@ -334,7 +363,6 @@ session_timeout, static_files_url or pagination limit.

* **session_timeout** - Session life-cycle time. ``Default: 2w`` (two weeks).
* **rest_page_limit** - Default limit of objects in API list. ``Default: 1000``.
* **public_openapi** - Allow to have access to OpenAPI schema from public. ``Default: false``.
* **history_metrics_window** - Timeframe in seconds of collecting execution history statuses. ``Default: 1min``.

.. note:: You can find more Web options in :ref:`vstutils:web`.
Expand All @@ -353,7 +381,8 @@ When user change some data, other clients get notification on ``subscriptions_up
with model label and primary key. Without the service all GUI-clients get page data
every 5 seconds (by default). Centrifugo server v3 is supported.

* **address** - Centrifugo server address.
* **address** - Centrifugo api address. For example, ``http://localhost:8000/api``.
* **public_address** - Centrifugo server address. By default used **address** without ``/api`` prefix (http -> ws, https -> wss). Also, can be used relative path, like ``/centrifugo``.
* **api_key** - API key for clients.
* **token_hmac_secret_key** - API key for jwt-token generation.
* **timeout** - Connection timeout.
Expand All @@ -364,6 +393,10 @@ every 5 seconds (by default). Centrifugo server v3 is supported.
``token_hmac_secret_key`` is used for jwt-token generation (based on
session expiration time). Token will be used for Centrifugo-JS client.

.. note::
``api_key`` and ``token_hmac_secret_key`` come from ``config.json`` for Centrifugo.
Read more in `Official Centrifugo documentation <https://centrifugal.dev/docs/3/getting-started/quickstart>`_


.. _git:

Expand Down
15 changes: 13 additions & 2 deletions doc/gui.rst
Original file line number Diff line number Diff line change
Expand Up @@ -662,7 +662,7 @@ As you can see, compared to `POLEMARCH_DB` inventory, this one is state managed.

.. image:: new_screenshots/inventory_state_ansible_string.png

These types of inventory stores an extension of file, its body and specifies either file should be executable or not.
These types of inventory stores an extension of file, its body, filename and specifies either file should be executable or not.
Let's edit the state. Click :guilabel:`Edit` button:

.. image:: new_screenshots/inventory_state_edit_ansible_string.png
Expand All @@ -671,7 +671,10 @@ After saving:

.. image:: new_screenshots/inventory_state_ansible_string_2.png

Now inventory is ready for using..
Now inventory is ready for using.

This type of inventory is used to store the completed file in the project directory at the time the task is run.
It is helpful when you need to use one of the dynamic ansible inventory plugins.


ANSIBLE_FILE inventory
Expand All @@ -696,6 +699,14 @@ executed.

Done. Inventory is ready for use.

This type of inventory is used to use file from the project directory at the time the task is run.
It is helpful when you have GitOps infrastructure where project files is single store of truth.

.. note::
The use of strings when specifying an inventory path to describe a launch will be deprecated in future releases
and removed from the interface.
This inventory type is a direct string replacement.


Import inventory
----------------
Expand Down
53 changes: 11 additions & 42 deletions doc/installation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -284,67 +284,36 @@ Settings
Main section
~~~~~~~~~~~~

* **POLEMARCH_DEBUG** - status of debug mode. Default value: `false`.
* **DEBUG** - status of debug mode. Default value: `false`.

* **POLEMARCH_LOG_LEVEL** - log level. Default value: `WARNING`.
* **DJANGO_LOG_LEVEL** - log level. Default value: `WARNING`.

* **TIMEZONE** - timezone. Default value: `UTC`.

Database section
~~~~~~~~~~~~~~~~

You can set Database environment variables in two ways:
Setup database connection via ``django-environ``: :ref:`environ:environ-env-db-url`.

1. Using ``django-environ``: :ref:`environ:environ-env-db-url`.

For example for mysql, **DATABASE_URL** = ``'mysql://user:password@host:port/dbname'``.
Read more about ``django-environ`` in the :doc:`official django-environ documentation <environ:types>`.

2. Or you can specify every variable, but this way is deprecated and we won't support it in the next release.

If you not set **POLEMARCH_DB_HOST**, default database would be SQLite3, path to database file: `/db.sqlite3`.
If you set **POLEMARCH_DB_HOST**, Polemarch would be use MYSQL with next variables:

* **POLEMARCH_DB_TYPE** - name of database type. Support: `mysql` and `postgres` database. Needed only with **POLEMARCH_DB_HOST** option.

* **POLEMARCH_DB_NAME** - name of database.

* **POLEMARCH_DB_USER** - user connected to database.

* **POLEMARCH_DB_PASSWORD** - password for connection to database.

* **POLEMARCH_DB_HOST** - host for connection to database.

* **POLEMARCH_DB_PORT** - port for connection to database.

Database. Options section
~~~~~~~~~~~~~~~~~~~~~~~~~

.. note:: If you use :ref:`environ:environ-env-db-url`, you can't use **DB_INIT_CMD**.

* **DB_INIT_CMD** - command to start your database
For example for mysql, **DATABASE_URL** = ``'mysql://user:password@host:port/dbname'``.
Read more about ``django-environ`` in the :doc:`official django-environ documentation <environ:types>`.

Cache
~~~~~

For cache environment variables you can also use ``django-environ`` - :ref:`environ:environ-env-cache-url`.

For example for redis, **CACHE_URL** = ``redis://host:port/dbname``.

Or you can specify variable **CACHE_LOCATION**, but this way is deprecated and we won't support it in the next release.

* **CACHE_LOCATION** - path to cache, if you use `/tmp/polemarch_django_cache` path, then cache engine would be `FileBasedCache`,
else `MemcacheCache`. Default value: ``/tmp/polemarch_django_cache``.


RPC section
~~~~~~~~~~~

* **RPC_ENGINE** - connection to rpc service. If not set, not used.
* **POLEMARCH_RPC_ENGINE** - connection to rpc service. If not set used as tmp-dir.

* **POLEMARCH_RPC_RESULT_BACKEND** - connection to rpc results service. Default as engine.

* **RPC_HEARTBEAT** - Timeout for RPC. Default value: `5`.
* **POLEMARCH_RPC_HEARTBEAT** - Timeout for RPC. Default value: `5`.

* **RPC_CONCURRENCY** - Number of concurrently tasks. Default value: `4`.
* **POLEMARCH_RPC_CONCURRENCY** - Number of concurrently tasks. Default value: `4`.

Web section
~~~~~~~~~~~
Expand Down Expand Up @@ -376,7 +345,7 @@ Run Polemarch with Memcache and RabbitMQ and SQLite3. Polemarch log-level=INFO,

.. sourcecode:: bash

docker run -d --name polemarch --restart always -v /opt/polemarch/projects:/projects -v /opt/polemarch/hooks:/hooks --env RPC_ENGINE=amqp://polemarch:polemarch@rabbitmq-server:5672/polemarch --env CACHE_URL=memcache://memcached-server:11211/ --env POLEMARCH_LOG_LEVEL=INFO --env SECRET_KEY=mysecretkey -p 8080:8080 vstconsulting/polemarch
docker run -d --name polemarch --restart always -v /opt/polemarch/projects:/projects -v /opt/polemarch/hooks:/hooks --env POLEMARCH_RPC_ENGINE=amqp://polemarch:polemarch@rabbitmq-server:5672/polemarch --env CACHE_URL=memcache://memcached-server:11211/ --env POLEMARCH_LOG_LEVEL=INFO --env SECRET_KEY=mysecretkey -p 8080:8080 vstconsulting/polemarch


Also you can use `.env` file with all variable you want use on run docker:
Expand Down
2 changes: 1 addition & 1 deletion polemarch/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,6 @@
"VST_ROOT_URLCONF": os.getenv("VST_ROOT_URLCONF", 'vstutils.urls'),
}

__version__ = "3.0.1"
__version__ = "3.0.2"

prepare_environment(**default_settings)
20 changes: 15 additions & 5 deletions polemarch/main/hooks/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,13 @@


class BaseHook:
__slots__ = (
'when_types',
'hook_object',
'when',
'conf',
)

def __init__(self, hook_object, when_types=None, **kwargs):
self.when_types = when_types or []
self.hook_object = hook_object
Expand Down Expand Up @@ -31,13 +38,16 @@ def modify_message(self, message):
def execute(self, recipient, when, message): # nocv
raise NotImplementedError

def execute_many(self, recipients, when, message) -> str:
return '\n'.join(map(lambda r: self.execute(r, when, message), recipients))

def send(self, message, when: str) -> str:
self.when = when
filtered = filter(lambda r: r, self.conf['recipients'])
execute = self.execute
message = self.modify_message(message)
mapping = map(lambda r: execute(r, when, message), filtered)
return '\n'.join(mapping)
return self.execute_many(
recipients=filter(lambda r: r, self.conf['recipients']),
when=when,
message=self.modify_message(message),
)

def on_execution(self, message):
return self.send(message, when='on_execution')
Expand Down
2 changes: 2 additions & 0 deletions polemarch/main/hooks/http.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,8 @@


class Backend(BaseHook):
__slots__ = ()

def execute(self, url, when, message) -> str: # pylint: disable=arguments-renamed
data = dict(type=when, payload=message)
try:
Expand Down
1 change: 1 addition & 0 deletions polemarch/main/hooks/script.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@


class Backend(BaseHook):
__slots__ = ()

def execute(self, script, when, file) -> str: # pylint: disable=arguments-renamed
try:
Expand Down
18 changes: 11 additions & 7 deletions polemarch/main/models/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -52,18 +52,22 @@ def send_hook(when: Text, target: Any) -> None:
@raise_context()
def send_user_hook(when: Text, instance: Any) -> None:
send_hook(
when, OrderedDict(
user_id=instance.id,
username=instance.username,
admin=instance.is_staff
)
when, {
'user_id': instance.id,
'username': instance.username,
'admin': instance.is_staff,
}
)


@raise_context()
def send_polemarch_models(when: Text, instance: Any, **kwargs) -> None:
target = OrderedDict(id=instance.id, name=instance.name, **kwargs)
send_hook(when, target)
send_hook(when, {
'id': instance.id,
'name': instance.name,
'object_name': instance._meta.verbose_name, # pylint: disable=protected-access
**kwargs,
})


def raise_linked_error(exception_class=ValidationError, **kwargs):
Expand Down
17 changes: 11 additions & 6 deletions polemarch/plugins/inventory/ansible.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,13 +14,18 @@
from django.db import transaction
from rest_framework import fields as drffields
from vstutils.api import fields as vstfields
from vstutils.api.validators import RegularExpressionValidator
from .base import BasePlugin
from ...main.constants import HiddenVariablesEnum, CYPHER
from ...main.utils import AnsibleInventoryParser
from ...main.models.hosts import Host, Group
from ...main.models.vars import AbstractVarsQuerySet


class FilenameValidator(RegularExpressionValidator):
regexp = re.compile(r"^([\d\w\-_\.])*$", re.MULTILINE)


class InventoryDumper(Dumper):
"""
Yaml Dumper class for PyYAML
Expand Down Expand Up @@ -183,6 +188,7 @@ class AnsibleString(BaseAnsiblePlugin):

serializer_fields = {
'body': vstfields.FileInStringField(),
'filename': vstfields.CharField(required=False, allow_blank=True, default='', validators=[FilenameValidator]),
'extension': vstfields.AutoCompletionField(autocomplete=('yaml', 'ini', 'json'), default='yaml'),
'executable': drffields.BooleanField(default=False),
}
Expand All @@ -191,13 +197,14 @@ class AnsibleString(BaseAnsiblePlugin):
}
defaults = {
'body': '',
'filename': '',
'extension': 'yaml',
'executable': False,
}

def render_inventory(self, instance, execution_dir) -> Tuple[Path, list]:
state_data = instance.inventory_state.data
filename = f'inventory_{uuid1()}'
filename = state_data.get('filename') or f'inventory_{uuid1()}'

if state_data['extension']:
filename += f'.{state_data["extension"]}'
Expand All @@ -213,14 +220,12 @@ def render_inventory(self, instance, execution_dir) -> Tuple[Path, list]:
def import_inventory(cls, instance, data):
loaded = orjson.loads(data['file']) # pylint: disable=no-member
media_type = loaded['mediaType'] or ''
extension = mimetypes.guess_extension(media_type, strict=False) or ''
if extension != '':
extension = extension.replace('.', '', 1)
elif '.' in loaded['name']:
extension = loaded['name'].rsplit('.', maxsplit=1)[-1]
path_name = Path(loaded['name'])
extension = (mimetypes.guess_extension(media_type, strict=False) or path_name.suffix)[1:]
body = base64.b64decode(loaded['content']).decode('utf-8')
instance.update_inventory_state(data={
'body': body,
'filename': path_name.stem,
'extension': extension,
'executable': body.startswith('#!/'),
})
Expand Down
2 changes: 1 addition & 1 deletion requirements-doc.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Docs
-rrequirements.txt
vstutils[doc]~=5.4.0
vstutils[doc]~=5.4.2
sphinxcontrib-openapi~=0.7.0
2 changes: 1 addition & 1 deletion requirements.txt
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# Main
vstutils[rpc,prod]~=5.4.0
vstutils[rpc,prod]~=5.4.2
markdown2~=2.4.8

# Repo types
Expand Down
Loading

0 comments on commit 13ddb7c

Please sign in to comment.