diff --git a/.gitignore b/.gitignore index 23ae5f8a..19759a66 100644 --- a/.gitignore +++ b/.gitignore @@ -106,3 +106,5 @@ ENV/ # Intellij IDEa / PyCharm / etc. .idea + +docs/api diff --git a/HISTORY.md b/HISTORY.md index 43eac2c7..914317ee 100644 --- a/HISTORY.md +++ b/HISTORY.md @@ -17,3 +17,10 @@ History * Completed the Poetry Support * added actions for documentation + +0.0.3 (2021-05-24) +------------------ + +* renamed classes and removed Minos prefix +* Integration of Command and CommandReply with Handler + diff --git a/Makefile b/Makefile index b52b142f..5f0e6217 100644 --- a/Makefile +++ b/Makefile @@ -70,11 +70,8 @@ reformat: ## check code coverage quickly with the default Python poetry run isort --recursive minos tests docs: ## generate Sphinx HTML documentation, including API docs - rm -f docs/minos_microservice_network.rst - rm -f docs/modules.rst - poetry run sphinx-apidoc -o docs/api minos - poetry run $(MAKE) -C docs clean - poetry run $(MAKE) -C docs html + rm -rf docs/api + poetry run $(MAKE) -C docs clean html servedocs: docs ## compile the docs watching for changes watchmedo shell-command -p '*.rst' -c '$(MAKE) -C docs html' -R -D . diff --git a/docs/authors.rst b/docs/authors.rst index e122f914..bcfd9cb0 100644 --- a/docs/authors.rst +++ b/docs/authors.rst @@ -1 +1 @@ -.. include:: ../AUTHORS.rst +.. include:: ../AUTHORS.md diff --git a/docs/conf.py b/docs/conf.py index 74f694f0..8e353769 100755 --- a/docs/conf.py +++ b/docs/conf.py @@ -19,6 +19,7 @@ # import os import sys + sys.path.insert(0, os.path.abspath('..')) from minos import networks @@ -31,7 +32,12 @@ # Add any Sphinx extension module names here, as strings. They can be # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom ones. -extensions = ['sphinx.ext.autodoc', 'sphinx.ext.viewcode'] +extensions = [ + "sphinxcontrib.apidoc", + 'sphinx.ext.autodoc', + "sphinx_autodoc_typehints", + 'sphinx.ext.viewcode', +] # Add any paths that contain templates here, relative to this directory. templates_path = ['_templates'] @@ -158,5 +164,24 @@ 'Miscellaneous'), ] +## "apidoc" extension +apidoc_module_dir = "../minos" +apidoc_output_dir = "api" +apidoc_separate_modules = True +autodoc_default_options = { + "inherited-members": True, + "special-members": "__init__", + "undoc-members": True, +} + +apidoc_toc_file = False +apidoc_module_first = True +apidoc_extra_args = [ + "--force", + "--implicit-namespaces", +] +## "autodoc typehints" extension +set_type_checking_flag = True +typehints_fully_qualified = True diff --git a/docs/history.rst b/docs/history.rst index 25064996..fcd2eb2d 100644 --- a/docs/history.rst +++ b/docs/history.rst @@ -1 +1 @@ -.. include:: ../HISTORY.rst +.. include:: ../HISTORY.md diff --git a/docs/index.rst b/docs/index.rst index ec34085e..8cfd21f2 100644 --- a/docs/index.rst +++ b/docs/index.rst @@ -7,7 +7,7 @@ Welcome to Minos Microservice Networks's documentation! readme usage - api/modules + modules authors history diff --git a/docs/modules.rst b/docs/modules.rst new file mode 100644 index 00000000..9201e6d9 --- /dev/null +++ b/docs/modules.rst @@ -0,0 +1,8 @@ +======= +Modules +======= + +.. toctree:: + :maxdepth: 2 + + api/minos diff --git a/docs/readme.rst b/docs/readme.rst index 72a33558..bdff72a8 100644 --- a/docs/readme.rst +++ b/docs/readme.rst @@ -1 +1 @@ -.. include:: ../README.rst +.. include:: ../README.md diff --git a/minos/networks/__init__.py b/minos/networks/__init__.py index 940925ca..f34bf1f4 100644 --- a/minos/networks/__init__.py +++ b/minos/networks/__init__.py @@ -5,40 +5,43 @@ Minos framework can not be copied and/or distributed without the express permission of Clariteia SL. """ -__version__ = "0.0.2" +__version__ = "0.0.3" -from .broker import ( - MinosCommandBroker, - MinosEventBroker, - MinosQueueDispatcher, - MinosQueueService, +from .brokers import ( + CommandBroker, + CommandReplyBroker, + EventBroker, + Producer, + ProducerService, ) from .exceptions import ( MinosNetworkException, MinosPreviousVersionSnapshotException, MinosSnapshotException, ) -from .handler import ( - MinosCommandHandlerDispatcher, - MinosCommandHandlerServer, - MinosCommandPeriodicService, - MinosCommandReplyHandlerDispatcher, - MinosCommandReplyHandlerServer, - MinosCommandReplyPeriodicService, - MinosCommandReplyServerService, - MinosCommandServerService, - MinosEventHandlerDispatcher, - MinosEventHandlerServer, - MinosEventPeriodicService, - MinosEventServerService, - MinosHandlerSetup, +from .handlers import ( + CommandConsumer, + CommandConsumerService, + CommandHandler, + CommandHandlerService, + CommandReplyConsumer, + CommandReplyConsumerService, + CommandReplyHandler, + CommandReplyHandlerService, + Consumer, + EventConsumer, + EventConsumerService, + EventHandler, + EventHandlerService, + Handler, + HandlerSetup, ) -from .rest_interface import ( - REST, - RestInterfaceHandler, +from .rest import ( + RestBuilder, + RestService, ) from .snapshots import ( - MinosSnapshotDispatcher, - MinosSnapshotEntry, - MinosSnapshotService, + SnapshotBuilder, + SnapshotEntry, + SnapshotService, ) diff --git a/minos/networks/broker/__init__.py b/minos/networks/brokers/__init__.py similarity index 58% rename from minos/networks/broker/__init__.py rename to minos/networks/brokers/__init__.py index e229ea70..7e7e7b72 100644 --- a/minos/networks/broker/__init__.py +++ b/minos/networks/brokers/__init__.py @@ -5,15 +5,21 @@ Minos framework can not be copied and/or distributed without the express permission of Clariteia SL. """ -from .commands import ( - MinosCommandBroker, +from .abc import ( + Broker, +) +from .command_replies import ( + CommandReplyBroker, ) -from .dispatchers import ( - MinosQueueDispatcher, +from .commands import ( + CommandBroker, ) from .events import ( - MinosEventBroker, + EventBroker, +) +from .producers import ( + Producer, ) from .services import ( - MinosQueueService, + ProducerService, ) diff --git a/minos/networks/broker/abc.py b/minos/networks/brokers/abc.py similarity index 82% rename from minos/networks/broker/abc.py rename to minos/networks/brokers/abc.py index b2c01d9e..07170c1c 100644 --- a/minos/networks/broker/abc.py +++ b/minos/networks/brokers/abc.py @@ -13,15 +13,16 @@ ) from typing import ( NoReturn, + Optional, ) from minos.common import ( - MinosBaseBroker, + MinosBroker, PostgreSqlMinosDatabase, ) -class MinosBrokerSetup(PostgreSqlMinosDatabase): +class BrokerSetup(PostgreSqlMinosDatabase): """Minos Broker Setup Class""" async def _setup(self) -> NoReturn: @@ -31,14 +32,14 @@ async def _create_broker_table(self) -> NoReturn: await self.submit_query(_CREATE_TABLE_QUERY) -class MinosBroker(MinosBaseBroker, MinosBrokerSetup, ABC): +class Broker(MinosBroker, BrokerSetup, ABC): """Minos Broker Class.""" ACTION: str - def __init__(self, topic: str, *args, **kwargs): - MinosBaseBroker.__init__(self, topic) - MinosBrokerSetup.__init__(self, *args, **kwargs) + def __init__(self, topic: Optional[str] = None, *args, **kwargs): + super().__init__(*args, **kwargs) + self.topic = topic async def _send_bytes(self, topic: str, raw: bytes) -> int: params = (topic, raw, 0, self.ACTION, datetime.now(), datetime.now()) diff --git a/minos/networks/brokers/command_replies.py b/minos/networks/brokers/command_replies.py new file mode 100644 index 00000000..f2e59665 --- /dev/null +++ b/minos/networks/brokers/command_replies.py @@ -0,0 +1,64 @@ +""" +Copyright (C) 2021 Clariteia SL + +This file is part of minos framework. + +Minos framework can not be copied and/or distributed without the express permission of Clariteia SL. +""" +from __future__ import ( + annotations, +) + +from typing import ( + Optional, +) + +from minos.common import ( + CommandReply, + MinosConfig, + MinosModel, +) + +from .abc import ( + Broker, +) + + +class CommandReplyBroker(Broker): + """Minos Command Broker Class.""" + + ACTION = "commandReply" + + def __init__(self, *args, saga_id: str, task_id: str, **kwargs): + super().__init__(*args, **kwargs) + self.saga_id = saga_id + self.task_id = task_id + + @classmethod + def _from_config(cls, *args, config: MinosConfig, **kwargs) -> CommandReplyBroker: + return cls(*args, **config.saga.queue._asdict(), **kwargs) + + async def send( + self, + items: list[MinosModel], + topic: Optional[str] = None, + saga_id: Optional[str] = None, + task_id: Optional[str] = None, + **kwargs + ) -> int: + """Send a list of ``Aggregate`` instances. + + :param items: A list of aggregates. + :param topic: Topic in which the message will be published. + :param saga_id: Saga identifier. + :param task_id: Saga execution identifier. + :return: This method does not return anything. + """ + if topic is None: + topic = self.topic + if saga_id is None: + saga_id = self.saga_id + if task_id is None: + task_id = self.task_id + command_reply = CommandReply(topic=topic, items=items, saga_id=saga_id, task_id=task_id) + return await self._send_bytes(command_reply.topic, command_reply.avro_bytes) diff --git a/minos/networks/broker/commands.py b/minos/networks/brokers/commands.py similarity index 50% rename from minos/networks/broker/commands.py rename to minos/networks/brokers/commands.py index 7d80d943..9874a459 100644 --- a/minos/networks/broker/commands.py +++ b/minos/networks/brokers/commands.py @@ -14,17 +14,17 @@ ) from minos.common import ( - Aggregate, Command, MinosConfig, + MinosModel, ) from .abc import ( - MinosBroker, + Broker, ) -class MinosCommandBroker(MinosBroker): +class CommandBroker(Broker): """Minos Command Broker Class.""" ACTION = "command" @@ -36,25 +36,34 @@ def __init__(self, *args, saga_id: str, task_id: str, reply_on: str, **kwargs): self.task_id = task_id @classmethod - def from_config(cls, *args, config: MinosConfig = None, **kwargs) -> Optional[MinosCommandBroker]: - """Build a new repository from config. - :param args: Additional positional arguments. - :param config: Config instance. If `None` is provided, default config is chosen. - :param kwargs: Additional named arguments. - :return: A `MinosRepository` instance. - """ - if config is None: - config = MinosConfig.get_default() - if config is None: - return None - # noinspection PyProtectedMember + def _from_config(cls, *args, config: MinosConfig, **kwargs) -> CommandBroker: return cls(*args, **config.commands.queue._asdict(), **kwargs) - async def send(self, items: list[Aggregate]) -> int: + async def send( + self, + items: list[MinosModel], + topic: Optional[str] = None, + saga_id: Optional[str] = None, + task_id: Optional[str] = None, + reply_on: Optional[str] = None, + **kwargs + ) -> int: """Send a list of ``Aggregate`` instances. :param items: A list of aggregates. + :param topic: Topic in which the message will be published. + :param saga_id: Saga identifier. + :param task_id: Saga execution identifier. + :param reply_on: Topic name in which the reply will be published. :return: This method does not return anything. """ - command = Command(self.topic, items, self.saga_id, self.task_id, self.reply_on) + if topic is None: + topic = self.topic + if saga_id is None: + saga_id = self.saga_id + if task_id is None: + task_id = self.task_id + if reply_on is None: + reply_on = self.reply_on + command = Command(topic, items, saga_id, task_id, reply_on) return await self._send_bytes(command.topic, command.avro_bytes) diff --git a/minos/networks/broker/events.py b/minos/networks/brokers/events.py similarity index 50% rename from minos/networks/broker/events.py rename to minos/networks/brokers/events.py index b19b587a..0665755c 100644 --- a/minos/networks/broker/events.py +++ b/minos/networks/brokers/events.py @@ -20,35 +20,27 @@ ) from .abc import ( - MinosBroker, + Broker, ) -class MinosEventBroker(MinosBroker): +class EventBroker(Broker): """Minos Event broker class.""" ACTION = "event" @classmethod - def from_config(cls, *args, config: MinosConfig = None, **kwargs) -> Optional[MinosEventBroker]: - """Build a new repository from config. - :param args: Additional positional arguments. - :param config: Config instance. If `None` is provided, default config is chosen. - :param kwargs: Additional named arguments. - :return: A `MinosRepository` instance. - """ - if config is None: - config = MinosConfig.get_default() - if config is None: - return None - # noinspection PyProtectedMember + def _from_config(cls, *args, config: MinosConfig, **kwargs) -> EventBroker: return cls(*args, **config.events.queue._asdict(), **kwargs) - async def send(self, items: list[Aggregate]) -> int: + async def send(self, items: list[Aggregate], topic: Optional[str] = None, **kwargs) -> int: """Send a list of ``Aggregate`` instances. :param items: A list of aggregates. + :param topic: Topic in which the message will be published. :return: This method does not return anything. """ - event = Event(self.topic, items) + if topic is None: + topic = self.topic + event = Event(topic, items) return await self._send_bytes(event.topic, event.avro_bytes) diff --git a/minos/networks/broker/dispatchers.py b/minos/networks/brokers/producers.py similarity index 82% rename from minos/networks/broker/dispatchers.py rename to minos/networks/brokers/producers.py index 77e2f8ba..56bf8d29 100644 --- a/minos/networks/broker/dispatchers.py +++ b/minos/networks/brokers/producers.py @@ -13,7 +13,6 @@ AsyncIterator, NamedTuple, NoReturn, - Optional, ) from aiokafka import ( @@ -22,15 +21,14 @@ from minos.common import ( MinosConfig, - MinosConfigException, ) from .abc import ( - MinosBrokerSetup, + BrokerSetup, ) -class MinosQueueDispatcher(MinosBrokerSetup): +class Producer(BrokerSetup): """Minos Queue Dispatcher Class.""" # noinspection PyUnresolvedReferences @@ -42,18 +40,7 @@ def __init__(self, *args, queue: NamedTuple, broker, **kwargs): self.broker = broker @classmethod - def from_config(cls, *args, config: MinosConfig = None, **kwargs) -> Optional[MinosQueueDispatcher]: - """Build a new repository from config. - :param args: Additional positional arguments. - :param config: Config instance. If `None` is provided, default config is chosen. - :param kwargs: Additional named arguments. - :return: A `MinosRepository` instance. - """ - if config is None: - config = MinosConfig.get_default() - if config is None: - raise MinosConfigException("The config object must be setup.") - # noinspection PyProtectedMember + def _from_config(cls, *args, config: MinosConfig, **kwargs) -> Producer: return cls(*args, **config.events._asdict(), **kwargs) async def dispatch(self) -> NoReturn: diff --git a/minos/networks/broker/services.py b/minos/networks/brokers/services.py similarity index 88% rename from minos/networks/broker/services.py rename to minos/networks/brokers/services.py index c861c060..8e070eb3 100644 --- a/minos/networks/broker/services.py +++ b/minos/networks/brokers/services.py @@ -17,17 +17,17 @@ MinosConfig, ) -from .dispatchers import ( - MinosQueueDispatcher, +from .producers import ( + Producer, ) -class MinosQueueService(PeriodicService): +class ProducerService(PeriodicService): """Minos QueueDispatcherService class.""" def __init__(self, config: MinosConfig = None, **kwargs): super().__init__(**kwargs) - self.dispatcher = MinosQueueDispatcher.from_config(config=config) + self.dispatcher = Producer.from_config(config=config) async def start(self) -> None: """Method to be called at the startup by the internal ``aiomisc`` loigc. diff --git a/minos/networks/handler/__init__.py b/minos/networks/handler/__init__.py deleted file mode 100644 index 3e8f8c70..00000000 --- a/minos/networks/handler/__init__.py +++ /dev/null @@ -1,28 +0,0 @@ -""" -Copyright (C) 2021 Clariteia SL - -This file is part of minos framework. - -Minos framework can not be copied and/or distributed without the express permission of Clariteia SL. -""" -from .abc import ( - MinosHandlerSetup, -) -from .command import ( - MinosCommandHandlerDispatcher, - MinosCommandHandlerServer, - MinosCommandPeriodicService, - MinosCommandServerService, -) -from .command_reply import ( - MinosCommandReplyHandlerDispatcher, - MinosCommandReplyHandlerServer, - MinosCommandReplyPeriodicService, - MinosCommandReplyServerService, -) -from .event import ( - MinosEventHandlerDispatcher, - MinosEventHandlerServer, - MinosEventPeriodicService, - MinosEventServerService, -) diff --git a/minos/networks/handler/abc.py b/minos/networks/handler/abc.py deleted file mode 100644 index 82f1261d..00000000 --- a/minos/networks/handler/abc.py +++ /dev/null @@ -1,60 +0,0 @@ -""" -Copyright (C) 2021 Clariteia SL - -This file is part of minos framework. - -Minos framework can not be copied and/or distributed without the express permission of Clariteia SL. -""" -from typing import ( - NoReturn, -) - -import aiopg - -from minos.common import ( - MinosSetup, -) - - -class MinosHandlerSetup(MinosSetup): - """Minos Broker Setup Class""" - - def __init__( - self, - table_name: str, - *args, - host: str = None, - port: int = None, - database: str = None, - user: str = None, - password: str = None, - **kwargs - ): - super().__init__(*args, **kwargs) - - self.host = host - self.port = port - self.database = database - self.user = user - self.password = password - self.table_name = table_name - - async def _setup(self) -> NoReturn: - await self._create_event_queue_table() - - async def _create_event_queue_table(self) -> NoReturn: - async with self._connection() as connect: - async with connect.cursor() as cur: - await cur.execute( - 'CREATE TABLE IF NOT EXISTS "%s" (' - '"id" BIGSERIAL NOT NULL PRIMARY KEY, ' - '"topic" VARCHAR(255) NOT NULL, ' - '"partition_id" INTEGER,' - '"binary_data" BYTEA NOT NULL, ' - '"creation_date" TIMESTAMP NOT NULL);' % (self.table_name) - ) - - def _connection(self): - return aiopg.connect( - host=self.host, port=self.port, dbname=self.database, user=self.user, password=self.password, - ) diff --git a/minos/networks/handler/command/dispatcher.py b/minos/networks/handler/command/dispatcher.py deleted file mode 100644 index bab3cfdb..00000000 --- a/minos/networks/handler/command/dispatcher.py +++ /dev/null @@ -1,35 +0,0 @@ -# Copyright (C) 2020 Clariteia SL -# -# This file is part of minos framework. -# -# Minos framework can not be copied and/or distributed without the express -# permission of Clariteia SL. - -from typing import ( - Any, -) - -from minos.common import ( - Command, - MinosConfig, -) - -from ..dispatcher import ( - MinosHandlerDispatcher, -) - - -class MinosCommandHandlerDispatcher(MinosHandlerDispatcher): - - TABLE = "command_queue" - - def __init__(self, *, config: MinosConfig, **kwargs: Any): - super().__init__(table_name=self.TABLE, config=config.commands, **kwargs) - self._broker_group_name = f"event_{config.service.name}" - - def _is_valid_instance(self, value: bytes): - try: - instance = Command.from_avro_bytes(value) - return True, instance - except: # noqa E722 - return False, None diff --git a/minos/networks/handler/command_reply/__init__.py b/minos/networks/handler/command_reply/__init__.py deleted file mode 100644 index 8bba3430..00000000 --- a/minos/networks/handler/command_reply/__init__.py +++ /dev/null @@ -1,18 +0,0 @@ -""" -Copyright (C) 2021 Clariteia SL - -This file is part of minos framework. - -Minos framework can not be copied and/or distributed without the express permission of Clariteia SL. -""" - -from .dispatcher import ( - MinosCommandReplyHandlerDispatcher, -) -from .server import ( - MinosCommandReplyHandlerServer, -) -from .services import ( - MinosCommandReplyPeriodicService, - MinosCommandReplyServerService, -) diff --git a/minos/networks/handler/command_reply/dispatcher.py b/minos/networks/handler/command_reply/dispatcher.py deleted file mode 100644 index de5e7dd9..00000000 --- a/minos/networks/handler/command_reply/dispatcher.py +++ /dev/null @@ -1,34 +0,0 @@ -# Copyright (C) 2020 Clariteia SL -# -# This file is part of minos framework. -# -# Minos framework can not be copied and/or distributed without the express -# permission of Clariteia SL. -from typing import ( - Any, -) - -from minos.common import ( - CommandReply, - MinosConfig, -) - -from ..dispatcher import ( - MinosHandlerDispatcher, -) - - -class MinosCommandReplyHandlerDispatcher(MinosHandlerDispatcher): - - TABLE = "command_reply_queue" - - def __init__(self, *, config: MinosConfig, **kwargs: Any): - super().__init__(table_name=self.TABLE, config=config.saga, **kwargs) - self._broker_group_name = f"event_{config.service.name}" - - def _is_valid_instance(self, value: bytes): - try: - instance = CommandReply.from_avro_bytes(value) - return True, instance - except: # noqa E722 - return False, None diff --git a/minos/networks/handlers/__init__.py b/minos/networks/handlers/__init__.py new file mode 100644 index 00000000..82fd3b8c --- /dev/null +++ b/minos/networks/handlers/__init__.py @@ -0,0 +1,33 @@ +""" +Copyright (C) 2021 Clariteia SL + +This file is part of minos framework. + +Minos framework can not be copied and/or distributed without the express permission of Clariteia SL. +""" +from .abc import ( + Consumer, + Handler, + HandlerSetup, +) +from .command_replies import ( + CommandReplyConsumer, + CommandReplyConsumerService, + CommandReplyHandler, + CommandReplyHandlerService, +) +from .commands import ( + CommandConsumer, + CommandConsumerService, + CommandHandler, + CommandHandlerService, +) +from .entries import ( + HandlerEntry, +) +from .events import ( + EventConsumer, + EventConsumerService, + EventHandler, + EventHandlerService, +) diff --git a/minos/networks/handlers/abc/__init__.py b/minos/networks/handlers/abc/__init__.py new file mode 100644 index 00000000..65bf2a2e --- /dev/null +++ b/minos/networks/handlers/abc/__init__.py @@ -0,0 +1,16 @@ +""" +Copyright (C) 2021 Clariteia SL + +This file is part of minos framework. + +Minos framework can not be copied and/or distributed without the express permission of Clariteia SL. +""" +from .consumers import ( + Consumer, +) +from .handlers import ( + Handler, +) +from .setups import ( + HandlerSetup, +) diff --git a/minos/networks/handler/server.py b/minos/networks/handlers/abc/consumers.py similarity index 53% rename from minos/networks/handler/server.py rename to minos/networks/handlers/abc/consumers.py index d5b4a5a7..3d884d65 100644 --- a/minos/networks/handler/server.py +++ b/minos/networks/handlers/abc/consumers.py @@ -10,18 +10,15 @@ import asyncio import datetime -import typing as t from abc import ( abstractmethod, ) from typing import ( Any, - AsyncIterator, - NamedTuple, + NoReturn, Optional, ) -import aiopg from aiokafka import ( AIOKafkaConsumer, ) @@ -33,12 +30,12 @@ MinosConfig, ) -from .abc import ( - MinosHandlerSetup, +from .setups import ( + HandlerSetup, ) -class MinosHandlerServer(MinosHandlerSetup): +class Consumer(HandlerSetup): """ Handler Server @@ -46,64 +43,56 @@ class MinosHandlerServer(MinosHandlerSetup): """ - __slots__ = "_tasks", "_db_dsn", "_handlers", "_topics", "_broker_group_name" + __slots__ = "_tasks", "_handler", "_topics", "_table_name", "_broker_group_name", "_kafka_conn_data" - def __init__(self, *, table_name: str, config: NamedTuple, **kwargs: Any): + def __init__(self, *, table_name: str, config, consumer: Optional[Any] = None, **kwargs: Any): super().__init__(table_name=table_name, **kwargs, **config.queue._asdict()) - self._tasks = set() # type: t.Set[asyncio.Task] - self._db_dsn = ( - f"dbname={config.queue.database} user={config.queue.user} " - f"password={config.queue.password} host={config.queue.host}" - ) + self._tasks = set() # type: set[asyncio.Task] self._handler = {item.name: {"controller": item.controller, "action": item.action} for item in config.items} self._topics = list(self._handler.keys()) self._table_name = table_name + self._broker_group_name = None + self._kafka_conn_data = None + self.__consumer = consumer @classmethod - def from_config(cls, *args, config: MinosConfig = None, **kwargs) -> Optional[MinosHandlerServer]: - """Build a new repository from config. - :param args: Additional positional arguments. - :param config: Config instance. If `None` is provided, default config is chosen. - :param kwargs: Additional named arguments. - :return: A `MinosRepository` instance. - """ - if config is None: - config = MinosConfig.get_default() - if config is None: - return None - # noinspection PyProtectedMember + def _from_config(cls, *args, config: MinosConfig, **kwargs) -> Consumer: return cls(*args, config=config, **kwargs) - async def queue_add(self, topic: str, partition: int, binary: bytes): - """Insert row to event_queue table. - - Retrieves number of affected rows and row ID. + async def _setup(self) -> NoReturn: + await super()._setup() + await self._consumer.start() - Args: - topic: Kafka topic. Example: "TicketAdded" - partition: Kafka partition number. - binary: Event Model in bytes. + @property + def _consumer(self) -> AIOKafkaConsumer: + if self.__consumer is None: # pragma: no cover + self.__consumer = AIOKafkaConsumer( + *self._topics, group_id=self._broker_group_name, bootstrap_servers=self._kafka_conn_data, + ) + return self.__consumer - Returns: - Affected rows and queue ID. + async def _destroy(self) -> NoReturn: + await self._consumer.stop() + await super()._destroy() - Example: 1, 12 + async def dispatch(self) -> NoReturn: + """Perform a dispatching step. - Raises: - Exception: An error occurred inserting record. + :return: This method does not return anything. """ + await self.handle_message(self._consumer) - async with aiopg.create_pool(self._db_dsn) as pool: - async with pool.acquire() as connect: - async with connect.cursor() as cur: - await cur.execute( - _INSERT_QUERY, (AsIs(self._table_name), topic, partition, binary, datetime.datetime.now(),), - ) + async def handle_message(self, consumer: Any) -> NoReturn: + """Message consumer. - queue_id = await cur.fetchone() - affected_rows = cur.rowcount + It consumes the messages and sends them for processing. - return affected_rows, queue_id[0] + Args: + consumer: Kafka Consumer instance (at the moment only Kafka consumer is supported). + """ + + async for msg in consumer: + await self.handle_single_message(msg) async def handle_single_message(self, msg): """Handle Kafka messages. @@ -121,34 +110,36 @@ async def handle_single_message(self, msg): # check if the event binary string is well formatted if not self._is_valid_instance(msg.value): return - affected_rows, id = await self.queue_add(msg.topic, msg.partition, msg.value) - return affected_rows, id + + return await self.queue_add(msg.topic, msg.partition, msg.value) @abstractmethod def _is_valid_instance(self, value: bytes): # pragma: no cover raise Exception("Method not implemented") - async def handle_message(self, consumer: AsyncIterator): - """Message consumer. + async def queue_add(self, topic: str, partition: int, binary: bytes) -> int: + """Insert row to event_queue table. - It consumes the messages and sends them for processing. + Retrieves number of affected rows and row ID. Args: - consumer: Kafka Consumer instance (at the moment only Kafka consumer is supported). - """ + topic: Kafka topic. Example: "TicketAdded" + partition: Kafka partition number. + binary: Event Model in bytes. - async for msg in consumer: - await self.handle_single_message(msg) + Returns: + Queue ID. - @staticmethod - async def kafka_consumer(topics: list, group_name: str, conn: str): - # start the Service Event Consumer for Kafka - consumer = AIOKafkaConsumer(group_id=group_name, auto_offset_reset="latest", bootstrap_servers=conn,) + Example: 12 - await consumer.start() - consumer.subscribe(topics) + Raises: + Exception: An error occurred inserting record. + """ + queue_id = await self.submit_query_and_fetchone( + _INSERT_QUERY, (AsIs(self._table_name), topic, partition, binary, datetime.datetime.now()), + ) - return consumer + return queue_id[0] _INSERT_QUERY = """ diff --git a/minos/networks/handler/dispatcher.py b/minos/networks/handlers/abc/handlers.py similarity index 51% rename from minos/networks/handler/dispatcher.py rename to minos/networks/handlers/abc/handlers.py index 360e09ad..fe3e951f 100644 --- a/minos/networks/handler/dispatcher.py +++ b/minos/networks/handlers/abc/handlers.py @@ -12,46 +12,46 @@ from abc import ( abstractmethod, ) +from datetime import ( + datetime, +) from typing import ( Any, Callable, NamedTuple, NoReturn, - Optional, ) -import aiopg - from minos.common import ( MinosConfig, + MinosModel, import_module, ) from minos.common.logs import ( log, ) -from ..exceptions import ( +from ...exceptions import ( MinosNetworkException, ) -from .abc import ( - MinosHandlerSetup, +from ..entries import ( + HandlerEntry, +) +from .setups import ( + HandlerSetup, ) -class MinosHandlerDispatcher(MinosHandlerSetup): +class Handler(HandlerSetup): """ Event Handler """ - __slots__ = "_db_dsn", "_handlers", "_event_items", "_topics", "_conf" + __slots__ = "_handlers", "_event_items", "_topics", "_conf" def __init__(self, *, table_name: str, config: NamedTuple, **kwargs: Any): super().__init__(table_name=table_name, **kwargs, **config.queue._asdict()) - self._db_dsn = ( - f"dbname={config.queue.database} user={config.queue.user} " - f"password={config.queue.password} host={config.queue.host}" - ) self._handlers = {item.name: {"controller": item.controller, "action": item.action} for item in config.items} self._event_items = config.items self._topics = list(self._handlers.keys()) @@ -59,20 +59,49 @@ def __init__(self, *, table_name: str, config: NamedTuple, **kwargs: Any): self._table_name = table_name @classmethod - def from_config(cls, *args, config: MinosConfig = None, **kwargs) -> Optional[MinosHandlerDispatcher]: - """Build a new repository from config. - :param args: Additional positional arguments. - :param config: Config instance. If `None` is provided, default config is chosen. - :param kwargs: Additional named arguments. - :return: A `MinosRepository` instance. - """ - if config is None: - config = MinosConfig.get_default() - if config is None: - return None - # noinspection PyProtectedMember + def _from_config(cls, *args, config: MinosConfig, **kwargs) -> Handler: return cls(*args, config=config, **kwargs) + async def dispatch(self) -> NoReturn: + """Event Queue Checker and dispatcher. + + It is in charge of querying the database and calling the action according to the topic. + + 1. Get periodically 10 records (or as many as defined in config > queue > records). + 2. Instantiate the action (asynchronous) by passing it the model. + 3. If the invoked function terminates successfully, remove the event from the database. + + Raises: + Exception: An error occurred inserting record. + """ + iterable = self.submit_query_and_iter( + "SELECT * FROM %s ORDER BY creation_date ASC LIMIT %d;" % (self._table_name, self._conf.queue.records), + ) + async for row in iterable: + try: + await self.dispatch_one(row) + except Exception as exc: + log.warning(exc) + continue + await self.submit_query("DELETE FROM %s WHERE id=%d;" % (self._table_name, row[0])) + + async def dispatch_one(self, row: tuple[int, str, int, bytes, datetime]) -> NoReturn: + """Dispatch one row. + + :param row: Row to be dispatched. + :return: This method does not return anything. + """ + id = row[0] + topic = row[1] + callback = self.get_event_handler(row[1]) + partition_id = row[2] + data = self._build_data(row[3]) + created_at = row[4] + + entry = HandlerEntry(id, topic, callback, partition_id, data, created_at) + + await self._dispatch_one(entry) + def get_event_handler(self, topic: str) -> Callable: """Get Event instance to call. @@ -102,44 +131,10 @@ def get_event_handler(self, topic: str) -> Callable: f"topic {topic} have no controller/action configured, " f"please review th configuration file" ) - async def queue_checker(self) -> NoReturn: - """Event Queue Checker and dispatcher. - - It is in charge of querying the database and calling the action according to the topic. - - 1. Get periodically 10 records (or as many as defined in config > queue > records). - 2. Instantiate the action (asynchronous) by passing it the model. - 3. If the invoked function terminates successfully, remove the event from the database. - - Raises: - Exception: An error occurred inserting record. - """ - db_dsn = ( - f"dbname={self._conf.queue.database} user={self._conf.queue.user} " - f"password={self._conf.queue.password} host={self._conf.queue.host}" - ) - async with aiopg.create_pool(db_dsn) as pool: - async with pool.acquire() as connect: - async with connect.cursor() as cur: - await cur.execute( - "SELECT * FROM %s ORDER BY creation_date ASC LIMIT %d;" - % (self._table_name, self._conf.queue.records), - ) - async for row in cur: - call_ok = False - try: - reply_on = self.get_event_handler(topic=row[1]) - valid_instance, instance = self._is_valid_instance(row[3]) - if not valid_instance: - return - await reply_on(row[1], instance) - call_ok = True - finally: - if call_ok: - # Delete from database If the event was sent successfully to Kafka. - async with connect.cursor() as cur2: - await cur2.execute("DELETE FROM %s WHERE id=%d;" % (self._table_name, row[0])) + @abstractmethod + def _build_data(self, value: bytes) -> MinosModel: + raise NotImplementedError @abstractmethod - def _is_valid_instance(self, value: bytes): # pragma: no cover - raise Exception("Method not implemented") + async def _dispatch_one(self, row: HandlerEntry) -> NoReturn: + raise NotImplementedError diff --git a/minos/networks/handlers/abc/setups.py b/minos/networks/handlers/abc/setups.py new file mode 100644 index 00000000..18d98e73 --- /dev/null +++ b/minos/networks/handlers/abc/setups.py @@ -0,0 +1,35 @@ +""" +Copyright (C) 2021 Clariteia SL + +This file is part of minos framework. + +Minos framework can not be copied and/or distributed without the express permission of Clariteia SL. +""" +from typing import ( + NoReturn, +) + +from minos.common import ( + PostgreSqlMinosDatabase, +) + + +class HandlerSetup(PostgreSqlMinosDatabase): + """Minos Broker Setup Class""" + + def __init__(self, table_name: str, *args, **kwargs): + super().__init__(*args, **kwargs) + self.table_name = table_name + + async def _setup(self) -> NoReturn: + await self._create_event_queue_table() + + async def _create_event_queue_table(self) -> NoReturn: + await self.submit_query( + 'CREATE TABLE IF NOT EXISTS "%s" (' + '"id" BIGSERIAL NOT NULL PRIMARY KEY, ' + '"topic" VARCHAR(255) NOT NULL, ' + '"partition_id" INTEGER,' + '"binary_data" BYTEA NOT NULL, ' + '"creation_date" TIMESTAMP NOT NULL);' % (self.table_name) + ) diff --git a/minos/networks/handler/event/__init__.py b/minos/networks/handlers/command_replies/__init__.py similarity index 55% rename from minos/networks/handler/event/__init__.py rename to minos/networks/handlers/command_replies/__init__.py index 0b8f5274..a1a8eced 100644 --- a/minos/networks/handler/event/__init__.py +++ b/minos/networks/handlers/command_replies/__init__.py @@ -6,13 +6,13 @@ Minos framework can not be copied and/or distributed without the express permission of Clariteia SL. """ -from .dispatcher import ( - MinosEventHandlerDispatcher, +from .consumers import ( + CommandReplyConsumer, ) -from .server import ( - MinosEventHandlerServer, +from .handlers import ( + CommandReplyHandler, ) from .services import ( - MinosEventPeriodicService, - MinosEventServerService, + CommandReplyConsumerService, + CommandReplyHandlerService, ) diff --git a/minos/networks/handler/command_reply/server.py b/minos/networks/handlers/command_replies/consumers.py similarity index 87% rename from minos/networks/handler/command_reply/server.py rename to minos/networks/handlers/command_replies/consumers.py index cf03a502..0d5855dd 100644 --- a/minos/networks/handler/command_reply/server.py +++ b/minos/networks/handlers/command_replies/consumers.py @@ -14,12 +14,13 @@ MinosConfig, ) -from ..server import ( - MinosHandlerServer, +from ..abc import ( + Consumer, ) -class MinosCommandReplyHandlerServer(MinosHandlerServer): +class CommandReplyConsumer(Consumer): + """Command Reply consumer class.""" TABLE = "command_reply_queue" diff --git a/minos/networks/handlers/command_replies/handlers.py b/minos/networks/handlers/command_replies/handlers.py new file mode 100644 index 00000000..49d004fb --- /dev/null +++ b/minos/networks/handlers/command_replies/handlers.py @@ -0,0 +1,49 @@ +# Copyright (C) 2020 Clariteia SL +# +# This file is part of minos framework. +# +# Minos framework can not be copied and/or distributed without the express +# permission of Clariteia SL. +from typing import ( + Any, + NoReturn, +) + +from dependency_injector.wiring import ( + Provide, +) + +from minos.common import ( + CommandReply, + MinosConfig, + MinosSagaManager, +) + +from ..abc import ( + Handler, +) +from ..entries import ( + HandlerEntry, +) + + +class CommandReplyHandler(Handler): + """Command Reply Handler class.""" + + TABLE = "command_reply_queue" + + saga_manager: MinosSagaManager = Provide["saga_manager"] + + def __init__(self, *, config: MinosConfig, saga_manager: MinosSagaManager = None, **kwargs: Any): + super().__init__(table_name=self.TABLE, config=config.saga, **kwargs) + + self._broker_group_name = f"event_{config.service.name}" + + if saga_manager is not None: + self.saga_manager = saga_manager + + def _build_data(self, value: bytes) -> CommandReply: + return CommandReply.from_avro_bytes(value) + + async def _dispatch_one(self, row: HandlerEntry) -> NoReturn: + self.saga_manager.run(reply=row.data) diff --git a/minos/networks/handler/event/services.py b/minos/networks/handlers/command_replies/services.py similarity index 58% rename from minos/networks/handler/event/services.py rename to minos/networks/handlers/command_replies/services.py index 772daa31..1c828750 100644 --- a/minos/networks/handler/event/services.py +++ b/minos/networks/handlers/command_replies/services.py @@ -19,21 +19,20 @@ MinosConfig, ) -from .dispatcher import ( - MinosEventHandlerDispatcher, +from .consumers import ( + CommandReplyConsumer, ) -from .server import ( - MinosEventHandlerServer, +from .handlers import ( + CommandReplyHandler, ) -class MinosEventServerService(Service): +class CommandReplyConsumerService(Service): """Minos QueueDispatcherService class.""" def __init__(self, config: MinosConfig = None, **kwargs): super().__init__(**kwargs) - self.dispatcher = MinosEventHandlerServer.from_config(config=config) - self.consumer = None + self.dispatcher = CommandReplyConsumer.from_config(config=config) async def start(self) -> None: """Method to be called at the startup by the internal ``aiomisc`` loigc. @@ -41,35 +40,44 @@ async def start(self) -> None: :return: This method does not return anything. """ await self.dispatcher.setup() - - self.consumer = await self.dispatcher.kafka_consumer( - self.dispatcher._topics, self.dispatcher._broker_group_name, self.dispatcher._kafka_conn_data - ) - await self.dispatcher.handle_message(self.consumer) + await self.dispatcher.dispatch() async def stop(self, exception: Exception = None) -> Any: - if self.consumer is not None: - await self.consumer.stop() + """Stop the service execution. + :param exception: Optional exception that stopped the execution. + :return: This method does not return anything. + """ + await self.dispatcher.destroy() -class MinosEventPeriodicService(PeriodicService): + +class CommandReplyHandlerService(PeriodicService): """Minos QueueDispatcherService class.""" def __init__(self, config: MinosConfig = None, **kwargs): super().__init__(**kwargs) - self.dispatcher = MinosEventHandlerDispatcher.from_config(config=config) + self.dispatcher = CommandReplyHandler.from_config(config=config) async def start(self) -> None: """Method to be called at the startup by the internal ``aiomisc`` loigc. :return: This method does not return anything. """ - await super().start() await self.dispatcher.setup() + await super().start() async def callback(self) -> None: """Method to be called periodically by the internal ``aiomisc`` logic. :return:This method does not return anything. """ - await self.dispatcher.queue_checker() + await self.dispatcher.dispatch() + + async def stop(self, err: Exception = None) -> None: + """Stop the service execution. + + :param err: Optional exception that stopped the execution. + :return: This method does not return anything. + """ + await super().stop(err) + await self.dispatcher.destroy() diff --git a/minos/networks/handler/command/__init__.py b/minos/networks/handlers/commands/__init__.py similarity index 54% rename from minos/networks/handler/command/__init__.py rename to minos/networks/handlers/commands/__init__.py index d54f41b7..41087e65 100644 --- a/minos/networks/handler/command/__init__.py +++ b/minos/networks/handlers/commands/__init__.py @@ -6,13 +6,13 @@ Minos framework can not be copied and/or distributed without the express permission of Clariteia SL. """ -from .dispatcher import ( - MinosCommandHandlerDispatcher, +from .consumers import ( + CommandConsumer, ) -from .server import ( - MinosCommandHandlerServer, +from .handlers import ( + CommandHandler, ) from .services import ( - MinosCommandPeriodicService, - MinosCommandServerService, + CommandConsumerService, + CommandHandlerService, ) diff --git a/minos/networks/handler/command/server.py b/minos/networks/handlers/commands/consumers.py similarity index 88% rename from minos/networks/handler/command/server.py rename to minos/networks/handlers/commands/consumers.py index 96224089..7072e3d8 100644 --- a/minos/networks/handler/command/server.py +++ b/minos/networks/handlers/commands/consumers.py @@ -14,12 +14,13 @@ MinosConfig, ) -from ..server import ( - MinosHandlerServer, +from ..abc import ( + Consumer, ) -class MinosCommandHandlerServer(MinosHandlerServer): +class CommandConsumer(Consumer): + """Command Consumer class.""" TABLE = "command_queue" diff --git a/minos/networks/handlers/commands/handlers.py b/minos/networks/handlers/commands/handlers.py new file mode 100644 index 00000000..30b42b4a --- /dev/null +++ b/minos/networks/handlers/commands/handlers.py @@ -0,0 +1,56 @@ +# Copyright (C) 2020 Clariteia SL +# +# This file is part of minos framework. +# +# Minos framework can not be copied and/or distributed without the express +# permission of Clariteia SL. + +from typing import ( + Any, + NoReturn, +) + +from dependency_injector.wiring import ( + Provide, +) + +from minos.common import ( + Command, + MinosBroker, + MinosConfig, +) + +from ..abc import ( + Handler, +) +from ..entries import ( + HandlerEntry, +) + + +class CommandHandler(Handler): + """Command Handler class.""" + + TABLE = "command_queue" + + broker: MinosBroker = Provide["command_reply_broker"] + + def __init__(self, *, config: MinosConfig, broker: MinosBroker = None, **kwargs: Any): + super().__init__(table_name=self.TABLE, config=config.commands, **kwargs) + + self._broker_group_name = f"event_{config.service.name}" + + if broker is not None: + self.broker = broker + + def _build_data(self, value: bytes) -> Command: + return Command.from_avro_bytes(value) + + async def _dispatch_one(self, row: HandlerEntry) -> NoReturn: + command: Command = row.data + definition_id = command.saga_id + execution_id = command.task_id + + response = await row.callback(row.topic, command) + + await self.broker.send(response, topic=f"{definition_id}Reply", saga_id=definition_id, task_id=execution_id) diff --git a/minos/networks/handler/command/services.py b/minos/networks/handlers/commands/services.py similarity index 59% rename from minos/networks/handler/command/services.py rename to minos/networks/handlers/commands/services.py index c8795122..674aac6c 100644 --- a/minos/networks/handler/command/services.py +++ b/minos/networks/handlers/commands/services.py @@ -19,21 +19,20 @@ MinosConfig, ) -from .dispatcher import ( - MinosCommandHandlerDispatcher, +from .consumers import ( + CommandConsumer, ) -from .server import ( - MinosCommandHandlerServer, +from .handlers import ( + CommandHandler, ) -class MinosCommandServerService(Service): +class CommandConsumerService(Service): """Minos QueueDispatcherService class.""" def __init__(self, config: MinosConfig = None, **kwargs): super().__init__(**kwargs) - self.dispatcher = MinosCommandHandlerServer.from_config(config=config) - self.consumer = None + self.dispatcher = CommandConsumer.from_config(config=config) async def start(self) -> None: """Method to be called at the startup by the internal ``aiomisc`` loigc. @@ -41,35 +40,45 @@ async def start(self) -> None: :return: This method does not return anything. """ await self.dispatcher.setup() - - self.consumer = await self.dispatcher.kafka_consumer( - self.dispatcher._topics, self.dispatcher._broker_group_name, self.dispatcher._kafka_conn_data - ) - await self.dispatcher.handle_message(self.consumer) + await self.dispatcher.dispatch() async def stop(self, exception: Exception = None) -> Any: - if self.consumer is not None: - await self.consumer.stop() + + """Stop the service execution. + + :param exception: Optional exception that stopped the execution. + :return: This method does not return anything. + """ + await self.dispatcher.destroy() -class MinosCommandPeriodicService(PeriodicService): +class CommandHandlerService(PeriodicService): """Minos QueueDispatcherService class.""" def __init__(self, config: MinosConfig = None, **kwargs): super().__init__(**kwargs) - self.dispatcher = MinosCommandHandlerDispatcher.from_config(config=config) + self.dispatcher = CommandHandler.from_config(config=config) async def start(self) -> None: """Method to be called at the startup by the internal ``aiomisc`` loigc. :return: This method does not return anything. """ - await super().start() await self.dispatcher.setup() + await super().start() async def callback(self) -> None: """Method to be called periodically by the internal ``aiomisc`` logic. :return:This method does not return anything. """ - await self.dispatcher.queue_checker() + await self.dispatcher.dispatch() + + async def stop(self, err: Exception = None) -> None: + """Stop the service execution. + + :param err: Optional exception that stopped the execution. + :return: This method does not return anything. + """ + await super().stop(err) + await self.dispatcher.destroy() diff --git a/minos/networks/handlers/entries.py b/minos/networks/handlers/entries.py new file mode 100644 index 00000000..91bab34f --- /dev/null +++ b/minos/networks/handlers/entries.py @@ -0,0 +1,25 @@ +""" +Copyright (C) 2021 Clariteia SL + +This file is part of minos framework. + +Minos framework can not be copied and/or distributed without the express permission of Clariteia SL. +""" +from collections import ( + namedtuple, +) +from datetime import ( + datetime, +) +from typing import ( + Callable, +) + +from minos.common import ( + MinosModel, +) + +HandlerEntry = namedtuple( + "HandlerEntry", + {"id": int, "topic": str, "callback": Callable, "partition_id": int, "data": MinosModel, "created_at": datetime}, +) diff --git a/minos/networks/handlers/events/__init__.py b/minos/networks/handlers/events/__init__.py new file mode 100644 index 00000000..842d7a08 --- /dev/null +++ b/minos/networks/handlers/events/__init__.py @@ -0,0 +1,18 @@ +""" +Copyright (C) 2021 Clariteia SL + +This file is part of minos framework. + +Minos framework can not be copied and/or distributed without the express permission of Clariteia SL. +""" + +from .consumers import ( + EventConsumer, +) +from .handlers import ( + EventHandler, +) +from .services import ( + EventConsumerService, + EventHandlerService, +) diff --git a/minos/networks/handler/event/server.py b/minos/networks/handlers/events/consumers.py similarity index 88% rename from minos/networks/handler/event/server.py rename to minos/networks/handlers/events/consumers.py index 9ab8aee4..ee0f8682 100644 --- a/minos/networks/handler/event/server.py +++ b/minos/networks/handlers/events/consumers.py @@ -14,12 +14,13 @@ MinosConfig, ) -from ..server import ( - MinosHandlerServer, +from ..abc import ( + Consumer, ) -class MinosEventHandlerServer(MinosHandlerServer): +class EventConsumer(Consumer): + """Event Consumer class.""" TABLE = "event_queue" diff --git a/minos/networks/handler/event/dispatcher.py b/minos/networks/handlers/events/handlers.py similarity index 58% rename from minos/networks/handler/event/dispatcher.py rename to minos/networks/handlers/events/handlers.py index e94a36de..ff449cd6 100644 --- a/minos/networks/handler/event/dispatcher.py +++ b/minos/networks/handlers/events/handlers.py @@ -4,9 +4,9 @@ # # Minos framework can not be copied and/or distributed without the express # permission of Clariteia SL. - from typing import ( Any, + NoReturn, ) from minos.common import ( @@ -14,12 +14,16 @@ MinosConfig, ) -from ..dispatcher import ( - MinosHandlerDispatcher, +from ..abc import ( + Handler, +) +from ..entries import ( + HandlerEntry, ) -class MinosEventHandlerDispatcher(MinosHandlerDispatcher): +class EventHandler(Handler): + """Event Handler class.""" TABLE = "event_queue" @@ -27,9 +31,8 @@ def __init__(self, *, config: MinosConfig, **kwargs: Any): super().__init__(table_name=self.TABLE, config=config.events, **kwargs) self._broker_group_name = f"event_{config.service.name}" - def _is_valid_instance(self, value: bytes): - try: - instance = Event.from_avro_bytes(value) - return True, instance - except: # noqa E722 - return False, None + def _build_data(self, value: bytes) -> Event: + return Event.from_avro_bytes(value) + + async def _dispatch_one(self, row: HandlerEntry) -> NoReturn: + await row.callback(row.topic, row.data) diff --git a/minos/networks/handler/command_reply/services.py b/minos/networks/handlers/events/services.py similarity index 60% rename from minos/networks/handler/command_reply/services.py rename to minos/networks/handlers/events/services.py index 7f7c4e47..65e9992c 100644 --- a/minos/networks/handler/command_reply/services.py +++ b/minos/networks/handlers/events/services.py @@ -19,21 +19,20 @@ MinosConfig, ) -from .dispatcher import ( - MinosCommandReplyHandlerDispatcher, +from .consumers import ( + EventConsumer, ) -from .server import ( - MinosCommandReplyHandlerServer, +from .handlers import ( + EventHandler, ) -class MinosCommandReplyServerService(Service): +class EventConsumerService(Service): """Minos QueueDispatcherService class.""" def __init__(self, config: MinosConfig = None, **kwargs): super().__init__(**kwargs) - self.dispatcher = MinosCommandReplyHandlerServer.from_config(config=config) - self.consumer = None + self.dispatcher = EventConsumer.from_config(config=config) async def start(self) -> None: """Method to be called at the startup by the internal ``aiomisc`` loigc. @@ -41,35 +40,44 @@ async def start(self) -> None: :return: This method does not return anything. """ await self.dispatcher.setup() - - self.consumer = await self.dispatcher.kafka_consumer( - self.dispatcher._topics, self.dispatcher._broker_group_name, self.dispatcher._kafka_conn_data - ) - await self.dispatcher.handle_message(self.consumer) + await self.dispatcher.dispatch() async def stop(self, exception: Exception = None) -> Any: - if self.consumer is not None: - await self.consumer.stop() + """Stop the service execution. + :param exception: Optional exception that stopped the execution. + :return: This method does not return anything. + """ + await self.dispatcher.destroy() -class MinosCommandReplyPeriodicService(PeriodicService): + +class EventHandlerService(PeriodicService): """Minos QueueDispatcherService class.""" def __init__(self, config: MinosConfig = None, **kwargs): super().__init__(**kwargs) - self.dispatcher = MinosCommandReplyHandlerDispatcher.from_config(config=config) + self.dispatcher = EventHandler.from_config(config=config) async def start(self) -> None: """Method to be called at the startup by the internal ``aiomisc`` loigc. :return: This method does not return anything. """ - await super().start() await self.dispatcher.setup() + await super().start() async def callback(self) -> None: """Method to be called periodically by the internal ``aiomisc`` logic. :return:This method does not return anything. """ - await self.dispatcher.queue_checker() + await self.dispatcher.dispatch() + + async def stop(self, err: Exception = None) -> None: + """Stop the service execution. + + :param err: Optional exception that stopped the execution. + :return: This method does not return anything. + """ + await super().stop(err) + await self.dispatcher.destroy() diff --git a/minos/networks/rest_interface/__init__.py b/minos/networks/rest/__init__.py similarity index 69% rename from minos/networks/rest_interface/__init__.py rename to minos/networks/rest/__init__.py index 524ba7a0..d6cc5c39 100644 --- a/minos/networks/rest_interface/__init__.py +++ b/minos/networks/rest/__init__.py @@ -6,9 +6,9 @@ Minos framework can not be copied and/or distributed without the express permission of Clariteia SL. """ -from .handler import ( - RestInterfaceHandler, +from .builders import ( + RestBuilder, ) -from .service import ( - REST, +from .services import ( + RestService, ) diff --git a/minos/networks/rest_interface/handler.py b/minos/networks/rest/builders.py similarity index 98% rename from minos/networks/rest_interface/handler.py rename to minos/networks/rest/builders.py index 3f738d73..51eb6687 100644 --- a/minos/networks/rest_interface/handler.py +++ b/minos/networks/rest/builders.py @@ -15,7 +15,7 @@ ) -class RestInterfaceHandler: +class RestBuilder: """ Rest Interface Handler diff --git a/minos/networks/rest_interface/service.py b/minos/networks/rest/services.py similarity index 58% rename from minos/networks/rest_interface/service.py rename to minos/networks/rest/services.py index 05e49df5..cded24a7 100644 --- a/minos/networks/rest_interface/service.py +++ b/minos/networks/rest/services.py @@ -1,3 +1,10 @@ +""" +Copyright (C) 2021 Clariteia SL + +This file is part of minos framework. + +Minos framework can not be copied and/or distributed without the express permission of Clariteia SL. +""" import typing as t from aiomisc.service.aiohttp import ( @@ -8,12 +15,12 @@ MinosConfig, ) -from .handler import ( - RestInterfaceHandler, +from .builders import ( + RestBuilder, ) -class REST(AIOHTTPService): +class RestService(AIOHTTPService): """ Rest Interface @@ -26,7 +33,7 @@ def __init__(self, config: MinosConfig, **kwds: t.Any): port = config.rest.broker.port super().__init__(address=address, port=port, **kwds) self._config = config - self.rest_interface = RestInterfaceHandler(config=self._config) + self.rest = RestBuilder(config=self._config) async def create_application(self): - return self.rest_interface.get_app() # pragma: no cover + return self.rest.get_app() # pragma: no cover diff --git a/minos/networks/snapshots/__init__.py b/minos/networks/snapshots/__init__.py index ab7b0571..7c62e525 100644 --- a/minos/networks/snapshots/__init__.py +++ b/minos/networks/snapshots/__init__.py @@ -5,12 +5,12 @@ Minos framework can not be copied and/or distributed without the express permission of Clariteia SL. """ -from .dispatchers import ( - MinosSnapshotDispatcher, +from .builders import ( + SnapshotBuilder, ) from .entries import ( - MinosSnapshotEntry, + SnapshotEntry, ) from .services import ( - MinosSnapshotService, + SnapshotService, ) diff --git a/minos/networks/snapshots/dispatchers.py b/minos/networks/snapshots/builders.py similarity index 84% rename from minos/networks/snapshots/dispatchers.py rename to minos/networks/snapshots/builders.py index 39dd3a00..bf982871 100644 --- a/minos/networks/snapshots/dispatchers.py +++ b/minos/networks/snapshots/builders.py @@ -20,7 +20,6 @@ from minos.common import ( Aggregate, MinosConfig, - MinosConfigException, MinosRepositoryAction, MinosRepositoryEntry, PostgreSqlMinosDatabase, @@ -32,11 +31,11 @@ MinosPreviousVersionSnapshotException, ) from .entries import ( - MinosSnapshotEntry, + SnapshotEntry, ) -class MinosSnapshotDispatcher(PostgreSqlMinosDatabase): +class SnapshotBuilder(PostgreSqlMinosDatabase): """Minos Snapshot Dispatcher class.""" def __init__(self, *args, repository: dict[str, Any] = None, **kwargs): @@ -48,18 +47,7 @@ async def _destroy(self) -> NoReturn: await self.repository.destroy() @classmethod - def from_config(cls, *args, config: MinosConfig = None, **kwargs) -> MinosSnapshotDispatcher: - """Build a new Snapshot Dispatcher from config. - :param args: Additional positional arguments. - :param config: Config instance. If `None` is provided, default config is chosen. - :param kwargs: Additional named arguments. - :return: A `MinosRepository` instance. - """ - if config is None: - config = MinosConfig.get_default() - if config is None: - raise MinosConfigException("The config object must be setup.") - # noinspection PyProtectedMember + def _from_config(cls, *args, config: MinosConfig, **kwargs) -> SnapshotBuilder: return cls(*args, **config.snapshot._asdict(), repository=config.repository._asdict(), **kwargs) async def _setup(self) -> NoReturn: @@ -103,7 +91,7 @@ async def _store_offset(self, offset: int) -> NoReturn: await self.submit_query(_INSERT_OFFSET_QUERY, {"value": offset}) # noinspection PyUnusedLocal - async def select(self, *args, **kwargs) -> AsyncIterator[MinosSnapshotEntry]: + async def select(self, *args, **kwargs) -> AsyncIterator[SnapshotEntry]: """Select a sequence of ``MinosSnapshotEntry`` objects. :param args: Additional positional arguments. @@ -111,9 +99,9 @@ async def select(self, *args, **kwargs) -> AsyncIterator[MinosSnapshotEntry]: :return: A sequence of ``MinosSnapshotEntry`` objects. """ async for row in self.submit_query_and_iter(_SELECT_ALL_ENTRIES_QUERY): - yield MinosSnapshotEntry(*row) + yield SnapshotEntry(*row) - async def _dispatch_one(self, event_entry: MinosRepositoryEntry) -> Optional[MinosSnapshotEntry]: + async def _dispatch_one(self, event_entry: MinosRepositoryEntry) -> Optional[SnapshotEntry]: if event_entry.action is MinosRepositoryAction.DELETE: await self._submit_delete(event_entry) return @@ -149,16 +137,16 @@ async def _select_one_aggregate(self, aggregate_id: int, aggregate_name: str) -> snapshot_entry = await self._select_one(aggregate_id, aggregate_name) return snapshot_entry.aggregate - async def _select_one(self, aggregate_id: int, aggregate_name: str) -> MinosSnapshotEntry: + async def _select_one(self, aggregate_id: int, aggregate_name: str) -> SnapshotEntry: raw = await self.submit_query_and_fetchone(_SELECT_ONE_SNAPSHOT_ENTRY_QUERY, (aggregate_id, aggregate_name)) - return MinosSnapshotEntry(aggregate_id, aggregate_name, *raw) + return SnapshotEntry(aggregate_id, aggregate_name, *raw) - async def _submit_instance(self, aggregate: Aggregate) -> MinosSnapshotEntry: - snapshot_entry = MinosSnapshotEntry.from_aggregate(aggregate) + async def _submit_instance(self, aggregate: Aggregate) -> SnapshotEntry: + snapshot_entry = SnapshotEntry.from_aggregate(aggregate) snapshot_entry = await self._submit_update_or_create(snapshot_entry) return snapshot_entry - async def _submit_update_or_create(self, entry: MinosSnapshotEntry) -> MinosSnapshotEntry: + async def _submit_update_or_create(self, entry: SnapshotEntry) -> SnapshotEntry: params = { "aggregate_id": entry.aggregate_id, "aggregate_name": entry.aggregate_name, diff --git a/minos/networks/snapshots/entries.py b/minos/networks/snapshots/entries.py index 973b351e..995197d5 100644 --- a/minos/networks/snapshots/entries.py +++ b/minos/networks/snapshots/entries.py @@ -26,7 +26,7 @@ ) -class MinosSnapshotEntry(object): +class SnapshotEntry(object): """Minos Snapshot Entry class. Is the python object representation of a row in the ``snapshot`` storage system. @@ -56,7 +56,7 @@ def __init__( self.updated_at = updated_at @classmethod - def from_aggregate(cls, aggregate: Aggregate) -> MinosSnapshotEntry: + def from_aggregate(cls, aggregate: Aggregate) -> SnapshotEntry: """Build a new instance from an ``Aggregate``. :param aggregate: The aggregate instance. @@ -84,7 +84,7 @@ def aggregate_cls(self) -> Type[Aggregate]: # noinspection PyTypeChecker return import_module(self.aggregate_name) - def __eq__(self, other: MinosSnapshotEntry) -> bool: + def __eq__(self, other: SnapshotEntry) -> bool: return type(self) == type(other) and tuple(self) == tuple(other) def __hash__(self) -> int: diff --git a/minos/networks/snapshots/services.py b/minos/networks/snapshots/services.py index b744d7e1..3cce546f 100644 --- a/minos/networks/snapshots/services.py +++ b/minos/networks/snapshots/services.py @@ -14,17 +14,17 @@ MinosConfig, ) -from .dispatchers import ( - MinosSnapshotDispatcher, +from .builders import ( + SnapshotBuilder, ) -class MinosSnapshotService(PeriodicService): +class SnapshotService(PeriodicService): """Minos Snapshot Service class.""" def __init__(self, config: MinosConfig = None, **kwargs): super().__init__(**kwargs) - self.dispatcher = MinosSnapshotDispatcher.from_config(config=config) + self.dispatcher = SnapshotBuilder.from_config(config=config) async def start(self) -> None: """Start the service execution. diff --git a/poetry.lock b/poetry.lock index debe2011..d57e04f3 100644 --- a/poetry.lock +++ b/poetry.lock @@ -33,7 +33,7 @@ snappy = ["python-snappy (>=0.5)"] [[package]] name = "aiomisc" -version = "14.0.3" +version = "14.0.7" description = "aiomisc - miscellaneous utils for asyncio" category = "main" optional = false @@ -144,6 +144,14 @@ typed-ast = ">=1.4.0" [package.extras] d = ["aiohttp (>=3.3.2)", "aiohttp-cors"] +[[package]] +name = "cached-property" +version = "1.5.2" +description = "A decorator for caching properties in classes." +category = "main" +optional = false +python-versions = "*" + [[package]] name = "certifi" version = "2020.12.5" @@ -165,7 +173,7 @@ pycparser = "*" [[package]] name = "cfgv" -version = "3.2.0" +version = "3.3.0" description = "Validate configuration and produce human readable error messages." category = "dev" optional = false @@ -181,7 +189,7 @@ python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*" [[package]] name = "click" -version = "8.0.0" +version = "8.0.1" description = "Composable command line interface toolkit" category = "dev" optional = false @@ -220,6 +228,23 @@ python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, <4" [package.extras] toml = ["toml"] +[[package]] +name = "dependency-injector" +version = "4.32.2" +description = "Dependency injection framework for Python" +category = "main" +optional = false +python-versions = "*" + +[package.dependencies] +six = ">=1.7.0,<=1.15.0" + +[package.extras] +aiohttp = ["aiohttp"] +flask = ["flask"] +pydantic = ["pydantic"] +yaml = ["pyyaml"] + [[package]] name = "distlib" version = "0.3.1" @@ -238,7 +263,7 @@ python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*" [[package]] name = "fastavro" -version = "1.4.0" +version = "1.4.1" description = "Fast read/write of AVRO files" category = "main" optional = false @@ -321,17 +346,17 @@ colors = ["colorama (>=0.4.3,<0.5.0)"] [[package]] name = "jinja2" -version = "2.11.3" +version = "3.0.1" description = "A very fast and expressive template engine." category = "dev" optional = false -python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*" +python-versions = ">=3.6" [package.dependencies] -MarkupSafe = ">=0.23" +MarkupSafe = ">=2.0" [package.extras] -i18n = ["Babel (>=0.8)"] +i18n = ["Babel (>=2.7)"] [[package]] name = "kafka-python" @@ -357,11 +382,11 @@ cffi = ">=0.8" [[package]] name = "markupsafe" -version = "1.1.1" +version = "2.0.1" description = "Safely add untrusted strings to HTML/XML markup." category = "dev" optional = false -python-versions = ">=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*" +python-versions = ">=3.6" [[package]] name = "mccabe" @@ -373,7 +398,7 @@ python-versions = "*" [[package]] name = "minos-microservice-common" -version = "0.0.12" +version = "0.0.14" description = "Python Package with common Classes and Utilities used in Minos Microservices." category = "main" optional = false @@ -382,11 +407,12 @@ python-versions = ">=3.9,<4.0" [package.dependencies] aiomisc = ">=14.0.3,<15.0.0" aiopg = ">=1.2.1,<2.0.0" +cached-property = ">=1.5.2,<2.0.0" +dependency-injector = ">=4.32.2,<5.0.0" fastavro = ">=1.4.0,<2.0.0" lmdb = ">=1.2.1,<2.0.0" orjson = ">=3.5.2,<4.0.0" PyYAML = ">=5.4.1,<6.0.0" -six = ">=1.16.0,<2.0.0" [[package]] name = "multidict" @@ -431,6 +457,14 @@ category = "dev" optional = false python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*" +[[package]] +name = "pbr" +version = "5.6.0" +description = "Python Build Reasonableness" +category = "dev" +optional = false +python-versions = ">=2.6" + [[package]] name = "pluggy" version = "0.13.1" @@ -579,7 +613,7 @@ socks = ["PySocks (>=1.5.6,!=1.5.7)", "win-inet-pton"] [[package]] name = "six" -version = "1.16.0" +version = "1.15.0" description = "Python 2 and 3 compatibility utilities" category = "main" optional = false @@ -595,7 +629,7 @@ python-versions = "*" [[package]] name = "sphinx" -version = "4.0.1" +version = "4.0.2" description = "Python documentation generator" category = "dev" optional = false @@ -607,8 +641,7 @@ babel = ">=1.3" colorama = {version = ">=0.3.5", markers = "sys_platform == \"win32\""} docutils = ">=0.14,<0.18" imagesize = "*" -Jinja2 = ">=2.3,<3.0" -MarkupSafe = "<2.0" +Jinja2 = ">=2.3" packaging = "*" Pygments = ">=2.0" requests = ">=2.5.0" @@ -625,6 +658,33 @@ docs = ["sphinxcontrib-websupport"] lint = ["flake8 (>=3.5.0)", "isort", "mypy (>=0.800)", "docutils-stubs"] test = ["pytest", "pytest-cov", "html5lib", "cython", "typed-ast"] +[[package]] +name = "sphinx-autodoc-typehints" +version = "1.12.0" +description = "Type hints (PEP 484) support for the Sphinx autodoc extension" +category = "dev" +optional = false +python-versions = ">=3.6" + +[package.dependencies] +Sphinx = ">=3.0" + +[package.extras] +test = ["pytest (>=3.1.0)", "typing-extensions (>=3.5)", "sphobjinv (>=2.0)", "Sphinx (>=3.2.0)", "dataclasses"] +type_comments = ["typed-ast (>=1.4.0)"] + +[[package]] +name = "sphinxcontrib-apidoc" +version = "0.3.0" +description = "A Sphinx extension for running 'sphinx-apidoc' on each build" +category = "dev" +optional = false +python-versions = "*" + +[package.dependencies] +pbr = "*" +Sphinx = ">=1.6.0" + [[package]] name = "sphinxcontrib-applehelp" version = "1.0.2" @@ -766,7 +826,7 @@ multidict = ">=4.0" [metadata] lock-version = "1.1" python-versions = "^3.9" -content-hash = "5f31e3a2bf1ed683b33253d0bad6a6284ef7bf15a1d64a6f9fe287bafcd35d8e" +content-hash = "8c366fb1d2708672c8beb0f00b594f33baad42c5227565062786f7dffc6d172b" [metadata.files] aiohttp = [ @@ -840,8 +900,8 @@ aiokafka = [ {file = "aiokafka-0.7.0.tar.gz", hash = "sha256:2a4bf3a7afc6406cb37c153256c35629c1d69caf63e4a3b5b1c3863d97a17b74"}, ] aiomisc = [ - {file = "aiomisc-14.0.3-py3-none-any.whl", hash = "sha256:b387c7662fab3b539ec607c034d727bbda2683a1e89d6c6f557ca4e64c95bfa8"}, - {file = "aiomisc-14.0.3.tar.gz", hash = "sha256:10b7fed39ee9a7d5d03cf354c14542901f489cd279cc0f35ad2bacc02b444f45"}, + {file = "aiomisc-14.0.7-py3-none-any.whl", hash = "sha256:409ff3548a1b5d92b332d6d6ea6a16afe389a4407490c02f48eb0a4212ee3c7c"}, + {file = "aiomisc-14.0.7.tar.gz", hash = "sha256:8eec96c1813f45d2c64f8020a26b561215e1bace94bc6d593958c2c4c425bbb9"}, ] aiopg = [ {file = "aiopg-1.2.1-py3-none-any.whl", hash = "sha256:736776982f49defe85351687abf745fc49184749eb9d245ab08ce52e0d26c3b7"}, @@ -875,6 +935,10 @@ black = [ {file = "black-19.10b0-py36-none-any.whl", hash = "sha256:1b30e59be925fafc1ee4565e5e08abef6b03fe455102883820fe5ee2e4734e0b"}, {file = "black-19.10b0.tar.gz", hash = "sha256:c2edb73a08e9e0e6f65a0e6af18b059b8b1cdd5bef997d7a0b181df93dc81539"}, ] +cached-property = [ + {file = "cached-property-1.5.2.tar.gz", hash = "sha256:9fa5755838eecbb2d234c3aa390bd80fbd3ac6b6869109bfc1b499f7bd89a130"}, + {file = "cached_property-1.5.2-py2.py3-none-any.whl", hash = "sha256:df4f613cf7ad9a588cc381aaf4a512d26265ecebd5eb9e1ba12f1319eb85a6a0"}, +] certifi = [ {file = "certifi-2020.12.5-py2.py3-none-any.whl", hash = "sha256:719a74fb9e33b9bd44cc7f3a8d94bc35e4049deebe19ba7d8e108280cfd59830"}, {file = "certifi-2020.12.5.tar.gz", hash = "sha256:1a4995114262bffbc2413b159f2a1a480c969de6e6eb13ee966d470af86af59c"}, @@ -919,16 +983,16 @@ cffi = [ {file = "cffi-1.14.5.tar.gz", hash = "sha256:fd78e5fee591709f32ef6edb9a015b4aa1a5022598e36227500c8f4e02328d9c"}, ] cfgv = [ - {file = "cfgv-3.2.0-py2.py3-none-any.whl", hash = "sha256:32e43d604bbe7896fe7c248a9c2276447dbef840feb28fe20494f62af110211d"}, - {file = "cfgv-3.2.0.tar.gz", hash = "sha256:cf22deb93d4bcf92f345a5c3cd39d3d41d6340adc60c78bbbd6588c384fda6a1"}, + {file = "cfgv-3.3.0-py2.py3-none-any.whl", hash = "sha256:b449c9c6118fe8cca7fa5e00b9ec60ba08145d281d52164230a69211c5d597a1"}, + {file = "cfgv-3.3.0.tar.gz", hash = "sha256:9e600479b3b99e8af981ecdfc80a0296104ee610cab48a5ae4ffd0b668650eb1"}, ] chardet = [ {file = "chardet-4.0.0-py2.py3-none-any.whl", hash = "sha256:f864054d66fd9118f2e67044ac8981a54775ec5b67aed0441892edb553d21da5"}, {file = "chardet-4.0.0.tar.gz", hash = "sha256:0d6f53a15db4120f2b08c94f11e7d93d2c911ee118b6b30a04ec3ee8310179fa"}, ] click = [ - {file = "click-8.0.0-py3-none-any.whl", hash = "sha256:e90e62ced43dc8105fb9a26d62f0d9340b5c8db053a814e25d95c19873ae87db"}, - {file = "click-8.0.0.tar.gz", hash = "sha256:7d8c289ee437bcb0316820ccee14aefcb056e58d31830ecab8e47eda6540e136"}, + {file = "click-8.0.1-py3-none-any.whl", hash = "sha256:fba402a4a47334742d782209a7c79bc448911afe1149d07bdabdf480b3e2f4b6"}, + {file = "click-8.0.1.tar.gz", hash = "sha256:8c04c11192119b1ef78ea049e0a6f0463e4c48ef00a30160c704337586f3ad7a"}, ] colorama = [ {file = "colorama-0.4.4-py2.py3-none-any.whl", hash = "sha256:9f47eda37229f68eee03b24b9748937c7dc3868f906e8ba69fbcbdd3bc5dc3e2"}, @@ -992,6 +1056,70 @@ coverage = [ {file = "coverage-5.5-pp37-none-any.whl", hash = "sha256:2a3859cb82dcbda1cfd3e6f71c27081d18aa251d20a17d87d26d4cd216fb0af4"}, {file = "coverage-5.5.tar.gz", hash = "sha256:ebe78fe9a0e874362175b02371bdfbee64d8edc42a044253ddf4ee7d3c15212c"}, ] +dependency-injector = [ + {file = "dependency-injector-4.32.2.tar.gz", hash = "sha256:dee9133b7042b5c224d2fd87430b6a6f6151276ff06a269907f064afbddc6f5d"}, + {file = "dependency_injector-4.32.2-cp27-cp27m-macosx_10_9_x86_64.whl", hash = "sha256:46d5c52a09d18b1b4cded6ab96f0bbbded89d47bacd6d2e584b285b9355cdefe"}, + {file = "dependency_injector-4.32.2-cp27-cp27m-manylinux1_i686.whl", hash = "sha256:761db302f783acfc6b3d3c7bf55d3843804060fe16c63dfe1cc1f63d8d50180e"}, + {file = "dependency_injector-4.32.2-cp27-cp27m-manylinux1_x86_64.whl", hash = "sha256:90a4b9ffa1e0c776fc67f26cb7669e2e1f064dc6f161e66266d4f65991b29ac9"}, + {file = "dependency_injector-4.32.2-cp27-cp27m-manylinux2010_i686.whl", hash = "sha256:4ef487f9c4d3845bb042399cf4ed173af42b254849b1e21ec953d79f249c72a6"}, + {file = "dependency_injector-4.32.2-cp27-cp27m-manylinux2010_x86_64.whl", hash = "sha256:8b4599a1b4d4e42cf76163dceecee56b4f5993c033f29c81cb0fd0df9ebd7460"}, + {file = "dependency_injector-4.32.2-cp27-cp27mu-manylinux1_i686.whl", hash = "sha256:49ec4a023974009fa84a6f3e2acaee3eb736eaf791261f240a281e08fbd3a76e"}, + {file = "dependency_injector-4.32.2-cp27-cp27mu-manylinux1_x86_64.whl", hash = "sha256:a41fb2af141e04bcd154ad30e55babaef75e3a2c5a7c5de747a48816462543f6"}, + {file = "dependency_injector-4.32.2-cp27-cp27mu-manylinux2010_i686.whl", hash = "sha256:a870aa290da2b4c6b0646e73a918906efa1ffaf958fae3eb269a609ebf44660b"}, + {file = "dependency_injector-4.32.2-cp27-cp27mu-manylinux2010_x86_64.whl", hash = "sha256:42bcdb6a2c025b3048a015f291b27c037c8b9ee715684574d2e5bc36ab20e140"}, + {file = "dependency_injector-4.32.2-cp35-cp35m-macosx_10_9_x86_64.whl", hash = "sha256:31633252bfc5bc0201262acbf2839850e7f920175426c66a549c8789b0e72e7b"}, + {file = "dependency_injector-4.32.2-cp35-cp35m-manylinux1_i686.whl", hash = "sha256:cd06e97761f242bdf1e5818f7fc46010b00757ba2ff09fb41a5e929071910794"}, + {file = "dependency_injector-4.32.2-cp35-cp35m-manylinux1_x86_64.whl", hash = "sha256:f9e4afcb9eb5ca9376301d4eac487eec1353ad9d05d6976842654aa36d679b64"}, + {file = "dependency_injector-4.32.2-cp35-cp35m-manylinux2010_i686.whl", hash = "sha256:1a1081543bc15ba81bfbe46e0bbb7f684370e2153188eea6a9bb2e0e7bb7d393"}, + {file = "dependency_injector-4.32.2-cp35-cp35m-manylinux2010_x86_64.whl", hash = "sha256:116c9dd8cb98c3bd524d834a7ece914a15aec4f3087bba0ea65e24710e3e3584"}, + {file = "dependency_injector-4.32.2-cp35-cp35m-manylinux2014_aarch64.whl", hash = "sha256:3f0908260a1e97556dea4c562f7b5674ba1213445c38b959fc81ae43d22ad2d9"}, + {file = "dependency_injector-4.32.2-cp35-cp35m-win32.whl", hash = "sha256:9a69d4edd4365cdade219b4a86331b814c09185fba7a37978467155deced1df3"}, + {file = "dependency_injector-4.32.2-cp35-cp35m-win_amd64.whl", hash = "sha256:49b37a6118eed7297f9161c7bd877d8ff5c66644a0e3d0c2cc453a55f4f1633e"}, + {file = "dependency_injector-4.32.2-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:252ea661465f12fd0aa04e7640583aedea82c782a2a54777af29349dfb292326"}, + {file = "dependency_injector-4.32.2-cp36-cp36m-manylinux1_i686.whl", hash = "sha256:03b5c86ed153430e7d793017950d6a6936a38cab7173838be1983cb6d2d74172"}, + {file = "dependency_injector-4.32.2-cp36-cp36m-manylinux1_x86_64.whl", hash = "sha256:73d5aab8e86b1a0fee334174c2dab8e315389195ae0721edd0a5cea7757c0f7e"}, + {file = "dependency_injector-4.32.2-cp36-cp36m-manylinux2010_i686.whl", hash = "sha256:76b1f73be50167d3879455126ff13eee2a6c63f99874defcd693a4607af82f58"}, + {file = "dependency_injector-4.32.2-cp36-cp36m-manylinux2010_x86_64.whl", hash = "sha256:873e419d11f656c081a7bc1aad9ab01ed45fb51fb91cc8bf579e110b77d5bf38"}, + {file = "dependency_injector-4.32.2-cp36-cp36m-manylinux2014_aarch64.whl", hash = "sha256:d7845eb55424883e383b473e96246e3a6f1e6ad3b1efec94619f38c849fcbda9"}, + {file = "dependency_injector-4.32.2-cp36-cp36m-win32.whl", hash = "sha256:f597e07a7871c14f1cf2ff1021b464d92c374fc2d7563d7ee045982c9397301c"}, + {file = "dependency_injector-4.32.2-cp36-cp36m-win_amd64.whl", hash = "sha256:e5cb38ac4d2adbc2658d3fd3f2879eecbed7fbc019433b305b4c4eadd9454dff"}, + {file = "dependency_injector-4.32.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:2d11e258831c8f1788725a27a29040695cdc601d6f53a6f4421698e242e467e1"}, + {file = "dependency_injector-4.32.2-cp37-cp37m-manylinux1_i686.whl", hash = "sha256:a735bd9b5c765b15db05424b4d8ea5f543414300d82f4f85fa029080f76edb86"}, + {file = "dependency_injector-4.32.2-cp37-cp37m-manylinux1_x86_64.whl", hash = "sha256:a8321098bcabeb1d224ccaac12e64ea23219b3d5352bacef195c81a12b4e51dc"}, + {file = "dependency_injector-4.32.2-cp37-cp37m-manylinux2010_i686.whl", hash = "sha256:412360a3d7f6a8f675cf9c69022c1071cf37037e00ece2d0e89eb4966123733a"}, + {file = "dependency_injector-4.32.2-cp37-cp37m-manylinux2010_x86_64.whl", hash = "sha256:ccd70373d3fe815cc90843fb54131972a99821ad7e4018c20dc1b2cdc6bed960"}, + {file = "dependency_injector-4.32.2-cp37-cp37m-manylinux2014_aarch64.whl", hash = "sha256:feab6538ea58b480946b8a4b12c657302ba29e576964404dffccc2b074280fbe"}, + {file = "dependency_injector-4.32.2-cp37-cp37m-win32.whl", hash = "sha256:77d9a578357f5950bb8c9a0984513dc0b30f78f3e8559229cbfe383ee8ef5062"}, + {file = "dependency_injector-4.32.2-cp37-cp37m-win_amd64.whl", hash = "sha256:694c35b7add7b82e66d3ebc1576b94c840b690e878df11406812bd876dcfacb2"}, + {file = "dependency_injector-4.32.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:cc9b102f1d6335c091e29b8e1ec8dc262775c98b9eb56560e411649a324d9f8e"}, + {file = "dependency_injector-4.32.2-cp38-cp38-manylinux1_i686.whl", hash = "sha256:95343c6f6946f2ea384729b665db0478073820ba45b8593a2114d5e802873ca3"}, + {file = "dependency_injector-4.32.2-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:431e12488539636fc3f915568d77aa5fade3f27c5d31dbf18f4234df65dcf027"}, + {file = "dependency_injector-4.32.2-cp38-cp38-manylinux2010_i686.whl", hash = "sha256:dce38f892d2f5f1289268cd7faf5959e2cd6c6614022c59319d25555c544d167"}, + {file = "dependency_injector-4.32.2-cp38-cp38-manylinux2010_x86_64.whl", hash = "sha256:f3a5cb237d6b8f88422752d14590041dd2604b4f6b7e8e496a6784ae0f8cdafd"}, + {file = "dependency_injector-4.32.2-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:d357161cb9e565674ea8fb7f949bc658808b97a3cd1148e3328d73127e7af544"}, + {file = "dependency_injector-4.32.2-cp38-cp38-win32.whl", hash = "sha256:c7db6638504f69fc38740cefb15086d9fda86866dfb2ed4169061f36cf00e0c9"}, + {file = "dependency_injector-4.32.2-cp38-cp38-win_amd64.whl", hash = "sha256:01f4ebba1fda1fff007663f66fb19e766e47fdde69bd6b2d852f33cea46134cf"}, + {file = "dependency_injector-4.32.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:024648b0ff1aa1f4943a71a85727977f086e15fe4c25c9c8e214a491573520d9"}, + {file = "dependency_injector-4.32.2-cp39-cp39-manylinux1_i686.whl", hash = "sha256:28ce9107722ec376ed99e4c2e3790850e927cbf77c40967fd4f503c03eb4488b"}, + {file = "dependency_injector-4.32.2-cp39-cp39-manylinux1_x86_64.whl", hash = "sha256:cfa8bd2562290cbf419dd381bf8aa4c409bfe315a5670bf409954711da94de82"}, + {file = "dependency_injector-4.32.2-cp39-cp39-manylinux2010_i686.whl", hash = "sha256:315f98c8fa3c0ee81984e7da8b6e7a20beab4058499fd20cc36ec2669b04cfd1"}, + {file = "dependency_injector-4.32.2-cp39-cp39-manylinux2010_x86_64.whl", hash = "sha256:cd5569a334ac51e27c11c616dc3234df12178892eea82d589ade7cec88025ee0"}, + {file = "dependency_injector-4.32.2-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:bf5c8589d0f37f317bd8876cbd6844e1af7c406b03d9e817e2201a9b0d7fdcab"}, + {file = "dependency_injector-4.32.2-cp39-cp39-win32.whl", hash = "sha256:652f003bc0fcc57bb2dd570c73db0ef42c165dd3377606c32f9dbe63306bc684"}, + {file = "dependency_injector-4.32.2-cp39-cp39-win_amd64.whl", hash = "sha256:2487da9fcda871cfe1ae4bf61baeb6bd5f4b1c2c10e309ecfc337d10f357d918"}, + {file = "dependency_injector-4.32.2-pp27-pypy_73-macosx_10_9_x86_64.whl", hash = "sha256:0d8a99d995248234ff42e3b3ad1a93203fd7bbc6b2aa55c171cf6ba0beda0da0"}, + {file = "dependency_injector-4.32.2-pp27-pypy_73-manylinux1_x86_64.whl", hash = "sha256:6f9c4beaf9aaedf83cd90bd5831368fa715f0a3d48e7b3a4cd9c4880a7d42760"}, + {file = "dependency_injector-4.32.2-pp27-pypy_73-manylinux2010_x86_64.whl", hash = "sha256:7be57ee9341328cecfeb6ca17d824c79b95543b9bf3078893122b91dbb942e69"}, + {file = "dependency_injector-4.32.2-pp27-pypy_73-win32.whl", hash = "sha256:746efe63f9503029061f6724c656826dc6d49b88b75f8c69517c18bee2046bec"}, + {file = "dependency_injector-4.32.2-pp36-pypy36_pp73-macosx_10_9_x86_64.whl", hash = "sha256:bb111fce11f65653acbcf4ea1c19f707e78666050c7c26788bd1b8b141d596f3"}, + {file = "dependency_injector-4.32.2-pp36-pypy36_pp73-manylinux1_x86_64.whl", hash = "sha256:73f155522f3cd272ab24c5fad258a3190ca49e374d3274c6a624776be9c19969"}, + {file = "dependency_injector-4.32.2-pp36-pypy36_pp73-manylinux2010_x86_64.whl", hash = "sha256:66c6f7a594332e42776aecb3773a7ab7e1de8a364936816ae169e959eb5bd996"}, + {file = "dependency_injector-4.32.2-pp36-pypy36_pp73-win32.whl", hash = "sha256:6ddb61017ec560138c6a74664093a176c7b6b0626fde20980b34fdd0a2746b1f"}, + {file = "dependency_injector-4.32.2-pp37-pypy37_pp73-macosx_10_9_x86_64.whl", hash = "sha256:9560295429495455a0223d94b1f6a2cdcc82e662fcb9999d89c24a6bf40f35ca"}, + {file = "dependency_injector-4.32.2-pp37-pypy37_pp73-manylinux1_x86_64.whl", hash = "sha256:f1279fcb570743f69bf8cf55ea578ecd1109799ebf438e4699c51a537e042538"}, + {file = "dependency_injector-4.32.2-pp37-pypy37_pp73-manylinux2010_x86_64.whl", hash = "sha256:c45f678c1b7042c95467ac9b7acd6c14afbe39f54b79f5b7aa345a4626a6e5f4"}, + {file = "dependency_injector-4.32.2-pp37-pypy37_pp73-win32.whl", hash = "sha256:c8122a7ee678f0782add0064aec341a196ca99b0ee054a1c1aef9563160e9414"}, +] distlib = [ {file = "distlib-0.3.1-py2.py3-none-any.whl", hash = "sha256:8c09de2c67b3e7deef7184574fc060ab8a793e7adbb183d942c389c8b13c52fb"}, {file = "distlib-0.3.1.zip", hash = "sha256:edf6116872c863e1aa9d5bb7cb5e05a022c519a4594dc703843343a9ddd9bff1"}, @@ -1001,20 +1129,20 @@ docutils = [ {file = "docutils-0.17.1.tar.gz", hash = "sha256:686577d2e4c32380bb50cbb22f575ed742d58168cee37e99117a854bcd88f125"}, ] fastavro = [ - {file = "fastavro-1.4.0-cp36-cp36m-macosx_10_14_x86_64.whl", hash = "sha256:4e240c99d484298b8ce6825ed239e806da6ff3cb2596077102b76a039e57c9fc"}, - {file = "fastavro-1.4.0-cp36-cp36m-manylinux2014_aarch64.whl", hash = "sha256:51d74395581afce5b3508950c7a0f791eb91838cc4cae9f11b9ebdf04e2894c6"}, - {file = "fastavro-1.4.0-cp36-cp36m-manylinux2014_x86_64.whl", hash = "sha256:0f7de00e84d61f1aa233ca22b1555c6a98693c3801829b89c52f1532850ea618"}, - {file = "fastavro-1.4.0-cp36-cp36m-win_amd64.whl", hash = "sha256:5f825bf6c90acdd47804ec343fb5337846cb87ff84211ed86defe92c43e8bb5e"}, - {file = "fastavro-1.4.0-cp37-cp37m-macosx_10_14_x86_64.whl", hash = "sha256:4dff942b220c7710c02dad189b23c935bf1b85d4a6e1dfef49ee6bd87379ef60"}, - {file = "fastavro-1.4.0-cp37-cp37m-manylinux2014_x86_64.whl", hash = "sha256:0770a8dafcac9b54f26cc54e2fbc683f3cb93b631d36f38c727aff65ffdb6c44"}, - {file = "fastavro-1.4.0-cp37-cp37m-win_amd64.whl", hash = "sha256:e7a8927764c0ba40cbf0ff0b6177875c4e9c2ce8eed1f33aaae432878c2923b2"}, - {file = "fastavro-1.4.0-cp38-cp38-macosx_10_14_x86_64.whl", hash = "sha256:a22107c204712679ce685ea3886d9f5ed1e395c8b64c081545b841b22dccf10b"}, - {file = "fastavro-1.4.0-cp38-cp38-manylinux2014_x86_64.whl", hash = "sha256:b69dc3ac12facef67d22c04afba318798ef1abaa512a1dc7407e3403d00a8979"}, - {file = "fastavro-1.4.0-cp38-cp38-win_amd64.whl", hash = "sha256:ad9d17c88dbddfc2c1b494c39124ceecc6417904a54ddd1a357270e69d05d1f8"}, - {file = "fastavro-1.4.0-cp39-cp39-macosx_10_14_x86_64.whl", hash = "sha256:6dbb58e051e62f0f71a40b6221ac80b47cdad7a8820a6403d36b357823bceb49"}, - {file = "fastavro-1.4.0-cp39-cp39-manylinux2014_x86_64.whl", hash = "sha256:3c9b1a11d17282267e3765cdb63f5155cdd0c37e985186fbcd72157fa6b98840"}, - {file = "fastavro-1.4.0-cp39-cp39-win_amd64.whl", hash = "sha256:8373c125b92b6460dc7b6863282409111a6f37f69efde82debb18cb368521ba6"}, - {file = "fastavro-1.4.0.tar.gz", hash = "sha256:306b87a55713ab5a9a37a315cbf9ecd74a1c640287c23a7926e960626c522d04"}, + {file = "fastavro-1.4.1-cp36-cp36m-macosx_10_14_x86_64.whl", hash = "sha256:3f99237de0f853f083a0f9929f54155b408a5c4b04fcfa8c59e589aa853f2111"}, + {file = "fastavro-1.4.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:18ed93c24ba96ed0e6cacf50170805188cc53a5e50c40fc94316387a98aa497b"}, + {file = "fastavro-1.4.1-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c298d2b2389049dfc6a06326a0a2f9809c7045e8cadb82b9e7e948fd32270547"}, + {file = "fastavro-1.4.1-cp36-cp36m-win_amd64.whl", hash = "sha256:948e69da16f4bf20bea65805ea210d793eae55f5f24f8be2c4d18c2a773aaf7b"}, + {file = "fastavro-1.4.1-cp37-cp37m-macosx_10_14_x86_64.whl", hash = "sha256:a4b6c2b2d126bf6246bead79cba43e17f07a1b99a25ccceb315c6e75ab7c9d39"}, + {file = "fastavro-1.4.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:453676a26e99f2f3af7f57c4236ceeee4e453ab7b9bd5f09e9c89bad5e572c78"}, + {file = "fastavro-1.4.1-cp37-cp37m-win_amd64.whl", hash = "sha256:5bd8a134daff2ea5ef0d72a528ca42dfe6c01deac7103dedf77d1a55936a981e"}, + {file = "fastavro-1.4.1-cp38-cp38-macosx_10_14_x86_64.whl", hash = "sha256:a8c5df1aa6c29409bbfe571504e87bf23553c88083a83febb42a74ec0cabe2ba"}, + {file = "fastavro-1.4.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:741cd757b7789e6ab821a1de02c1e18dfada417e1cfccae3f20dc2aadd6654fc"}, + {file = "fastavro-1.4.1-cp38-cp38-win_amd64.whl", hash = "sha256:6d7d4032ecb28bef3dd41e8c91c986df351e3323f526c901d1cbb53425617756"}, + {file = "fastavro-1.4.1-cp39-cp39-macosx_10_14_x86_64.whl", hash = "sha256:72c81690cef6ed9c87a146eaf9f608150bf78fd537a7e796780381b4a7baabb8"}, + {file = "fastavro-1.4.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cad3ecbac1fe1d319c617ff01104639795020cb72faf7e30ebe802c1d60ec915"}, + {file = "fastavro-1.4.1-cp39-cp39-win_amd64.whl", hash = "sha256:3e804c4fc9875314aa41901055941b199f87aeb1c880cc6fe3ad258fa08b24c6"}, + {file = "fastavro-1.4.1.tar.gz", hash = "sha256:237a741668316a2aadb14ba0532666a305dd14b4043aace89bcb0c6419c08162"}, ] filelock = [ {file = "filelock-3.0.12-py3-none-any.whl", hash = "sha256:929b7d63ec5b7d6b71b0fa5ac14e030b3f70b75747cef1b10da9b879fef15836"}, @@ -1045,8 +1173,8 @@ isort = [ {file = "isort-5.8.0.tar.gz", hash = "sha256:0a943902919f65c5684ac4e0154b1ad4fac6dcaa5d9f3426b732f1c8b5419be6"}, ] jinja2 = [ - {file = "Jinja2-2.11.3-py2.py3-none-any.whl", hash = "sha256:03e47ad063331dd6a3f04a43eddca8a966a26ba0c5b7207a9a9e4e08f1b29419"}, - {file = "Jinja2-2.11.3.tar.gz", hash = "sha256:a6d58433de0ae800347cab1fa3043cebbabe8baa9d29e668f1c768cb87a333c6"}, + {file = "Jinja2-3.0.1-py3-none-any.whl", hash = "sha256:1f06f2da51e7b56b8f238affdd6b4e2c61e39598a378cc49345bc1bd42a978a4"}, + {file = "Jinja2-3.0.1.tar.gz", hash = "sha256:703f484b47a6af502e743c9122595cc812b0271f661722403114f71a79d0f5a4"}, ] kafka-python = [ {file = "kafka-python-2.0.2.tar.gz", hash = "sha256:04dfe7fea2b63726cd6f3e79a2d86e709d608d74406638c5da33a01d45a9d7e3"}, @@ -1077,66 +1205,48 @@ lmdb = [ {file = "lmdb-1.2.1.tar.gz", hash = "sha256:5f76a90ebd08922acca11948779b5055f7a262687178e9e94f4e804b9f8465bc"}, ] markupsafe = [ - {file = "MarkupSafe-1.1.1-cp27-cp27m-macosx_10_6_intel.whl", hash = "sha256:09027a7803a62ca78792ad89403b1b7a73a01c8cb65909cd876f7fcebd79b161"}, - {file = "MarkupSafe-1.1.1-cp27-cp27m-manylinux1_i686.whl", hash = "sha256:e249096428b3ae81b08327a63a485ad0878de3fb939049038579ac0ef61e17e7"}, - {file = "MarkupSafe-1.1.1-cp27-cp27m-manylinux1_x86_64.whl", hash = "sha256:500d4957e52ddc3351cabf489e79c91c17f6e0899158447047588650b5e69183"}, - {file = "MarkupSafe-1.1.1-cp27-cp27m-win32.whl", hash = "sha256:b2051432115498d3562c084a49bba65d97cf251f5a331c64a12ee7e04dacc51b"}, - {file = "MarkupSafe-1.1.1-cp27-cp27m-win_amd64.whl", hash = "sha256:98c7086708b163d425c67c7a91bad6e466bb99d797aa64f965e9d25c12111a5e"}, - {file = "MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_i686.whl", hash = "sha256:cd5df75523866410809ca100dc9681e301e3c27567cf498077e8551b6d20e42f"}, - {file = "MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl", hash = "sha256:43a55c2930bbc139570ac2452adf3d70cdbb3cfe5912c71cdce1c2c6bbd9c5d1"}, - {file = "MarkupSafe-1.1.1-cp34-cp34m-macosx_10_6_intel.whl", hash = "sha256:1027c282dad077d0bae18be6794e6b6b8c91d58ed8a8d89a89d59693b9131db5"}, - {file = "MarkupSafe-1.1.1-cp34-cp34m-manylinux1_i686.whl", hash = "sha256:62fe6c95e3ec8a7fad637b7f3d372c15ec1caa01ab47926cfdf7a75b40e0eac1"}, - {file = "MarkupSafe-1.1.1-cp34-cp34m-manylinux1_x86_64.whl", hash = "sha256:88e5fcfb52ee7b911e8bb6d6aa2fd21fbecc674eadd44118a9cc3863f938e735"}, - {file = "MarkupSafe-1.1.1-cp34-cp34m-win32.whl", hash = "sha256:ade5e387d2ad0d7ebf59146cc00c8044acbd863725f887353a10df825fc8ae21"}, - {file = "MarkupSafe-1.1.1-cp34-cp34m-win_amd64.whl", hash = "sha256:09c4b7f37d6c648cb13f9230d847adf22f8171b1ccc4d5682398e77f40309235"}, - {file = "MarkupSafe-1.1.1-cp35-cp35m-macosx_10_6_intel.whl", hash = "sha256:79855e1c5b8da654cf486b830bd42c06e8780cea587384cf6545b7d9ac013a0b"}, - {file = "MarkupSafe-1.1.1-cp35-cp35m-manylinux1_i686.whl", hash = "sha256:c8716a48d94b06bb3b2524c2b77e055fb313aeb4ea620c8dd03a105574ba704f"}, - {file = "MarkupSafe-1.1.1-cp35-cp35m-manylinux1_x86_64.whl", hash = "sha256:7c1699dfe0cf8ff607dbdcc1e9b9af1755371f92a68f706051cc8c37d447c905"}, - {file = "MarkupSafe-1.1.1-cp35-cp35m-win32.whl", hash = "sha256:6dd73240d2af64df90aa7c4e7481e23825ea70af4b4922f8ede5b9e35f78a3b1"}, - {file = "MarkupSafe-1.1.1-cp35-cp35m-win_amd64.whl", hash = "sha256:9add70b36c5666a2ed02b43b335fe19002ee5235efd4b8a89bfcf9005bebac0d"}, - {file = "MarkupSafe-1.1.1-cp36-cp36m-macosx_10_6_intel.whl", hash = "sha256:24982cc2533820871eba85ba648cd53d8623687ff11cbb805be4ff7b4c971aff"}, - {file = "MarkupSafe-1.1.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:d53bc011414228441014aa71dbec320c66468c1030aae3a6e29778a3382d96e5"}, - {file = "MarkupSafe-1.1.1-cp36-cp36m-manylinux1_i686.whl", hash = "sha256:00bc623926325b26bb9605ae9eae8a215691f33cae5df11ca5424f06f2d1f473"}, - {file = "MarkupSafe-1.1.1-cp36-cp36m-manylinux1_x86_64.whl", hash = "sha256:717ba8fe3ae9cc0006d7c451f0bb265ee07739daf76355d06366154ee68d221e"}, - {file = "MarkupSafe-1.1.1-cp36-cp36m-manylinux2010_i686.whl", hash = "sha256:3b8a6499709d29c2e2399569d96719a1b21dcd94410a586a18526b143ec8470f"}, - {file = "MarkupSafe-1.1.1-cp36-cp36m-manylinux2010_x86_64.whl", hash = "sha256:84dee80c15f1b560d55bcfe6d47b27d070b4681c699c572af2e3c7cc90a3b8e0"}, - {file = "MarkupSafe-1.1.1-cp36-cp36m-manylinux2014_aarch64.whl", hash = "sha256:b1dba4527182c95a0db8b6060cc98ac49b9e2f5e64320e2b56e47cb2831978c7"}, - {file = "MarkupSafe-1.1.1-cp36-cp36m-win32.whl", hash = "sha256:535f6fc4d397c1563d08b88e485c3496cf5784e927af890fb3c3aac7f933ec66"}, - {file = "MarkupSafe-1.1.1-cp36-cp36m-win_amd64.whl", hash = "sha256:b1282f8c00509d99fef04d8ba936b156d419be841854fe901d8ae224c59f0be5"}, - {file = "MarkupSafe-1.1.1-cp37-cp37m-macosx_10_6_intel.whl", hash = "sha256:8defac2f2ccd6805ebf65f5eeb132adcf2ab57aa11fdf4c0dd5169a004710e7d"}, - {file = "MarkupSafe-1.1.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:bf5aa3cbcfdf57fa2ee9cd1822c862ef23037f5c832ad09cfea57fa846dec193"}, - {file = "MarkupSafe-1.1.1-cp37-cp37m-manylinux1_i686.whl", hash = "sha256:46c99d2de99945ec5cb54f23c8cd5689f6d7177305ebff350a58ce5f8de1669e"}, - {file = "MarkupSafe-1.1.1-cp37-cp37m-manylinux1_x86_64.whl", hash = "sha256:ba59edeaa2fc6114428f1637ffff42da1e311e29382d81b339c1817d37ec93c6"}, - {file = "MarkupSafe-1.1.1-cp37-cp37m-manylinux2010_i686.whl", hash = "sha256:6fffc775d90dcc9aed1b89219549b329a9250d918fd0b8fa8d93d154918422e1"}, - {file = "MarkupSafe-1.1.1-cp37-cp37m-manylinux2010_x86_64.whl", hash = "sha256:a6a744282b7718a2a62d2ed9d993cad6f5f585605ad352c11de459f4108df0a1"}, - {file = "MarkupSafe-1.1.1-cp37-cp37m-manylinux2014_aarch64.whl", hash = "sha256:195d7d2c4fbb0ee8139a6cf67194f3973a6b3042d742ebe0a9ed36d8b6f0c07f"}, - {file = "MarkupSafe-1.1.1-cp37-cp37m-win32.whl", hash = "sha256:b00c1de48212e4cc9603895652c5c410df699856a2853135b3967591e4beebc2"}, - {file = "MarkupSafe-1.1.1-cp37-cp37m-win_amd64.whl", hash = "sha256:9bf40443012702a1d2070043cb6291650a0841ece432556f784f004937f0f32c"}, - {file = "MarkupSafe-1.1.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6788b695d50a51edb699cb55e35487e430fa21f1ed838122d722e0ff0ac5ba15"}, - {file = "MarkupSafe-1.1.1-cp38-cp38-manylinux1_i686.whl", hash = "sha256:cdb132fc825c38e1aeec2c8aa9338310d29d337bebbd7baa06889d09a60a1fa2"}, - {file = "MarkupSafe-1.1.1-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:13d3144e1e340870b25e7b10b98d779608c02016d5184cfb9927a9f10c689f42"}, - {file = "MarkupSafe-1.1.1-cp38-cp38-manylinux2010_i686.whl", hash = "sha256:acf08ac40292838b3cbbb06cfe9b2cb9ec78fce8baca31ddb87aaac2e2dc3bc2"}, - {file = "MarkupSafe-1.1.1-cp38-cp38-manylinux2010_x86_64.whl", hash = "sha256:d9be0ba6c527163cbed5e0857c451fcd092ce83947944d6c14bc95441203f032"}, - {file = "MarkupSafe-1.1.1-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:caabedc8323f1e93231b52fc32bdcde6db817623d33e100708d9a68e1f53b26b"}, - {file = "MarkupSafe-1.1.1-cp38-cp38-win32.whl", hash = "sha256:596510de112c685489095da617b5bcbbac7dd6384aeebeda4df6025d0256a81b"}, - {file = "MarkupSafe-1.1.1-cp38-cp38-win_amd64.whl", hash = "sha256:e8313f01ba26fbbe36c7be1966a7b7424942f670f38e666995b88d012765b9be"}, - {file = "MarkupSafe-1.1.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:d73a845f227b0bfe8a7455ee623525ee656a9e2e749e4742706d80a6065d5e2c"}, - {file = "MarkupSafe-1.1.1-cp39-cp39-manylinux1_i686.whl", hash = "sha256:98bae9582248d6cf62321dcb52aaf5d9adf0bad3b40582925ef7c7f0ed85fceb"}, - {file = "MarkupSafe-1.1.1-cp39-cp39-manylinux1_x86_64.whl", hash = "sha256:2beec1e0de6924ea551859edb9e7679da6e4870d32cb766240ce17e0a0ba2014"}, - {file = "MarkupSafe-1.1.1-cp39-cp39-manylinux2010_i686.whl", hash = "sha256:7fed13866cf14bba33e7176717346713881f56d9d2bcebab207f7a036f41b850"}, - {file = "MarkupSafe-1.1.1-cp39-cp39-manylinux2010_x86_64.whl", hash = "sha256:6f1e273a344928347c1290119b493a1f0303c52f5a5eae5f16d74f48c15d4a85"}, - {file = "MarkupSafe-1.1.1-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:feb7b34d6325451ef96bc0e36e1a6c0c1c64bc1fbec4b854f4529e51887b1621"}, - {file = "MarkupSafe-1.1.1-cp39-cp39-win32.whl", hash = "sha256:22c178a091fc6630d0d045bdb5992d2dfe14e3259760e713c490da5323866c39"}, - {file = "MarkupSafe-1.1.1-cp39-cp39-win_amd64.whl", hash = "sha256:b7d644ddb4dbd407d31ffb699f1d140bc35478da613b441c582aeb7c43838dd8"}, - {file = "MarkupSafe-1.1.1.tar.gz", hash = "sha256:29872e92839765e546828bb7754a68c418d927cd064fd4708fab9fe9c8bb116b"}, + {file = "MarkupSafe-2.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:f9081981fe268bd86831e5c75f7de206ef275defcb82bc70740ae6dc507aee51"}, + {file = "MarkupSafe-2.0.1-cp36-cp36m-manylinux1_i686.whl", hash = "sha256:0955295dd5eec6cb6cc2fe1698f4c6d84af2e92de33fbcac4111913cd100a6ff"}, + {file = "MarkupSafe-2.0.1-cp36-cp36m-manylinux1_x86_64.whl", hash = "sha256:0446679737af14f45767963a1a9ef7620189912317d095f2d9ffa183a4d25d2b"}, + {file = "MarkupSafe-2.0.1-cp36-cp36m-manylinux2010_i686.whl", hash = "sha256:f826e31d18b516f653fe296d967d700fddad5901ae07c622bb3705955e1faa94"}, + {file = "MarkupSafe-2.0.1-cp36-cp36m-manylinux2010_x86_64.whl", hash = "sha256:fa130dd50c57d53368c9d59395cb5526eda596d3ffe36666cd81a44d56e48872"}, + {file = "MarkupSafe-2.0.1-cp36-cp36m-manylinux2014_aarch64.whl", hash = "sha256:905fec760bd2fa1388bb5b489ee8ee5f7291d692638ea5f67982d968366bef9f"}, + {file = "MarkupSafe-2.0.1-cp36-cp36m-win32.whl", hash = "sha256:6c4ca60fa24e85fe25b912b01e62cb969d69a23a5d5867682dd3e80b5b02581d"}, + {file = "MarkupSafe-2.0.1-cp36-cp36m-win_amd64.whl", hash = "sha256:b2f4bf27480f5e5e8ce285a8c8fd176c0b03e93dcc6646477d4630e83440c6a9"}, + {file = "MarkupSafe-2.0.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:0717a7390a68be14b8c793ba258e075c6f4ca819f15edfc2a3a027c823718567"}, + {file = "MarkupSafe-2.0.1-cp37-cp37m-manylinux1_i686.whl", hash = "sha256:6557b31b5e2c9ddf0de32a691f2312a32f77cd7681d8af66c2692efdbef84c18"}, + {file = "MarkupSafe-2.0.1-cp37-cp37m-manylinux1_x86_64.whl", hash = "sha256:49e3ceeabbfb9d66c3aef5af3a60cc43b85c33df25ce03d0031a608b0a8b2e3f"}, + {file = "MarkupSafe-2.0.1-cp37-cp37m-manylinux2010_i686.whl", hash = "sha256:d7f9850398e85aba693bb640262d3611788b1f29a79f0c93c565694658f4071f"}, + {file = "MarkupSafe-2.0.1-cp37-cp37m-manylinux2010_x86_64.whl", hash = "sha256:6a7fae0dd14cf60ad5ff42baa2e95727c3d81ded453457771d02b7d2b3f9c0c2"}, + {file = "MarkupSafe-2.0.1-cp37-cp37m-manylinux2014_aarch64.whl", hash = "sha256:b7f2d075102dc8c794cbde1947378051c4e5180d52d276987b8d28a3bd58c17d"}, + {file = "MarkupSafe-2.0.1-cp37-cp37m-win32.whl", hash = "sha256:a30e67a65b53ea0a5e62fe23682cfe22712e01f453b95233b25502f7c61cb415"}, + {file = "MarkupSafe-2.0.1-cp37-cp37m-win_amd64.whl", hash = "sha256:611d1ad9a4288cf3e3c16014564df047fe08410e628f89805e475368bd304914"}, + {file = "MarkupSafe-2.0.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:be98f628055368795d818ebf93da628541e10b75b41c559fdf36d104c5787066"}, + {file = "MarkupSafe-2.0.1-cp38-cp38-manylinux1_i686.whl", hash = "sha256:1d609f577dc6e1aa17d746f8bd3c31aa4d258f4070d61b2aa5c4166c1539de35"}, + {file = "MarkupSafe-2.0.1-cp38-cp38-manylinux1_x86_64.whl", hash = "sha256:7d91275b0245b1da4d4cfa07e0faedd5b0812efc15b702576d103293e252af1b"}, + {file = "MarkupSafe-2.0.1-cp38-cp38-manylinux2010_i686.whl", hash = "sha256:01a9b8ea66f1658938f65b93a85ebe8bc016e6769611be228d797c9d998dd298"}, + {file = "MarkupSafe-2.0.1-cp38-cp38-manylinux2010_x86_64.whl", hash = "sha256:47ab1e7b91c098ab893b828deafa1203de86d0bc6ab587b160f78fe6c4011f75"}, + {file = "MarkupSafe-2.0.1-cp38-cp38-manylinux2014_aarch64.whl", hash = "sha256:97383d78eb34da7e1fa37dd273c20ad4320929af65d156e35a5e2d89566d9dfb"}, + {file = "MarkupSafe-2.0.1-cp38-cp38-win32.whl", hash = "sha256:023cb26ec21ece8dc3907c0e8320058b2e0cb3c55cf9564da612bc325bed5e64"}, + {file = "MarkupSafe-2.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:984d76483eb32f1bcb536dc27e4ad56bba4baa70be32fa87152832cdd9db0833"}, + {file = "MarkupSafe-2.0.1-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:2ef54abee730b502252bcdf31b10dacb0a416229b72c18b19e24a4509f273d26"}, + {file = "MarkupSafe-2.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:3c112550557578c26af18a1ccc9e090bfe03832ae994343cfdacd287db6a6ae7"}, + {file = "MarkupSafe-2.0.1-cp39-cp39-manylinux1_i686.whl", hash = "sha256:53edb4da6925ad13c07b6d26c2a852bd81e364f95301c66e930ab2aef5b5ddd8"}, + {file = "MarkupSafe-2.0.1-cp39-cp39-manylinux1_x86_64.whl", hash = "sha256:f5653a225f31e113b152e56f154ccbe59eeb1c7487b39b9d9f9cdb58e6c79dc5"}, + {file = "MarkupSafe-2.0.1-cp39-cp39-manylinux2010_i686.whl", hash = "sha256:4efca8f86c54b22348a5467704e3fec767b2db12fc39c6d963168ab1d3fc9135"}, + {file = "MarkupSafe-2.0.1-cp39-cp39-manylinux2010_x86_64.whl", hash = "sha256:ab3ef638ace319fa26553db0624c4699e31a28bb2a835c5faca8f8acf6a5a902"}, + {file = "MarkupSafe-2.0.1-cp39-cp39-manylinux2014_aarch64.whl", hash = "sha256:f8ba0e8349a38d3001fae7eadded3f6606f0da5d748ee53cc1dab1d6527b9509"}, + {file = "MarkupSafe-2.0.1-cp39-cp39-win32.whl", hash = "sha256:10f82115e21dc0dfec9ab5c0223652f7197feb168c940f3ef61563fc2d6beb74"}, + {file = "MarkupSafe-2.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:693ce3f9e70a6cf7d2fb9e6c9d8b204b6b39897a2c4a1aa65728d5ac97dcc1d8"}, + {file = "MarkupSafe-2.0.1.tar.gz", hash = "sha256:594c67807fb16238b30c44bdf74f36c02cdf22d1c8cda91ef8a0ed8dabf5620a"}, ] mccabe = [ {file = "mccabe-0.6.1-py2.py3-none-any.whl", hash = "sha256:ab8a6258860da4b6677da4bd2fe5dc2c659cff31b3ee4f7f5d64e79735b80d42"}, {file = "mccabe-0.6.1.tar.gz", hash = "sha256:dd8d182285a0fe56bace7f45b5e7d1a6ebcbf524e8f3bd87eb0f125271b8831f"}, ] minos-microservice-common = [ - {file = "minos_microservice_common-0.0.12-py3-none-any.whl", hash = "sha256:9dac9b9062305ae91a9de132c864bf0a70ed3e8c7ec60c8c95fbec53afbb98b5"}, - {file = "minos_microservice_common-0.0.12.tar.gz", hash = "sha256:6e6f4e4ba1215196be0c7bdb0d4c75cbc3055f049b9509a65b086db4c9f41aba"}, + {file = "minos_microservice_common-0.0.14-py3-none-any.whl", hash = "sha256:c23e8d94f69bc8bd3a5cd61c8dd2a42cb195098fe8578ae7db632f8dbf82cb58"}, + {file = "minos_microservice_common-0.0.14.tar.gz", hash = "sha256:4afa38f82f445a776893195d45a7a5aeb3d17c0111bf62a79e76b7f1abf58be6"}, ] multidict = [ {file = "multidict-5.1.0-cp36-cp36m-macosx_10_14_x86_64.whl", hash = "sha256:b7993704f1a4b204e71debe6095150d43b2ee6150fa4f44d6d966ec356a8d61f"}, @@ -1214,6 +1324,10 @@ pathspec = [ {file = "pathspec-0.8.1-py2.py3-none-any.whl", hash = "sha256:aa0cb481c4041bf52ffa7b0d8fa6cd3e88a2ca4879c533c9153882ee2556790d"}, {file = "pathspec-0.8.1.tar.gz", hash = "sha256:86379d6b86d75816baba717e64b1a3a3469deb93bb76d613c9ce79edc5cb68fd"}, ] +pbr = [ + {file = "pbr-5.6.0-py2.py3-none-any.whl", hash = "sha256:c68c661ac5cc81058ac94247278eeda6d2e6aecb3e227b0387c30d277e7ef8d4"}, + {file = "pbr-5.6.0.tar.gz", hash = "sha256:42df03e7797b796625b1029c0400279c7c34fd7df24a7d7818a1abb5b38710dd"}, +] pluggy = [ {file = "pluggy-0.13.1-py2.py3-none-any.whl", hash = "sha256:966c145cd83c96502c3c3868f50408687b38434af77734af1e9ca461a4081d2d"}, {file = "pluggy-0.13.1.tar.gz", hash = "sha256:15b2acde666561e1298d71b523007ed7364de07029219b604cf808bfa1c765b0"}, @@ -1370,16 +1484,24 @@ requests = [ {file = "requests-2.25.1.tar.gz", hash = "sha256:27973dd4a904a4f13b263a19c866c13b92a39ed1c964655f025f3f8d3d75b804"}, ] six = [ - {file = "six-1.16.0-py2.py3-none-any.whl", hash = "sha256:8abb2f1d86890a2dfb989f9a77cfcfd3e47c2a354b01111771326f8aa26e0254"}, - {file = "six-1.16.0.tar.gz", hash = "sha256:1e61c37477a1626458e36f7b1d82aa5c9b094fa4802892072e49de9c60c4c926"}, + {file = "six-1.15.0-py2.py3-none-any.whl", hash = "sha256:8b74bedcbbbaca38ff6d7491d76f2b06b3592611af620f8426e82dddb04a5ced"}, + {file = "six-1.15.0.tar.gz", hash = "sha256:30639c035cdb23534cd4aa2dd52c3bf48f06e5f4a941509c8bafd8ce11080259"}, ] snowballstemmer = [ {file = "snowballstemmer-2.1.0-py2.py3-none-any.whl", hash = "sha256:b51b447bea85f9968c13b650126a888aabd4cb4463fca868ec596826325dedc2"}, {file = "snowballstemmer-2.1.0.tar.gz", hash = "sha256:e997baa4f2e9139951b6f4c631bad912dfd3c792467e2f03d7239464af90e914"}, ] sphinx = [ - {file = "Sphinx-4.0.1-py3-none-any.whl", hash = "sha256:b2566f5f339737a6ef37198c47d56de1f4a746c722bebdb2fe045c34bfd8b9d0"}, - {file = "Sphinx-4.0.1.tar.gz", hash = "sha256:cf5104777571b2b7f06fa88ee08fade24563f4a0594cf4bd17d31c47b8740b4c"}, + {file = "Sphinx-4.0.2-py3-none-any.whl", hash = "sha256:d1cb10bee9c4231f1700ec2e24a91be3f3a3aba066ea4ca9f3bbe47e59d5a1d4"}, + {file = "Sphinx-4.0.2.tar.gz", hash = "sha256:b5c2ae4120bf00c799ba9b3699bc895816d272d120080fbc967292f29b52b48c"}, +] +sphinx-autodoc-typehints = [ + {file = "sphinx-autodoc-typehints-1.12.0.tar.gz", hash = "sha256:193617d9dbe0847281b1399d369e74e34cd959c82e02c7efde077fca908a9f52"}, + {file = "sphinx_autodoc_typehints-1.12.0-py3-none-any.whl", hash = "sha256:5e81776ec422dd168d688ab60f034fccfafbcd94329e9537712c93003bddc04a"}, +] +sphinxcontrib-apidoc = [ + {file = "sphinxcontrib-apidoc-0.3.0.tar.gz", hash = "sha256:729bf592cf7b7dd57c4c05794f732dc026127275d785c2a5494521fdde773fb9"}, + {file = "sphinxcontrib_apidoc-0.3.0-py2.py3-none-any.whl", hash = "sha256:6671a46b2c6c5b0dca3d8a147849d159065e50443df79614f921b42fbd15cb09"}, ] sphinxcontrib-applehelp = [ {file = "sphinxcontrib-applehelp-1.0.2.tar.gz", hash = "sha256:a072735ec80e7675e3f432fcae8610ecf509c5f1869d17e2eecff44389cdbc58"}, diff --git a/pyproject.toml b/pyproject.toml index bcaca222..5fca890a 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -1,6 +1,6 @@ [tool.poetry] name = "minos_microservice_networks" -version = "0.0.2" +version = "0.0.3" description = "Python Package with the common network classes and utilities used in Minos Microservice." readme = "README.md" repository = "https://github.com/clariteia/minos_microservice_network" @@ -31,11 +31,12 @@ include = [ [tool.poetry.dependencies] python = "^3.9" -minos-microservice-common = "^0.0.12" +minos-microservice-common = "^0.0" aiokafka = "^0.7.0" aiomisc = "^14.0.3" aiopg = "^1.2.1" aiohttp = "^3.7.4" +dependency-injector = "^4.32.2" [tool.poetry.dev-dependencies] black = "^19.10b" @@ -45,6 +46,8 @@ coverage = "^5.5" flake8 = "^3.9.2" Sphinx = "^4.0.1" pre-commit = "^2.12.1" +sphinx-autodoc-typehints = "^1.12.0" +sphinxcontrib-apidoc = "^0.3.0" [build-system] requires = ["poetry-core>=1.0.0"] diff --git a/tests/test_config.yml b/tests/test_config.yml index 1894c1f6..ced66ef3 100644 --- a/tests/test_config.yml +++ b/tests/test_config.yml @@ -4,16 +4,16 @@ rest: host: localhost port: 8080 endpoints: - - name: AddOrder - route: /order - method: POST - controller: tests.services.TestRestService.RestService - action: add_order - - name: GetOrder - route: /order - method: GET - controller: tests.services.TestRestService.RestService - action: get_order + - name: AddOrder + route: /order + method: POST + controller: tests.services.TestRestService.RestService + action: add_order + - name: GetOrder + route: /order + method: GET + controller: tests.services.TestRestService.RestService + action: get_order repository: database: order_db user: minos @@ -30,12 +30,12 @@ events: broker: localhost port: 9092 items: - - name: TicketAdded - controller: tests.services.CqrsTestService.CqrsService - action: ticket_added - - name: TicketDeleted - controller: tests.services.CqrsTestService.CqrsService - action: ticket_deleted + - name: TicketAdded + controller: tests.services.CqrsTestService.CqrsService + action: ticket_added + - name: TicketDeleted + controller: tests.services.CqrsTestService.CqrsService + action: ticket_deleted queue: database: order_db user: minos @@ -48,18 +48,18 @@ commands: broker: localhost port: 9092 items: - - name: AddOrder - controller: tests.services.CommandTestService.CommandService - action: add_order - - name: DeleteOrder - controller: tests.services.CommandTestService.CommandService - action: delete_order - - name: UpdateOrder - controller: tests.services.CommandTestService.CommandService - action: update_order - - name: GetOrder - controller: tests.service.CommandTestService.CommandService - action: get_order + - name: AddOrder + controller: tests.services.CommandTestService.CommandService + action: add_order + - name: DeleteOrder + controller: tests.services.CommandTestService.CommandService + action: delete_order + - name: UpdateOrder + controller: tests.services.CommandTestService.CommandService + action: update_order + - name: GetOrder + controller: tests.service.CommandTestService.CommandService + action: get_order queue: database: order_db user: minos @@ -69,13 +69,15 @@ commands: records: 10 retry: 2 saga: + storage: + path: "./order.lmdb" items: - - name: AddOrder - controller: tests.services.SagaTestService.SagaService - action: add_order - - name: DeleteOrder - controller: tests.services.SagaTestService.SagaService - action: delete_order + - name: AddOrder + controller: tests.services.SagaTestService.SagaService + action: add_order + - name: DeleteOrder + controller: tests.services.SagaTestService.SagaService + action: delete_order queue: database: order_db user: minos diff --git a/tests/test_networks/test_broker/__init__.py b/tests/test_networks/test_brokers/__init__.py similarity index 100% rename from tests/test_networks/test_broker/__init__.py rename to tests/test_networks/test_brokers/__init__.py diff --git a/tests/test_networks/test_brokers/test_command_replies.py b/tests/test_networks/test_brokers/test_command_replies.py new file mode 100644 index 00000000..d2b8d757 --- /dev/null +++ b/tests/test_networks/test_brokers/test_command_replies.py @@ -0,0 +1,116 @@ +""" +Copyright (C) 2021 Clariteia SL + +This file is part of minos framework. + +Minos framework can not be copied and/or distributed without the express permission of Clariteia SL. +""" +import unittest + +import aiopg + +from minos.common import ( + MinosConfig, + MinosConfigException, +) +from minos.common.testing import ( + PostgresAsyncTestCase, +) +from minos.networks import ( + CommandReplyBroker, + Producer, +) +from tests.utils import ( + BASE_PATH, + NaiveAggregate, +) + + +class TestCommandReplyBroker(PostgresAsyncTestCase): + CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" + + def test_from_config_default(self): + broker = CommandReplyBroker.from_config( + "CommandBroker", saga_id="9347839473kfslf", task_id="92839283hjijh232", config=self.config + ) + self.assertIsInstance(broker, CommandReplyBroker) + + def test_from_config_raises(self): + with self.assertRaises(MinosConfigException): + CommandReplyBroker.from_config() + + async def test_commands_broker_insertion(self): + broker = CommandReplyBroker.from_config( + "CommandBroker", config=self.config, saga_id="9347839473kfslf", task_id="92839283hjijh232" + ) + await broker.setup() + + item = NaiveAggregate(test_id=1, test=2, id=1, version=1) + + queue_id = await broker.send_one(item) + assert queue_id > 0 + + async def test_if_commands_was_deleted(self): + broker = CommandReplyBroker.from_config( + "CommandReplyBroker-Delete", config=self.config, saga_id="9347839473kfslf", task_id="92839283hjijh232" + ) + await broker.setup() + + item = NaiveAggregate(test_id=1, test=2, id=1, version=1) + + queue_id_1 = await broker.send_one(item) + queue_id_2 = await broker.send_one(item) + + await Producer.from_config(config=self.config).dispatch() + + async with aiopg.connect(**self.events_queue_db) as connection: + async with connection.cursor() as cursor: + await cursor.execute( + "SELECT COUNT(*) FROM producer_queue WHERE topic = '%s'" % "CommandReplyBroker-Delete" + ) + records = await cursor.fetchone() + + assert queue_id_1 > 0 + assert queue_id_2 > 0 + assert records[0] == 0 + + async def test_if_commands_retry_was_incremented(self): + broker = CommandReplyBroker.from_config( + "CommandReplyBroker-Delete", config=self.config, saga_id="9347839473kfslf", task_id="92839283hjijh232", + ) + await broker.setup() + + item = NaiveAggregate(test_id=1, test=2, id=1, version=1) + + queue_id_1 = await broker.send_one(item) + queue_id_2 = await broker.send_one(item) + + config = MinosConfig( + path=BASE_PATH / "wrong_test_config.yml", + events_queue_database=self.config.events.queue.database, + events_queue_user=self.config.events.queue.user, + ) + await Producer.from_config(config=config).dispatch() + + async with aiopg.connect(**self.events_queue_db) as connection: + async with connection.cursor() as cursor: + await cursor.execute( + "SELECT COUNT(*) FROM producer_queue WHERE topic = '%s'" % "CommandReplyBroker-Delete" + ) + records = await cursor.fetchone() + + await cursor.execute("SELECT retry FROM producer_queue WHERE id=%d;" % queue_id_1) + retry_1 = await cursor.fetchone() + + await cursor.execute("SELECT retry FROM producer_queue WHERE id=%d;" % queue_id_2) + retry_2 = await cursor.fetchone() + + assert queue_id_1 > 0 + assert queue_id_2 > 0 + assert records[0] == 2 + assert retry_1[0] > 0 + assert retry_2[0] > 0 + + +if __name__ == "__main__": + unittest.main() diff --git a/tests/test_networks/test_broker/test_command.py b/tests/test_networks/test_brokers/test_commands.py similarity index 79% rename from tests/test_networks/test_broker/test_command.py rename to tests/test_networks/test_brokers/test_commands.py index d3a8df67..be762b15 100644 --- a/tests/test_networks/test_broker/test_command.py +++ b/tests/test_networks/test_brokers/test_commands.py @@ -11,13 +11,14 @@ from minos.common import ( MinosConfig, + MinosConfigException, ) from minos.common.testing import ( PostgresAsyncTestCase, ) from minos.networks import ( - MinosCommandBroker, - MinosQueueDispatcher, + CommandBroker, + Producer, ) from tests.utils import ( BASE_PATH, @@ -25,11 +26,25 @@ ) -class TestMinosCommandBroker(PostgresAsyncTestCase): +class TestCommandBroker(PostgresAsyncTestCase): CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" + def test_from_config_default(self): + broker = CommandBroker.from_config( + "CommandBroker", + saga_id="9347839473kfslf", + task_id="92839283hjijh232", + reply_on="test_reply_on", + config=self.config, + ) + self.assertIsInstance(broker, CommandBroker) + + def test_from_config_raises(self): + with self.assertRaises(MinosConfigException): + CommandBroker.from_config() + async def test_commands_broker_insertion(self): - broker = MinosCommandBroker.from_config( + broker = CommandBroker.from_config( "CommandBroker", config=self.config, saga_id="9347839473kfslf", @@ -44,7 +59,7 @@ async def test_commands_broker_insertion(self): assert queue_id > 0 async def test_if_commands_was_deleted(self): - broker = MinosCommandBroker.from_config( + broker = CommandBroker.from_config( "CommandBroker-Delete", config=self.config, saga_id="9347839473kfslf", @@ -58,7 +73,7 @@ async def test_if_commands_was_deleted(self): queue_id_1 = await broker.send_one(item) queue_id_2 = await broker.send_one(item) - await MinosQueueDispatcher.from_config(config=self.config).dispatch() + await Producer.from_config(config=self.config).dispatch() async with aiopg.connect(**self.events_queue_db) as connection: async with connection.cursor() as cursor: @@ -70,7 +85,7 @@ async def test_if_commands_was_deleted(self): assert records[0] == 0 async def test_if_commands_retry_was_incremented(self): - broker = MinosCommandBroker.from_config( + broker = CommandBroker.from_config( "CommandBroker-Delete", config=self.config, saga_id="9347839473kfslf", @@ -89,7 +104,7 @@ async def test_if_commands_retry_was_incremented(self): events_queue_database=self.config.events.queue.database, events_queue_user=self.config.events.queue.user, ) - await MinosQueueDispatcher.from_config(config=config).dispatch() + await Producer.from_config(config=config).dispatch() async with aiopg.connect(**self.events_queue_db) as connection: async with connection.cursor() as cursor: diff --git a/tests/test_networks/test_broker/test_events.py b/tests/test_networks/test_brokers/test_events.py similarity index 78% rename from tests/test_networks/test_broker/test_events.py rename to tests/test_networks/test_brokers/test_events.py index 4e32b666..695ff9a3 100644 --- a/tests/test_networks/test_broker/test_events.py +++ b/tests/test_networks/test_brokers/test_events.py @@ -4,13 +4,14 @@ from minos.common import ( MinosConfig, + MinosConfigException, ) from minos.common.testing import ( PostgresAsyncTestCase, ) from minos.networks import ( - MinosEventBroker, - MinosQueueDispatcher, + EventBroker, + Producer, ) from tests.utils import ( BASE_PATH, @@ -18,11 +19,18 @@ ) -class TestMinosEventBroker(PostgresAsyncTestCase): +class TestEventBroker(PostgresAsyncTestCase): CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" + def test_from_config_default(self): + self.assertIsInstance(EventBroker.from_config("EventBroker", config=self.config), EventBroker) + + def test_from_config_raises(self): + with self.assertRaises(MinosConfigException): + EventBroker.from_config() + async def test_if_queue_table_exists(self): - broker = MinosEventBroker.from_config("EventBroker", config=self.config) + broker = EventBroker.from_config("EventBroker", config=self.config) await broker.setup() async with aiopg.connect(**self.events_queue_db) as connection: @@ -39,7 +47,7 @@ async def test_if_queue_table_exists(self): assert ret == [(1,)] async def test_events_broker_insertion(self): - broker = MinosEventBroker.from_config("EventBroker", config=self.config) + broker = EventBroker.from_config("EventBroker", config=self.config) await broker.setup() item = NaiveAggregate(test_id=1, test=2, id=1, version=1) @@ -48,14 +56,14 @@ async def test_events_broker_insertion(self): assert queue_id > 0 async def test_if_events_was_deleted(self): - broker = MinosEventBroker.from_config("EventBroker-Delete", config=self.config) + broker = EventBroker.from_config("EventBroker-Delete", config=self.config) await broker.setup() item = NaiveAggregate(test_id=1, test=2, id=1, version=1) queue_id_1 = await broker.send_one(item) queue_id_2 = await broker.send_one(item) - await MinosQueueDispatcher.from_config(config=self.config).dispatch() + await Producer.from_config(config=self.config).dispatch() async with aiopg.connect(**self.events_queue_db) as connection: async with connection.cursor() as cursor: @@ -67,7 +75,7 @@ async def test_if_events_was_deleted(self): assert records[0] == 0 async def test_if_events_retry_was_incremented(self): - broker = MinosEventBroker.from_config("EventBroker-Delete", config=self.config) + broker = EventBroker.from_config("EventBroker-Delete", config=self.config) await broker.setup() item = NaiveAggregate(test_id=1, test=2, id=1, version=1) @@ -81,7 +89,7 @@ async def test_if_events_retry_was_incremented(self): events_queue_user=self.config.commands.queue.user, ) - await MinosQueueDispatcher.from_config(config=config).dispatch() + await Producer.from_config(config=config).dispatch() async with aiopg.connect(**self.events_queue_db) as connection: async with connection.cursor() as cursor: diff --git a/tests/test_networks/test_broker/test_dispatcher.py b/tests/test_networks/test_brokers/test_producers.py similarity index 61% rename from tests/test_networks/test_broker/test_dispatcher.py rename to tests/test_networks/test_brokers/test_producers.py index be51f22d..ef8c25b5 100644 --- a/tests/test_networks/test_broker/test_dispatcher.py +++ b/tests/test_networks/test_brokers/test_producers.py @@ -7,31 +7,34 @@ PostgresAsyncTestCase, ) from minos.networks import ( - MinosQueueDispatcher, + Producer, ) from tests.utils import ( BASE_PATH, ) -class TestQueueDispatcher(PostgresAsyncTestCase): +class TestProducer(PostgresAsyncTestCase): CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" def test_from_config(self): - dispatcher = MinosQueueDispatcher.from_config(config=self.config) - self.assertIsInstance(dispatcher, MinosQueueDispatcher) + dispatcher = Producer.from_config(config=self.config) + self.assertIsInstance(dispatcher, Producer) + + def test_from_config_default(self): + self.assertIsInstance(Producer.from_config(config=self.config), Producer) def test_from_config_raises(self): with self.assertRaises(MinosConfigException): - MinosQueueDispatcher.from_config() + Producer.from_config() async def test_select(self): - dispatcher = MinosQueueDispatcher.from_config(config=self.config) + dispatcher = Producer.from_config(config=self.config) await dispatcher.setup() self.assertEqual([], [v async for v in dispatcher.select()]) async def test_send_to_kafka_ok(self): - dispatcher = MinosQueueDispatcher.from_config(config=self.config) + dispatcher = Producer.from_config(config=self.config) response = await dispatcher.publish(topic="TestKafkaSend", message=bytes()) assert response is True diff --git a/tests/test_networks/test_broker/test_service.py b/tests/test_networks/test_brokers/test_services.py similarity index 64% rename from tests/test_networks/test_broker/test_service.py rename to tests/test_networks/test_brokers/test_services.py index 942dd9a0..14195e30 100644 --- a/tests/test_networks/test_broker/test_service.py +++ b/tests/test_networks/test_brokers/test_services.py @@ -21,46 +21,40 @@ PostgresAsyncTestCase, ) from minos.networks import ( - MinosQueueDispatcher, - MinosQueueService, + Producer, + ProducerService, ) from tests.utils import ( BASE_PATH, ) -class TestMinosQueueService(PostgresAsyncTestCase): +class TestProducerService(PostgresAsyncTestCase): CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" def test_is_instance(self): - with self.config: - service = MinosQueueService(interval=0.1) - self.assertIsInstance(service, PeriodicService) + service = ProducerService(interval=0.1, config=self.config) + self.assertIsInstance(service, PeriodicService) def test_dispatcher_empty(self): with self.assertRaises(MinosConfigException): - MinosQueueService(interval=0.1) + ProducerService(interval=0.1) def test_dispatcher_config(self): - service = MinosQueueService(interval=0.1, config=self.config) + service = ProducerService(interval=0.1, config=self.config) dispatcher = service.dispatcher - self.assertIsInstance(dispatcher, MinosQueueDispatcher) + self.assertIsInstance(dispatcher, Producer) self.assertFalse(dispatcher.already_setup) - def test_dispatcher_config_context(self): - with self.config: - service = MinosQueueService(interval=0.1) - self.assertIsInstance(service.dispatcher, MinosQueueDispatcher) - async def test_start(self): - service = MinosQueueService(interval=0.1, loop=None, config=self.config) + service = ProducerService(interval=0.1, loop=None, config=self.config) service.dispatcher.setup = MagicMock(side_effect=service.dispatcher.setup) await service.start() self.assertTrue(1, service.dispatcher.setup.call_count) await service.stop() async def test_callback(self): - service = MinosQueueService(interval=0.1, loop=None, config=self.config) + service = ProducerService(interval=0.1, loop=None, config=self.config) await service.dispatcher.setup() service.dispatcher.dispatch = MagicMock(side_effect=service.dispatcher.dispatch) await service.callback() diff --git a/tests/test_networks/test_handler/test_command/test_command_server.py b/tests/test_networks/test_handler/test_command/test_command_server.py deleted file mode 100644 index de29345d..00000000 --- a/tests/test_networks/test_handler/test_command/test_command_server.py +++ /dev/null @@ -1,86 +0,0 @@ -from collections import ( - namedtuple, -) - -from minos.common import ( - Command, -) -from minos.common.testing import ( - PostgresAsyncTestCase, -) -from minos.networks import ( - MinosCommandHandlerServer, -) -from tests.utils import ( - BASE_PATH, - NaiveAggregate, -) - - -class TestCommandServer(PostgresAsyncTestCase): - CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" - - def test_from_config(self): - dispatcher = MinosCommandHandlerServer.from_config(config=self.config) - self.assertIsInstance(dispatcher, MinosCommandHandlerServer) - - async def test_none_config(self): - event_server = MinosCommandHandlerServer.from_config(config=None) - - self.assertIsNone(event_server) - - async def test_queue_add(self): - model = NaiveAggregate(test_id=1, test=2, id=1, version=1) - event_instance = Command( - topic="AddOrder", - model=model.classname, - items=[], - saga_id="43434jhij", - task_id="juhjh34", - reply_on="mkk2334", - ) - bin_data = event_instance.avro_bytes - Command.from_avro_bytes(bin_data) - - event_server = MinosCommandHandlerServer.from_config(config=self.config) - await event_server.setup() - - affected_rows, id = await event_server.queue_add(topic=event_instance.topic, partition=0, binary=bin_data) - - assert affected_rows == 1 - assert id > 0 - - async def test_handle_message(self): - event_server = MinosCommandHandlerServer.from_config(config=self.config) - await event_server.setup() - - model = NaiveAggregate(test_id=1, test=2, id=1, version=1) - event_instance = Command( - topic="AddOrder", - model=model.classname, - items=[], - saga_id="43434jhij", - task_id="juhjh34", - reply_on="mkk2334", - ) - bin_data = event_instance.avro_bytes - - Mensaje = namedtuple("Mensaje", ["topic", "partition", "value"]) - - async def consumer(): - yield Mensaje(topic="TicketAdded", partition=0, value=bin_data) - - await event_server.handle_message(consumer()) - - async def test_handle_message_ko(self): - event_server = MinosCommandHandlerServer.from_config(config=self.config) - await event_server.setup() - - bin_data = bytes(b"test") - - Mensaje = namedtuple("Mensaje", ["topic", "partition", "value"]) - - async def consumer(): - yield Mensaje(topic="TicketAdded", partition=0, value=bin_data) - - await event_server.handle_message(consumer()) diff --git a/tests/test_networks/test_handler/test_command/test_command_services.py b/tests/test_networks/test_handler/test_command/test_command_services.py deleted file mode 100644 index 6a94a47c..00000000 --- a/tests/test_networks/test_handler/test_command/test_command_services.py +++ /dev/null @@ -1,56 +0,0 @@ -import unittest -from unittest.mock import ( - MagicMock, -) - -from minos.common.testing import ( - PostgresAsyncTestCase, -) -from minos.networks import ( - MinosCommandPeriodicService, - MinosCommandServerService, -) -from tests.utils import ( - BASE_PATH, -) - - -class TestMinosCommandServices(PostgresAsyncTestCase): - CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" - - async def test_start(self): - service = MinosCommandServerService(loop=None, config=self.config) - - async def _fn(consumer): - self.assertEqual(service.consumer, consumer) - - mock = MagicMock(side_effect=_fn) - service.dispatcher.handle_message = mock - await service.start() - self.assertTrue(1, mock.call_count) - await service.stop() - - -class TestMinosQueueService(PostgresAsyncTestCase): - CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" - - async def test_start(self): - service = MinosCommandPeriodicService(interval=1, loop=None, config=self.config) - mock = MagicMock(side_effect=service.dispatcher.setup) - service.dispatcher.setup = mock - await service.start() - self.assertTrue(1, mock.call_count) - await service.stop() - - async def test_callback(self): - service = MinosCommandPeriodicService(interval=1, loop=None, config=self.config) - await service.dispatcher.setup() - mock = MagicMock(side_effect=service.dispatcher.queue_checker) - service.dispatcher.queue_checker = mock - await service.callback() - self.assertEqual(1, mock.call_count) - await service.dispatcher.destroy() - - -if __name__ == "__main__": - unittest.main() diff --git a/tests/test_networks/test_handler/test_command/test_dispatcher.py b/tests/test_networks/test_handler/test_command/test_dispatcher.py deleted file mode 100644 index 11a91d09..00000000 --- a/tests/test_networks/test_handler/test_command/test_dispatcher.py +++ /dev/null @@ -1,157 +0,0 @@ -import datetime - -import aiopg - -from minos.common import ( - Command, -) -from minos.common.testing import ( - PostgresAsyncTestCase, -) -from minos.networks import ( - MinosCommandHandlerDispatcher, - MinosNetworkException, -) -from tests.utils import ( - BASE_PATH, - NaiveAggregate, -) - - -class TestCommandDispatcher(PostgresAsyncTestCase): - CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" - - def test_from_config(self): - dispatcher = MinosCommandHandlerDispatcher.from_config(config=self.config) - self.assertIsInstance(dispatcher, MinosCommandHandlerDispatcher) - - async def test_if_queue_table_exists(self): - handler = MinosCommandHandlerDispatcher.from_config(config=self.config) - await handler.setup() - - async with aiopg.connect(**self.commands_queue_db) as connect: - async with connect.cursor() as cur: - await cur.execute( - "SELECT 1 " - "FROM information_schema.tables " - "WHERE table_schema = 'public' AND table_name = 'command_queue';" - ) - ret = [] - async for row in cur: - ret.append(row) - - assert ret == [(1,)] - - async def test_get_event_handler(self): - model = NaiveAggregate(test_id=1, test=2, id=1, version=1) - event_instance = Command( - topic="AddOrder", - model=model.classname, - items=[], - saga_id="43434jhij", - task_id="juhjh34", - reply_on="mkk2334", - ) - m = MinosCommandHandlerDispatcher.from_config(config=self.config) - - cls = m.get_event_handler(topic=event_instance.topic) - result = await cls(topic=event_instance.topic, command=event_instance) - - assert result == "add_order" - - async def test_non_implemented_action(self): - model = NaiveAggregate(test_id=1, test=2, id=1, version=1) - instance = Command( - topic="NotExisting", - model=model.classname, - items=[], - saga_id="43434jhij", - task_id="juhjh34", - reply_on="mkk2334", - ) - m = MinosCommandHandlerDispatcher.from_config(config=self.config) - - with self.assertRaises(MinosNetworkException) as context: - cls = m.get_event_handler(topic=instance.topic) - await cls(topic=instance.topic, command=instance) - - self.assertTrue( - "topic NotExisting have no controller/action configured, please review th configuration file" - in str(context.exception) - ) - - async def test_none_config(self): - handler = MinosCommandHandlerDispatcher.from_config(config=None) - - self.assertIsNone(handler) - - async def test_event_queue_checker(self): - handler = MinosCommandHandlerDispatcher.from_config(config=self.config) - await handler.setup() - - model = NaiveAggregate(test_id=1, test=2, id=1, version=1) - instance = Command( - topic="AddOrder", - model=model.classname, - items=[], - saga_id="43434jhij", - task_id="juhjh34", - reply_on="mkk2334", - ) - bin_data = instance.avro_bytes - Command.from_avro_bytes(bin_data) - - async with aiopg.connect(**self.commands_queue_db) as connect: - async with connect.cursor() as cur: - await cur.execute( - "INSERT INTO command_queue (topic, partition_id, binary_data, creation_date) " - "VALUES (%s, %s, %s, %s) " - "RETURNING id;", - (instance.topic, 0, bin_data, datetime.datetime.now(),), - ) - - queue_id = await cur.fetchone() - affected_rows = cur.rowcount - - assert affected_rows == 1 - assert queue_id[0] > 0 - - # Must get the record, call on_reply function and delete the record from DB - await handler.queue_checker() - - async with aiopg.connect(**self.commands_queue_db) as connect: - async with connect.cursor() as cur: - await cur.execute("SELECT COUNT(*) FROM command_queue WHERE id=%d" % (queue_id)) - records = await cur.fetchone() - - assert records[0] == 0 - - async def test_event_queue_checker_wrong_event(self): - handler = MinosCommandHandlerDispatcher.from_config(config=self.config) - await handler.setup() - bin_data = bytes(b"Test") - - async with aiopg.connect(**self.commands_queue_db) as connect: - async with connect.cursor() as cur: - await cur.execute( - "INSERT INTO command_queue (topic, partition_id, binary_data, creation_date) " - "VALUES (%s, %s, %s, %s) " - "RETURNING id;", - ("AddOrder", 0, bin_data, datetime.datetime.now(),), - ) - - queue_id = await cur.fetchone() - affected_rows = cur.rowcount - - assert affected_rows == 1 - assert queue_id[0] > 0 - - # Must get the record, call on_reply function and delete the record from DB - await handler.queue_checker() - - async with aiopg.connect(**self.commands_queue_db) as connect: - async with connect.cursor() as cur: - await cur.execute("SELECT COUNT(*) FROM command_queue WHERE id=%d" % (queue_id)) - records = await cur.fetchone() - - assert records[0] == 1 diff --git a/tests/test_networks/test_handler/test_command_reply/test_command__reply_server.py b/tests/test_networks/test_handler/test_command_reply/test_command__reply_server.py deleted file mode 100644 index 288564f3..00000000 --- a/tests/test_networks/test_handler/test_command_reply/test_command__reply_server.py +++ /dev/null @@ -1,86 +0,0 @@ -from collections import ( - namedtuple, -) - -from minos.common import ( - CommandReply, -) -from minos.common.testing import ( - PostgresAsyncTestCase, -) -from minos.networks import ( - MinosCommandReplyHandlerServer, -) -from tests.utils import ( - BASE_PATH, - NaiveAggregate, -) - - -class TestCommandReplyServer(PostgresAsyncTestCase): - CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" - - def test_from_config(self): - dispatcher = MinosCommandReplyHandlerServer.from_config(config=self.config) - self.assertIsInstance(dispatcher, MinosCommandReplyHandlerServer) - - async def test_none_config(self): - event_server = MinosCommandReplyHandlerServer.from_config(config=None) - - self.assertIsNone(event_server) - - async def test_queue_add(self): - model = NaiveAggregate(test_id=1, test=2, id=1, version=1) - event_instance = CommandReply( - topic="AddOrder", - model=model.classname, - items=[], - saga_id="43434jhij", - task_id="juhjh34", - reply_on="mkk2334", - ) - bin_data = event_instance.avro_bytes - CommandReply.from_avro_bytes(bin_data) - - event_server = MinosCommandReplyHandlerServer.from_config(config=self.config) - await event_server.setup() - - affected_rows, id = await event_server.queue_add(topic=event_instance.topic, partition=0, binary=bin_data) - - assert affected_rows == 1 - assert id > 0 - - async def test_handle_message(self): - event_server = MinosCommandReplyHandlerServer.from_config(config=self.config) - await event_server.setup() - - model = NaiveAggregate(test_id=1, test=2, id=1, version=1) - event_instance = CommandReply( - topic="AddOrder", - model=model.classname, - items=[], - saga_id="43434jhij", - task_id="juhjh34", - reply_on="mkk2334", - ) - bin_data = event_instance.avro_bytes - - Mensaje = namedtuple("Mensaje", ["topic", "partition", "value"]) - - async def consumer(): - yield Mensaje(topic="AddOrder", partition=0, value=bin_data) - - await event_server.handle_message(consumer()) - - async def test_handle_message_ko(self): - event_server = MinosCommandReplyHandlerServer.from_config(config=self.config) - await event_server.setup() - - bin_data = bytes(b"test") - - Mensaje = namedtuple("Mensaje", ["topic", "partition", "value"]) - - async def consumer(): - yield Mensaje(topic="AddOrder", partition=0, value=bin_data) - - await event_server.handle_message(consumer()) diff --git a/tests/test_networks/test_handler/test_command_reply/test_command_reply_services.py b/tests/test_networks/test_handler/test_command_reply/test_command_reply_services.py deleted file mode 100644 index a193743d..00000000 --- a/tests/test_networks/test_handler/test_command_reply/test_command_reply_services.py +++ /dev/null @@ -1,51 +0,0 @@ -from unittest.mock import ( - MagicMock, -) - -from minos.common.testing import ( - PostgresAsyncTestCase, -) -from minos.networks import ( - MinosCommandReplyPeriodicService, - MinosCommandReplyServerService, -) -from tests.utils import ( - BASE_PATH, -) - - -class TestMinosCommandReplyServices(PostgresAsyncTestCase): - CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" - - async def test_start(self): - service = MinosCommandReplyServerService(loop=None, config=self.config) - - async def _fn(consumer): - self.assertEqual(service.consumer, consumer) - - mock = MagicMock(side_effect=_fn) - service.dispatcher.handle_message = mock - await service.start() - self.assertTrue(1, mock.call_count) - await service.stop() - - -class TestMinosCommandReplyQueueService(PostgresAsyncTestCase): - CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" - - async def test_start(self): - service = MinosCommandReplyPeriodicService(interval=1, loop=None, config=self.config) - mock = MagicMock(side_effect=service.dispatcher.setup) - service.dispatcher.setup = mock - await service.start() - self.assertTrue(1, mock.call_count) - await service.stop() - - async def test_callback(self): - service = MinosCommandReplyPeriodicService(interval=1, loop=None, config=self.config) - await service.dispatcher.setup() - mock = MagicMock(side_effect=service.dispatcher.queue_checker) - service.dispatcher.queue_checker = mock - await service.callback() - self.assertEqual(1, mock.call_count) - await service.dispatcher.destroy() diff --git a/tests/test_networks/test_handler/test_command_reply/test_dispatcher.py b/tests/test_networks/test_handler/test_command_reply/test_dispatcher.py deleted file mode 100644 index 39ef566e..00000000 --- a/tests/test_networks/test_handler/test_command_reply/test_dispatcher.py +++ /dev/null @@ -1,158 +0,0 @@ -import datetime - -import aiopg - -from minos.common import ( - CommandReply, -) -from minos.common.testing import ( - PostgresAsyncTestCase, -) -from minos.networks import ( - MinosCommandReplyHandlerDispatcher, - MinosNetworkException, -) -from tests.utils import ( - BASE_PATH, - NaiveAggregate, -) - - -class TestCommandReplyDispatcher(PostgresAsyncTestCase): - CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" - - def test_from_config(self): - dispatcher = MinosCommandReplyHandlerDispatcher.from_config(config=self.config) - self.assertIsInstance(dispatcher, MinosCommandReplyHandlerDispatcher) - - async def test_if_queue_table_exists(self): - handler = MinosCommandReplyHandlerDispatcher.from_config(config=self.config) - await handler.setup() - - async with aiopg.connect(**self.saga_queue_db) as connect: - async with connect.cursor() as cur: - await cur.execute( - "SELECT 1 " - "FROM information_schema.tables " - "WHERE table_schema = 'public' AND table_name = 'command_reply_queue';" - ) - ret = [] - async for row in cur: - ret.append(row) - - assert ret == [(1,)] - - async def test_get_event_handler(self): - model = NaiveAggregate(test_id=1, test=2, id=1, version=1) - event_instance = CommandReply( - topic="AddOrder", - model=model.classname, - items=[], - saga_id="43434jhij", - task_id="juhjh34", - reply_on="mkk2334", - ) - m = MinosCommandReplyHandlerDispatcher.from_config(config=self.config) - - cls = m.get_event_handler(topic=event_instance.topic) - result = await cls(topic=event_instance.topic, command=event_instance) - - assert result == "add_order_saga" - - async def test_non_implemented_action(self): - model = NaiveAggregate(test_id=1, test=2, id=1, version=1) - instance = CommandReply( - topic="NotExisting", - model=model.classname, - items=[], - saga_id="43434jhij", - task_id="juhjh34", - reply_on="mkk2334", - ) - m = MinosCommandReplyHandlerDispatcher.from_config(config=self.config) - - with self.assertRaises(MinosNetworkException) as context: - cls = m.get_event_handler(topic=instance.topic) - await cls(topic=instance.topic, command=instance) - - self.assertTrue( - "topic NotExisting have no controller/action configured, please review th configuration file" - in str(context.exception) - ) - - async def test_none_config(self): - handler = MinosCommandReplyHandlerDispatcher.from_config(config=None) - - self.assertIsNone(handler) - - async def test_event_queue_checker(self): - handler = MinosCommandReplyHandlerDispatcher.from_config(config=self.config) - await handler.setup() - - model = NaiveAggregate(test_id=1, test=2, id=1, version=1) - instance = CommandReply( - topic="AddOrder", - model=model.classname, - items=[], - saga_id="43434jhij", - task_id="juhjh34", - reply_on="mkk2334", - ) - bin_data = instance.avro_bytes - CommandReply.from_avro_bytes(bin_data) - - async with aiopg.connect(**self.saga_queue_db) as connect: - async with connect.cursor() as cur: - await cur.execute( - "INSERT INTO command_reply_queue (topic, partition_id, binary_data, creation_date) " - "VALUES (%s, %s, %s, %s) " - "RETURNING id;", - (instance.topic, 0, bin_data, datetime.datetime.now(),), - ) - - queue_id = await cur.fetchone() - affected_rows = cur.rowcount - - assert affected_rows == 1 - assert queue_id[0] > 0 - - # Must get the record, call on_reply function and delete the record from DB - await handler.queue_checker() - - async with aiopg.connect(**self.saga_queue_db) as connect: - async with connect.cursor() as cur: - await cur.execute("SELECT COUNT(*) FROM command_reply_queue WHERE id=%d" % (queue_id)) - records = await cur.fetchone() - - assert records[0] == 0 - - async def test_command_reply_queue_checker_wrong_event(self): - handler = MinosCommandReplyHandlerDispatcher.from_config(config=self.config) - await handler.setup() - - bin_data = bytes(b"Test") - - async with aiopg.connect(**self.saga_queue_db) as connect: - async with connect.cursor() as cur: - await cur.execute( - "INSERT INTO command_reply_queue (topic, partition_id, binary_data, creation_date) " - "VALUES (%s, %s, %s, %s) " - "RETURNING id;", - ("AddOrder", 0, bin_data, datetime.datetime.now(),), - ) - - queue_id = await cur.fetchone() - affected_rows = cur.rowcount - - assert affected_rows == 1 - assert queue_id[0] > 0 - - # Must get the record, call on_reply function and delete the record from DB - await handler.queue_checker() - - async with aiopg.connect(**self.saga_queue_db) as connect: - async with connect.cursor() as cur: - await cur.execute("SELECT COUNT(*) FROM command_reply_queue WHERE id=%d" % (queue_id)) - records = await cur.fetchone() - - assert records[0] == 1 diff --git a/tests/test_networks/test_handler/test_event/test_dispatcher.py b/tests/test_networks/test_handler/test_event/test_dispatcher.py deleted file mode 100644 index e38e4630..00000000 --- a/tests/test_networks/test_handler/test_event/test_dispatcher.py +++ /dev/null @@ -1,136 +0,0 @@ -import datetime - -import aiopg - -from minos.common import ( - Event, -) -from minos.common.testing import ( - PostgresAsyncTestCase, -) -from minos.networks import ( - MinosEventHandlerDispatcher, - MinosNetworkException, -) -from tests.utils import ( - BASE_PATH, - NaiveAggregate, -) - - -class TestEventDispatcher(PostgresAsyncTestCase): - CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" - - def test_from_config(self): - dispatcher = MinosEventHandlerDispatcher.from_config(config=self.config) - self.assertIsInstance(dispatcher, MinosEventHandlerDispatcher) - - async def test_if_queue_table_exists(self): - event_handler = MinosEventHandlerDispatcher.from_config(config=self.config) - await event_handler.setup() - - async with aiopg.connect(**self.events_queue_db) as connect: - async with connect.cursor() as cur: - await cur.execute( - "SELECT 1 " - "FROM information_schema.tables " - "WHERE table_schema = 'public' AND table_name = 'event_queue';" - ) - ret = [] - async for row in cur: - ret.append(row) - - assert ret == [(1,)] - - async def test_get_event_handler(self): - model = NaiveAggregate(test_id=1, test=2, id=1, version=1) - event_instance = Event(topic="TestEventQueueAdd", model=model.classname, items=[]) - m = MinosEventHandlerDispatcher.from_config(config=self.config) - - cls = m.get_event_handler(topic="TicketAdded") - result = await cls(topic="TicketAdded", event=event_instance) - - assert result == "request_added" - - async def test_non_implemented_action(self): - model = NaiveAggregate(test_id=1, test=2, id=1, version=1) - event_instance = Event(topic="NotExisting", model=model.classname, items=[]) - m = MinosEventHandlerDispatcher.from_config(config=self.config) - - with self.assertRaises(MinosNetworkException) as context: - cls = m.get_event_handler(topic=event_instance.topic) - await cls(topic=event_instance.topic, event=event_instance) - - self.assertTrue( - "topic NotExisting have no controller/action configured, please review th configuration file" - in str(context.exception) - ) - - async def test_none_config(self): - event_handler = MinosEventHandlerDispatcher.from_config(config=None) - - self.assertIsNone(event_handler) - - async def test_event_queue_checker(self): - event_handler = MinosEventHandlerDispatcher.from_config(config=self.config) - await event_handler.setup() - - model = NaiveAggregate(test_id=1, test=2, id=1, version=1) - event_instance = Event(topic="TicketAdded", model=model.classname, items=[]) - bin_data = event_instance.avro_bytes - Event.from_avro_bytes(bin_data) - - async with aiopg.connect(**self.events_queue_db) as connect: - async with connect.cursor() as cur: - await cur.execute( - "INSERT INTO event_queue (topic, partition_id, binary_data, creation_date) " - "VALUES (%s, %s, %s, %s) " - "RETURNING id;", - (event_instance.topic, 0, bin_data, datetime.datetime.now(),), - ) - - queue_id = await cur.fetchone() - affected_rows = cur.rowcount - - assert affected_rows == 1 - assert queue_id[0] > 0 - - # Must get the record, call on_reply function and delete the record from DB - await event_handler.queue_checker() - - async with aiopg.connect(**self.events_queue_db) as connect: - async with connect.cursor() as cur: - await cur.execute("SELECT COUNT(*) FROM event_queue WHERE id=%d" % (queue_id)) - records = await cur.fetchone() - - assert records[0] == 0 - - async def test_event_queue_checker_wrong_event(self): - handler = MinosEventHandlerDispatcher.from_config(config=self.config) - await handler.setup() - bin_data = bytes(b"Test") - - async with aiopg.connect(**self.events_queue_db) as connect: - async with connect.cursor() as cur: - await cur.execute( - "INSERT INTO event_queue (topic, partition_id, binary_data, creation_date) " - "VALUES (%s, %s, %s, %s) " - "RETURNING id;", - ("TicketAdded", 0, bin_data, datetime.datetime.now(),), - ) - - queue_id = await cur.fetchone() - affected_rows = cur.rowcount - - assert affected_rows == 1 - assert queue_id[0] > 0 - - # Must get the record, call on_reply function and delete the record from DB - await handler.queue_checker() - - async with aiopg.connect(**self.events_queue_db) as connect: - async with connect.cursor() as cur: - await cur.execute("SELECT COUNT(*) FROM event_queue WHERE id=%d" % (queue_id)) - records = await cur.fetchone() - - assert records[0] == 1 diff --git a/tests/test_networks/test_handler/test_event/test_event_server.py b/tests/test_networks/test_handler/test_event/test_event_server.py deleted file mode 100644 index 4bd14e60..00000000 --- a/tests/test_networks/test_handler/test_event/test_event_server.py +++ /dev/null @@ -1,72 +0,0 @@ -from collections import ( - namedtuple, -) - -from minos.common import ( - Event, -) -from minos.common.testing import ( - PostgresAsyncTestCase, -) -from minos.networks import ( - MinosEventHandlerServer, -) -from tests.utils import ( - BASE_PATH, - NaiveAggregate, -) - - -class TestEventServer(PostgresAsyncTestCase): - CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" - - def test_from_config(self): - dispatcher = MinosEventHandlerServer.from_config(config=self.config) - self.assertIsInstance(dispatcher, MinosEventHandlerServer) - - async def test_none_config(self): - event_server = MinosEventHandlerServer.from_config(config=None) - - self.assertIsNone(event_server) - - async def test_queue_add(self): - model = NaiveAggregate(test_id=1, test=2, id=1, version=1) - event_instance = Event(topic="TestEventQueueAdd", model=model.classname, items=[]) - bin_data = event_instance.avro_bytes - Event.from_avro_bytes(bin_data) - - event_server = MinosEventHandlerServer.from_config(config=self.config) - await event_server.setup() - - affected_rows, id = await event_server.queue_add(topic=event_instance.topic, partition=0, binary=bin_data) - - assert affected_rows == 1 - assert id > 0 - - async def test_handle_message(self): - event_server = MinosEventHandlerServer.from_config(config=self.config) - await event_server.setup() - - model = NaiveAggregate(test_id=1, test=2, id=1, version=1) - event_instance = Event(topic="TicketAdded", model=model.classname, items=[model]) - bin_data = event_instance.avro_bytes - - Mensaje = namedtuple("Mensaje", ["topic", "partition", "value"]) - - async def consumer(): - yield Mensaje(topic="TicketAdded", partition=0, value=bin_data) - - await event_server.handle_message(consumer()) - - async def test_handle_message_ko(self): - event_server = MinosEventHandlerServer.from_config(config=self.config) - await event_server.setup() - - bin_data = bytes(b"test") - - Mensaje = namedtuple("Mensaje", ["topic", "partition", "value"]) - - async def consumer(): - yield Mensaje(topic="TicketAdded", partition=0, value=bin_data) - - await event_server.handle_message(consumer()) diff --git a/tests/test_networks/test_handler/test_event/test_event_services.py b/tests/test_networks/test_handler/test_event/test_event_services.py deleted file mode 100644 index 0fee90b7..00000000 --- a/tests/test_networks/test_handler/test_event/test_event_services.py +++ /dev/null @@ -1,56 +0,0 @@ -import unittest -from unittest.mock import ( - MagicMock, -) - -from minos.common.testing import ( - PostgresAsyncTestCase, -) -from minos.networks import ( - MinosEventPeriodicService, - MinosEventServerService, -) -from tests.utils import ( - BASE_PATH, -) - - -class TestMinosEventServices(PostgresAsyncTestCase): - CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" - - async def test_start(self): - service = MinosEventServerService(loop=None, config=self.config) - - async def _fn(consumer): - self.assertEqual(service.consumer, consumer) - - mock = MagicMock(side_effect=_fn) - service.dispatcher.handle_message = mock - await service.start() - self.assertTrue(1, mock.call_count) - await service.stop() - - -class TestMinosQueueService(PostgresAsyncTestCase): - CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" - - async def test_start(self): - service = MinosEventPeriodicService(interval=1, loop=None, config=self.config) - mock = MagicMock(side_effect=service.dispatcher.setup) - service.dispatcher.setup = mock - await service.start() - self.assertTrue(1, mock.call_count) - await service.stop() - - async def test_callback(self): - service = MinosEventPeriodicService(interval=1, loop=None, config=self.config) - await service.dispatcher.setup() - mock = MagicMock(side_effect=service.dispatcher.queue_checker) - service.dispatcher.queue_checker = mock - await service.callback() - self.assertEqual(1, mock.call_count) - await service.dispatcher.destroy() - - -if __name__ == "__main__": - unittest.main() diff --git a/tests/test_networks/test_handler/__init__.py b/tests/test_networks/test_handlers/__init__.py similarity index 100% rename from tests/test_networks/test_handler/__init__.py rename to tests/test_networks/test_handlers/__init__.py diff --git a/tests/test_networks/test_handler/test_command/__init__.py b/tests/test_networks/test_handlers/test_command_replies/__init__.py similarity index 100% rename from tests/test_networks/test_handler/test_command/__init__.py rename to tests/test_networks/test_handlers/test_command_replies/__init__.py diff --git a/tests/test_networks/test_handlers/test_command_replies/test_consumers.py b/tests/test_networks/test_handlers/test_command_replies/test_consumers.py new file mode 100644 index 00000000..4c093f76 --- /dev/null +++ b/tests/test_networks/test_handlers/test_command_replies/test_consumers.py @@ -0,0 +1,86 @@ +""" +Copyright (C) 2021 Clariteia SL + +This file is part of minos framework. + +Minos framework can not be copied and/or distributed without the express permission of Clariteia SL. +""" +import unittest +from unittest.mock import ( + MagicMock, +) + +from minos.common import ( + CommandReply, + MinosConfigException, +) +from minos.common.testing import ( + PostgresAsyncTestCase, +) +from minos.networks import ( + CommandReplyConsumer, +) +from tests.utils import ( + BASE_PATH, + FakeConsumer, + Message, + NaiveAggregate, +) + + +class TestCommandReplyConsumer(PostgresAsyncTestCase): + CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" + + def test_from_config(self): + dispatcher = CommandReplyConsumer.from_config(config=self.config) + self.assertIsInstance(dispatcher, CommandReplyConsumer) + + def test_from_config_raises(self): + with self.assertRaises(MinosConfigException): + CommandReplyConsumer.from_config() + + async def test_queue_add(self): + model = NaiveAggregate(test_id=1, test=2, id=1, version=1) + event_instance = CommandReply( + topic="AddOrder", + model=model.classname, + items=[], + saga_id="43434jhij", + task_id="juhjh34", + reply_on="mkk2334", + ) + bin_data = event_instance.avro_bytes + CommandReply.from_avro_bytes(bin_data) + + async with CommandReplyConsumer.from_config(config=self.config, consumer=FakeConsumer()) as consumer: + id = await consumer.queue_add(topic=event_instance.topic, partition=0, binary=bin_data) + assert id > 0 + + async def test_dispatch(self): + model = NaiveAggregate(test_id=1, test=2, id=1, version=1) + event_instance = CommandReply( + topic="AddOrder", + model=model.classname, + items=[], + saga_id="43434jhij", + task_id="juhjh34", + reply_on="mkk2334", + ) + bin_data = event_instance.avro_bytes + consumer = FakeConsumer([Message(topic="AddOrder", partition=0, value=bin_data)]) + + async with CommandReplyConsumer.from_config(config=self.config, consumer=consumer) as dispatcher: + mock = MagicMock(side_effect=dispatcher.handle_single_message) + dispatcher.handle_single_message = mock + await dispatcher.dispatch() + self.assertEqual(1, mock.call_count) + + async def test_dispatch_ko(self): + bin_data = bytes(b"test") + consumer = FakeConsumer([Message(topic="AddOrder", partition=0, value=bin_data)]) + async with CommandReplyConsumer.from_config(config=self.config, consumer=consumer) as consumer: + await consumer.dispatch() + + +if __name__ == "__main__": + unittest.main() diff --git a/tests/test_networks/test_handlers/test_command_replies/test_handlers.py b/tests/test_networks/test_handlers/test_command_replies/test_handlers.py new file mode 100644 index 00000000..f193145e --- /dev/null +++ b/tests/test_networks/test_handlers/test_command_replies/test_handlers.py @@ -0,0 +1,157 @@ +import datetime +import unittest + +import aiopg + +from minos.common import ( + CommandReply, + MinosConfigException, +) +from minos.common.testing import ( + PostgresAsyncTestCase, +) +from minos.networks import ( + CommandReplyHandler, + MinosNetworkException, +) +from tests.utils import ( + BASE_PATH, + FakeSagaManager, + NaiveAggregate, +) + + +class TestCommandReplyHandler(PostgresAsyncTestCase): + CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" + + def test_from_config(self): + dispatcher = CommandReplyHandler.from_config(config=self.config) + self.assertIsInstance(dispatcher, CommandReplyHandler) + + def test_from_config_raises(self): + with self.assertRaises(MinosConfigException): + CommandReplyHandler.from_config() + + async def test_if_queue_table_exists(self): + async with CommandReplyHandler.from_config(config=self.config): + async with aiopg.connect(**self.saga_queue_db) as connect: + async with connect.cursor() as cur: + await cur.execute( + "SELECT 1 " + "FROM information_schema.tables " + "WHERE table_schema = 'public' AND table_name = 'command_reply_queue';" + ) + ret = [] + async for row in cur: + ret.append(row) + + assert ret == [(1,)] + + async def test_get_event_handler(self): + model = NaiveAggregate(test_id=1, test=2, id=1, version=1) + event_instance = CommandReply( + topic="AddOrder", + model=model.classname, + items=[], + saga_id="43434jhij", + task_id="juhjh34", + reply_on="mkk2334", + ) + m = CommandReplyHandler.from_config(config=self.config) + + cls = m.get_event_handler(topic=event_instance.topic) + result = await cls(topic=event_instance.topic, command=event_instance) + + assert result == "add_order_saga" + + async def test_non_implemented_action(self): + model = NaiveAggregate(test_id=1, test=2, id=1, version=1) + instance = CommandReply( + topic="NotExisting", + model=model.classname, + items=[], + saga_id="43434jhij", + task_id="juhjh34", + reply_on="mkk2334", + ) + m = CommandReplyHandler.from_config(config=self.config) + + with self.assertRaises(MinosNetworkException) as context: + cls = m.get_event_handler(topic=instance.topic) + await cls(topic=instance.topic, command=instance) + + self.assertTrue( + "topic NotExisting have no controller/action configured, please review th configuration file" + in str(context.exception) + ) + + async def test_event_dispatch(self): + model = NaiveAggregate(test_id=1, test=2, id=1, version=1) + instance = CommandReply( + topic="AddOrder", + model=model.classname, + items=[], + saga_id="43434jhij", + task_id="juhjh34", + reply_on="mkk2334", + ) + bin_data = instance.avro_bytes + saga_manager = FakeSagaManager() + + async with CommandReplyHandler.from_config(config=self.config, saga_manager=saga_manager) as handler: + async with aiopg.connect(**self.saga_queue_db) as connect: + async with connect.cursor() as cur: + await cur.execute( + "INSERT INTO command_reply_queue (topic, partition_id, binary_data, creation_date) " + "VALUES (%s, %s, %s, %s) " + "RETURNING id;", + (instance.topic, 0, bin_data, datetime.datetime.now(),), + ) + + queue_id = await cur.fetchone() + + assert queue_id[0] > 0 + + # Must get the record, call on_reply function and delete the record from DB + await handler.dispatch() + + async with aiopg.connect(**self.saga_queue_db) as connect: + async with connect.cursor() as cur: + await cur.execute("SELECT COUNT(*) FROM command_reply_queue WHERE id=%d" % (queue_id)) + records = await cur.fetchone() + + assert records[0] == 0 + + self.assertEqual(None, saga_manager.name) + self.assertEqual(instance, saga_manager.reply) + + async def test_command_reply_dispatch_wrong_event(self): + async with CommandReplyHandler.from_config(config=self.config) as handler: + bin_data = bytes(b"Test") + + async with aiopg.connect(**self.saga_queue_db) as connect: + async with connect.cursor() as cur: + await cur.execute( + "INSERT INTO command_reply_queue (topic, partition_id, binary_data, creation_date) " + "VALUES (%s, %s, %s, %s) " + "RETURNING id;", + ("AddOrder", 0, bin_data, datetime.datetime.now(),), + ) + + queue_id = await cur.fetchone() + + assert queue_id[0] > 0 + + # Must get the record, call on_reply function and delete the record from DB + await handler.dispatch() + + async with aiopg.connect(**self.saga_queue_db) as connect: + async with connect.cursor() as cur: + await cur.execute("SELECT COUNT(*) FROM command_reply_queue WHERE id=%d" % (queue_id)) + records = await cur.fetchone() + + assert records[0] == 1 + + +if __name__ == "__main__": + unittest.main() diff --git a/tests/test_networks/test_handlers/test_command_replies/test_services.py b/tests/test_networks/test_handlers/test_command_replies/test_services.py new file mode 100644 index 00000000..34355fac --- /dev/null +++ b/tests/test_networks/test_handlers/test_command_replies/test_services.py @@ -0,0 +1,65 @@ +import unittest +from unittest.mock import ( + MagicMock, + patch, +) + +from minos.common.testing import ( + PostgresAsyncTestCase, +) +from minos.networks import ( + CommandReplyConsumerService, + CommandReplyHandlerService, +) +from tests.utils import ( + BASE_PATH, + FakeDispatcher, +) + + +class TestCommandReplyConsumerService(PostgresAsyncTestCase): + CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" + + @patch("minos.networks.CommandReplyConsumer.from_config") + async def test_start(self, mock): + instance = FakeDispatcher() + mock.return_value = instance + + service = CommandReplyConsumerService(loop=None, config=self.config) + + self.assertEqual(0, instance.setup_count) + self.assertEqual(0, instance.setup_dispatch) + self.assertEqual(0, instance.setup_destroy) + await service.start() + self.assertEqual(1, instance.setup_count) + self.assertEqual(1, instance.setup_dispatch) + self.assertEqual(0, instance.setup_destroy) + await service.stop() + self.assertEqual(1, instance.setup_count) + self.assertEqual(1, instance.setup_dispatch) + self.assertEqual(1, instance.setup_destroy) + + +class TestCommandReplyHandlerService(PostgresAsyncTestCase): + CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" + + async def test_start(self): + service = CommandReplyHandlerService(interval=1, loop=None, config=self.config) + mock = MagicMock(side_effect=service.dispatcher.setup) + service.dispatcher.setup = mock + await service.start() + self.assertTrue(1, mock.call_count) + await service.stop() + + async def test_callback(self): + service = CommandReplyHandlerService(interval=1, loop=None, config=self.config) + await service.dispatcher.setup() + mock = MagicMock(side_effect=service.dispatcher.dispatch) + service.dispatcher.dispatch = mock + await service.callback() + self.assertEqual(1, mock.call_count) + await service.dispatcher.destroy() + + +if __name__ == "__main__": + unittest.main() diff --git a/tests/test_networks/test_handler/test_command_reply/__init__.py b/tests/test_networks/test_handlers/test_commands/__init__.py similarity index 100% rename from tests/test_networks/test_handler/test_command_reply/__init__.py rename to tests/test_networks/test_handlers/test_commands/__init__.py diff --git a/tests/test_networks/test_handlers/test_commands/test_consumers.py b/tests/test_networks/test_handlers/test_commands/test_consumers.py new file mode 100644 index 00000000..74882082 --- /dev/null +++ b/tests/test_networks/test_handlers/test_commands/test_consumers.py @@ -0,0 +1,86 @@ +""" +Copyright (C) 2021 Clariteia SL + +This file is part of minos framework. + +Minos framework can not be copied and/or distributed without the express permission of Clariteia SL. +""" +import unittest +from unittest.mock import ( + MagicMock, +) + +from minos.common import ( + Command, + MinosConfigException, +) +from minos.common.testing import ( + PostgresAsyncTestCase, +) +from minos.networks import ( + CommandConsumer, +) +from tests.utils import ( + BASE_PATH, + FakeConsumer, + Message, + NaiveAggregate, +) + + +class TestCommandConsumer(PostgresAsyncTestCase): + CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" + + def test_from_config(self): + dispatcher = CommandConsumer.from_config(config=self.config) + self.assertIsInstance(dispatcher, CommandConsumer) + + def test_from_config_raises(self): + with self.assertRaises(MinosConfigException): + CommandConsumer.from_config() + + async def test_queue_add(self): + model = NaiveAggregate(test_id=1, test=2, id=1, version=1) + event_instance = Command( + topic="AddOrder", + model=model.classname, + items=[], + saga_id="43434jhij", + task_id="juhjh34", + reply_on="mkk2334", + ) + bin_data = event_instance.avro_bytes + Command.from_avro_bytes(bin_data) + + async with CommandConsumer.from_config(config=self.config, consumer=FakeConsumer()) as dispatcher: + id = await dispatcher.queue_add(topic=event_instance.topic, partition=0, binary=bin_data) + assert id > 0 + + async def test_dispatch(self): + model = NaiveAggregate(test_id=1, test=2, id=1, version=1) + event_instance = Command( + topic="AddOrder", + model=model.classname, + items=[], + saga_id="43434jhij", + task_id="juhjh34", + reply_on="mkk2334", + ) + bin_data = event_instance.avro_bytes + + consumer = FakeConsumer([Message(topic="TicketAdded", partition=0, value=bin_data)]) + async with CommandConsumer.from_config(config=self.config, consumer=consumer) as dispatcher: + mock = MagicMock(side_effect=dispatcher.handle_single_message) + dispatcher.handle_single_message = mock + await dispatcher.dispatch() + self.assertEqual(1, mock.call_count) + + async def test_dispatch_ko(self): + bin_data = bytes(b"test") + consumer = FakeConsumer([Message(topic="TicketAdded", partition=0, value=bin_data)]) + async with CommandConsumer.from_config(config=self.config, consumer=consumer) as dispatcher: + await dispatcher.dispatch() + + +if __name__ == "__main__": + unittest.main() diff --git a/tests/test_networks/test_handlers/test_commands/test_handlers.py b/tests/test_networks/test_handlers/test_commands/test_handlers.py new file mode 100644 index 00000000..45deda90 --- /dev/null +++ b/tests/test_networks/test_handlers/test_commands/test_handlers.py @@ -0,0 +1,160 @@ +import datetime +import unittest + +import aiopg + +from minos.common import ( + Command, + MinosConfigException, +) +from minos.common.testing import ( + PostgresAsyncTestCase, +) +from minos.networks import ( + CommandHandler, + MinosNetworkException, +) +from tests.utils import ( + BASE_PATH, + FakeBroker, + NaiveAggregate, +) + + +class TestCommandHandler(PostgresAsyncTestCase): + CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" + + def test_from_config(self): + dispatcher = CommandHandler.from_config(config=self.config) + self.assertIsInstance(dispatcher, CommandHandler) + + def test_from_config_raises(self): + with self.assertRaises(MinosConfigException): + CommandHandler.from_config() + + async def test_if_queue_table_exists(self): + async with CommandHandler.from_config(config=self.config): + async with aiopg.connect(**self.commands_queue_db) as connect: + async with connect.cursor() as cur: + await cur.execute( + "SELECT 1 " + "FROM information_schema.tables " + "WHERE table_schema = 'public' AND table_name = 'command_queue';" + ) + ret = [] + async for row in cur: + ret.append(row) + + assert ret == [(1,)] + + async def test_get_event_handler(self): + model = NaiveAggregate(test_id=1, test=2, id=1, version=1) + event_instance = Command( + topic="AddOrder", + model=model.classname, + items=[], + saga_id="43434jhij", + task_id="juhjh34", + reply_on="mkk2334", + ) + m = CommandHandler.from_config(config=self.config) + + cls = m.get_event_handler(topic=event_instance.topic) + result = await cls(topic=event_instance.topic, command=event_instance) + + assert result == "add_order" + + async def test_non_implemented_action(self): + model = NaiveAggregate(test_id=1, test=2, id=1, version=1) + instance = Command( + topic="NotExisting", + model=model.classname, + items=[], + saga_id="43434jhij", + task_id="juhjh34", + reply_on="mkk2334", + ) + m = CommandHandler.from_config(config=self.config) + + with self.assertRaises(MinosNetworkException) as context: + cls = m.get_event_handler(topic=instance.topic) + await cls(topic=instance.topic, command=instance) + + self.assertTrue( + "topic NotExisting have no controller/action configured, please review th configuration file" + in str(context.exception) + ) + + async def test_event_dispatch(self): + model = NaiveAggregate(test_id=1, test=2, id=1, version=1) + instance = Command( + topic="AddOrder", + model=model.classname, + items=[], + saga_id="43434jhij", + task_id="juhjh34", + reply_on="mkk2334", + ) + bin_data = instance.avro_bytes + + broker = FakeBroker() + + async with CommandHandler.from_config(config=self.config, broker=broker) as handler: + async with aiopg.connect(**self.commands_queue_db) as connect: + async with connect.cursor() as cur: + await cur.execute( + "INSERT INTO command_queue (topic, partition_id, binary_data, creation_date) " + "VALUES (%s, %s, %s, %s) " + "RETURNING id;", + (instance.topic, 0, bin_data, datetime.datetime.now(),), + ) + + queue_id = await cur.fetchone() + + assert queue_id[0] > 0 + + # Must get the record, call on_reply function and delete the record from DB + await handler.dispatch() + + async with aiopg.connect(**self.commands_queue_db) as connect: + async with connect.cursor() as cur: + await cur.execute("SELECT COUNT(*) FROM command_queue WHERE id=%d" % (queue_id)) + records = await cur.fetchone() + + assert records[0] == 0 + + self.assertEqual("add_order", broker.items) + self.assertEqual("43434jhijReply", broker.topic) + self.assertEqual("43434jhij", broker.saga_id) + self.assertEqual("juhjh34", broker.task_id) + + async def test_event_dispatch_wrong_event(self): + async with CommandHandler.from_config(config=self.config) as handler: + bin_data = bytes(b"Test") + + async with aiopg.connect(**self.commands_queue_db) as connect: + async with connect.cursor() as cur: + await cur.execute( + "INSERT INTO command_queue (topic, partition_id, binary_data, creation_date) " + "VALUES (%s, %s, %s, %s) " + "RETURNING id;", + ("AddOrder", 0, bin_data, datetime.datetime.now(),), + ) + + queue_id = await cur.fetchone() + + assert queue_id[0] > 0 + + # Must get the record, call on_reply function and delete the record from DB + await handler.dispatch() + + async with aiopg.connect(**self.commands_queue_db) as connect: + async with connect.cursor() as cur: + await cur.execute("SELECT COUNT(*) FROM command_queue WHERE id=%d" % (queue_id)) + records = await cur.fetchone() + + assert records[0] == 1 + + +if __name__ == "__main__": + unittest.main() diff --git a/tests/test_networks/test_handlers/test_commands/test_services.py b/tests/test_networks/test_handlers/test_commands/test_services.py new file mode 100644 index 00000000..b5143bea --- /dev/null +++ b/tests/test_networks/test_handlers/test_commands/test_services.py @@ -0,0 +1,65 @@ +import unittest +from unittest.mock import ( + MagicMock, + patch, +) + +from minos.common.testing import ( + PostgresAsyncTestCase, +) +from minos.networks import ( + CommandConsumerService, + CommandHandlerService, +) +from tests.utils import ( + BASE_PATH, + FakeDispatcher, +) + + +class TestCommandConsumerService(PostgresAsyncTestCase): + CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" + + @patch("minos.networks.CommandConsumer.from_config") + async def test_start(self, mock): + instance = FakeDispatcher() + mock.return_value = instance + + service = CommandConsumerService(loop=None, config=self.config) + + self.assertEqual(0, instance.setup_count) + self.assertEqual(0, instance.setup_dispatch) + self.assertEqual(0, instance.setup_destroy) + await service.start() + self.assertEqual(1, instance.setup_count) + self.assertEqual(1, instance.setup_dispatch) + self.assertEqual(0, instance.setup_destroy) + await service.stop() + self.assertEqual(1, instance.setup_count) + self.assertEqual(1, instance.setup_dispatch) + self.assertEqual(1, instance.setup_destroy) + + +class TestCommandHandlerService(PostgresAsyncTestCase): + CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" + + async def test_start(self): + service = CommandHandlerService(interval=1, loop=None, config=self.config) + mock = MagicMock(side_effect=service.dispatcher.setup) + service.dispatcher.setup = mock + await service.start() + self.assertTrue(1, mock.call_count) + await service.stop() + + async def test_callback(self): + service = CommandHandlerService(interval=1, loop=None, config=self.config) + await service.dispatcher.setup() + mock = MagicMock(side_effect=service.dispatcher.dispatch) + service.dispatcher.dispatch = mock + await service.callback() + self.assertEqual(1, mock.call_count) + await service.dispatcher.destroy() + + +if __name__ == "__main__": + unittest.main() diff --git a/tests/test_networks/test_handler/test_event/__init__.py b/tests/test_networks/test_handlers/test_events/__init__.py similarity index 100% rename from tests/test_networks/test_handler/test_event/__init__.py rename to tests/test_networks/test_handlers/test_events/__init__.py diff --git a/tests/test_networks/test_handlers/test_events/test_consumers.py b/tests/test_networks/test_handlers/test_events/test_consumers.py new file mode 100644 index 00000000..2bc675c8 --- /dev/null +++ b/tests/test_networks/test_handlers/test_events/test_consumers.py @@ -0,0 +1,73 @@ +""" +Copyright (C) 2021 Clariteia SL + +This file is part of minos framework. + +Minos framework can not be copied and/or distributed without the express permission of Clariteia SL. +""" +import unittest +from unittest.mock import ( + MagicMock, +) + +from minos.common import ( + Event, + MinosConfigException, +) +from minos.common.testing import ( + PostgresAsyncTestCase, +) +from minos.networks import ( + EventConsumer, +) +from tests.utils import ( + BASE_PATH, + FakeConsumer, + Message, + NaiveAggregate, +) + + +class TestEventConsumer(PostgresAsyncTestCase): + CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" + + def test_from_config(self): + dispatcher = EventConsumer.from_config(config=self.config) + self.assertIsInstance(dispatcher, EventConsumer) + + def test_from_config_raises(self): + with self.assertRaises(MinosConfigException): + EventConsumer.from_config() + + async def test_queue_add(self): + model = NaiveAggregate(test_id=1, test=2, id=1, version=1) + event_instance = Event(topic="TestEventQueueAdd", model=model.classname, items=[]) + bin_data = event_instance.avro_bytes + Event.from_avro_bytes(bin_data) + + async with EventConsumer.from_config(config=self.config, consumer=FakeConsumer()) as dispatcher: + id = await dispatcher.queue_add(topic=event_instance.topic, partition=0, binary=bin_data) + assert id > 0 + + async def test_dispatch(self): + model = NaiveAggregate(test_id=1, test=2, id=1, version=1) + event_instance = Event(topic="TicketAdded", model=model.classname, items=[model]) + bin_data = event_instance.avro_bytes + consumer = FakeConsumer([Message(topic="TicketAdded", partition=0, value=bin_data)]) + + async with EventConsumer.from_config(config=self.config, consumer=consumer) as dispatcher: + mock = MagicMock(side_effect=dispatcher.handle_single_message) + dispatcher.handle_single_message = mock + await dispatcher.dispatch() + self.assertEqual(1, mock.call_count) + + async def test_dispatch_ko(self): + bin_data = bytes(b"test") + consumer = FakeConsumer([Message(topic="TicketAdded", partition=0, value=bin_data)]) + + async with EventConsumer.from_config(config=self.config, consumer=consumer) as dispatcher: + await dispatcher.dispatch() + + +if __name__ == "__main__": + unittest.main() diff --git a/tests/test_networks/test_handlers/test_events/test_handlers.py b/tests/test_networks/test_handlers/test_events/test_handlers.py new file mode 100644 index 00000000..7d2d9bdf --- /dev/null +++ b/tests/test_networks/test_handlers/test_events/test_handlers.py @@ -0,0 +1,131 @@ +import datetime +import unittest + +import aiopg + +from minos.common import ( + Event, + MinosConfigException, +) +from minos.common.testing import ( + PostgresAsyncTestCase, +) +from minos.networks import ( + EventHandler, + MinosNetworkException, +) +from tests.utils import ( + BASE_PATH, + NaiveAggregate, +) + + +class TestEventHandler(PostgresAsyncTestCase): + CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" + + def test_from_config(self): + dispatcher = EventHandler.from_config(config=self.config) + self.assertIsInstance(dispatcher, EventHandler) + + def test_from_config_raises(self): + with self.assertRaises(MinosConfigException): + EventHandler.from_config() + + async def test_if_queue_table_exists(self): + async with EventHandler.from_config(config=self.config): + async with aiopg.connect(**self.events_queue_db) as connect: + async with connect.cursor() as cur: + await cur.execute( + "SELECT 1 " + "FROM information_schema.tables " + "WHERE table_schema = 'public' AND table_name = 'event_queue';" + ) + ret = [] + async for row in cur: + ret.append(row) + + assert ret == [(1,)] + + async def test_get_event_handler(self): + model = NaiveAggregate(test_id=1, test=2, id=1, version=1) + event_instance = Event(topic="TestEventQueueAdd", model=model.classname, items=[]) + m = EventHandler.from_config(config=self.config) + + cls = m.get_event_handler(topic="TicketAdded") + result = await cls(topic="TicketAdded", event=event_instance) + + assert result == "request_added" + + async def test_non_implemented_action(self): + model = NaiveAggregate(test_id=1, test=2, id=1, version=1) + event_instance = Event(topic="NotExisting", model=model.classname, items=[]) + m = EventHandler.from_config(config=self.config) + + with self.assertRaises(MinosNetworkException) as context: + cls = m.get_event_handler(topic=event_instance.topic) + await cls(topic=event_instance.topic, event=event_instance) + + self.assertTrue( + "topic NotExisting have no controller/action configured, please review th configuration file" + in str(context.exception) + ) + + async def test_event_dispatch(self): + async with EventHandler.from_config(config=self.config) as handler: + model = NaiveAggregate(test_id=1, test=2, id=1, version=1) + event_instance = Event(topic="TicketAdded", model=model.classname, items=[]) + bin_data = event_instance.avro_bytes + Event.from_avro_bytes(bin_data) + + async with aiopg.connect(**self.events_queue_db) as connect: + async with connect.cursor() as cur: + await cur.execute( + "INSERT INTO event_queue (topic, partition_id, binary_data, creation_date) " + "VALUES (%s, %s, %s, %s) " + "RETURNING id;", + (event_instance.topic, 0, bin_data, datetime.datetime.now(),), + ) + + queue_id = await cur.fetchone() + + assert queue_id[0] > 0 + + # Must get the record, call on_reply function and delete the record from DB + await handler.dispatch() + + async with aiopg.connect(**self.events_queue_db) as connect: + async with connect.cursor() as cur: + await cur.execute("SELECT COUNT(*) FROM event_queue WHERE id=%d" % (queue_id)) + records = await cur.fetchone() + + assert records[0] == 0 + + async def test_event_dispatch_wrong_event(self): + async with EventHandler.from_config(config=self.config) as handler: + bin_data = bytes(b"Test") + + async with aiopg.connect(**self.events_queue_db) as connect: + async with connect.cursor() as cur: + await cur.execute( + "INSERT INTO event_queue (topic, partition_id, binary_data, creation_date) " + "VALUES (%s, %s, %s, %s) " + "RETURNING id;", + ("TicketAdded", 0, bin_data, datetime.datetime.now(),), + ) + + queue_id = await cur.fetchone() + assert queue_id[0] > 0 + + # Must get the record, call on_reply function and delete the record from DB + await handler.dispatch() + + async with aiopg.connect(**self.events_queue_db) as connect: + async with connect.cursor() as cur: + await cur.execute("SELECT COUNT(*) FROM event_queue WHERE id=%d" % (queue_id)) + records = await cur.fetchone() + + assert records[0] == 1 + + +if __name__ == "__main__": + unittest.main() diff --git a/tests/test_networks/test_handlers/test_events/test_services.py b/tests/test_networks/test_handlers/test_events/test_services.py new file mode 100644 index 00000000..b9ebf208 --- /dev/null +++ b/tests/test_networks/test_handlers/test_events/test_services.py @@ -0,0 +1,65 @@ +import unittest +from unittest.mock import ( + MagicMock, + patch, +) + +from minos.common.testing import ( + PostgresAsyncTestCase, +) +from minos.networks import ( + EventConsumerService, + EventHandlerService, +) +from tests.utils import ( + BASE_PATH, + FakeDispatcher, +) + + +class TestEventConsumerService(PostgresAsyncTestCase): + CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" + + @patch("minos.networks.EventConsumer.from_config") + async def test_start(self, mock): + instance = FakeDispatcher() + mock.return_value = instance + + service = EventConsumerService(loop=None, config=self.config) + + self.assertEqual(0, instance.setup_count) + self.assertEqual(0, instance.setup_dispatch) + self.assertEqual(0, instance.setup_destroy) + await service.start() + self.assertEqual(1, instance.setup_count) + self.assertEqual(1, instance.setup_dispatch) + self.assertEqual(0, instance.setup_destroy) + await service.stop() + self.assertEqual(1, instance.setup_count) + self.assertEqual(1, instance.setup_dispatch) + self.assertEqual(1, instance.setup_destroy) + + +class TestEventHandlerService(PostgresAsyncTestCase): + CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" + + async def test_start(self): + service = EventHandlerService(interval=1, loop=None, config=self.config) + mock = MagicMock(side_effect=service.dispatcher.setup) + service.dispatcher.setup = mock + await service.start() + self.assertTrue(1, mock.call_count) + await service.stop() + + async def test_callback(self): + service = EventHandlerService(interval=1, loop=None, config=self.config) + await service.dispatcher.setup() + mock = MagicMock(side_effect=service.dispatcher.dispatch) + service.dispatcher.dispatch = mock + await service.callback() + self.assertEqual(1, mock.call_count) + await service.dispatcher.destroy() + + +if __name__ == "__main__": + unittest.main() diff --git a/tests/test_networks/test_rest_interface/__init__.py b/tests/test_networks/test_rest/__init__.py similarity index 100% rename from tests/test_networks/test_rest_interface/__init__.py rename to tests/test_networks/test_rest/__init__.py diff --git a/tests/test_networks/test_rest_interface/test_service.py b/tests/test_networks/test_rest/test_services.py similarity index 90% rename from tests/test_networks/test_rest_interface/test_service.py rename to tests/test_networks/test_rest/test_services.py index 5ec9de69..fdc61e17 100644 --- a/tests/test_networks/test_rest_interface/test_service.py +++ b/tests/test_networks/test_rest/test_services.py @@ -7,14 +7,14 @@ MinosConfig, ) from minos.networks import ( - REST, + RestService, ) from tests.utils import ( BASE_PATH, ) -class TestRestInterfaceService(AioHTTPTestCase): +class TestRestService(AioHTTPTestCase): CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" async def get_application(self): @@ -22,7 +22,7 @@ async def get_application(self): Override the get_app method to return your application. """ config = MinosConfig(self.CONFIG_FILE_PATH) - rest_interface = REST(config=config) + rest_interface = RestService(config=config) return await rest_interface.create_application() diff --git a/tests/test_networks/test_rest_interface/test_interface.py b/tests/test_networks/test_rest_interface/test_interface.py deleted file mode 100644 index 4ca51709..00000000 --- a/tests/test_networks/test_rest_interface/test_interface.py +++ /dev/null @@ -1,36 +0,0 @@ -""" -from aiohttp.test_utils import ( - AioHTTPTestCase, - unittest_run_loop, -) -from minos.common import ( - MinosConfig, -) -from minos.networks import ( - RestInterfaceHandler, -) -from tests.utils import ( - BASE_PATH, -) - -class TestInterface(AioHTTPTestCase): - CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" - - async def get_application(self): - rest_interface = RestInterfaceHandler(config=MinosConfig(self.CONFIG_FILE_PATH)) - - return rest_interface.get_app() - - @unittest_run_loop - async def test_methods(self): - url = "/order" - resp = await self.client.request("GET", url) - assert resp.status == 200 - text = await resp.text() - assert "Order get" in text - - resp = await self.client.request("POST", url) - assert resp.status == 200 - text = await resp.text() - assert "Order added" in text -""" diff --git a/tests/test_networks/test_snapshots/test_dispatchers.py b/tests/test_networks/test_snapshots/test_builders.py similarity index 71% rename from tests/test_networks/test_snapshots/test_dispatchers.py rename to tests/test_networks/test_snapshots/test_builders.py index 89cb7c97..772f1c0d 100644 --- a/tests/test_networks/test_snapshots/test_dispatchers.py +++ b/tests/test_networks/test_snapshots/test_builders.py @@ -27,8 +27,8 @@ PostgresAsyncTestCase, ) from minos.networks import ( - MinosSnapshotDispatcher, - MinosSnapshotEntry, + SnapshotBuilder, + SnapshotEntry, ) from tests.aggregate_classes import ( Car, @@ -38,14 +38,14 @@ ) -class TestMinosSnapshotDispatcher(PostgresAsyncTestCase): +class TestSnapshotBuilder(PostgresAsyncTestCase): CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" def test_type(self): - self.assertTrue(issubclass(MinosSnapshotDispatcher, object)) + self.assertTrue(issubclass(SnapshotBuilder, object)) def test_from_config(self): - dispatcher = MinosSnapshotDispatcher.from_config(config=self.config) + dispatcher = SnapshotBuilder.from_config(config=self.config) self.assertEqual(self.config.snapshot.host, dispatcher.host) self.assertEqual(self.config.snapshot.port, dispatcher.port) self.assertEqual(self.config.snapshot.database, dispatcher.database) @@ -54,10 +54,10 @@ def test_from_config(self): def test_from_config_raises(self): with self.assertRaises(MinosConfigException): - MinosSnapshotDispatcher.from_config() + SnapshotBuilder.from_config() async def test_setup_snapshot_table(self): - async with MinosSnapshotDispatcher.from_config(config=self.config): + async with SnapshotBuilder.from_config(config=self.config): async with aiopg.connect(**self.snapshot_db) as connection: async with connection.cursor() as cursor: await cursor.execute( @@ -68,7 +68,7 @@ async def test_setup_snapshot_table(self): self.assertEqual(True, observed) async def test_setup_snapshot_aux_offset_table(self): - async with MinosSnapshotDispatcher.from_config(config=self.config): + async with SnapshotBuilder.from_config(config=self.config): async with aiopg.connect(**self.snapshot_db) as connection: async with connection.cursor() as cursor: await cursor.execute( @@ -80,39 +80,38 @@ async def test_setup_snapshot_aux_offset_table(self): async def test_dispatch_select(self): await self._populate() - with self.config: - async with MinosSnapshotDispatcher.from_config() as dispatcher: - await dispatcher.dispatch() - observed = [v async for v in dispatcher.select()] + async with SnapshotBuilder.from_config(config=self.config) as dispatcher: + await dispatcher.dispatch() + observed = [v async for v in dispatcher.select()] expected = [ - MinosSnapshotEntry.from_aggregate(Car(2, 2, 3, "blue")), - MinosSnapshotEntry.from_aggregate(Car(3, 1, 3, "blue")), + SnapshotEntry.from_aggregate(Car(2, 2, 3, "blue")), + SnapshotEntry.from_aggregate(Car(3, 1, 3, "blue")), ] self._assert_equal_snapshot_entries(expected, observed) async def test_dispatch_ignore_previous_version(self): - with self.config: - dispatcher = MinosSnapshotDispatcher.from_config() - await dispatcher.setup() - car = Car(1, 1, 3, "blue") - # noinspection PyTypeChecker - aggregate_name: str = car.classname + dispatcher = SnapshotBuilder.from_config(config=self.config) + await dispatcher.setup() - async def _fn(*args, **kwargs): - yield MinosRepositoryEntry(1, aggregate_name, 1, car.avro_bytes) - yield MinosRepositoryEntry(1, aggregate_name, 3, car.avro_bytes) - yield MinosRepositoryEntry(1, aggregate_name, 2, car.avro_bytes) + car = Car(1, 1, 3, "blue") + # noinspection PyTypeChecker + aggregate_name: str = car.classname - with patch("minos.common.PostgreSqlMinosRepository.select", _fn): - await dispatcher.dispatch() - observed = [v async for v in dispatcher.select()] + async def _fn(*args, **kwargs): + yield MinosRepositoryEntry(1, aggregate_name, 1, car.avro_bytes) + yield MinosRepositoryEntry(1, aggregate_name, 3, car.avro_bytes) + yield MinosRepositoryEntry(1, aggregate_name, 2, car.avro_bytes) + + with patch("minos.common.PostgreSqlMinosRepository.select", _fn): + await dispatcher.dispatch() + observed = [v async for v in dispatcher.select()] - expected = [MinosSnapshotEntry(1, aggregate_name, 3, car.avro_bytes)] - self._assert_equal_snapshot_entries(expected, observed) + expected = [SnapshotEntry(1, aggregate_name, 3, car.avro_bytes)] + self._assert_equal_snapshot_entries(expected, observed) - def _assert_equal_snapshot_entries(self, expected: list[MinosSnapshotEntry], observed: list[MinosSnapshotEntry]): + def _assert_equal_snapshot_entries(self, expected: list[SnapshotEntry], observed: list[SnapshotEntry]): self.assertEqual(len(expected), len(observed)) for exp, obs in zip(expected, observed): self.assertEqual(exp.aggregate, obs.aggregate) @@ -121,7 +120,7 @@ def _assert_equal_snapshot_entries(self, expected: list[MinosSnapshotEntry], obs async def test_dispatch_with_offset(self): async with await self._populate() as repository: - async with MinosSnapshotDispatcher.from_config(config=self.config) as dispatcher: + async with SnapshotBuilder.from_config(config=self.config) as dispatcher: mock = MagicMock(side_effect=dispatcher.repository.select) dispatcher.repository.select = mock diff --git a/tests/test_networks/test_snapshots/test_entries.py b/tests/test_networks/test_snapshots/test_entries.py index 7c752783..40b20203 100644 --- a/tests/test_networks/test_snapshots/test_entries.py +++ b/tests/test_networks/test_snapshots/test_entries.py @@ -11,16 +11,16 @@ ) from minos.networks import ( - MinosSnapshotEntry, + SnapshotEntry, ) from tests.aggregate_classes import ( Car, ) -class TestMinosSnapshotEntry(unittest.TestCase): +class TestSnapshotEntry(unittest.TestCase): def test_constructor(self): - entry = MinosSnapshotEntry(1234, "example.Car", 0, bytes("car", "utf-8")) + entry = SnapshotEntry(1234, "example.Car", 0, bytes("car", "utf-8")) self.assertEqual(1234, entry.aggregate_id) self.assertEqual("example.Car", entry.aggregate_name) self.assertEqual(0, entry.version) @@ -29,7 +29,7 @@ def test_constructor(self): self.assertEqual(None, entry.updated_at) def test_constructor_extended(self): - entry = MinosSnapshotEntry( + entry = SnapshotEntry( 1234, "example.Car", 0, bytes("car", "utf-8"), datetime(2020, 1, 10, 4, 23), datetime(2020, 1, 10, 4, 25) ) self.assertEqual(1234, entry.aggregate_id) @@ -41,7 +41,7 @@ def test_constructor_extended(self): def test_from_aggregate(self): car = Car(1, 1, 3, "blue") - entry = MinosSnapshotEntry.from_aggregate(car) + entry = SnapshotEntry.from_aggregate(car) self.assertEqual(car.id, entry.aggregate_id) self.assertEqual(car.classname, entry.aggregate_name) self.assertEqual(car.version, entry.version) @@ -50,30 +50,30 @@ def test_from_aggregate(self): self.assertEqual(None, entry.updated_at) def test_equals(self): - a = MinosSnapshotEntry(1234, "example.Car", 0, bytes("car", "utf-8")) - b = MinosSnapshotEntry(1234, "example.Car", 0, bytes("car", "utf-8")) + a = SnapshotEntry(1234, "example.Car", 0, bytes("car", "utf-8")) + b = SnapshotEntry(1234, "example.Car", 0, bytes("car", "utf-8")) self.assertEqual(a, b) def test_hash(self): - entry = MinosSnapshotEntry(1234, "example.Car", 0, bytes("car", "utf-8")) + entry = SnapshotEntry(1234, "example.Car", 0, bytes("car", "utf-8")) self.assertIsInstance(hash(entry), int) def test_aggregate_cls(self): car = Car(1, 1, 3, "blue") - entry = MinosSnapshotEntry.from_aggregate(car) + entry = SnapshotEntry.from_aggregate(car) self.assertEqual(Car, entry.aggregate_cls) def test_aggregate(self): car = Car(1, 1, 3, "blue") - entry = MinosSnapshotEntry.from_aggregate(car) + entry = SnapshotEntry.from_aggregate(car) self.assertEqual(car, entry.aggregate) def test_repr(self): - entry = MinosSnapshotEntry( + entry = SnapshotEntry( 1234, "example.Car", 0, bytes("car", "utf-8"), datetime(2020, 1, 10, 4, 23), datetime(2020, 1, 10, 4, 25), ) expected = ( - "MinosSnapshotEntry(aggregate_id=1234, aggregate_name='example.Car', version=0, data=b'car', " + "SnapshotEntry(aggregate_id=1234, aggregate_name='example.Car', version=0, data=b'car', " "created_at=datetime.datetime(2020, 1, 10, 4, 23), updated_at=datetime.datetime(2020, 1, 10, 4, 25))" ) self.assertEqual(expected, repr(entry)) diff --git a/tests/test_networks/test_snapshots/test_services.py b/tests/test_networks/test_snapshots/test_services.py index ff43ad17..c4a6766f 100644 --- a/tests/test_networks/test_snapshots/test_services.py +++ b/tests/test_networks/test_snapshots/test_services.py @@ -22,46 +22,40 @@ PostgresAsyncTestCase, ) from minos.networks import ( - MinosSnapshotDispatcher, - MinosSnapshotService, + SnapshotBuilder, + SnapshotService, ) from tests.utils import ( BASE_PATH, ) -class TestMinosSnapshotService(PostgresAsyncTestCase): +class TestSnapshotService(PostgresAsyncTestCase): CONFIG_FILE_PATH = BASE_PATH / "test_config.yml" def test_is_instance(self): - with self.config: - service = MinosSnapshotService(interval=0.1) - self.assertIsInstance(service, PeriodicService) + service = SnapshotService(interval=0.1, config=self.config) + self.assertIsInstance(service, PeriodicService) def test_dispatcher_config_raises(self): with self.assertRaises(MinosConfigException): - MinosSnapshotService(interval=0.1) + SnapshotService(interval=0.1) def test_dispatcher_config(self): - service = MinosSnapshotService(interval=0.1, config=self.config) + service = SnapshotService(interval=0.1, config=self.config) dispatcher = service.dispatcher - self.assertIsInstance(dispatcher, MinosSnapshotDispatcher) + self.assertIsInstance(dispatcher, SnapshotBuilder) self.assertFalse(dispatcher.already_setup) - def test_dispatcher_config_context(self): - with self.config: - service = MinosSnapshotService(interval=0.1) - self.assertIsInstance(service.dispatcher, MinosSnapshotDispatcher) - async def test_start(self): - service = MinosSnapshotService(interval=0.1, loop=None, config=self.config) + service = SnapshotService(interval=0.1, loop=None, config=self.config) service.dispatcher.setup = MagicMock(side_effect=service.dispatcher.setup) await service.start() self.assertTrue(1, service.dispatcher.setup.call_count) await service.stop() async def test_callback(self): - service = MinosSnapshotService(interval=0.1, loop=None, config=self.config) + service = SnapshotService(interval=0.1, loop=None, config=self.config) await service.dispatcher.setup() service.dispatcher.dispatch = MagicMock(side_effect=service.dispatcher.dispatch) await service.callback() diff --git a/tests/utils.py b/tests/utils.py index 60075770..e43e3631 100644 --- a/tests/utils.py +++ b/tests/utils.py @@ -5,12 +5,22 @@ Minos framework can not be copied and/or distributed without the express permission of Clariteia SL. """ +from collections import ( + namedtuple, +) from pathlib import ( Path, ) +from typing import ( + NoReturn, +) from minos.common import ( Aggregate, + CommandReply, + MinosBroker, + MinosModel, + MinosSagaManager, ) BASE_PATH = Path(__file__).parent @@ -20,3 +30,84 @@ class NaiveAggregate(Aggregate): """Naive aggregate class to be used for testing purposes.""" test: int + + +Message = namedtuple("Message", ["topic", "partition", "value"]) + + +class FakeConsumer: + """For testing purposes.""" + + def __init__(self, messages=None): + if messages is None: + messages = [Message(topic="TicketAdded", partition=0, value=bytes())] + self.messages = messages + + async def start(self): + """For testing purposes.""" + + async def stop(self): + """For testing purposes.""" + + async def getmany(self, *args, **kwargs): + return dict(enumerate(self.messages)) + + async def __aiter__(self): + for message in self.messages: + yield message + + +class FakeDispatcher: + """For testing purposes""" + + def __init__(self): + self.setup_count = 0 + self.setup_dispatch = 0 + self.setup_destroy = 0 + + async def setup(self): + """For testing purposes.""" + self.setup_count += 1 + + async def dispatch(self): + """For testing purposes.""" + self.setup_dispatch += 1 + + async def destroy(self): + """For testing purposes.""" + self.setup_destroy += 1 + + +class FakeSagaManager(MinosSagaManager): + """For testing purposes.""" + + def __init__(self): + super().__init__() + self.name = None + self.reply = None + + def _run_new(self, name: str, **kwargs) -> NoReturn: + self.name = name + + def _load_and_run(self, reply: CommandReply, **kwargs) -> NoReturn: + self.reply = reply + + +class FakeBroker(MinosBroker): + """For testing purposes.""" + + def __init__(self): + super().__init__() + self.items = None + self.topic = None + self.saga_id = None + self.task_id = None + + async def send( + self, items: list[MinosModel], topic: str = None, saga_id: str = None, task_id: str = None, **kwargs + ) -> NoReturn: + """For testing purposes.""" + self.items = items + self.topic = topic + self.saga_id = saga_id + self.task_id = task_id