Skip to content

Latest commit

 

History

History
74 lines (57 loc) · 3.19 KB

README.md

File metadata and controls

74 lines (57 loc) · 3.19 KB

OnlineShop

The main purpose of this project showing communication between microservices. As in the basic approach of microservice architecture, all microservices work independently and not affect each other. With this approach all of them can deploy independently. The project is built with onion infrastructure.

Used Technologies

  • .Net 6
  • Entity Framework Core
  • PostgreSQL
  • Mongo
  • RabbitMq
  • Elasticsearch
  • Kibana
  • Redis
  • Docker - Docker Compose

Prerequities

You will need the following tools:

Project Infrastructure

First of all, the infrastructures to be used in the core part of the project are located in the relevant main classes. Application and persistence layers are coded separately for each service. Eventbus infrastructure was created in the project. In this structure, masstransit and rabbitmq are used. Separate repositories has been created for reading and command structures in the webapi layer. In this way, read and write operations can be separated when necessary. Redis lock is used for synchronization error that may occur in APIs. Elasticsearch is used for reflection data to be presented to external systems, also used for log management with elasticsearch kibana.

project

The project consists of main three entities. Product, Customer and Order. OrderApi and CustomerApi work with Postgresql. ProductApi works with MongoDb. Databases of all APIs can work separately and independently from each other. All APIs are coded with CQRS pattern.

data_diagram

The relationship between all entities is shown in the picture above. After the Product and Customer registration processes are completed, order registration can be done. If the order registration is successful, a separate event is created and the order data is sent to the reflection service. Reflection service allows us to both process the data and present it securely in a separate place from the main data. External api can serve data to external clients quickly and securely.

Logging has been added in all mediator encodings. Logs can be checked via kibana with the correlationId information found in the classes.

How to Use

After opening Command Prompt, go to the directory where docker-compose file is located and run the code below.

 CMD>docker-compose up -d

There are endpoints after running program with docker compose.

Api Address
ProductApi http://localhost:3000/swagger/index.html
OrderApi http://localhost:3001/swagger/index.html
CustomerApi http://localhost:3002/swagger/index.html
OrderExternalApi http://localhost:3003/swagger/index.html

For seeing logging and external order data kibana address below.

  http://localhost:5601/