Skip to content

Latest commit

 

History

History
87 lines (63 loc) · 3.46 KB

README.md

File metadata and controls

87 lines (63 loc) · 3.46 KB

ETL Data Pipeline

GitHub Pipenv locked Python version Code style: black

GitHub Workflow Status (with event) GitHub Workflow Status (with event)

FastAPI Pandas Redis Postgres Docker React TailwindCSS

Objective

Create a data pipeline that ingests user data via an API, processes and stores it, and then retrieves it in a serialized format.

Components

  1. Data Source: Random API for fake user data
  2. Python & Pandas: For programming and data manipulation.
  3. Redis: Caching recent data for quick access.
  4. Postgres: Long-term data storage.
  5. FastAPI For an API endpoint for data retrieval
  6. Docker: Containerization of the entire pipeline.

Steps

assets/flow.png

  1. Data Ingestion:
    • Python script to fetch data random user data from an API.
    • Validate the data before processing.
    • Pandas for data cleaning and transformation.
  2. Caching Layer:
    • Redis setup for caching recent User data and set a TTL.
    • Python logic for data retrieval from Redis and Postgres.
  3. Data Storage:
    • Design and implement a Postgres database schema for the user data.
    • Make sure PII is hashed before putting into storage
    • Store processed data into Postgres.
  4. Data Retrieval:
    • API endpoint (e.g., using FastAPI) for data retrieval.
  5. Dockerization:
    • Dockerfile for the Python application.
    • Docker Compose for orchestrating Redis and Postgres services.
  6. Testing and Deployment:
    • Unit tests for pipeline components.

Learning Outcomes

  • Data pipeline architecture.
  • Skills in Python, Pandas, Redis, Postgres, FastAPI and Docker.

Further Enhancements

  • Front-end dashboard for data display.
  • Advanced data processing features.

How to test the project

Clone the repo

git clone https://github.com/mrpbennett/etl-pipeline.git

cd into the cloned repo and run docker compose up

docker compose up

Then head over to the URL to access the front end to see where the data is stored

http://120.0.0.1:5173

⭐ Stargazers

Star History Chart