This project demonstrates the deployment of an Iris classification model as a REST API using Flask, containerized with Docker, and deployed on a Kubernetes cluster. It showcases skills in machine learning, containerization, and DevOps.
- Python
- Flask (for REST API)
- Scikit-learn (for ML model)
- Docker
- Kubernetes
- Clone the repository:
git clone https://github.com/your-repo/ml-flask-docker-k8s.git
cd ml-flask-docker-k8s
-
Create a virtual environment and install dependencies:
python3 -m venv venv source venv/bin/activate pip install -r requirements.txt
-
Run the Flask app:
python app.py
-
Test the API endpoint:
http://127.0.0.1:5000/predict
-
Write a Dockerfile to containerize the Flask application:
FROM python:3.9 WORKDIR /app COPY . /app RUN pip install -r requirements.txt EXPOSE 5000 CMD python ./index.py
-
Build the Docker image:
docker build -t iris .
-
Run the Docker container locally:
docker run -d -p 5000:5000 iris
-
Create a repository in docker hub and push the image:
docker tag iris your-docker-username/iris:latest
docker push your-docker-username/iris:latest
-
Set up a local Kubernetes cluster using Minikube:
minikube start
-
Write Kubernetes deployment and service YAML files:
iris.yaml
:
apiVersion: apps/v1
kind: Deployment
metadata:
name: irisapp
spec:
selector:
matchLabels:
app: irisapp
template:
metadata:
labels:
app: irisapp
spec:
containers:
- name: iris
image: replace your image name from doker registry (example :->harshdupare/irisclass:v1.0)
resources:
limits:
memory: "128Mi"
cpu: "500m"
ports:
- containerPort: 5000
---
apiVersion: v1
kind: Service
metadata:
name: irisapp-service
spec:
selector:
app: irisapp
ports:
- protocol: TCP
port: 80
targetPort: 5000
nodePort: 30000
type: NodePort
- Deploy the Dockerized application on the Kubernetes cluster:
kubectl apply -f iris.yaml
- Access the application:
$ kubectl port-forward your-pod-name 8080:5000
This project demonstrates the complete workflow of developing, containerizing, and deploying a machine learning model as a REST API. By following the steps outlined above, you can deploy your own machine learning models and APIs using Flask, Docker, and Kubernetes.