Let's try to understand our complicated, complex, sophisticated project in the simplest way possible, divided into four sections.
-
Section 1: Introductory, explaining the reason behind the project.
-
Section 2: Covers terminology, basic tools, and technology.
-
Section 3: Significant, detailing the procedure, including all the steps leading to the project and how it is executed.
-
Final Section: Discusses the impact the project has made, concluding with a presentation of the work.
import yfinance as yf
import pandas as pd
from sklearn.ensemble import RandomForestClassifier
from sklearn.metrics import precision_score
import streamlit as st
import pickle
import matplotlib
The Basic essence of the project is simple, Just a web-application that when given required input predicts and gives the output, which is the signal to BUY or SELL .
Saving money represents the past tense, while investing denotes the present tense. Stock markets have been a reliable option for decades, consistently yielding returns when approached systematically. These markets generate substantial data daily. Within this data and chaos, identifying the right moment using machine learning techniques is the fundamental principle of this project.
-
Data-driven Opportunity:
- Stock markets generate substantial data daily.
- Identifying the right moment within this data using machine learning techniques is the fundamental principle of this project.
-
Project Focus:
- Amidst the vast randomness and big volumes, this project is a small trial to predict daily movement using ML techniques.
- The main focus is on the Indian National Stock Exchange (NSE).
In this project, the following key technologies are employed:
- Programming: - Python facilitates smooth communication between users and machine learning algorithms.------->PYTHON
- Visualization: - Matplotlib is used for effective data visualization.------>MATPLOTLIB, TABLEAU
- Machine Learning: - Utilization of the random forest classifier as the backbone.------>RANDOM FOREST CLASSIFIER
- Operational Tools: - Docker containers enhance operational efficiency.------> DOCKER
- Cloud Computing: - AWS is employed for showcasing real-world data.----> AWS EC2
- User Interface: - Python library that helps to develop a front end. -----> STREAMLIT
To make predictions accessible, a user-friendly app using Streamlit turns complicated stock data into a visual story. It's like turning financial complexity into a narrative that anyone can understand – no need to be a financial expert.
import streamlit as st
Dockerizing is like packing a toolbox with everything needed for predicting stocks – codes, apps, and more – into a single, easy-to-move container. It's a travel-ready kit, simplifying the deployment of our stock prediction tool on any system. 🌍
Random Fact: Docker was inspired by shipping containers, streamlining the transport of goods globally. (Source: Docker)
FROM python:3.8
COPY . /app
...
The rest of the code is here : Dockerfile.
In the digital clouds, Amazon EC2 instances are our virtual helpers. They provide a space for our prediction tool to run smoothly. These instances act like virtual servers, making our stock predictions accessible without worrying about physical limitations.
# Modernized ML Deployment Journey1. ML Code Development:
- Kick off the journey by crafting intelligent machine-learning code using the powerful Random Forest Classifier.
# Creating RandomForestClassifier model
model = RandomForestClassifier(n_estimators=200, min_samples_split=100, random_state=1)
This means that there are 200 analysts and each analyst filters the data with 100 operations, before coming to a decision
The whole ML code can be accessed here.
2. Pickle Packaging:
- Bundle up the brilliance! Pack the ML code into a sleek "pickle," a binary storage box that holds the essence of your algorithm.
import pickle # Save the trained model to a file using pickle with open('random_forest_model.pkl', 'wb') as model_file: pickle.dump(model, model_file)
3. Streamlit for the Wow Factor:
- Inject life into your creation! Load the pickled ML code into a dynamic Python script using Streamlit—a magic wand for creating stunning frontends and user interfaces.
4. Streamlit Application Check:
- Give it a spin! Test the waters with the
streamlit app.py
command, ensuring your Streamlit application dances flawlessly. (Remember, "app.py" is the script's spotlight name.)
streamlit run app.py
5. Docker File Creation:
- Time to pack a punch! Wrap up the entire script and ML code into a Docker file—a set of instructions that brings your creation to life in the virtual world.
6. Docker Image Mastery:
- Layers on layers! Transform your Docker file into a mesmerizing "Docker image," each layer holding key details about your code.
7. Containerization Enchantment:
- Unleash the magic! As you hit the play button, your Docker image transforms into a portable "container," neatly encapsulating all the ingredients needed for the show—libraries, dependencies, and more.
8. EC2 Deployment:
- Elevate to the cloud! Your Docker container takes center stage as it gracefully steps onto an EC2 instance, a virtual server ready to amplify your creation.
9. AWS EC2 Grand Debut:
- Lights, camera, action! Your EC2 server runs the Docker container seamlessly, and the fusion of a public IP address and host opens the curtain to your website's grand debut.
Our story concludes with a symphony of decision trees, easy interfaces, containerization, and cloud instances – creating a tool that predicts stock movements in a user-friendly way. This project blends technology and finance, offering a simple yet powerful guide for making informed stock decisions in the dynamic market. 🚀
May your stocks rise, and your financial decisions echo the wisdom of our virtual forest of classifiers.
Innovative Machine Learning Infrastructure In essence, the implemented machine learning (ML) code functions as the backbone, intricately handling data and employing advanced ML techniques to predict outputs based on inputs. These predictions are then communicated seamlessly through APIs. The entire system is meticulously encapsulated, along with its dependencies, within a Docker file. Subsequently, this Docker file is executed on an AWS EC2 instance. Notably, the EC2 instance is equipped with a public key, facilitating the visualization of the application across the vast expanse of the internet, all achieved without the necessity of a domain name. |