Skip to content

This 100-day series focuses on Machine Learning and Deep Learning, offering comprehensive tutorials and hands-on projects. Each day covers key concepts, algorithms, and techniques, helping learners build a solid foundation in AI. Ideal for beginners and enthusiasts aiming to master ML and DL through practical, step-by-step learning.

Notifications You must be signed in to change notification settings

Warishayat/MachineLearning-DeepLearning_DataScience100-Days

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

WDATA SCIENCE

MachineLearning-DataScience100-Days

#100 days with Machine-Learning/Data-Science
Day1:
Today i read and practise all about csv files like how to import csv,and their functions like skiprows,indexcols,usecol, colums,na_pamater,convertor etc.
Day2:
Today i read and practise all about json and Sql files like how to import JSON,and How to work with Sql.
Day3:
Today i read and practise all about how to understand the data.
Day4:
Today i read about univariate Data Analysis and solve some practise level question.
Day5:
Today i read about bivariate Data Analysis and Multivariate Data analysis and solve some practise level datset.
Day6:
Today i read about Data_Profiling how to analyse the data in the form of html etc.
Day7:
Today i read about standard Scalar from Feature_Engineering.
Day8:
Today i read about normalization from Feature_Engineering.
Day9:
Today i read about Ordinanl Encoding from Feature_Engineering.
Day11:
Today i read about the ColumTransformer which is quite easy to do similar task that we do with labelencoder and ordinanl Encoder.
Day12:
Today i read about the sklearn pipelines and practise titanic datset.
Day13:
Today i read about the sklearn function transformer and practise titanic datset.
Day14:
Today i read about the sklearn power transformer.
Day15:
Today i make a project using all existing concept.
Day16:
Today i cover the mix data like how to handle mix data.
Day17:
Today i cover the how to handles date and time in the data.
Day18:
Today i cover the CCA -->Case complete analysis or removinf or droping the values.
Day19:
Today i cover the handling missing values or univariate handle mising values.
Day20:
Today i cover the handling missing categorical values.
Day21:
Today i cover the handling missing values with tecnique Fill with Random and Missing Indicator.
Day22:
Today i cover the Knn imputer and practise side by side comparision with simpleimputer,Knn imputer,missingindicator,random etc.
Day23:
Today i understand the concept of Multivariate Missing Imputer Concept.
Day24:
Today i wrote about the outlier at this day i take the concept of handling outlier with z score that mean data is normally distributed then we remove the outliers with tecniique zcsore
like if i talk about the formoula that is: z_score=x-mean/std or 3+x(standard_diviation) for positive side and 3-x(standard_diviation) then we read the conceptt of capping and trrimming the data.
Day25:
Today i read about the another tecnique of handling outlier is IQR we do solve some question the formoula that we used in the IQR method is
Q1-1.5iqr and Q3+1.5iqr .where IQR is the diffence between the value of 75% and 25%. and q1 is 25% of the total value where the q3 is the 75% of
pf the total value.
Day25:
Today i read about the percentile tcehnique by using that how we can detact and remove outliers.
Day26:
Today i read about the feature contsruction like how to construct new columns from existing data and i learn how to do feature split..
Day27:
Today i read about the curse of dimensionality where the dimensionality mean feature and feature mean columns that if we have feature more then algoritham that can make the
algoritham capacity decrease or it is not benifical for the algorithan and it would increase the sparsity of the columns becuase of that the data point separate/fare from the mean
in Curse of dimensionality use both for FEATURE SELECTION and FEATURE EXTRACTION.
Day28:
Today i read about the principle component analysis pca feature extraction technique what is that? how that is work.?
Day29:
Today i practise the old topic that i had done in past like future construction and handling mix data.
Day:30
Today i read about the simple linear regression how it is work but without mathematical intution.
Day:31
Today i read about the simple linear regression how it is work with mathamtics intutions.
Day:32
Today i read about the multiple linear regression how it is work without mathamtics intutions.
Day:33
Today i read about the multiple linear regression how it is work with mathamtics intutions.
Day:34
Today i read about the Mean Absoulute Error ,Mean Squared Error,Root mean Squared Error,R2_score and adjusted r2_score.
Day:35
Today i read about the Gradient descent
Day:36
Today i read about the Gradient descent type which is Batch Gradient Descent and their mathematical intuition and code from scracth.
Day:37
Today i read about the Gradient descent type which is stochastic Gradient Descent and their mathematical intuition. and code from scractch. And start practise and revise of all topic from start.
Day:38
Today i read about the Gradient descent type which is Minni Batch Gradient Descent and their mathematical intuition. and code from scractch.from start.
Day:39
First i revised the multiple linear regression and doin some paractise.
Than i read about the polynomial regression when the data is non linear.
Day:40
First i revised theGradient Descent and doin some paractise.
Then i read about Bias variance trade off mean the concept of underfitting and overfitting technique.
Then i read about about Regularization technique or ridge technique which i used for overfitting.
Day:41
i read about the ridge regression and solve problem on that.and make my own class from scratch.
i read about the ridge regression and solve problem with gradient decent by making my own class although there was some dimension error but i will check letar.
Day:42
i practise about the gradient descent type which is stochastic gradient descent.which take update with every row.
Day:43
i practise about the gradient descent type which is Minni Batch gradient descent.which take update with batch size.
i practise about the polynomial regression why that use what is the working behind that than practise a problem of polynomial.
Day:44
i practise about the Overfitting technique which is ridge regression i do that with sklearn library and with my own class and compare the accuracy between both of them.
Day:45
Today i read about the ridge regression key feature the five feature that are:
1:How to coefficent get affected by lambda?
2:Higher values are impace more?
3:Regularization effect on bais variance?
4:Lambda impact on loss function?
5:why it is called ridge?
Day:46
Today i read and practise about the lasso regression key features their background intutions:
Day:47
Today i read and practise about the lasso regression behind the mathematics like why lasso creat sparsity mean why coef_ gone zero if we increase lambda/alpha.
Than i read and practise about the Elasitic net regression which is another technique to reduce overfitting. it is the combination of both Ridge and Lasso.
Day:48
Today i read logistic regressiond and the basic concept of perceptron that using in logistic regression.
Day:49
Today i read about logistic regressiond with sigmoid function.
Day:50
Today i read about the loss function of the logistic regression or function that sklearn used inside the logistic regression working.
Than i read about the classification metrics and i read about the accuracy score classifiaction and Confusion metrics.
Day:51
Today i read about the Precision metrics,Recall metrics and F1_score and doin some practise about the metrics.
i do some pracrise of the precession,recall,f1_score and classification report.
Day:52
Today i read about the softmax regression when we have more than two classes.
Than about polynomial logistic regression than i read about the hyperparameter of the logissti regression.
Day 53: Today i read about the Disicion tree model what is disicion tree how it's work working of entropy etc.
Today i read about the Disicion Tree hyper parameter.
Today i read about the Disicion regression tree and explore the librarry Dtreeviz.
Day 54:
Today i read about the ensemble technique, and about the type pf the technique.Like bagging voting etc.
Than i read about the voting ensamble assumption of voting ensamble.
Than i read about the classfication voting ensamble hard voting and soft voting and practise voting ensemble on the iris dataset.
I do the coding of voting ensemble using diffrrent base model like logistic regression svm and disicion tree.
Than i read about the voting regressor.
Day 55:
Today i read about the bagging ensemble technique core idea of bagging intution of bagging etc. than i apply all the knowlede to a code.
Today i read about Bagging classifier and practise the code example of bagging ensemble with bootstraping than i read about the type of the bagging like pasting(without replacement) ,random subspaces (colum sampling) and random patches(both colums and row smapling).
Today i read about Bagging Regressor and doin a code example by using linear regression Disicion Tree regression and KnearestNeighbour with GridSearchCv to find the best aprameter.
Day 56: Today i read about the random forest algoritham basic about the algoritham.
Day:57
Today i read about all about random forest what is difference between random forest and bagging ensemble,how to tune hyperparameter oob score each and everything all about random forest.
Day:58
Today i read about the adaboost Classifier where i read about weak learner,disicion stump etc.
Day:59
Today i read about the adaboost Classifier working from sctrach. <br. Than i implement the class through scikit learn and hpertune the parameter.than check which parametrs are the best for the adaboos using gridSeacrhCv.
Bagging vs boosting.what are the model that use, (LBHV, HBLV), parallel vs sequential model, and weights for the model.
Day:60
Todai i read about K mean clustering back intuition how it's work how to know how much clusters should i take using Elbow method.
Than i apply all the knowledge to a practical dataset.
Than i make class from scratch of cluster and doin all the work manually.
Than i make class from scratch of python.
Day:61
Today i explore the gradient boosting algorithm and the intuition of this algorithm.
Today i read about the back math of gradient boosting.
Today i explore gradient boosting with classification.
Day:62
Today i explore stacking and blanding.
Than i do code of stacking and blending.
Day:63
Today i explore another clustering algo Agglomarative clustering which is unsupervised machine learning algorithm.
The main difference behind KMean clusters and Agglomarative clusters is that the Kmean work well when the data /cluster is well defined on the other hand Agglomarative clustering work with complex data.
Day:64
Today i work with k Nearest Neighbour their working code each and everything.
Day:65
I explore the five assumption of Linear Regression.
Day:66
Today i explore about the Support vector machine and their kernal trick each and every thing about svm.
Day:67
Today i explore he naive bayes algoritham internally which is used bayes algoritham.than i do a code without using any ml library.
Day:68
Today I learn about the XGBoost which is famous library made up on gradient boosting. I learn to code and example Xg-boost with regressor.
Than I explore the XGBoost classification. Ans solve the practical problem with a real world data set.
day:69
Today i read about about the DBSCAN algoritham which is unsupervised learning algoritham technique.
Working on project Machine learning is complete now.
i upload a case study project which help the NGO to help the intenational countries that really need of funds.
Now i start Deep learning today i read about what is deep learning and machin learning diffrenc between deep and machine learning. Than what is deep learning complete introduction about deep learning.
Application of deep learning.
Types of Perceptron.
What is Perceptron? Singly layer Perceptron and Multi Layer Perceptron.
Today i read about the loss function of Perceptron.
Biggest flawn in Perceptron.
Day:70
Today i read about the Multi layer perceptron notations, and all about the multi layer perceptron.
Today i do another project on mnist dataset. Today i read about forward propagation.
Today i do a project by using Artificial Neural Network.
Today i read about the different loss functions in deep learning like Mean square error, mean absolute error, huber loss, binary crossentropy, categorical_crossentropy, sparse_categorical_cross_entropy.
Day:71
Today i read about what is back propagation how it's implementation, why its work and all the mathematics of back propagation.
Day:72
Today i read baout the memoization technique where memoization mean to speed up the program.
Today i read about different varient of gradient descent like batch gradient descent, stochastic gradient descent and minni batch gradient descent.
Today i read about vanishing gradient descent.
Day:73
Today i read about how to improve the perfoamance of the neural network, how to fie tune the hyperparameter,and how to tune the hyerparamter.
Than i read about the concept of early stopping mean how much i should iterate into the data.
Than i read another topic about the improving perfoamance of neura network that is feature scaling in artifiacal neral network.
Than i explore the Dropout layers which is another technique to reduce overfitting in neural network which is used to improve the perfoamance of neural network.
Than i read about the another overfitting technique which is regularization. i explore L1 and L2 regularization in deep learning.
Than i explore the diffrent activation function like sigmoid,tanh hyperparabolic and Relu(Rectified linear unit).than explore the diffrent variant of relu like leaky relu, parametric relu,exponential relu and scaled relu.
Than i read about the Weight intialization and i explore the two weight initialization technique like Glorot_normal and Hi_normal.
Than i read about the batch normaization technique which is used to make faster training of neural network.
Than i read about diffrent optimizers like adagrad adam,nastrove accelerated gradient momentum etc.
Than i explore more about the optimizers like rmsprop,adam how all these diifrent from each others.
Today i read about hyperparameter tunner in Neural network.
Day:74
Today i read about the introduction oart of cnn.
Today i read about the Convoulational Operations than Padding and strides than i read about the max pooling than i apply all the knowledge with a mnist dataset.
Today i read about back propagation in Cnn.
Than i work on a project which was image classification Dog vs Cat.
Day:75
Today i read about the data augmentation technique to avoid from overfitting and when you have low data.
Than i explore the pretrained Model of keras like ResNeT ,GoggleNet,VGG16,etc
Today i read about the transfar learnring when we use someones pretrained model with our custom data than we use transfar learning there are two type of learning 1: Feature Extraction and 2 is Fine tunning.
Today i read about functional model api.
Day:76
Today i read about RNN,applications of RNN than forward propagation in RNN.
Today i read about the Type of Rnn an i do a project with two diffrent technique like 1: is integer encode and the second one is Embedding.
Today i read about the back propagation in RNN & problem in RNN there are two major problem like Long term dependencies and unstable gradient.
Day:77
Today i read about the lstm and the architecture of lstm.
forget gate,input gate and output gate.
Today i do a project on lstm which is next word prediction i got the training accuracy of 95% which was quite good as a beginner.
Day:78
Today i read about the Gated RNN and all the working of GRU.
Than i read about the DEEP RNN or STACK RNN adding more layers into the RNN.
Than i read about the BIDIRECTIONAL RNN. all the codes of DAY 78 are attached.
Day:79
Today I read about Sequence 2 sequences data like machine translation and Name entity recognition.in Sequence 2 Sequence first i explore about encoder and decoder and counter vector which contain summary of input block and pass it to decode block .
Than i explore the attention which is most important topic.
I explore 2 type of attention.
1:BahDaNau attention
2: Luong Attention which was more smarter.
Today i read about transformer's and complete introduction of Transformers.
Than i read about self attention.that convert the static embedding into the contextual embedding.
Today i explore the multi head attention where we use multi attention at the same time as much as attention increas context vector is also increas.
Today i read about positional Encoding to maintain the sequences.
Than i read about the layer normalization get the best results.
Today i read about the transformer architecture and understand the each and everything in transformer architecture.
Today i read about mask self attention than cross attention.
Day: Todai i read about the transformer architecture in detail all the details in and out.
Day:80
Today i start the NLP and i start with introduction and Nlp pipeline.
Today i cover the basic preprocessin in nlp like:
Convert text to lower case.
Removing_punctutation
Chat_word_treatment
spelling_correct
Removing_stopwords
Handling_emojies
Tokenization
Stemming
Today i explore how to convert from text to number i learn about the diffrent techniques like.
One hot encoding
Bag of words
N-gram,bi-gram-tri-gram
TF-Idf
Today i learn about Word2vec there are two main and famous type of word2vec 1 is cbow and other one is skip-gram.
Day:81
Today i read about the text classification in NLP codes are attached with the file.
today i explore the parts of speech tagging using spacy library and read about the two famous algoritham 1: is hidden markov model in which we do emission prob,transition prob than second optimzation algoritham is Viterby algoritham which is used for optimization.
Today i make a small nlp project sms-spam-classifier.
Leave for the next two days.
Leave From tomorrow i will start llm and gen ai
today i read about transformers text summarization using hagging face library.
.......

About

This 100-day series focuses on Machine Learning and Deep Learning, offering comprehensive tutorials and hands-on projects. Each day covers key concepts, algorithms, and techniques, helping learners build a solid foundation in AI. Ideal for beginners and enthusiasts aiming to master ML and DL through practical, step-by-step learning.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published