Notebooks for Secure and Private AI Challenge course.
- Completed the 3rd lesson
- Completed the 4th lesson
- Completed the 5th lesson
- Finished the 6th lesson but didn't do the final project (will concentrate on it tomorrow)
- I finished going through the final project in lesson 5
Some interesting links:
- Github link toward the implementation of the project
- An exellent explanation of the first part of the course
- Completed the first 5 parts of Lesson 7 about Federated Learning.
- Completed lesson 7th lesson
- Working on how to use chatbot for a medical case
- Got tagged for the 1st game of #wmn_who_code channel it was fun responding
- Got tagged for the 1st game of #random on what addected me the most during the challenge.
It was the amazing effort and sharing spirit of the community
Lesson 8: Securing Federated Learning
- Completed 8.1 and 8.2 parts
- Completed lesson 8
- Reading on machine learning conversations processing algorithms:
- Finished the 5 first parts of lesson 9
- Finished the 9th lesson : Encrypted Deep Learning
- Working on the self assessment part of the chatbot project
- Found some books to improve my python skills
- Read the slide description for the Shirts project provided by the project group
- Searched for videos about GAN for style transfer
- Working on revamping my Resume
- Searched for tips to revamp my cover letter
GAN for style transfer
Revamp your cover letter
- Has been working the whole day on crafting a good cover letter (it's harder than it seems)
- Reading about the interview process for a Machine learning Engineer.
- Checked the [Webinar] AMA with Robert Wagner | Secure & Private AI Challenge Scholarships , awesome input.
- Gone through the video from #sg_project-t-shirt Secure & Private AI Challenge - 60 Days of Udacity - Project T-Shirt, thank you team.
Webinar
Secure & Private AI Challenge - 60 Days of Udacity - Project T-Shirt
- Watched videos about Logistic Regression, cost function and gradient descent (revising the basics) of deep learning.
- Trying to make the OpenAI GPT2 train
- Still trying to wrap my head around GAN (understood a little bit the principal but still trying to understand the code)
- Working on EDA for NLP (1)
- Still trying to understand the GAN through this video (2)
- Learned about LSA in NLP
- Finished EDA for my textual dataset
- Learned about vectorization what it’s, why we use it and to transform a code from for loops to vectorization for gradient descent and cost function calculations.
- Continued on vectorization: vectorized the whole Gradient descent calculations process
- Learned about Broadcasting in python, numpy library
- Checked out more on medium articles on chatbot creation and searched for ways to do multi-class classification on NLP tasks
Multi-class classification for NLP:
1
2
- Implemented and compared the performance between normal implementation of the gradient descent and loss calculation and the vectorized implementation of them using python.
- Finished notebook of Python basics using numpy from Andrew NG DL course.
- Started the Logistic Regression with a Neural Network mindset notebook from Andrew NG DL course.
- I worked on my Style Transfer code trying to find the best combination of style and image to create something beautiful for the shirt style project.
- Added a second lab (after doing it) in my Hands-on ML ppt talk.
- I had a recap meeting about the event.
- I had my interview with the Digital School Product about the three months for the position of AI Engineer. Thank you #events_opportunities for sharing that opportunity.
- I was asked to add an ML qwiklabs lab to hold a study jam.
- I requested access to ML Intro with the TensorFlow quest.
- I did the lab: Creating an Object Detection Application Using TensorFlow
Creating an Object Detection Application Using TensorFlow
- I added the lab to the ppt
- I wrote my first draft of the PTSD Omdena challenge article (I will share it with you when it's out)
- It's finally the events day. I held the talk and directed the study jam. Please check it out here if interested: shorturl.at/bmrvP
- Completed the 'Machine Learning with TensorFlow' lab from qwiklabs (https://google.qwiklabs.com/focuses/3391?parent=catalog)
- Working on more examples for shirt style transfer project #sg_project-t-shirt
- Has been working the whole day on Style transfer for the #sg_project-t-shirt , if interested please check my ''art'' in the Facebook group album for style shirt
- Currently working on Creating Models with Amazon SageMaker lab from qwiklabs
Creating Models with Amazon SageMaker
- Participated in the online Fairygodboss recruiting event, it was a refreshing and first time experience.
- Tried the Multi-class classification tutorial with LSTM on the customer complaint dataset. To implement it using my own dataset.
- I trained the multi-classification LSTM model I tried yesterday on the Omdena challenge PTSD dataset, the results were quite weird so I am thinking about redefining the structure of the dataset. Any suggestions are welcome 🙏
- I tried training the same data using a pre-trained BERT model and it gave pretty good results (thinking about understanding BERT more deeply)
- I finished week 1 and started week 2 of the TensorFlow course from Coursera of Andrew NG, it's pretty smooth and quite beginner-friendly:
- Worked on understanding how the BERT model works: https://towardsdatascience.com/bert-explained-state-of-the-art-language-model-for-nlp-f8b21a9b6270
- Applied to some job opportunities shared in #jobs channel which I would like to thank.
- Studied the three notebooks of the 4th lesson from fastai course v3.
- Exploring how AI can help in the legal field (any suggestions on how to make the process faster)
- Going to BERT but this time reading the article to understand the implementation:
- Began the Data Structures course from Udacity, I am thinking about brushing on the algorithms concepts basics in English (I studied it in French).
https://towardsdatascience.com/bert-classifier-just-another-pytorch-model-881b3cf05784 https://github.com/sugi-chan/custom_bert_pipeline/blob/master/IMDB%20Dataset.csv https://classroom.udacity.com/courses/ud513/lessons/7174469398/concepts/78885717720923
- Completed week 2 of DL from Deeplearningai course in coursera.
- Checked this article on how we can fine tune an ML model on AI conversational chatbots
- I am thinking about deploying ML models as mobile apps so I thought about checking Flutter but for IOS it’s really a hassle especially with the apple development id.
Flutter .
AI-scholar-chatbots-that-improve-after-deployment
- Checked the AWS DeepRacer course and finished lesson 1 and started lesson 2.
- Finished week 2 of the Tensorflow 1 course specialization in Coursera by Andrew.
- Still working on setting up flutter for IOS : https://www.youtube.com/watch?v=3oIFshgMgLA
- Stopped at efficiency in the Data Structures course from Udacity.
These days I have been focusing on advancing on the TF course in Coursera (and making truffles for Eid)
- Implementing convolution layers
- Implementing pooling layers
- Implementing using numpy and scipy the convolution network from scratch and understanding how it works.
- Implementing what I learned all along by applying it on MNIST (has been learning how it works using FashionMNIST)
- Finished the first 5 videos of week 3 on Neural Networks from the Deep learning AI 1st course from the specialization
- I finaly got my first flutter app runing on an IOS simulator (thanks to the following tutorial): https://www.youtube.com/watch?v=H_xusHxICbk
- Completed this tutorial as well (it’s fun): https://codelabs.developers.google.com/codelabs/first-flutter-app-pt1/index.html?index=..%2F..index#5
- Gone through the 3rd and 4th part in lesson 3 of AWS DeepRacer (but still need to activate my AWS account)
- I watched the first parts about Image Generators of week 4 of Tensorflow in Coursera.
- Completed the lessons and quiz of week 3 of the deeplearning.ai 1st course of the DL specialization.
- Finished the first lesson of Data Structures & Algorithms (calculating efficiency)
- I have gone through the videos and notebooks from the 7th until the 12th of TF course from coursera (that exposes the importance of having a validation set that is separate from the training set).
- Finished week 4 and the 1st specialization of TF course from coursera while exploring the difference in accuracy and training when we change the image size and did the practice exercices where I used all the things I learned throughtout week 4 to create a classifier for a set of happy or sad images.
- I have run and did half of the assignments of the notebook : Planar data classification with one hidden layer from the deeplearning.ai course on Coursera.
- I finished the other half of the notebook where I implemented the loss, forward, backward propagation, the gradient descent and gathered all the functions in one function that constructed the model and did the predictions.
- I implemented several machine learning models like : SVM, Random Forest, Logistic Regression... on the fully annotated dataset for the Omdena PTSD Challenge to help us compare the novel methods like BERT and ULMFIT performance with traditional models.
- I reviewed the article I had to write about the challenge which is an introduction to how we can help cure PTSD using AI: https://medium.com/omdena/ai-post-traumatic-stress-disorder-f8917cedb434
- I have gone through half of the concepts in week 1 of 2nd course of TF by deeplearning.ai
- I watched the interview of Andrew NG with Ian Goodfellow.
- I finished the remaining lessons from week 1 as well as finished the final exercice that consists of classifying dogs and cats using the whole dataset from microsoft : https://www.coursera.org/learn/convolutional-neural-networks-tensorflow/home/welcome
- I did the Introduction to Amazon EC2 Auto Scaling lab from qwiklabs
- I did all the labs in Cloud Hero Speedrun: IoT & Big Data:
- Internet of Things: Qwik Start
- Dataflow: Qwik Start - Templates
- Streaming IoT Core Data to Dataprep
- Building an IoT Analytics Pipeline on Google Cloud Platform
- I watched the first 4 videos on Deep l layers network from deeplearning.ai week 4 1st course
- I finished the lessons from the deeplearning.ai specialization course week 4 1st course.
- I have gone through the lessons of week 2 and did the final exercise of classification using Augmentation techniques from the Convolutional Neural Networks in TensorFlow course in Coursera.
- I studied all the lesson videos on Transfer Learning, and how to take an existing model, freeze many of its layers to prevent them being retrained, and effectively 'remember' the convolutions it was trained on to fit images.
How to add a DNN underneath this so that we could retrain on our images using the convolutions from the other model.
- As well as learned regularization using dropouts to make the network more efficient in preventing over-specialization and this overfitting.
From the Convolutional Neural Networks in TensorFlow course in Coursera.
- I applied all that I learned in the previous videos to improve the accuracy of classifying Horses vs Humans.
- I completed Convolutional Neural Networks in TensorFlow course from Coursera course by finishing all the 4th week lessons about Multi-Class classification by applying what I learned to a CGI (Computer Graphic Images) generated dataset of rock, paper, scissors with different skin tones and nail polish: http://www.laurencemoroney.com/rock-paper-scissors-dataset/
- I did the final project that classifies hand signs using the 'Sign Language MNIST' dataset from kaggle: https://www.kaggle.com/datamunge/sign-language-mnist/kernels
To run these notebooks you'll need to install Python 3.6, PySyft, Numpy, PyTorch 1.0, and Jupyter Notebooks. The easiest way for all of this is to create a conda environment:
conda create -n pysyft python=3.6
conda activate pysyft
conda install numpy jupyter notebook
conda install pytorch torchvision -c pytorch
pip install syft