Skip to content
View Undisputed-jay's full-sized avatar
💭
Available
💭
Available

Block or report Undisputed-jay

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
Undisputed-jay/README.md

Hi! hi

I'm Ahmed Ayodele, a passionate coder who thrives on crafting exceptional projects and conquering exciting Kaggle challenges. My time is devoted to coding and pushing boundaries to create outstanding solutions. Currently honing my skills to be a better Data Engineer.

📫 Reach out!

Twitter BadgeLinkedin Badge Mail Badge Mail Badge

  • 🔭 MSc. Data Science @University of Salford, Manchester..
  • 💻 Most used line of code git commit -m "Initial Commit"
  • 🤔 I’m always looking out to share ideas and gain knowledge.
  • 📫 How to reach me: ayodeleahmed219@gmail.com.
  • 😄 Pronouns: He / Him.
  • ⚡ Hobbies: Music, Travelling and Reading.

Top Technologies

Python Badge SQL Badge Apache Airflow Badge Snowflake Badge AWS Badge GCP Badge PySpark Badge

IDE

VScode SQL Server Jupyter Docker PostgreSQL >

More stuff about me

I am an experienced Data Scientist | Data Engineer with a track record of working in both the Public and Private sectors. My areas of expertise include data mining, analysis, and machine learning, which I leverage to provide insights to both technical and non-technical audiences.

As a highly skilled individual, I possess core competencies such as natural language processing, time series forecasting, advanced data analytics, model development, data visualization, research and analysis, risk assessment, database management, and technical problem resolution. I excel in delivering effective and accurate results to my clients by leveraging these skills.

Additionally, I have a strong technical background and am proficient in several programming languages and tools, including Python, Pyspark, Pandas, AWS, SQL, BeautifulSoup, Numpy, Matplotlib, Seaborn, Sklearn, PowerBI, Azure Data Factory, and Databricks.

Overall, my expertise and skills enable me to provide valuable insights and solutions to complex business problems using data-driven approaches, and my ability to communicate technical findings effectively to a broad audience makes me a valuable asset to any organization.

Github Stats

Ahmed's github stats

Pinned Loading

  1. AWS-S3-Integration-with-snowflake AWS-S3-Integration-with-snowflake Public

    This project sets up an ETL pipeline to load Citibike trip data from an AWS S3 bucket into Snowflake. It establishes a secure integration with S3, defines a CSV file format, stages the data, and lo…

    1

  2. Behavior-Driven-Development-Testing-for-Ecommerce-Login Behavior-Driven-Development-Testing-for-Ecommerce-Login Public

    This project automates login testing with Behave and Selenium WebDriver, using BDD to verify login scenarios like valid and invalid credentials. The page object model (POM) keeps the code organized…

    Python 1

  3. SpotifyAPI-Data-Engineering-Project SpotifyAPI-Data-Engineering-Project Public

    This projects uses ETL (Extract, Transform and Load) pipeline to extract data from Spotify using its API and loads the data to a data source(AWS Athena). The entire pipeline will be built using Ama…

    Jupyter Notebook 1

  4. wikipedia_stadium_data_pipeline_with_apache_airflow wikipedia_stadium_data_pipeline_with_apache_airflow Public

    An Apache Airflow pipeline that scrapes football stadium data from Wikipedia, processes it with pandas, stores it in PostgreSQL, and saves query results to CSV.

    Python 1

  5. Airflow-ETL-Pipeline-with-PySpark-and-Google-Cloud-Dataproc Airflow-ETL-Pipeline-with-PySpark-and-Google-Cloud-Dataproc Public

    This project automates daily vehicle data processing on Google Cloud using Apache Airflow. It uploads scripts to Google Cloud Storage, runs specific PySpark jobs on Dataproc based on the day, and s…

    Python

  6. Reddit_ETL_pipeline-using-Apache_Airflow Reddit_ETL_pipeline-using-Apache_Airflow Public

    Python 1