Automatically log into Linkedin, Scrape insights data, Store it locally, and Visualize with Jupyter Notebook and Tableau.
Table of Contents
- This code uses python and selenium to log into a Linkedin Premium Account, loop through a list of companies, and harvest the hidden table data from the HTML.
- I then used pandas to store the data in .pkl files.
- Finaly, I created data visualizations using Pandas and MatplotLib in Jupyter Notebook as well as Tableau.
- Installing all Required Packages
pip install -r requirements.txt
- Use Python to run 'main.py'. This will log into Linkedin, loop through a list of companies, and scrape the data from hidden tables, saving it to pickle files for future visualizations.
- Use Jupyter Notebook to manipulate the dat frames and create visualizations in matplotlib.
- Use Tableau to visualize the data.
Jared Fiacco - jaredfiacco2@gmail.com
A GCP Data Engineering Project of Mine: Publish Computer Statistics to Pub/Sub, Use Cloud Functions to Store in BigQuery