Automated tool for scraping job postings into a .csv
file.
- Never see the same job twice!
- No advertising.
- See jobs from multiple job search websites all in one place.
JobFunnel requires Python 3.8 or later.
pip install git+https://github.com/PaulMcInnis/JobFunnel.git
By performing regular scraping and reviewing, you can cut through the noise of even the busiest job markets.
You can search for jobs with YAML configuration files or by passing command arguments.
Download the demo settings.yaml by running the below command:
wget https://git.io/JUWeP -O my_settings.yaml
NOTE:
-
It is recommended to provide as few search keywords as possible (i.e.
Python
,AI
). -
JobFunnel currently only supports
CANADA_ENGLISH
andUSA_ENGLISH
locales.
Run funnel
with your settings YAML to populate your master CSV file with jobs from available providers:
funnel load -s my_settings.yaml
Open the master CSV file and update the per-job status
:
-
Set to
interested
,applied
,interview
oroffer
to reflect your progression on the job. -
Set to
archive
,rejected
ordelete
to remove a job from this search. You can review 'blocked' jobs within yourblock_list_file
.
-
Automating Searches
JobFunnel can be easily automated to run nightly with crontab
For more information see the crontab document. -
Writing your own Scrapers
If you have a job website you'd like to write a scraper for, you are welcome to implement it, Review the Base Scraper for implementation details. -
Remote Work
Bypass a frustrating user experience looking for remote work by setting the search parameterremoteness
to match your desired level, i.e.FULLY_REMOTE
. -
Adding Support for X Language / Job Website
JobFunnel supports scraping jobs from the same job website across locales & domains. If you are interested in adding support, you may only need to define session headers and domain strings, Review the Base Scraper for further implementation details. -
Blocking Companies
Filter undesired companies by adding them to yourcompany_block_list
in your YAML or pass them by command line as-cbl
. -
Job Age Filter
You can configure the maximum age of scraped listings (in days) by configuringmax_listing_days
. -
Reviewing Jobs in Terminal
You can review the job list in the command line:column -s, -t < master_list.csv | less -#2 -N -S
-
Respectful Delaying
Respectfully scrape your job posts with our built-in delaying algorithms.To better understand how to configure delaying, check out this Jupyter Notebook which breaks down the algorithm step by step with code and visualizations.
-
Recovering Lost Data
JobFunnel can re-build your master CSV from yourcache_folder
where all the historic scrape data is located:funnel --recover
-
Running by CLI
You can run JobFunnel using CLI only, review the command structure via:funnel inline -h