Skip to content

A guide on how to install and run PySpark on your local machine

Notifications You must be signed in to change notification settings

Explore-AI/pyspark-installation-guide

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 

Repository files navigation

pyspark-installation-guide

A guide on how to install and run PySpark on your local machine

Using Docker

This is the recommended was as it ensures that you're using a setup that is replicable on a wide variety of operating systems. Have a look at the docker_setup folder for instructions.

Install PySpark using Anaconda

If you can't use Docker, a conda based environment is your next option. Have a look at the anaconda_setup folder for further instructions. Please note that there are different instructions for Windows and Mac/Linux based systems.

About

A guide on how to install and run PySpark on your local machine

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published