How to set up PySpark for your Jupyter notebook - Opensource.com
opensource.com › 18 › 11Nov 12, 2018 · You can check your Spark setup by going to the /bin directory inside {YOUR_SPARK_DIRECTORY} and running the spark-shell –version command. Here you can see which version of Spark you have and which versions of Java and Scala it is using. That's it! Now you should be able to spin up a Jupyter Notebook and start using PySpark from anywhere.