spark
Last updated
Was this helpful?
Last updated
Was this helpful?
Firstly, please refer to the conda workflow in python section.
pip install findspark
in your conda virtual env.
spack load jdk
and spack load spark
. It is necessary to load jdk for the exsitence of JAVA_HOME
, otherwise spark context cannot be created, see .
And open jupyter notebook use jupyter notebook
as usual.