spark
Firstly, please refer to the conda workflow in python section.
pip install findspark
in your conda virtual env.
spack load jdk
and spack load spark
. It is necessary to load jdk for the exsitence of JAVA_HOME
, otherwise spark context cannot be created, see so.
And open jupyter notebook use jupyter notebook
as usual.
Last updated