ValueError: Cannot run multiple SparkContexts at once; existing SparkContext(app=PySparkShell, maste_Rachel_nana的博客-CSDN博客
Pyspark Kernel Error : ValueError: Cannot run multiple SparkContexts at once; existing SparkContext(app=pyspark-shell, master=yarn · Issue #2115 · jupyterhub/jupyterhub · GitHub
Initialization of SparklySession when SparkContext is already exists · Issue #66 · tubular/sparkly · GitHub
Pyspark Kernel Error : ValueError: Cannot run multiple SparkContexts at once; existing SparkContext(app=pyspark-shell, master=yarn · Issue #2115 · jupyterhub/jupyterhub · GitHub
![PySpark のコードを実行すると "ValueError: Cannot run multiple SparkContexts at once; existing SparkContext" と怒られる - ablog PySpark のコードを実行すると "ValueError: Cannot run multiple SparkContexts at once; existing SparkContext" と怒られる - ablog](https://cdn.image.st-hatena.com/image/square/8c1403812854f7ff444c6e7e84c2c54ced5af9a8/backend=imagemagick;height=100;version=1;width=100/https%3A%2F%2Fcdn-ak.f.st-hatena.com%2Fimages%2Ffotolife%2Fy%2Fyohei-a%2F20180909%2F20180909162436.png)