我尝试在 IPython Notebook 中运行 Apache Spark,请遵循此说明(以及评论中的所有建议)-link http://ramhiser.com/2015/02/01/configuring-ipython-notebook-support-for-pyspark/
但是当我通过以下命令运行 IPython Notebook 时:
ipython notebook --profile=pyspark
我收到此错误:
Error: Must specify a primary resource (JAR or Python or R file)
如果我在 shell 中运行 pyspark,一切正常。这意味着我在连接 Spark 和 IPython 时遇到了一些问题。
顺便说一句,这是我的 bash_profile:
export SPARK_HOME="$HOME/spark-1.4.0"
export PYSPARK_SUBMIT_ARGS='--conf "spark.mesos.coarse=true" pyspark-shell'
而这包含〜/.ipython/profile_pyspark/startup/00-pyspark-setup.py:
# Configure the necessary Spark environment
import os
import sys
# Spark home
spark_home = os.environ.get("SPARK_HOME")
# If Spark V1.4.x is detected, then add ' pyspark-shell' to
# the end of the 'PYSPARK_SUBMIT_ARGS' environment variable
spark_release_file = spark_home + "/RELEASE"
if os.path.exists(spark_release_file) and "Spark 1.4" in open(spark_release_file).read():
pyspark_submit_args = os.environ.get("PYSPARK_SUBMIT_ARGS", "")
if not "pyspark-shell" in pyspark_submit_args: pyspark_submit_args += " pyspark-shell"
os.environ["PYSPARK_SUBMIT_ARGS"] = pyspark_submit_args
# Add the spark python sub-directory to the path
sys.path.insert(0, spark_home + "/python")
# Add the py4j to the path.
# You may need to change the version number to match your install
sys.path.insert(0, os.path.join(spark_home, "python/lib/py4j-0.8.2.1-src.zip"))
# Initialize PySpark to predefine the SparkContext variable 'sc'
execfile(os.path.join(spark_home, "python/pyspark/shell.py"))
以及可能需要什么 - 昨天我将我的 OS X 升级到 10.10.4