我正在尝试这里建议的代码:http://spark.apache.org/docs/1.2.1/mllib-ensembles.html#classification http://spark.apache.org/docs/1.2.1/mllib-ensembles.html#classification
使用 Scala 控制台(Scala 版本 = Scala code runner 版本 2.10.4),并收到以下错误:
scala> import org.apache.spark.mllib.tree.RandomForest
<console>:8: error: object apache is not a member of package org
import org.apache.spark.mllib.tree.RandomForest
^
然后我听从了建议here https://stackoverflow.com/a/28270036/190791并尝试构建一个简单的独立应用程序,但遇到了不同的问题:
root@sd:~/simple# sbt package
[info] Set current project to Simple Project (in build file:/root/simple/)
[info] Updating {file:/root/simple/}default-c5720e...
[info] Resolving org.scala-lang#scala-library;2.10.4 ...
[info] Resolving org.apache.spark#spark-core_2.10.4;1.2.0 ...
[warn] module not found: org.apache.spark#spark-core_2.10.4;1.2.0
[warn] ==== local: tried
[warn] /root/.ivy2/local/org.apache.spark/spark-core_2.10.4/1.2.0/ivys/ivy.xml
[warn] ==== public: tried
[warn] http://repo1.maven.org/maven2/org/apache/spark/spark-core_2.10.4/1.2.0/spark-core_2.10.4-1.2.0.pom
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: UNRESOLVED DEPENDENCIES ::
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: org.apache.spark#spark-core_2.10.4;1.2.0: not found
谁能建议我可以尝试什么?
您可以在中找到详细步骤这个帖子 http://blog.prabeeshk.com/blog/2014/04/01/a-standalone-spark-application-in-scala/如何在 Scala 中使用 SBT 编写自包含的 Spark 应用程序。在 sbt 配置文件中,您应该指定依赖库。
libraryDependencies ++= Seq("org.apache.spark" % "spark-core_2.10" % "1.2.1",
"org.apache.spark" % "spark-mllib_2.10" % "1.2.1")
然后使用以下命令编译
sbt package
本文内容由网友自发贡献,版权归原作者所有,本站不承担相应法律责任。如您发现有涉嫌抄袭侵权的内容,请联系:hwhale#tublm.com(使用前将#替换为@)