为 Spark 创建 JAR

2024-02-05

我正在关注这个guide https://spark.apache.org/docs/1.2.0/quick-start.html,但是当我尝试使用以下命令创建 JAR 时,我无法在 Spark 中运行 Scalasbt.

I have 简单.sbt as:

name := "Simple Project"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.6.0"

这是错误:

sbt package
[INFO]  ..
[warn]  [NOT FOUND  ] org.eclipse.jetty.orbit#javax.servlet;3.0.0.v201112011016!javax.servlet.orbit (255ms)
[warn] ==== public: tried
[warn]   http://repo1.maven.org/maven2/org/eclipse/jetty/orbit/javax.servlet/3.0.0.v201112011016/javax.servlet-3.0.0.v201112011016.orbit
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::              FAILED DOWNLOADS            ::
[warn]  :: ^ see resolution messages for details  ^ ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: org.eclipse.jetty.orbit#javax.servlet;3.0.0.v201112011016!javax.servlet.orbit
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[error] {file:/home/gsamaras/spark-1.6.0-bin-hadoop2.6/code/}default-04a409/*:update: sbt.ResolveException: download failed: org.eclipse.jetty.orbit#javax.servlet;3.0.0.v201112011016!javax.servlet.orbit
[error] Total time: 25 s, completed Feb 10, 2016 5:11:30 PM

Scala 版本,但我在某处读到它不相关:

gsamaras@gsamaras:~/spark-1.6.0-bin-hadoop2.6/code$ scala -version
Scala code runner version 2.11.6 -- Copyright 2002-2013, LAMP/EPFL

和 sbt 版本:

gsamaras@gsamaras:~/spark-1.6.0-bin-hadoop2.6/code$ sbt sbt-version
[info] Set current project to Simple Project (in build file:/home/gsamaras/spark-1.6.0-bin-hadoop2.6/code/)
[info] 0.11.3

相关的:SBT、Jetty 和 Servlet 3.0 https://stackoverflow.com/questions/9889674/sbt-jetty-and-servlet-3-0


Update sbt到最新版本,通过执行(某些步骤可能是可选的):

sudo apt-get remove sbt
echo "deb https://dl.bintray.com/sbt/debian /" | sudo tee -a /etc/apt/sources.list.d/sbt.list
sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv 642AC823
sudo apt-get update
sudo apt-get install sbt
sudo apt-get upgrade sbt

你会没事的。

本文内容由网友自发贡献,版权归原作者所有,本站不承担相应法律责任。如您发现有涉嫌抄袭侵权的内容,请联系:hwhale#tublm.com(使用前将#替换为@)

为 Spark 创建 JAR 的相关文章

随机推荐