我正在构建一个 SBT 多项目项目,其中有common
模块和logic
模块,以及logic.dependsOn(common)
.
In common
,SparkSQL 2.2.1 ("org.apache.spark" %% "spark-sql" % "2.2.1") 被引入。在logic
,也使用了 SparkSQL,但是我收到编译错误,提示“对象 Spark 不是包 org.apache 的成员”。
现在,如果我将 SparkSQL 依赖项添加到logic
as "org.apache.spark" %% "spark-sql" % "2.2.1"
, 有用。但是如果我添加"org.apache.spark" %% "spark-sql" % "2.2.1" % Provided"
, 我犯了同样的错误。
我不明白为什么会发生这种情况,为什么依赖关系不能传递common
to logic
这是根 sbt 文件:
lazy val commonSettings = Seq(
organization := "...",
version := "0.1.0",
scalaVersion := "2.11.12",
resolvers ++= Seq(
clojars,
maven_local,
novus,
twitter,
spark_packages,
artima
),
test in assembly := {},
assemblyMergeStrategy in assembly := {...}
)
lazy val root = (project in file(".")).aggregate(common, logic)
lazy val common = (project in file("common")).settings(commonSettings:_*)
lazy val logic = (project in file("logic")).dependsOn(common).settings(commonSettings:_*)
这是逻辑模块 sbt 文件:
libraryDependencies ++= Seq(
spark_sql.exclude("io.netty", "netty"),
embedded_elasticsearch % "test",
scalatest % "test"
)
dependencyOverrides ++= Seq(
"com.fasterxml.jackson.core" % "jackson-core" % "2.6.5",
"com.fasterxml.jackson.core" % "jackson-databind" % "2.6.5",
"com.fasterxml.jackson.module" % "jackson-module-scala_2.11" % "2.6.5",
"com.fasterxml.jackson.core" % "jackson-annotation" % "2.6.5",
"org.json4s" %% "json4s-jackson" % "3.2.11"
)
assemblyJarName in assembly := "***.jar"