为什么 Scala 编译器会失败并显示“无法在包 org.apache.spark 中访问包 Spark 中的对象 SparkConf”?

2023-12-02

我无法访问SparkConf在包裹中。但我已经导入了import org.apache.spark.SparkConf。我的代码是:

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.rdd.RDD

import org.apache.spark._
import org.apache.spark.streaming._
import org.apache.spark.streaming.StreamingContext._

object SparkStreaming {
    def main(arg: Array[String]) = {

        val conf = new SparkConf.setMaster("local[2]").setAppName("NetworkWordCount")
        val ssc = new StreamingContext( conf, Seconds(1) )

        val lines = ssc.socketTextStream("localhost", 9999)
        val words = lines.flatMap(_.split(" "))
        val pairs_new = words.map( w => (w, 1) )
        val wordsCount = pairs_new.reduceByKey(_ + _)
        wordsCount.print() 

        ssc.start() // Start the computation
        ssc.awaitTermination() // Wait for the computation to the terminate

    }
}

The sbt依赖项是:

name := "Spark Streaming"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies ++= Seq(
    "org.apache.spark" %% "spark-core" % "1.5.2" % "provided",
    "org.apache.spark" %% "spark-mllib" % "1.5.2",
    "org.apache.spark" %% "spark-streaming" % "1.5.2"
)

但错误表明SparkConf无法访问。

[error] /home/cliu/Documents/github/Spark-Streaming/src/main/scala/Spark-Streaming.scala:31: object SparkConf in package spark cannot be accessed in package org.apache.spark
[error]         val conf = new SparkConf.setMaster("local[2]").setAppName("NetworkWordCount")
[error]                        ^

如果在 SparkConf 之后添加括号,它就会编译:

val conf = new SparkConf().setMaster("local[2]").setAppName("NetworkWordCount")

关键是 SparkConf 是一个类而不是一个函数,因此您也可以将类名用于范围目的。因此,当您在类名后添加括号时,您可以确保调用类构造函数而不是作用域功能。下面是来自 Scala shell 的示例,说明了差异:

scala> class C1 { var age = 0; def setAge(a:Int) = {age = a}}
defined class C1

scala> new C1
res18: C1 = $iwC$$iwC$C1@2d33c200

scala> new C1()
res19: C1 = $iwC$$iwC$C1@30822879

scala> new C1.setAge(30)  // this doesn't work

<console>:23: error: not found: value C1
          new C1.setAge(30)
              ^

scala> new C1().setAge(30) // this works

scala> 
本文内容由网友自发贡献,版权归原作者所有,本站不承担相应法律责任。如您发现有涉嫌抄袭侵权的内容,请联系:hwhale#tublm.com(使用前将#替换为@)

为什么 Scala 编译器会失败并显示“无法在包 org.apache.spark 中访问包 Spark 中的对象 SparkConf”? 的相关文章

随机推荐