如何在spark Streaming中定期更新rdd

2024-06-18

我的代码是这样的:

sc = SparkContext()
ssc = StreamingContext(sc, 30)

initRDD = sc.parallelize('path_to_data')
lines = ssc.socketTextStream('localhost', 9999)
res = lines.transform(lambda x: x.join(initRDD))

res.pprint()

我的问题是initRDD需要每天半夜更新.

我尝试这样:

sc = SparkContext()
ssc = StreamingContext(sc, 30)

lines = ssc.socketTextStream('localhost', 9999)


def func(rdd):
    initRDD = rdd.context.parallelize('path_to_data')
    return rdd.join(initRDD)


res = lines.transform(func)

res.pprint()

但似乎initRDD每 30 秒更新一次,与batchDuration

有没有什么好的理想


一种选择是在截止日期之前检查是否有截止日期transform。该检查是一个简单的比较,因此在每个批次间隔进行检查很便宜:

def nextDeadline() : Long = {
  // assumes midnight on UTC timezone.
  LocalDate.now.atStartOfDay().plusDays(1).toInstant(ZoneOffset.UTC).toEpochMilli()
}
// Note this is a mutable variable!
var initRDD = sparkSession.read.parquet("/tmp/learningsparkstreaming/sensor-records.parquet")
// Note this is a mutable variable!
var _nextDeadline = nextDeadline()

val lines = ssc.socketTextStream("localhost", 9999)
// we use the foreachRDD as a scheduling trigger. 
// We don't use the data, only the execution hook
lines.foreachRDD{ _ => 
    if (System.currentTimeMillis > _nextDeadline) {
      initRDD = sparkSession.read.parquet("/tmp/learningsparkstreaming/sensor-records.parquet")
      _nextDeadline = nextDeadline()
    }
}
// if the rdd was updated, it will be picked up in this stage.
val res = lines.transform(rdd => rdd.join(initRDD))
本文内容由网友自发贡献,版权归原作者所有,本站不承担相应法律责任。如您发现有涉嫌抄袭侵权的内容,请联系:hwhale#tublm.com(使用前将#替换为@)

如何在spark Streaming中定期更新rdd 的相关文章

随机推荐