我使用 Databricks 和 Apache Spark 2.4 测试了以下查询:
%sql
<step1>
create temporary view temp_view_t
as select 1 as no, 'aaa' as str;
<step2>
insert into temp_view_t values (2,'bbb');
然后我收到此错误消息。
SQL 语句错误: AnalysisException:不允许插入基于 RDD 的表。;;
'InsertIntoTable 项目 [1 AS no#824,aaa AS str#825],假,假
+- LocalRelation [col1#831, col2#832]
我的问题是
- Spark中无法插入临时表吗?
- 如何在 Spark sql 中创建临时数据?
谢谢。
We can't
将数据插入临时表,但我们可以用以下命令模拟插入union all
(or) union
(删除重复项)。
Example:
#create temp view
spark.sql("""create or replace temporary view temp_view_t as select 1 as no, 'aaa' as str""")
spark.sql("select * from temp_view_t").show()
#+---+---+
#| no|str|
#+---+---+
#| 1|aaa|
#+---+---+
#union all with the new data
spark.sql("""create or replace temporary view temp_view_t as select * from temp_view_t union all select 2 as no, 'bbb' as str""")
spark.sql("select * from temp_view_t").show()
#+---+---+
#| no|str|
#+---+---+
#| 1|aaa|
#| 2|bbb|
#+---+---+
#to eliminate duplicates we can use union also.
spark.sql("""create or replace temporary view temp_view_t as select * from temp_view_t union select 1 as no, 'aaa' as str""")
spark.sql("select * from temp_view_t").show()
#+---+---+
#| no|str|
#+---+---+
#| 1|aaa|
#| 2|bbb|
#+---+---+
本文内容由网友自发贡献,版权归原作者所有,本站不承担相应法律责任。如您发现有涉嫌抄袭侵权的内容,请联系:hwhale#tublm.com(使用前将#替换为@)