+-
读取第一个表,并在spark SQL中执行输入并插入到另一个表中。

请为下面的需求提供最好的代码,我们使用的是Apache Spark框架。

我有一个表1和表2,如下图所示。首先,我们需要根据Test_id读取表1,然后一个接一个地执行各自的查询(包括_SRC和_Target),然后将输出结果插入表2,并与表1查询的输出计数做一些基本的比较(如<,>,=,等),并将结果写入表2,并写上日期和用户详细信息。

先谢谢你

表1

enter image description here

表2enter image description here

2
投票

请检查下面的代码。

创建表1。

scala> :paste
// Entering paste mode (ctrl-D to finish)

val df = Seq(
    (1,"select count(*) from dbnameaaa.tbl_name","select count(*) from dbnameaaa.tbl_name"),
    (2,"select count(*) from dbnameaaa.tmp_tbl","select count(*) from dbnameaaa.tmp_tbl"))
    .toDF("test_id","execution_script_src","execution_script_target")

// Exiting paste mode, now interpreting.

df: org.apache.spark.sql.DataFrame = [test_id: int, execution_script_src: string ... 1 more field]

scala> df.show(false)
+-------+---------------------------------------+---------------------------------------+
|test_id|execution_script_src                   |execution_script_target                |
+-------+---------------------------------------+---------------------------------------+
|1      |select count(*) from dbnameaaa.tbl_name|select count(*) from dbnameaaa.tbl_name|
|2      |select count(*) from dbnameaaa.tmp_tbl |select count(*) from dbnameaaa.tmp_tbl |
+-------+---------------------------------------+---------------------------------------+

创建查询执行&条件UDFs。

scala> :paste
// Entering paste mode (ctrl-D to finish)

val execute = udf((query: String) => {
    try { spark.sql(query).map(_.getAs[Long](0)).collect.head }catch { case _: Exception => 0L }
})

val condition = udf((actual:Long,expected:Long) => {
   s"""{"=":"${if (actual == expected) "Pass" else "Fail"}","<":"${if (actual < expected) "Pass" else "Fail"}",">":"${if (actual > expected) "Pass" else "Fail"}","<>":"${if (actual != expected) "Pass" else "Fail"}"}"""
})

// Exiting paste mode, now interpreting.

execute: org.apache.spark.sql.expressions.UserDefinedFunction = UserDefinedFunction(<function1>,LongType,Some(List(StringType)))
condition: org.apache.spark.sql.expressions.UserDefinedFunction = UserDefinedFunction(<function2>,StringType,Some(List(LongType, LongType)))


最后表格结果

scala> :paste
// Entering paste mode (ctrl-D to finish)

df
.withColumn("actual_result",execute($"execution_script_src"))
.withColumn("expected_result",execute($"execution_script_target"))
.withColumn("test_condition",lit("[ =, <, >, <> ]"))
.withColumn("test_result",condition($"actual_result",$"expected_result"))
.withColumn("create_date",current_date)
.withColumn("modify_date",current_date)
.withColumn("created_by",lit(spark.sparkContext.sparkUser))
.withColumn("modified_by",lit(spark.sparkContext.sparkUser))
.withColumn("execute_date",current_date)
.show(false)

// Exiting paste mode, now interpreting.

+-------+---------------------------------------+---------------------------------------+-------------+---------------+---------------+----------------------------------------------+-----------+-----------+----------+-----------+------------+
|test_id|execution_script_src                   |execution_script_target                |actual_result|expected_result|test_condition |test_result                                   |create_date|modify_date|created_by|modified_by|execute_date|
+-------+---------------------------------------+---------------------------------------+-------------+---------------+---------------+----------------------------------------------+-----------+-----------+----------+-----------+------------+
|1      |select count(*) from dbnameaaa.tbl_name|select count(*) from dbnameaaa.tbl_name|11           |11             |[ =, <, >, <> ]|{"=":"Pass","<":"Fail",">":"Fail","<>":"Fail"}|2020-05-06 |2020-05-06 |srinivas  |srinivas   |2020-05-06  |
|2      |select count(*) from dbnameaaa.tmp_tbl |select count(*) from dbnameaaa.tmp_tbl |11           |22             |[ =, <, >, <> ]|{"=":"Fail","<":"Pass",">":"Fail","<>":"Pass"}|2020-05-06 |2020-05-06 |srinivas  |srinivas   |2020-05-06  |
+-------+---------------------------------------+---------------------------------------+-------------+---------------+---------------+----------------------------------------------+-----------+-----------+----------+-----------+------------+