Spark注冊UDF函數,用於DataFrame DSL or SQL


import org.apache.spark.sql.SparkSession
import org.apache.spark.sql.functions._

object Test2 {
  def main(args: Array[String]): Unit = {
    val spark = SparkSession
      .builder()
      .appName("Spark SQL basic example")
      .config("spark.some.config.option", "some-value")
      .getOrCreate()

    // For implicit conversions like converting RDDs to DataFrames
    import spark.implicits._

    val df = Seq(("id1", 1, 100), ("id2", 4, 300), ("id3", 5, 800)).toDF("id", "value", "cnt")
    df.printSchema()

    // 注冊自定義UDF函數
    spark.udf.register("simpleUDF", (v: Int, w: Int) => v * v + w * w)

    df.select($"id", callUDF("simpleUDF", $"value", $"cnt")).toDF("id", "value").show

  }
}

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM