調用sqlContext.udf.register()
此時注冊的方法 只能在sql()中可見,對DataFrame API不可見
用法:sqlContext.udf.register("makeDt", makeDT(_:String,_:String,_:String))
示例:
def makeDT(date: String, time: String, tz: String) = s"$date $time $tz" sqlContext.udf.register("makeDt", makeDT(_:String,_:String,_:String)) // Now we can use our function directly in SparkSQL.
sqlContext.sql("SELECT amount, makeDt(date, time, tz) from df").take(2) // but not outside
df.select($"customer_id", makeDt($"date", $"time", $"tz"), $"amount").take(2) // fails
2)調用spark.sql.function.udf()方法
此時注冊的方法,對外部可見
用法:valmakeDt = udf(makeDT(_:String,_:String,_:String))
示例:
import org.apache.spark.sql.functions.udf val makeDt = udf(makeDT(_:String,_:String,_:String)) // now this works
df.select($"customer_id", makeDt($"date", $"time", $"tz"), $"amount").take(2)
