SPARK SQL ERROR: Detected cartesian product for INNER join between logical plans報錯解決方法


 

SparkSql運行程序報錯,

Exception in thread "main" org.apache.spark.sql.AnalysisException: Detected cartesian product for INNER join between logical plans

解決方式:設置spark.sql.crossJoin.enabled=true

因為 ,2.x中默認不支持笛卡爾積操作,需要通過參數spark.sql.crossJoin.enabled開啟

 

程序代碼里面開始笛卡爾積操作,如下示:

val sc: SparkSession = SparkSession.builder
.appName("My Spark Application") // optional and will be autogenerated if not specified
.master("local[*]") // avoid hardcoding the deployment environment
.config("spark.debug.maxToStringFields", "200")
.config("spark.sql.crossJoin.enabled","true")
.getOrCreate

val rst = discountFinancial.run(sc)


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM