SPARK SQL ERROR: Detected cartesian product for INNER join between logical plans报错解决方法


 

SparkSql运行程序报错,

Exception in thread "main" org.apache.spark.sql.AnalysisException: Detected cartesian product for INNER join between logical plans

解决方式:设置spark.sql.crossJoin.enabled=true

因为 ,2.x中默认不支持笛卡尔积操作,需要通过参数spark.sql.crossJoin.enabled开启

 

程序代码里面开始笛卡尔积操作,如下示:

val sc: SparkSession = SparkSession.builder
.appName("My Spark Application") // optional and will be autogenerated if not specified
.master("local[*]") // avoid hardcoding the deployment environment
.config("spark.debug.maxToStringFields", "200")
.config("spark.sql.crossJoin.enabled","true")
.getOrCreate

val rst = discountFinancial.run(sc)


免责声明!

本站转载的文章为个人学习借鉴使用,本站对版权不负任何法律责任。如果侵犯了您的隐私权益,请联系本站邮箱yoyou2525@163.com删除。



 
粤ICP备18138465号  © 2018-2025 CODEPRJ.COM