【原創】大叔經驗分享(84)spark sql中設置hive.exec.max.dynamic.partitions無效


spark 2.4

 

spark sql中執行

set hive.exec.max.dynamic.partitions=10000;

后再執行sql依然會報錯:

org.apache.hadoop.hive.ql.metadata.HiveException:
Number of dynamic partitions created is 1001, which is more than 1000.
To solve this try to set hive.exec.max.dynamic.partitions to at least 1001.

這個參數hive.exec.max.dynamic.partitions的默認值是1000,修改沒有生效,

 

原因如下:

`HiveClient` does not know new value 1001. There is no way to change the default value of `hive.exec.max.dynamic.partitions` of `HiveCilent` with `SET` command.

The root cause is that `hive` parameters are passed to `HiveClient` on creating. So, the workaround is to use `--hiveconf` when starting `spark-shell`.

 

解決方法是在啟動spark-sql時設置hiveconf

spark-sql --hiveconf hive.exec.max.dynamic.partitions=10000

 

參考:

https://issues.apache.org/jira/browse/SPARK-19881


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM