spark 2.4
spark sql中執行
set hive.exec.max.dynamic.partitions=10000;
后再執行sql依然會報錯:
org.apache.hadoop.hive.ql.metadata.HiveException: Number of dynamic partitions created is 1001, which is more than 1000. To solve this try to set hive.exec.max.dynamic.partitions to at least 1001.
這個參數hive.exec.max.dynamic.partitions的默認值是1000,修改沒有生效,
原因如下:
`HiveClient` does not know new value 1001. There is no way to change the default value of `hive.exec.max.dynamic.partitions` of `HiveCilent` with `SET` command.
The root cause is that `hive` parameters are passed to `HiveClient` on creating. So, the workaround is to use `--hiveconf` when starting `spark-shell`.
解決方法是在啟動spark-sql時設置hiveconf
spark-sql --hiveconf hive.exec.max.dynamic.partitions=10000
參考:
https://issues.apache.org/jira/browse/SPARK-19881