zoukankan      html  css  js  c++  java
  • 【原创】大叔经验分享(84)spark sql中设置hive.exec.max.dynamic.partitions无效

    spark 2.4

    spark sql中执行

    set hive.exec.max.dynamic.partitions=10000;

    后再执行sql依然会报错:

    org.apache.hadoop.hive.ql.metadata.HiveException:
    Number of dynamic partitions created is 1001, which is more than 1000.
    To solve this try to set hive.exec.max.dynamic.partitions to at least 1001.

    这个参数hive.exec.max.dynamic.partitions的默认值是1000,修改没有生效,

    原因如下:

    `HiveClient` does not know new value 1001. There is no way to change the default value of `hive.exec.max.dynamic.partitions` of `HiveCilent` with `SET` command.

    The root cause is that `hive` parameters are passed to `HiveClient` on creating. So, the workaround is to use `--hiveconf` when starting `spark-shell`.

    解决方法是在启动spark-sql时设置hiveconf

    spark-sql --hiveconf hive.exec.max.dynamic.partitions=10000

    参考:

    https://issues.apache.org/jira/browse/SPARK-19881

  • 相关阅读:
    053335
    053334
    053333
    053332
    053331
    053330
    053329
    053328
    053327
    053326
  • 原文地址:https://www.cnblogs.com/barneywill/p/11618898.html
Copyright © 2011-2022 走看看