zoukankan      html  css  js  c++  java
  • spark Could not write all entries

    使用 sparkdataFrame 储存到 elasticsearch 出现如下报错:

    Caused by: org.elasticsearch.hadoop.EsHadoopException: Could not write all entries [1/1] (Maybe ES was overloaded?). Error sample (first [1] error messages):
    	rejected execution of org.elasticsearch.transport.TransportService$4@7d5f91de on EsThreadPoolExecutor[bulk, queue capacity = 50, org.elasticsearch.common.util.concurrent.EsThreadPoolExecutor@3447703a[Running, pool size = 32, active threads = 32, queued tasks = 68, completed tasks = 9151096]]
    Bailing out...
    

    这个无法查到定位到报错位置,所以在新建 spark 的时候进行如下配置:

    val masterUrl = "local"
    val appName = "ttyb"
    val sparkConf = new SparkConf()
      .setMaster(masterUrl)
      .setAppName(appName)
      .set("es.nodes", "172.16.14.21")
      .set("es.port", "9200")
      //Bailing out...错误
      .set("es.batch.size.entries", "1")
      //插入失败后无限重复插数据
      .set("es.batch.write.retry.count", "-1")
      //查数据等待时间
      .set("es.batch.write.retry.wait", "100")
    val Spark = SparkSession.builder().config(sparkConf).getOrCreate()
    

    得到新的错误:

    org.elasticsearch.hadoop.rest.EsHadoopInvalidRequest: 
    null
    

    报错显示:

    ES 负载过高,需要重新修复

    本想重启 ES ,发现是机器 磁盘空间已满 ,查错成功

  • 相关阅读:
    js 复制 浏览器 点击 copy
    php 百度地图 腾讯地图 转换坐标
    Excel PHP html select option 替换
    python教程
    Eclipse 总是在编译的时候卡住
    python+Eclipse+pydev环境搭建1
    python+Eclipse+pydev环境搭建
    Pycharm
    如何卸载eclipse中的pydev
    Eclipse的PyDev插件安装及解决安装后找不到的问题
  • 原文地址:https://www.cnblogs.com/TTyb/p/8507241.html
Copyright © 2011-2022 走看看