zoukankan      html  css  js  c++  java
  • Failed to rename HdfsNamedFileStatus

    报错信息:

    21/03/29 14:05:08 WARN TaskSetManager: Lost task 53.0 in stage 6.0 (TID 580, test-bdp06, executor 7): org.apache.spark.SparkException: Task failed while writing rows
        at org.apache.spark.internal.io.SparkHadoopWriter$.org$apache$spark$internal$io$SparkHadoopWriter$$executeTask(SparkHadoopWriter.scala:155)
        at org.apache.spark.internal.io.SparkHadoopWriter$$anonfun$3.apply(SparkHadoopWriter.scala:83)
        at org.apache.spark.internal.io.SparkHadoopWriter$$anonfun$3.apply(SparkHadoopWriter.scala:78)
        at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
        at org.apache.spark.scheduler.Task.run(Task.scala:109)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
    Caused by: java.io.IOException: Failed to rename HdfsNamedFileStatus{path=hdfs://mycluster/data/autohome-log/1614524400000.tmp/_temporary/0/_temporary/attempt_20210329140504_0016_m_000053_0/5; isDirectory=false; length=2859909; replication=3; blocksize=134217728; modification_time=1616997908781; access_time=1616997908596; owner=root; group=hdfs; permission=rw-r--r--; isSymlink=false; hasAcl=false; isEncrypted=false; isErasureCoded=false} to hdfs://mycluster/data/autohome-log/1614524400000.tmp/5
        at org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.mergePaths(FileOutputCommitter.java:473)
        at org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.mergePaths(FileOutputCommitter.java:486)
        at org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.commitTask(FileOutputCommitter.java:597)
        at org.apache.hadoop.mapred.FileOutputCommitter.commitTask(FileOutputCommitter.java:172)
        at org.apache.hadoop.mapred.OutputCommitter.commitTask(OutputCommitter.java:343)
        at org.apache.spark.mapred.SparkHadoopMapRedUtil$.performCommit$1(SparkHadoopMapRedUtil.scala:50)
        at org.apache.spark.mapred.SparkHadoopMapRedUtil$.commitTask(SparkHadoopMapRedUtil.scala:77)
        at org.apache.spark.internal.io.HadoopMapReduceCommitProtocol.commitTask(HadoopMapReduceCommitProtocol.scala:225)
        at org.apache.spark.internal.io.SparkHadoopWriter$$anonfun$4.apply(SparkHadoopWriter.scala:138)
        at org.apache.spark.internal.io.SparkHadoopWriter$$anonfun$4.apply(SparkHadoopWriter.scala:127)
        at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1415)
        at org.apache.spark.internal.io.SparkHadoopWriter$.org$apache$spark$internal$io$SparkHadoopWriter$$executeTask(SparkHadoopWriter.scala:139)
        ... 8 more

    错误代码:

    JavaPairRDD<String, String> domainLogRdd = logRdd.mapToPair(log -> {
        return new Tuple2<>(log.split("|")[1], log);
    })
    
    domainRdd = domainLogRdd.reduceByKey((log1, log2) -> log1.concat("
    ").concat(log2));
    
    domainLogRdd.saveAsHadoopFile(path, String.class, String.class, RDDMultipleTextOutputFormat.class);

    解决方法:

    未给分隔符添加转义字符,把log.split("|")改为log.split("\|")可以解决错误

    报错原因:

  • 相关阅读:
    Method "goodsList" has already been defined as a data property
    mac安装淘宝淘宝镜像失败
    webstrom git配置设置时右侧没有内容 select configuration element in the tree to edit its setting
    vue下标获取数据时候,页面报错
    透明度全兼容
    clipboard冲突mui.css,移动端实现复制粘贴
    Vue价格四舍五入保留两位和直接取两位
    实习大总结
    day33
    day31
  • 原文地址:https://www.cnblogs.com/zcqkk/p/14593875.html
Copyright © 2011-2022 走看看