zoukankan      html  css  js  c++  java
  • Spark问题记录

    Spark 多线程时的序列化问题  临时记录

     Exception in thread "Thread-28" org.apache.spark.SparkException: Task not serializable
        at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:166)
        at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:158)
        at org.apache.spark.SparkContext.clean(SparkContext.scala:1242)
        at org.apache.spark.rdd.RDD.flatMap(RDD.scala:277)
        at org.apache.spark.api.java.JavaRDDLike$class.flatMap(JavaRDDLike.scala:109)
        at org.apache.spark.api.java.JavaRDD.flatMap(JavaRDD.scala:32)
        at com.main.java.MyThread.run(MyThread.java:30)
        at java.lang.Thread.run(Thread.java:745)
    Caused by: java.io.NotSerializableException: org.apache.spark.api.java.JavaSparkContext
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184)
        at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
        at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
        at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
        at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
        at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
        at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
        at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
        at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
        at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
        at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
        at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:42)
        at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:73)
        at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:164)
        ... 7 more
    hadoop@Node4:/usr/local/myjar$

    解决:发现是node3的/etc/hosts 多了一行  127.0.1.1 Node3

    去掉就解决了。

  • 相关阅读:
    谈谈分布式事务之一:SOA需要怎样的事务控制方式
    asp.net创建自定义排序用户界面
    在ASP.NET 2.0中操作数据:在GridView的页脚中显示统计信息
    Url重写技术的运用(转)
    ASP.NET 对 SqlDataSource 控件使用参数
    正则表达式分支条件与分组
    向DWR传递参数和返回参数(转)
    一位软件工程师的6年总结(转)
    ASP.NET 2.0数据教程之二十六::排序自定义分页数据
    Table控件使用示例
  • 原文地址:https://www.cnblogs.com/gnivor/p/4404499.html
Copyright © 2011-2022 走看看