zoukankan      html  css  js  c++  java
  • cdh安装spark遇到的几个BUG

    spark安装后启动:

    [zdwy@master spark]$ sbin/start-all.sh
    starting org.apache.spark.deploy.master.Master, logging to /home/zdwy/cdh5.9/spark/logs/spark-zdwy-org.apache.spark.deploy.master.Master-1-master.out
    failed to launch org.apache.spark.deploy.master.Master:
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 7 more
    full log in /home/zdwy/cdh5.9/spark/logs/spark-zdwy-org.apache.spark.deploy.master.Master-1-master.out
    slave1: starting org.apache.spark.deploy.worker.Worker, logging to /home/zdwy/cdh5.9/spark/logs/spark-zdwy-org.apache.spark.deploy.worker.Worker-1-slave1.out
    slave2: starting org.apache.spark.deploy.worker.Worker, logging to /home/zdwy/cdh5.9/spark/logs/spark-zdwy-org.apache.spark.deploy.worker.Worker-1-slave2.out
    slave1: failed to launch org.apache.spark.deploy.worker.Worker:
    slave1: at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    slave1: ... 6 more
    slave1: full log in /home/zdwy/cdh5.9/spark/logs/spark-zdwy-org.apache.spark.deploy.worker.Worker-1-slave1.out
    slave2: failed to launch org.apache.spark.deploy.worker.Worker:
    slave2: at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    slave2: ... 6 more
    slave2: full log in /home/zdwy/cdh5.9/spark/logs/spark-zdwy-org.apache.spark.deploy.worker.Worker-1-slave2.out

    原因:缺少hadoop和spark之间通信的jar包

    解决方案:下载3个jar包:jackson-core-xxx.jar,jackson-annotations-xxx.jar,jackson-databind-xxx.jar,下载地址:http://mvnrepository.com/artifact/com.fasterxml.jackson.core/

    下载后将jar包放入到hadoop/share/hadoop/commom/目录下,重新启动spark即可。

  • 相关阅读:
    C++进程通信之命名管道
    从Win32过渡到MFC工程
    Windows常用消息处理与自定义消息
    Windows窗口开发原理(窗口的创建&消息机制)
    _T、_TEXT、TEXT、L的使用记录
    几种多线程同步方式总结
    异步编程之async&await
    rpc理解
    docker 基础namespace cgroup overlayfs network
    python编程书籍资料整理大全
  • 原文地址:https://www.cnblogs.com/zhaojinyan/p/9599521.html
Copyright © 2011-2022 走看看