执行spark代码插入数据到hbase表中去的时候,遇到的错误
1. 缺少hadoop-mapreduce-client-core-2.5.1.jar包
错误:java.lang.ClassNotFoundException: org.apache.hadoop.mapred.JobConf
2. 缺少hbase-protocol-1.3.1.jar包
错误:java.lang.ClassNotFoundException: org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingInterface
3. 缺少metrics-core-2.2.0.jar的包
终端出现该错误:
com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: com/yammer/metrics/core/Gauge
4. 需要的jar包
hadoop-mapreduce-client-core-2.5.1.jar //org.apache.hadoop.mapred.JobConf hbase-client-1.3.1.jar hbase-common-1.3.1.jar hbase-server-1.3.1.jar //主要针对操作hbase hbase-protocol-1.3.1.jar //org.apache.hadoop.hbase.protobuf.generated.MasterProtos kafka-clients-1.0.0.jar kafka_2.11-1.0.0.jar //主要针对于操作Kafka spark-core_2.11-2.1.1.jar spark-streaming_2.11-2.1.1.jar spark-streaming-kafka-0-10_2.11-2.1.1.jar //主要针对于操作sparkstreaming zkclient-0.10.jar zookeeper-3.4.10.jar //主要针对于操作zookeeper FlumeKafkaToHbase.jar //自定义jar包
5. 执行
/home/spark/bin/spark-submit --master local[2] --driver-class-path /usr/local/hbase/lib/metrics-core-2.2.0.jar --class com..FlumeKafkaToHbase --executor-memory 4G --total-executor-cores 2 FlumeKafkaToHbase.jar