執行spark代碼插入數據到hbase表中去的時候,遇到的錯誤
1. 缺少hadoop-mapreduce-client-core-2.5.1.jar包
錯誤:java.lang.ClassNotFoundException: org.apache.hadoop.mapred.JobConf
2. 缺少hbase-protocol-1.3.1.jar包
錯誤:java.lang.ClassNotFoundException: org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingInterface
3. 缺少metrics-core-2.2.0.jar的包
終端出現該錯誤:
com.google.protobuf.ServiceException: java.lang.NoClassDefFoundError: com/yammer/metrics/core/Gauge
4. 需要的jar包
hadoop-mapreduce-client-core-2.5.1.jar //org.apache.hadoop.mapred.JobConf hbase-client-1.3.1.jar hbase-common-1.3.1.jar hbase-server-1.3.1.jar //主要針對操作hbase hbase-protocol-1.3.1.jar //org.apache.hadoop.hbase.protobuf.generated.MasterProtos kafka-clients-1.0.0.jar kafka_2.11-1.0.0.jar //主要針對於操作Kafka spark-core_2.11-2.1.1.jar spark-streaming_2.11-2.1.1.jar spark-streaming-kafka-0-10_2.11-2.1.1.jar //主要針對於操作sparkstreaming zkclient-0.10.jar zookeeper-3.4.10.jar //主要針對於操作zookeeper FlumeKafkaToHbase.jar //自定義jar包
5. 執行
/home/spark/bin/spark-submit \ --master local[2] \ --driver-class-path /usr/local/hbase/lib/metrics-core-2.2.0.jar \ --class com..FlumeKafkaToHbase \ --executor-memory 4G \ --total-executor-cores 2 \ FlumeKafkaToHbase.jar