Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/ FSDataInputStream
export SPARK_DIST_CLASSPATH=$(hadoop classpath)
還需要配置上 :
Spark 啟動java.lang.NoClassDefFoundError: com/fasterxml/jackson/databind/Module
使用Maven下載以下依賴 : jackson-databind-xxx.jar、 jackson-core-xxx.jar、 jackson-annotations-xxx.jar
放到 $HADOOP_HOME/share/hadoop/common下
<dependency> <groupId>com.fasterxml.jackson.core</groupId> <artifactId>jackson-databind</artifactId> <version>2.4.4</version> </dependency> <dependency> <groupId>com.fasterxml.jackson.core</groupId> <artifactId>jackson-core</artifactId> <version>2.4.4</version> </dependency> <dependency> <groupId>com.fasterxml.jackson.core</groupId> <artifactId>jackson-annotations</artifactId> <version>2.4.4</version> </dependency>
Spark啟動報 java.lang.ClassNotFoundError: parquet/hadoop/ParquetOutputCommitter
解決的版本是 將下面的jar包下載下來放到Spark的啟動ClassPath下,然后開啟Spark
<dependency> <groupId>com.twitter</groupId> <artifactId>parquet-hadoop</artifactId> <version>1.4.3</version> </dependency>
將上面的 Jar 包 , 放入 /opt/app/spark-1.6.0-cdh5.13.2/sbin 如果Sbin 不行的話 在lib目錄下也放一個相同的即可