環境:win10 +hadoop2.7.1,服務器hadoop2.6.0+spark2.2.1+hive1.1.0
代碼:
1 import org.apache.spark.sql.Dataset; 2 import org.apache.spark.sql.Row; 3 import org.apache.spark.sql.SparkSession; 4 import org.apache.spark.sql.hive.HiveContext; 5 6 public class Dwd_rec_kqmj { 7 8 9 public static void main(String[] args) { 10 SparkSession spark = SparkSession 11 .builder() 12 .master("local[*]") 13 .appName("Dwd_rec_kqmj") 14 .enableHiveSupport() 15 .config("spark.some.config.option", "some-value") 16 .config("spark.serializer", "org.apache.spark.serializer.KryoSerializer") 17 .getOrCreate(); 18 19 spark.sql("show tables").show(); 20 spark.stop(); 21 }
問題:
1 Error while running command to get file permissions : 2 java.io.IOException: (null) entry in command string: null ls -F E:\tmp\hive
看問題描述應該是文件權限問題
解決辦法:
1.將winutils.exe和hadoop.dll放到C:\Windows\System32和本地$HADOOP_HOME/bin下
2.創建c:\tmp\hive目錄
3.在$HADOOP_HOME/bin執行winutils.exe chmod -R 777c:\tmp\hive