1:如果在將文件導入到hive表時,查詢結果為null(下圖)

這個是因為在創建表的時候沒有指定列分隔符,hive的默認分隔符是ctrl+a(/u0001)
2.當我復制好幾行sql到hive命令行時,會出現下面情況,導致復制失敗


這個是因為我的sql中有tab的縮進,將tab的空格去掉即可
3.當我在使用load加載數據是,拋出filad
hive> LOAD DATA LOCAL INPATH '/home/node4/Desktop/sutdent.txt' OVERWRITE INTO TABLE student_3; FAILED: SemanticException Unable to load data to destination table. Error: The file that you are trying to load does not match the file format of the destination table.
這是因為SequenceFile的表不能使用load來加載數據
4.如果在從hdfs上導出數據到hive表報如下錯誤:
FAILED: SemanticException Line 1:17 Invalid path ''hdfs://Master:9000/user/test/qar_test'': No files matching path hdfs://Master:9000/user/test/qar_test
這個是說路徑錯誤,可以去mysql中查看一下
查詢hive下的DBS表DB_LOCATION_URI列:select DB_LOCATION_URI from DBS;
+----------------------------------------+ | DB_LOCATION_URI | +----------------------------------------+ | hdfs://10.1.51.200:9000/hive/warehouse | +----------------------------------------+
5.
在使用eclipse鏈接hive時,總是報time out connection,這個是因為我服務器的防火牆沒關閉
6.報一下錯誤,需要配置hadoop-core.xml文件
Caused by: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): Unauthorized connection for super-user: root from IP 192.168.177.124
添加如下配置
<property> <name>hadoop.proxyuser.root.groups</name> <value>*</value> </property> <property> <name>hadoop.proxyuser.root.hosts</name> <value>*</value> </property>