使用insert into from從源表向目標表導入數據時報錯如下
insert into uniaction1 values('136.206.220.16','1542011089896','www.mi.com','Buy','2018-11-12','海南','67084475796635524'); ERROR : FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask. Exception when loading 1 in table uniaction1 with loadPath=hdfs://host:8020/warehouse/tablespace/managed/hive/db1.db/action1 Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask. Exception when loading 1 in table uniaction1 with loadPath=hdfs://host:8020/warehouse/tablespace/managed/hive/db1.db/action1 (state=08S01,code=1)
目標表表結構:
0: jdbc:hive2://node3:2181,node2:2181,node1:2> desc uniaction1; +--------------------------+------------+----------+ | col_name | data_type | comment | +--------------------------+------------+----------+ | ipaddress | string | | | thetimestamp | string | | | web | string | | | operator | string | | | thedate | string | | | prov | string | | | userid | string | | | | NULL | NULL | | # Partition Information | NULL | NULL | | # col_name | data_type | comment | | thedate | string | | | prov | string | | | userid | string | | +--------------------------+------------+----------+ 13 rows selected (0.173 seconds)
是的,這里就是把中文數據的字段作為自動分區字段了,沒法自動創建,具體原因說不上,
也有人能解決:
關於mysql網上的很多的修改編碼的方式都不可靠,並不能保證所有表的所有字段的編碼都改正過來。
這個中文字段的分區之所以不能建,在日志中提示的很明確就是hive meta store exception,所以就去找hive元數據的問題,經過排查發現,partitions表的par_name字段還是lanten1的編碼,將其修改文utf8的編碼以后,就可以創建中文分區了。至於怎么修改,自己網上查吧,
https://www.oschina.net/question/2909997_2289170