工作太忙,隔了好久的HIVE沒玩,撿起來重新學習下,建表出現錯誤如:
Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaExcColumn length too big for column 'TYPE_NAME' (max = 21845)

為了后面導入CSV數據,建表語句如下:
create table t1( id string ,name string ,hobby string ,age string ) ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.OpenCSVSerde' with serdeproperties ( "separatorChar" = ",", "quoteChar" = "'", "escapeChar" = "\\" ) tblproperties("skip.header.line.count"="1")
解決方法:錯誤一直報數據類型的問題,而 hive 建表類型基本為 string,是hive使用MySQL存儲元數據編碼問題
導入語句:
alter database hive character set latin1;

