concat_ws() 在hive中,被連接對象必須為string或者array<string>,否則報錯如下:
hive> select concat_ws(',',unix_timestamp('2012-12-07 13:01:03'),unix_timestamp('2012-12-07 15:01:03')); FAILED: SemanticException [Error 10016]: Line 1:21 Argument type mismatch ''2012-12-07 13:01:03'':
Argument 2 of function CONCAT_WS must be "string or array<string>", but "bigint" was found.
但是在 spark-sql中,concat_ws()中,被連接的對象並不一定為string,也可以是int
(unix_timestamp()返回的是bigint類型)
spark-sql> select concat_ws(',',unix_timestamp('2012-12-07 13:01:03'),unix_timestamp('2012-12-07 15:01:03'));
輸出結果: 1354856463,1354863663