問題描述:
Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 21/10/10 08:51:52 INFO mapreduce.Job: map 100% reduce 0% 21/10/10 08:51:53 INFO mapreduce.Job: Job job_1633826412371_0001 failed with state FAILED due to: Task failed task_1633826412371_0001_m_000000 Job failed as tasks failed. failedMaps:1 failedReduces:0 21/10/10 08:51:54 INFO mapreduce.Job: Counters: 9 Job Counters Failed map tasks=4 Launched map tasks=4 Other local map tasks=3 Data-local map tasks=1 Total time spent by all maps in occupied slots (ms)=52317 Total time spent by all reduces in occupied slots (ms)=0 Total time spent by all map tasks (ms)=52317 Total vcore-milliseconds taken by all map tasks=52317 Total megabyte-milliseconds taken by all map tasks=53572608 21/10/10 08:51:54 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead 21/10/10 08:51:54 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 110.2385 seconds (0 bytes/sec) 21/10/10 08:51:54 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead 21/10/10 08:51:54 INFO mapreduce.ExportJobBase: Exported 0 records. 21/10/10 08:51:54 ERROR tool.ExportTool: Error during export: Export job failed!
解決方案:
①首先查看hive中的表結構是否與mysql一致。(desc 表名稱)

如果表結構一致的話可能就是mysql表的字符集不一致導致
②Sqoop導入數據到mysql數據庫命令:
bin/sqoop export \ > --connect “jdbc:mysql://master:3306/mysql?useUnicode=true&characterEncoding=utf-8” \ > --username root \ > --password 000000 \ > --table QX_diyu_results \ > --num-mappers 1 \ > --export-dir /user/hive/warehouse/diyu_resaults \ > --input-fields-terminated-by ","
從命令中可以看出我使用的編碼為utf8,所以應將mysql中表的字符集修改為utf8

修改完成之后再次運行命令,出現如下結果表示數據導入成功:

