MYSQL提供了從本地文件快速導數據的命令,具體說明如下:
LOAD DATA [LOW_PRIORITY | CONCURRENT] [LOCAL] INFILE 'file_name' [REPLACE | IGNORE] INTO TABLEtbl_name[CHARACTER SETcharset_name] [{FIELDS | COLUMNS} [TERMINATED BY 'string'] [[OPTIONALLY] ENCLOSED BY 'char'] [ESCAPED BY 'char'] ] [LINES [STARTING BY 'string'] [TERMINATED BY 'string'] ] [IGNOREnumberLINES] [(col_name_or_user_var,...)] [SETcol_name=expr,...]
實驗:導入233M文件的數據
表結構如下:
mysql> SHOW COLUMNS FROM load_file_test;
+-------+------------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+-------+------------+------+-----+---------+-------+
| mid | int(10) | YES | | NULL | |
| time | int(10) | YES | | NULL | |
| type | tinyint(4) | YES | | NULL | |
+-------+------------+------+-----+---------+-------+
3 rows in set (0.12 sec)
文件數據格式:
1450025|2|1343145600
1586865|1|1343145600
2557075|2|1343145600
2663240|2|1343145600
3787375|2|1343145600
4293640|1|1343145600
執行結果:
mysql> LOAD DATA LOCAL INFILE '/usr/local/wwwroot/texas/fansPrizeData/2012-12' INTO TABLE load_file_test FIELDS TERMINATED BY '|' LINES TERMINATED BY '\n' (mid,type,time);
Query OK, 11116864 rows affected (2 min 25.11 sec)
Records: 11116864 Deleted: 0 Skipped: 0 Warnings: 0
二百多兆的數據,大概花了兩分半鍾,還算是比較快的。
參考資料:mysql dev
