hbase中自帶一些數據導入、導出工具
1. ImportTsv直接導入
1.1 hbase中建表
create 'testtable4','cf1','cf2'
1.2 准備數據文件data.txt,上傳到hdfs
1,tom,m 2,jack,m 3,lili,f hadoop fs -put data.txt /user/dw_hbkal/przhang
1.3 使用命令導入
bin/hbase org.apache.hadoop.hbase.mapreduce.ImportTsv -Dimporttsv.separator=',' -Dimporttsv.columns=HBASE_ROW_KEY,cf1,cf2 testtable4 /user/dw_hbkal/przhang/data.txt
1.4 查看hbase數據
hbase(main):069:0> scan 'testtable4' ROW COLUMN+CELL 1 column=cf1:, timestamp=1533708793917, value=tom 1 column=cf2:, timestamp=1533708793917, value=m 2 column=cf1:, timestamp=1533708793917, value=jack 2 column=cf2:, timestamp=1533708793917, value=m 3 column=cf1:, timestamp=1533708793917, value=lili 3 column=cf2:, timestamp=1533708793917, value=f 3 row(s) in 0.0300 seconds
2. ImportTsv先生成HFile,然后增量導入
2.1 創建數據文件data2.txt,並上傳hdfs
1,tom,f 5,jack2,m 6,lili2,m hadoop fs -put data2.txt /user/dw_hbkal/przhang
2.2 生成HFile
bin/hbase org.apache.hadoop.hbase.mapreduce.ImportTsv -Dimporttsv.separator=',' -Dimporttsv.columns=HBASE_ROW_KEY,cf1,cf2 -Dimporttsv.bulk.output=/user/dw_hbkal/przhang/hfile_tmp testtable4 /user/dw_hbkal/przhang/data2.txt
2.3 將HFile文件導入HBase,實際是執行hdfs mv 操作
bin/hbase org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles /user/dw_hbkal/przhang/hfile_tmp testtable4
2.4 查看hdfs上cf1的hfile文件,時間戳晚一點的為生成的hfile文件
hadoop fs -ls /hbase/data/default/testtable4/ebaa89a06f73a0ecdc15b53bd88bc3a4/cf1 Found 2 items -rwxrwxrwx 3 hdfs bdos 1170 2018-08-08 14:23 /hbase/data/default/testtable4/ebaa89a06f73a0ecdc15b53bd88bc3a4/cf1/0e80f632a7214755a8e84e9fafea36eb_SeqId_6_ -rw-r--r-- 3 hbase hbase 1065 2018-08-08 14:45 /hbase/data/default/testtable4/ebaa89a06f73a0ecdc15b53bd88bc3a4/cf1/347598bdf4e34b51909b6965fed11a99
2.5 查看hbase
hbase(main):070:0> scan 'testtable4' ROW COLUMN+CELL 1 column=cf1:, timestamp=1533709383463, value=tom 1 column=cf2:, timestamp=1533709383463, value=f 2 column=cf1:, timestamp=1533708793917, value=jack 2 column=cf2:, timestamp=1533708793917, value=m 3 column=cf1:, timestamp=1533708793917, value=lili 3 column=cf2:, timestamp=1533708793917, value=f 5 column=cf1:, timestamp=1533709383463, value=jack2 5 column=cf2:, timestamp=1533709383463, value=m 6 column=cf1:, timestamp=1533709383463, value=lili2 6 column=cf2:, timestamp=1533709383463, value=m 5 row(s) in 0.0260 seconds
3. Export數據導出至HDFS
bin/hbase org.apache.hadoop.hbase.mapreduce.Export testtable /user/dw_hbkal/przhang/hbaseexport/testdata //testtable表數據導出到一個hdfs路徑,可以設置導出的版本數量、起始時間
4. Import數據從HDFS中導入
hbase org.apache.hadoop.hbase.mapreduce.Import testtable /user/dw_hbkal/przhang/hbaseexport/testdata // hdfs數據導入testtable,導入之前test要先創建
5. CopyTable表復制
hbase org.apache.hadoop.hbase.mapreduce.CopyTable --new.name=test3 test //test中的數據復制到test3表中,復制只會考慮最新的數據