歡迎大家來一樂樂的博客園
HBase的編程實踐
✿ 准備工作:啟動、關閉hbase之類的(以及該過程可能遇到的bug) 一、Hbase中使用Shell命令: ① HBase中創建表:(create 命令:第一個變量是表名,然后是列族名) ✿ 增加數據 ② put:(put 命令:第一個變量是表名,第二個變量是行鍵,第三個變量開始就是添加列族情況啦(列限定符可選) ✿ 刪除數據 ③ delete命令: □ put 的反向操作: □ 刪除 一行中的所有數據: □ 刪除 表: ✿ 查看數據: □ get: 查看的是一行中的數據 □ scan: 查看的是表中的全部數據 ✿ 查詢表歷史數據: ④ 查詢表的歷史版本,需要兩步。 ✿ 退出HBase數據庫操作: ⑤ 命令: exit 二、Hbase編程實踐:✿ 准備工作:導入jar包: 例子:創建表,插入數據、查看表中數據
三、實驗:熟悉常用的HBase 操作: (一)編程實現以下指定功能,並用 Hadoop 提供的 HBase Shell 命令完成相同任務:
(二)現有以下關系型數據庫中的表和數據,要求將其轉換為適合於HBase存儲的表並插入數據: 學生表(Student)、課程表(Course)、選課表(SC):同時,請編程完成以下指定功能: ....
|
✿ 准備工作:
■ 先啟動hadoop,再啟動hbase;(關閉:關閉先關hbase,再關閉hadoop)
□ 啟動hadoop:
ssh localhost cd /usr/local/hadoop ./sbin/start-dfs.sh
□ 啟動hbase:(因為hbase 咱將其bin 目錄配置了環境變量,相當於全局變量了,在終端命令可以直接使用, 而hadoop 沒有配置系統的全局變量,所以需要切換到其安裝目錄下的sbin 目錄)
start-hbase.sh
□ 進入shell界面:
hbase shell
□關閉hbase:
stop-hbase.sh
□關閉hadoop:
cd /usr/local/hadoop
./sbin/stop-dfs.sh
可能遇到bug:org.apache.hadoop.hbase.PleaseHoldException:Maste is initializing
logs中報錯:master.HMaster: Master failed to complete initialization after 900000ms. Please consider submitting a bug report including a thread dump of this process.
master.SplitLogManger:error while splitting logs in [hdfs:// localhost:9000/hbase/WALS...]
屋漏偏逢連夜雨:又卡住。。。。。
解決:ctr+c,關閉hbase、關閉hadoop,然后,關掉shell,重啟,然后重新登錄。。。(ssh localhost 又發現。。。,直覺,這個問題無傷大雅,跳過它)
Failed to connect to https://changelogs.ubuntu.com/meta-relese-lts.Check your Internet connection or proxy settings.
詭異的事。。。(前一秒成功創建了一張表,下一秒卡死)
日志顯示:【詭異的地方是在,我創建的表格名為user 就卡住,創建叫其他名字的表格就沒事】
警告:WARN[ProcExecTimeout] assignment.AssignmentManger:STUCK Region-In-Transition rit=OPENING, location=null,table=user,region=d7e...
然后又:ERROR:master.HMasterCommandLine:Master exiting。。。
至此越搞越多bug,我選擇重新安裝(因為重新安裝對於解決問題的是一個優解,為什么不重新安裝呢,辦法沒有高低貴賤,高效即最好。)
也許一開始的原因在於刪除表的命令輸入先后順序有問題,我是先輸入:truncate 'user' (‘user’ 是表名) 然后輸入 drop 'user' #停用表之后才能刪除
(好像輸入 drop命令前已經停表了哈哈哈)問題不大,重新裝一下即可
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
一、Hbase中使用Shell命令:
① HBase中創建表:(create 命令:第一個變量是表名,然后是列族名)
語法:create '表名稱','列族名稱1','列族名稱2','列族名稱N'
create 'student','Sname','Ssex','Sage','Sdept','course'
通過命令 describe 'student' 進行查看表的結構:(desc ‘表名’,查看表的結構)
接下來是hbase 常規操作(增刪改查)
✿ 增加數據
② put:(put 命令:第一個變量是表名,第二個變量是行鍵,第三個變量開始就是添加列族情況啦(列限定符可選)
注意 put 命令:一次只能為一個表的一行數據的一個列,也就是一次只能給一個單元格添加一個數據,
所以直接用shell命令插入數據效率很低,在實際應用中,一般都是利用編程操作數據。
語法:put '表名稱','行名稱','列名稱:','值'
例子:student表添加了學號為95001,名字為LiYing的一行數據,其行鍵為95001。
put 'student', '95001','Sname','LiYing'
例子:為95001行下的course列族的math列添加了一個數據:
put 'student','95001','course:math','80'
在表格中的樣子(大概如此,變量名不一定對得上哦,我只是為了展示表格的形式):
✿ 刪除數據
③ delete命令:
□ put 的反向操作:
□ 刪除 一行中的所有數據:
□ 刪除 表:
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
□ put 的反向操作:
delete 'student','95001','Sname:firstName'
□ 刪除 一行中的所有數據:
deleteall 'student','95001'
□ 刪除表 :
disable 'student' #讓表不可用
drop 'student' #刪除表
✿ 查看數據:
□ get: 查看的是一行中的數據
□ scan: 查看的是表中的全部數據
get 'student','95001'
scan 'student'
還可以查詢部分細節的數據等等細節的數據,例如:查詢 某個列族的數據:
✿ 查詢表歷史數據:
④ 查詢表的歷史版本,需要兩步。
1、在創建表的時候,指定保存的版本數(假設指定為5)
create 'teacher',{NAME=>'username',VERSIONS=>5}
2、插入數據然后更新數據,使其產生歷史版本數據,注意:這里插入數據和更新數據都是用put命令
put 'teacher','91001','username','Mary' put 'teacher','91001','username','Mary1' put 'teacher','91001','username','Mary2' put 'teacher','91001','username','Mary3' put 'teacher','91001','username','Mary4' put 'teacher','91001','username','Mary5'
3、查詢時,指定查詢的歷史版本數。默認會查詢出最新的數據
get 'teacher','91001',{COLUMN=>'username',VERSIONS=>3}
✿ 退出HBase數據庫操作
⑤ 命令: exit
注意:這里退出HBase數據庫是退出對數據庫表的操作,而不是停止啟動HBase數據庫后台運行。
二、Hbase編程實踐:
✿ 准備工作:導入jar包:
導包步驟:File -》 Project Structure -》Libraries -》+ -》選擇需要導入的包,然后記得導入完成后,點擊一下 Apply,再點 Ok
(1) 進入到“/usr/local/hbase/lib”目錄,選中該目錄下的所有jar文件(注意,不要選中client-facing-thirdparty、ruby、shaded-clients和zkcli這四個目錄)
(2) 進入到“/usr/local/hbase/lib/client-facing-thirdparty”目錄, 選中該目錄下的所有jar文件。
例子:創建表,插入數據,查看表中數據
import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.hbase.*; import org.apache.hadoop.hbase.client.*; import org.apache.hadoop.hbase.util.Bytes; import java.io.IOException; public class ExampleForHBase { public static Configuration configuration; public static Connection connection; public static Admin admin; public static void main(String[] args)throws IOException{ init(); //主要操作就是為了連接到數據庫hbase createTable("student",new String[]{"score"}); //創建表,shell命令:create '表名','列族名1','列族名2','列族名3' ... insertData("student","zhangsan","score","English","69"); //shell命令: put 'student','張三','score:English','69' insertData("student","zhangsan","score","Math","86"); insertData("student","zhangsan","score","Computer","77"); getData("student", "zhangsan", "score","English"); close(); } public static void init(){ configuration = HBaseConfiguration.create(); configuration.set("hbase.rootdir","hdfs://localhost:9000/hbase"); try{ connection = ConnectionFactory.createConnection(configuration); admin = connection.getAdmin(); }catch (IOException e){ e.printStackTrace(); } } public static void close(){ try{ if(admin != null){ admin.close(); } if(null != connection){ connection.close(); } }catch (IOException e){ e.printStackTrace(); } } public static void createTable(String myTableName,String[] colFamily) throws IOException { TableName tableName = TableName.valueOf(myTableName); if(admin.tableExists(tableName)){ System.out.println("talbe is exists!"); }else { TableDescriptorBuilder tableDescriptor = TableDescriptorBuilder.newBuilder(tableName); for(String str:colFamily){ ColumnFamilyDescriptor family = ColumnFamilyDescriptorBuilder.newBuilder(Bytes.toBytes(str)).build(); tableDescriptor.setColumnFamily(family); } admin.createTable(tableDescriptor.build()); } } public static void insertData(String tableName,String rowKey,String colFamily,String col,String val) throws IOException { Table table = connection.getTable(TableName.valueOf(tableName)); Put put = new Put(rowKey.getBytes()); put.addColumn(colFamily.getBytes(),col.getBytes(), val.getBytes()); table.put(put); table.close(); } public static void getData(String tableName,String rowKey,String colFamily, String col)throws IOException{ Table table = connection.getTable(TableName.valueOf(tableName)); Get get = new Get(rowKey.getBytes()); get.addColumn(colFamily.getBytes(),col.getBytes()); Result result = table.get(get); System.out.println(new String(result.getValue(colFamily.getBytes(),col==null?null:col.getBytes()))); table.close(); } }
運行程序后,到終端輸入:scan ‘student’ 查看一下
三、實驗:熟悉常用的HBase 操作:
(一)編程實現以下指定功能,並用 Hadoop 提供的 HBase Shell 命令完成相同任務:
- 列出 HBase 所有的表的相關信息,例如表名;
- 在終端打印出指定的表的所有記錄數據;
- 向已經創建好的表添加和刪除指定的列族或列;
- 清空指定的表的所有記錄數據;
- 統計表的行數。
1.列出 HBase 所有的表的相關信息,例如表名:
■ HBase Shell:List
■ Java Api:
/**
* 同樣是正常的建立 數據庫連接,執行操作,然后最后關閉連接
* 重點是:HTableDescriptor hTableDescriptors[] = admin.listTables(); 獲取到 表格列表,然后遍歷
*/
import java.io.IOException; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.hbase.*; import org.apache.hadoop.hbase.client.*; import java.io.IOException; public class Test_1 { public static Configuration configuration; public static Connection connection; public static Admin admin; /** * 建立連接 */ public static void init() { configuration = HBaseConfiguration.create(); configuration.set("hbase.rootdir", "hdfs://localhost:9000/hbase"); try { connection = ConnectionFactory.createConnection(configuration); admin = connection.getAdmin(); } catch (IOException e) { e.printStackTrace(); } } /** * 關閉連接 */ public static void close() { try { if (admin != null) { admin.close(); } if (null != connection) { connection.close(); } } catch (IOException e) { e.printStackTrace(); } } /** * * 查看已有表,通過方法listTables() * * @throws IOException * */ public static void listTables() throws IOException { init(); HTableDescriptor hTableDescriptors[] = admin.listTables(); for (HTableDescriptor hTableDescriptor : hTableDescriptors) { System.out.println(hTableDescriptor.getNameAsString()); } close(); } public static void main(String[] args) { Test_1 t = new Test_1(); try { System.out.println("以下為Hbase 數據庫中所存的表信息"); t.listTables(); } catch (IOException e) { e.printStackTrace(); } } }
2.在終端打印出指定的表的所有記錄數據;
■ HBase Shell:scan 'student'
■ Java Api:
/**
* 同樣是正常的建立 數據庫連接,執行操作,然后最后關閉連接
* 重點是:
* Table table = connection.getTable(TableName.valueOf(tableName));獲取到表格對象
* Scan scan = new Scan(); ResultScanner scanner = table.getScanner(scan); 然后通過Scanner對象,獲取到ResultScanner掃描結果對象,遍歷輸出
*/
import java.io.IOException; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.hbase.*; import org.apache.hadoop.hbase.client.*; import java.io.IOException; import java.util.Scanner; public class Test_2 { public static Configuration configuration; public static Connection connection; public static Admin admin; // 建立連接 public static void init() { configuration = HBaseConfiguration.create(); configuration.set("hbase.rootdir", "hdfs://localhost:9000/hbase"); try { connection = ConnectionFactory.createConnection(configuration); admin = connection.getAdmin(); } catch (IOException e) { e.printStackTrace(); } } // 關閉連接 public static void close() { try { if (admin != null) { admin.close(); } if (null != connection) { connection.close(); } } catch (IOException e) { e.printStackTrace(); } } /** * * 根據表名查找表信息 * */ public static void getData(String tableName) throws IOException { init(); Table table = connection.getTable(TableName.valueOf(tableName)); Scan scan = new Scan(); ResultScanner scanner = table.getScanner(scan); for (Result result : scanner) { showCell((result)); } close(); } /** * * 格式化輸出 * * @param result * */ public static void showCell(Result result) { Cell[] cells = result.rawCells(); for (Cell cell : cells) { System.out.println("RowName(行鍵):" + new String(CellUtil.cloneRow(cell)) + " "); System.out.println("Timetamp(時間戳):" + cell.getTimestamp() + " "); System.out.println("column Family(列簇):" + new String(CellUtil.cloneFamily(cell)) + " "); System.out.println("column Name(列名):" + new String(CellUtil.cloneQualifier(cell)) + " "); System.out.println("value:(值)" + new String(CellUtil.cloneValue(cell)) + " "); System.out.println(); } } public static void main(String[] args) throws IOException { // TODO Auto-generated method stub Test_2 t = new Test_2(); System.out.println("請輸入要查看的表名"); Scanner scan = new Scanner(System.in); String tableName = scan.nextLine(); System.out.println("信息如下:"); t.getData(tableName); } }
3,向已經創建好的表添加和刪除指定的列族或列:
■ HBase Shell:
put 'student','95003','Sname','wangjinxuan' (添加列)
put 'student','95003','Sname:nickName','wang' (添加列族)
put 'student','95003','Sname:firstName','jinxuan' (添加列族)
put的反向操作的delete:
delete 'student' ,’95003’,’Sname’
delete 'student' ,’95003’,’Sname:nickName’
deleteall 'student' ,’95003’ (刪除整個行記錄)
■ Java Api:
/** * hbase只關注rowkey,column Family(列族),並沒有說在創建表的時候指定cq(列限定修飾符)有多少,這也是hbase列式存儲的特點, * 所以在hbase API中是沒有提供delete 一個列下的所有數據的 * * 同樣是正常的建立 數據庫連接,執行操作,然后最后關閉連接 * 1,Table table = connection.getTable(TableName.valueOf(tableName)); 先獲取到表 * 2,插入:(① 創建Put對象,② 然后通過方法 addColumn將列、列限定符、值 放到put對象,③ 最后將put對象put到表格) * Put put = new Put(rowKey.getBytes()); * put.addColumn(colFamily.getBytes(), col.getBytes(), val.getBytes()); * table.put(put); * 3,刪除: * Table table = connection.getTable(TableName.valueOf(tableName)); 同樣首先獲取到表 * Delete delete = new Delete(rowKey.getBytes()); //通過傳入行鍵,new一個刪除對象 * //刪除對象添加要被刪除的列或列族 * ① 刪除指定列族的所有數據(此情況是列族下無列限定符時的情況):delete.addFamily(colFamily.getBytes()); * ② 刪除指定列的數據(此列主要說的是列限定修飾符):delete.addColumn(colFamily.getBytes(), col.getBytes()); * table.delete(delete); //最后就是表格delete掉 delete對象 */ import java.io.IOException; import java.util.Scanner; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.hbase.Cell; import org.apache.hadoop.hbase.CellUtil; import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.TableName; import org.apache.hadoop.hbase.client.Admin; import org.apache.hadoop.hbase.client.Connection; import org.apache.hadoop.hbase.client.ConnectionFactory; import org.apache.hadoop.hbase.client.Delete; import org.apache.hadoop.hbase.client.Put; import org.apache.hadoop.hbase.client.Result; import org.apache.hadoop.hbase.client.ResultScanner; import org.apache.hadoop.hbase.client.Scan; import org.apache.hadoop.hbase.client.Table; public class Test_3 { public static Configuration configuration; public static Connection connection; public static Admin admin; // 建立連接 public static void init() { configuration = HBaseConfiguration.create(); configuration.set("hbase.rootdir", "hdfs://localhost:9000/hbase"); try { connection = ConnectionFactory.createConnection(configuration); admin = connection.getAdmin(); } catch (IOException e) { e.printStackTrace(); } } // 關閉連接 public static void close() { try { if (admin != null) { admin.close(); } if (null != connection) { connection.close(); } } catch (IOException e) { e.printStackTrace(); } } /** * 向某一行的某一列插入數據 * * @param tableName 表名 * @param rowKey 行鍵 * @param colFamily 列族名 * @param col 列名(如果其列族下沒有子列,此參數可為空) * @param val 值 * @throws IOException */ public static void insertRow(String tableName, String rowKey, String colFamily, String col, String val)throws IOException { init(); Table table = connection.getTable(TableName.valueOf(tableName)); Put put = new Put(rowKey.getBytes()); put.addColumn(colFamily.getBytes(), col.getBytes(), val.getBytes()); table.put(put); table.close(); close(); } /** * 根據表名查找表信息 */ public static void getData(String tableName) throws IOException { init(); Table table = connection.getTable(TableName.valueOf(tableName)); Scan scan = new Scan(); ResultScanner scanner = table.getScanner(scan); for (Result result : scanner) { showCell((result)); } close(); } /** * * 格式化輸出 * * @param result * */ public static void showCell(Result result) { Cell[] cells = result.rawCells(); for (Cell cell : cells) { System.out.println("RowName(行鍵):" + new String(CellUtil.cloneRow(cell)) + " "); System.out.println("Timetamp(時間戳):" + cell.getTimestamp() + " "); System.out.println("column Family(列簇):" + new String(CellUtil.cloneFamily(cell)) + " "); System.out.println("column Name(列名):" + new String(CellUtil.cloneQualifier(cell)) + " "); System.out.println("value:(值)" + new String(CellUtil.cloneValue(cell)) + " "); System.out.println(); } } /** * * 刪除數據 * * @param tableName 表名 * * @param rowKey 行鍵 * * @param colFamily 列族名 * * @param col 列名 * * @throws IOException * */ public static void deleteRow(String tableName, String rowKey, String colFamily, String col) throws IOException { init(); Table table = connection.getTable(TableName.valueOf(tableName)); Delete delete = new Delete(rowKey.getBytes()); if(col == null) { //刪除指定列族的所有數據(此情況是列族下無列限定符時的情況) delete.addFamily(colFamily.getBytes()); table.delete(delete); table.close(); }else { //刪除指定列的數據(此列主要說的是列限定修飾符) delete.addColumn(colFamily.getBytes(), col.getBytes()); table.delete(delete); table.close(); } close(); } public static void main(String[] args) { Test_3 t = new Test_3(); boolean flag = true; while (flag){ System.out.println("------------向已經創建好的表中添加和刪除指定的列簇或列--------------------"); System.out.println(" 請輸入您要進行的操作 1- 添加 2-刪除 "); Scanner scan = new Scanner(System.in); String choose1 = scan.nextLine(); switch (choose1) { case "1": try { //put 'student','95003','Sname','wangjinxuan' (添加列) //put 'student','95003','Sname:nickName','wang' (添加列族) //put 'student','95003','Sname:firstName','jinxuan' (添加列族) // t.insertRow(tableName, rowKey, colFamily, col, val); t.insertRow("student", "95003", "Sname",null, "wangjingxuan"); t.insertRow("student", "95003", "Sname", "nickName", "wang"); t.insertRow("student", "95003", "Sname", "firstName", "jingxuan"); System.out.println("插入成功:"); t.getData(tableName); } catch (IOException e) { e.getMessage(); } break; case "2": try { System.out.println("----------------------刪除前,表的原本信息如下---------------------"); t.getData(tableName); //delete 'student' ,’95003’,’Sname’ //delete 'student' ,’95003’,’Sname:nickName’ // t.deleteRow(tableName, rowKey, colFamily, col); t.deleteRow("student", "95003", "Sname", "firstName"); System.out.println("-----------------------刪除成功-----------------------------\n"); System.out.println("---------------------刪除后,表的信息如下---------------------"); t.getData(tableName); } catch (IOException e) { e.getMessage(); } break; } System.out.println(" 你要繼續操作嗎? 是-true 否-false "); flag = scan.nextBoolean(); } System.out.println(" 程序已退出! "); } }
4,清空指定的表的所有記錄數據:
■ HBase Shell:truncate 'student'
■ Java Api:
import java.io.IOException; import java.util.Scanner; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.hbase.Cell; import org.apache.hadoop.hbase.CellUtil; import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.HColumnDescriptor; import org.apache.hadoop.hbase.HTableDescriptor; import org.apache.hadoop.hbase.TableName; import org.apache.hadoop.hbase.client.Admin; import org.apache.hadoop.hbase.client.Connection; import org.apache.hadoop.hbase.client.ConnectionFactory; import org.apache.hadoop.hbase.client.HBaseAdmin; import org.apache.hadoop.hbase.client.Result; import org.apache.hadoop.hbase.client.ResultScanner; import org.apache.hadoop.hbase.client.Scan; import org.apache.hadoop.hbase.client.Table; import org.apache.hadoop.hbase.util.Bytes; public class Test_4 { public static Configuration configuration; public static Connection connection; public static Admin admin; // 建立連接 public static void init() { configuration = HBaseConfiguration.create(); configuration.set("hbase.rootdir", "hdfs://localhost:9000/hbase"); try { connection = ConnectionFactory.createConnection(configuration); admin = connection.getAdmin(); } catch (IOException e) { e.printStackTrace(); } } // 關閉連接 public static void close() { try { if (admin != null) { admin.close(); } if (null != connection) { connection.close(); } } catch (IOException e) { e.printStackTrace(); } } /** * * 清空制定的表的所有記錄數據 * * @param args * * @throws IOException * */ public static void clearRows(String tableName) throws IOException { init(); HBaseAdmin admin1 = new HBaseAdmin(configuration); // 讀取了之前表的表名 列簇等信息,然后再進行刪除操作。 HTableDescriptor tDescriptor = admin1.getTableDescriptor(Bytes.toBytes(tableName)); // 總思想是先將原表結構保留下來,然后進行刪除,再重新依據保存的信息重新創建表。 TableName tablename = TableName.valueOf(tableName); // 刪除表 admin.disableTable(tablename); admin.deleteTable(tablename); // 重新建表 admin.createTable(tDescriptor); close(); } /** * * 根據表名查找表信息 * */ public static void getData(String tableName) throws IOException { init(); Table table = connection.getTable(TableName.valueOf(tableName)); Scan scan = new Scan(); ResultScanner scanner = table.getScanner(scan); for (Result result : scanner){ showCell((result)); } close(); } /** * * 格式化輸出 * * @param result * */ public static void showCell(Result result) { Cell[] cells = result.rawCells(); for (Cell cell : cells) { System.out.println("RowName(行鍵):" + new String(CellUtil.cloneRow(cell)) + " "); System.out.println("Timetamp(時間戳):" + cell.getTimestamp() + " "); System.out.println("column Family(列簇):" + new String(CellUtil.cloneFamily(cell)) + " "); System.out.println("column Name(列名):" + new String(CellUtil.cloneQualifier(cell)) + " "); System.out.println("value:(值)" + new String(CellUtil.cloneValue(cell)) + " "); System.out.println(); } } public static void main(String[] args) { Test_4 test_4 = new Test_4(); Scanner scan = new Scanner(System.in); System.out.println("請輸入要清空的表名"); String tableName = scan.nextLine(); try { System.out.println("表原來的信息:"); test_4.getData(tableName); test_4.clearRows(tableName); System.out.println("表已清空:"); } catch (IOException e) { e.printStackTrace(); } } }
5,統計表的行數:
■ HBase Shell:count 'student'
■ Java Api:
import java.io.IOException; import java.util.Scanner; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.TableName; import org.apache.hadoop.hbase.client.Admin; import org.apache.hadoop.hbase.client.Connection; import org.apache.hadoop.hbase.client.ConnectionFactory; import org.apache.hadoop.hbase.client.Result; import org.apache.hadoop.hbase.client.ResultScanner; import org.apache.hadoop.hbase.client.Scan; import org.apache.hadoop.hbase.client.Table; public class Test_5 { public static Configuration configuration; public static Connection connection; public static Admin admin; //建立連接 public static void init() { configuration = HBaseConfiguration.create(); configuration.set("hbase.rootdir", "hdfs://localhost:9000/hbase"); try { connection = ConnectionFactory.createConnection(configuration); admin = connection.getAdmin(); } catch (IOException e) { e.printStackTrace(); } } // 關閉連接 public static void close() { try { if (admin != null) { admin.close(); } if (null != connection) { connection.close(); } } catch (IOException e) { e.printStackTrace(); } } public static void countRows(String tableName) throws IOException{ init(); Table table = connection.getTable(TableName.valueOf(tableName)); Scan scan = new Scan(); ResultScanner scanner = table.getScanner(scan); int num = 0; for (Result result = scanner.next(); result != null; result = scanner.next()){ num++; } System.out.println("行數:" + num); scanner.close(); close(); } public static void main(String[] args) throws IOException { Test_5 test_5 = new Test_5(); Scanner scan = new Scanner(System.in); System.out.println("請輸入要統計行數的表名"); String tableName = scan.nextLine(); test_5.countRows(tableName); } }
(二)現有以下關系型數據庫中的表和數據,要求將其轉換為適合於HBase存儲的表並插入數據:
學生表(Student)、課程表(Course)、選課表(SC):同時,請編程完成以下指定功能:
create 'Student','S_No','S_Name','S_Sex','S_Age' put 'Student','s001','S_No','2015001' put 'Student','s001','S_Name','Zhangsan' put 'Student','s001','S_Sex','male' put 'Student','s001','S_Age','23' put 'Student','s002','S_No','2015002' put 'Student','s002','S_Name','Mary' put 'Student','s002','S_Sex','female' put 'Student','s002','S_Age','22' put 'Student','s003','S_No','2015003' put 'Student','s003','S_Name','Lisi' put 'Student','s003','S_Sex','male' put 'Student','s003','S_Age','24' ————————————————————————————————————————————————————————————————————————————— create 'Course','C_No','C_Name','C_Credit' put 'Course','c001','C_No','123001' put 'Course','c001','C_Name','Math' put 'Course','c001','C_Credit','2.0' put 'Course','c002','C_No','123002' put 'Course','c002','C_Name','Computer' put 'Course','c002','C_Credit','5.0' put 'Course','c003','C_No','123003' put 'Course','c003','C_Name','English' put 'Course','c003','C_Credit','3.0' ———————————————————————————————————————————————————————————————————————————————— put 'SC','sc001','SC_Sno','2015001' put 'SC','sc001','SC_Cno','123001' put 'SC','sc001','SC_Score','86' put 'SC','sc002','SC_Sno','2015001' put 'SC','sc002','SC_Cno','123003' put 'SC','sc002','SC_Score','69' put 'SC','sc003','SC_Sno','2015002' put 'SC','sc003','SC_Cno','123002' put 'SC','sc003','SC_Score','77' put 'SC','sc004','SC_Sno','2015002' put 'SC','sc004','SC_Cno','123003' put 'SC','sc004','SC_Score','99' put 'SC','sc005','SC_Sno','2015003' put 'SC','sc005','SC_Cno','123001' put 'SC','sc005','SC_Score','98' put 'SC','sc006','SC_Sno','2015003' put 'SC','sc006','SC_Cno','123002' put 'SC','sc006','SC_Score','95'
同時,請編程完成以下指定功能:
① createTable(String tableName, String[] fields): 創建表,參數tableName為表的名稱,字符串數組fields為存儲記錄各個域名稱的數組。
(域名稱即列族名稱啦)要求當HBase已經存在名為tableName的表的時候,先刪除原有的表,然后再創建新的表。
import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.HColumnDescriptor; import org.apache.hadoop.hbase.HTableDescriptor; import org.apache.hadoop.hbase.TableName; import org.apache.hadoop.hbase.client.Admin; import org.apache.hadoop.hbase.client.Connection; import org.apache.hadoop.hbase.client.ConnectionFactory; import java.io.IOException; public class CreateTable { public static Configuration configuration; public static Connection connection; public static Admin admin; public static void createTable(String tableName, String[] fields) throws IOException { init(); TableName tablename = TableName.valueOf(tableName); if (admin.tableExists(tablename)) { System.out.println("table is exists!"); admin.disableTable(tablename); admin.deleteTable(tablename); } HTableDescriptor hTableDescriptor = new HTableDescriptor(tablename); for (String str : fields) { HColumnDescriptor hColumnDescriptor = new HColumnDescriptor(str); hTableDescriptor.addFamily(hColumnDescriptor); } admin.createTable(hTableDescriptor); close(); } public static void init() { configuration = HBaseConfiguration.create(); configuration.set("hbase.rootdir", "hdfs://localhost:9000/hbase"); try { connection = ConnectionFactory.createConnection(configuration); admin = connection.getAdmin(); } catch (IOException e) { e.printStackTrace(); } } public static void close() { try { if (admin != null) { admin.close(); } if (null != connection) { connection.close(); } } catch (IOException e) { e.printStackTrace(); } } public static void main(String[] args) { String[] fields = {"Score"}; try { createTable("person", fields); } catch (IOException e) { e.printStackTrace(); } } }
② addRecord(String tableName, String row, String[] fields, String[] values): 向表tableName、行row(用S_Name表示)和字符串數組files指定的單元格中添加對應的數據values。其中fields中每個元素如果對應的列族下還有相應的列限定符的話,用"columnFamily:column"表示。例如,同時向"Math"、“Computer Science”、"English"三列添加成績時,字符串數組fields為{“Score:Math”, “Score: Computer Science”, “Score:English”},數組values存儲這三門課的成績。
import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.TableName; import org.apache.hadoop.hbase.client.*; import java.io.IOException; public class AddRecord { public static Configuration configuration; public static Connection connection; public static Admin admin; public static void addRecord(String tableName, String row, String[] fields, String[] values) throws IOException { init(); Table table = connection.getTable(TableName.valueOf(tableName)); for (int i = 0; i != fields.length; i++) { Put put = new Put(row.getBytes()); String[] cols = fields[i].split(":"); put.addColumn(cols[0].getBytes(), cols[1].getBytes(), values[i].getBytes()); table.put(put); } table.close(); close(); } public static void init() { configuration = HBaseConfiguration.create(); configuration.set("hbase.rootdir", "hdfs://localhost:9000/hbase"); try { connection = ConnectionFactory.createConnection(configuration); admin = connection.getAdmin(); } catch (IOException e) { e.printStackTrace(); } } public static void close() { try { if (admin != null) { admin.close(); } if (null != connection) { connection.close(); } } catch (IOException e) { e.printStackTrace(); } } public static void main(String[] args) { String[] fields = {"Score:Math", "Score:Computer Science", "Score:English"}; String[] values = {"99", "80", "100"}; try { addRecord("person", "Score", fields, values); } catch (IOException e) { e.printStackTrace(); } } }
③ scanColumn(String tableName, String column): 瀏覽表tableName某一列的數據,如果某一行記錄中該列數據不存在,則返回null。要求當參數column為某一列族名稱時,如果底下有若干個列限定符,則要列出每個列限定符代表的列的數據;當參數column為某一列具體名稱(例如"Score:Math")時,只需要列出該列的數據。
import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.hbase.Cell; import org.apache.hadoop.hbase.CellUtil; import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.TableName; import org.apache.hadoop.hbase.client.*; import org.apache.hadoop.hbase.util.Bytes; import java.io.IOException; public class ScanColumn { public static Configuration configuration; public static Connection connection; public static Admin admin; public static void scanColumn(String tableName, String column) throws IOException { init(); Table table = connection.getTable(TableName.valueOf(tableName)); Scan scan = new Scan(); scan.addFamily(Bytes.toBytes(column)); ResultScanner scanner = table.getScanner(scan); for (Result result = scanner.next(); result != null; result = scanner.next()) { showCell(result); } table.close(); close(); } public static void showCell(Result result) { Cell[] cells = result.rawCells(); for (Cell cell : cells) { System.out.println("RowName:" + new String(CellUtil.cloneRow(cell)) + " "); System.out.println("Timetamp:" + cell.getTimestamp() + " "); System.out.println("column Family:" + new String(CellUtil.cloneFamily(cell)) + " "); System.out.println("row Name:" + new String(CellUtil.cloneQualifier(cell)) + " "); System.out.println("value:" + new String(CellUtil.cloneValue(cell)) + " "); } } public static void init() { configuration = HBaseConfiguration.create(); configuration.set("hbase.rootdir", "hdfs://localhost:9000/hbase"); try { connection = ConnectionFactory.createConnection(configuration); admin = connection.getAdmin(); } catch (IOException e) { e.printStackTrace(); } } // 關閉連接 public static void close() { try { if (admin != null) { admin.close(); } if (null != connection) { connection.close(); } } catch (IOException e) { e.printStackTrace(); } } public static void main(String[] args) { try { scanColumn("person", "Score"); } catch (IOException e) { e.printStackTrace(); } } }
④ modifyData(String tableName, String row, String column): 修改表tableName,行row(可以用學生姓名S_Name表示),列column指定的單元格的數據。
import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.hbase.Cell; import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.TableName; import org.apache.hadoop.hbase.client.*; import java.io.IOException; public class ModifyData { public static long ts; public static Configuration configuration; public static Connection connection; public static Admin admin; public static void modifyData(String tableName, String row, String column, String val) throws IOException { init(); Table table = connection.getTable(TableName.valueOf(tableName)); Put put = new Put(row.getBytes()); Scan scan = new Scan(); ResultScanner resultScanner = table.getScanner(scan); for (Result r : resultScanner) { for (Cell cell : r.getColumnCells(row.getBytes(), column.getBytes())) { ts = cell.getTimestamp(); } } put.addColumn(row.getBytes(), column.getBytes(), ts, val.getBytes()); table.put(put); table.close(); close(); } public static void init() { configuration = HBaseConfiguration.create(); configuration.set("hbase.rootdir", "hdfs://localhost:9000/hbase"); try { connection = ConnectionFactory.createConnection(configuration); admin = connection.getAdmin(); } catch (IOException e) { e.printStackTrace(); } } public static void close() { try { if (admin != null) { admin.close(); } if (null != connection) { connection.close(); } } catch (IOException e) { e.printStackTrace(); } } public static void main(String[] args) { try { modifyData("person", "Score", "Math", "100"); } catch (IOException e) { e.printStackTrace(); } } }
⑤ deleteRow(String tableName, String row): 刪除表tableName中row指定的行的記錄。
import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.hbase.Cell; import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.TableName; import org.apache.hadoop.hbase.client.*; import java.io.IOException; public class ModifyData { public static long ts; public static Configuration configuration; public static Connection connection; public static Admin admin; public static void modifyData(String tableName, String row, String column, String val) throws IOException { init(); Table table = connection.getTable(TableName.valueOf(tableName)); Put put = new Put(row.getBytes()); Scan scan = new Scan(); ResultScanner resultScanner = table.getScanner(scan); for (Result r : resultScanner) { for (Cell cell : r.getColumnCells(row.getBytes(), column.getBytes())) { ts = cell.getTimestamp(); } } put.addColumn(row.getBytes(), column.getBytes(), ts, val.getBytes()); table.put(put); table.close(); close(); } public static void init() { configuration = HBaseConfiguration.create(); configuration.set("hbase.rootdir", "hdfs://localhost:9000/hbase"); try { connection = ConnectionFactory.createConnection(configuration); admin = connection.getAdmin(); } catch (IOException e) { e.printStackTrace(); } } public static void close() { try { if (admin != null) { admin.close(); } if (null != connection) { connection.close(); } } catch (IOException e) { e.printStackTrace(); } } public static void main(String[] args) { try { modifyData("person", "Score", "Math", "100"); } catch (IOException e) { e.printStackTrace(); } } }
參考:
《HBase2.2.2安裝和編程實踐指南_廈大數據庫實驗室博客 (xmu.edu.cn)》
《實驗3 熟悉常用的 HBase 操作》https://blog.csdn.net/qq_38648558/article/details/83033050
《實驗3_熟悉常用的HBase操作》https://blog.csdn.net/qq_50596778/article/details/120552574