步驟一:啟動為前台:bin/hiveserver2
步驟二:啟動為后台:nohup bin/hiveserver2 1>/var/log/hiveserver.log 2>/var/log/hiveserver.err &
1:啟動成功后,可以在別的節點上用beeline去連接
方式一:
hive/bin/beeline 回車,進入beeline的命令界面
輸入命令連接hiveserver2
beeline> !connect jdbc:hive2//master:10000
(master機器是hiveserver2所啟動的那台主機名,端口默認是10000)
方式二:
或者啟動就連接:
bin/beeline -u jdbc:hive2://master:10000 -n hadoop
接下來就可以做正常sql查詢了
1:將Hive作為一個服務器,其他機器可以作為客戶端進行訪問:
2:然后尋找符合thrift的這種協議的客戶端來連這個服務,然而hive自帶這種客戶端(我這里復制本台機器,弄了兩個窗口):
然后呢,就是開啟了命令行客戶端了,然后了連接你的hive即可。我這里遇到一個錯誤,順手貼一下吧。
造成這個錯誤的原因是!connect jdbc:hive2//master:10000寫成!connect jdbc:hive2//localhost:10000即可。
1 [root@master apache-hive-1.2.1-bin]# cd bin/ 2 [root@master bin]# ls 3 beeline ext hive hive-config.sh hiveserver2 metatool schematool 4 [root@master bin]# ./beeline 5 Beeline version 1.2.1 by Apache Hive 6 beeline> !connect jdbc:hive2//master:10000 7 scan complete in 1ms 8 17/12/12 04:42:53 [main]: ERROR beeline.ClassNameCompleter: Fail to parse the class name from the Jar file due to the exception:java.io.FileNotFoundException: minlog-1.2.jar (No such file or directory) 9 17/12/12 04:42:53 [main]: ERROR beeline.ClassNameCompleter: Fail to parse the class name from the Jar file due to the exception:java.io.FileNotFoundException: objenesis-1.2.jar (No such file or directory) 10 17/12/12 04:42:53 [main]: ERROR beeline.ClassNameCompleter: Fail to parse the class name from the Jar file due to the exception:java.io.FileNotFoundException: reflectasm-1.07-shaded.jar (No such file or directory) 11 scan complete in 5708ms 12 No known driver to handle "jdbc:hive2//master:10000"
然后呢我寫成beeline> !connect jdbc:hive2://localhost:10000回車。會讓你輸入賬號和密碼,這里直接輸入你的登陸linux的賬號和密碼即可,我習慣使用root和其密碼操作,雖然不好,但是這里直接輸入root賬號,123456密碼即可登陸。
1 beeline> !connect jdbc:hive2://localhost:10000 2 Connecting to jdbc:hive2://localhost:10000 3 Enter username for jdbc:hive2://localhost:10000: hadoop 4 Enter password for jdbc:hive2://localhost:10000: 5 Error: Failed to open new session: java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Permission denied: user=hadoop, access=EXECUTE, inode="/tmp":root:supergroup:drwx------ 6 at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:271) 7 at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:257) 8 at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:208) 9 at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:171) 10 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6545) 11 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4182) 12 at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881) 13 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:821) 14 at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) 15 at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619) 16 at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:975) 17 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040) 18 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2036) 19 at java.security.AccessController.doPrivileged(Native Method) 20 at javax.security.auth.Subject.doAs(Subject.java:415) 21 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656) 22 at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2034) (state=,code=0) 23 0: jdbc:hive2://localhost:10000 (closed)> !connect jdbc:hive2://localhost:10000 24 Connecting to jdbc:hive2://localhost:10000 25 Enter username for jdbc:hive2://localhost:10000: root 26 Enter password for jdbc:hive2://localhost:10000: ****** 27 Connected to: Apache Hive (version 1.2.1) 28 Driver: Hive JDBC (version 1.2.1) 29 Transaction isolation: TRANSACTION_REPEATABLE_READ 30 1: jdbc:hive2://localhost:10000>
登陸以后即可以進行操作,演示如下所示:
停更......
2017-12-12 20:50:44