Spark 連接hive 元數據庫(mysql)
方法一:
1)打開Hive metastore
[root@head42 ~]# hive --service metastore &
netstat -ano|grep 9083 ???
2)開啟spark連接Mysql
[root@head42 ~]# spark-shell --conf spark.hadoop.hive.metastore.uris=thrift://localhost:9083
3)scala> spark.sql("show tables").show
spark.sql("select * from database_name.table_name")//訪問其他數據庫
+--------+--------------+-----------+
|database| tableName|isTemporary|
+--------+--------------+-----------+
| default| customer| false|
| default|text_customers| false|
+--------+--------------+-----------+
這樣就Ok了!
方法二:
1)拷貝hive的hive-site.xml文件到spark的conf目錄下
2)修改spark中hive-site.xml文件
添加以下:
<configuration>
<property>
<name>hive.metastore.uris</name>
<value>thrift://localhost:9083</value>
</property>
</configuration>
3)另建窗口啟動:
[root@head42 conf]# hive --service metastore
4)啟動spark:
[root@head42 conf]# spark-shell
5)測試:
spark.sql("select * from database_name.table_name").show//訪問其他數據庫的表格
scala> spark.sql("show tables").show
+--------+--------------+-----------+
|database| tableName|isTemporary|
+--------+--------------+-----------+
| default| customer| false|
| default|text_customers| false|
+--------+--------------+-----------+
這樣就OK了!