【原創】大叔經驗分享(65)spark讀取不到hive表


spark 2.4.3

spark讀取hive表,步驟:

1)hive-site.xml

hive-site.xml放到$SPARK_HOME/conf下

2)enableHiveSupport

SparkSession.builder.enableHiveSupport().getOrCreate()

3) 測試代碼

    val sparkConf = new SparkConf().setAppName(getName)
    val sc = new SparkContext(sparkConf)
    val spark = SparkSession.builder.config(sparkConf).enableHiveSupport().getOrCreate()
    spark.sql("show databases").rdd.foreach(println)

使用$SPARK_HOME/bin/spark-submit提交任務后發現並不能讀取到hive的數據庫,相關日志如下

19/05/31 13:11:31 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect.
19/05/31 13:11:31 INFO SharedState: loading hive config file: file:/export/spark-2.4.3-bin-hadoop2.6/conf/hive-site.xml
19/05/31 13:11:31 INFO SharedState: spark.sql.warehouse.dir is not set, but hive.metastore.warehouse.dir is set. Setting spark.sql.warehouse.dir to the value of hive.metastore.warehouse.dir ('/user/hive/warehouse').
19/05/31 13:11:31 INFO SharedState: Warehouse path is '/user/hive/warehouse'.
19/05/31 13:11:31 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoin

說明已經讀到hive-site.xml;

進一步測試,使用$SPARK_HOME/bin/spark-sql或者$SPARK_HOME/bin/spark-shell發現都可以讀到hive數據庫,很神奇有沒有,

$SPARK_HOME/bin/spark-shell啟動的類為org.apache.spark.repl.Main

"${SPARK_HOME}"/bin/spark-submit --class org.apache.spark.repl.Main --name "Spark shell" "$@"

跟進org.apache.spark.repl.Main代碼

...
      val builder = SparkSession.builder.config(conf)
      if (conf.get(CATALOG_IMPLEMENTATION.key, "hive").toLowerCase(Locale.ROOT) == "hive") {
        if (SparkSession.hiveClassesArePresent) {
          // In the case that the property is not set at all, builder's config
          // does not have this value set to 'hive' yet. The original default
          // behavior is that when there are hive classes, we use hive catalog.
          sparkSession = builder.enableHiveSupport().getOrCreate()
          logInfo("Created Spark session with Hive support")
        } else {
          // Need to change it back to 'in-memory' if no hive classes are found
          // in the case that the property is set to hive in spark-defaults.conf
          builder.config(CATALOG_IMPLEMENTATION.key, "in-memory")
          sparkSession = builder.getOrCreate()
          logInfo("Created Spark session")
        }
      } else {
        // In the case that the property is set but not to 'hive', the internal
        // default is 'in-memory'. So the sparkSession will use in-memory catalog.
        sparkSession = builder.getOrCreate()
        logInfo("Created Spark session")
      }
      sparkContext = sparkSession.sparkContext
      sparkSession
...

發現和測試代碼有些差異,關鍵是在倒數第二行,這里是先創建SparkSession,再從SparkSession中獲取SparkContext,另外注意到之前有個WARN級別的日志

19/05/31 13:11:31 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect.

修改測試代碼

    val sparkConf = new SparkConf().setAppName(getName)
    //val sc = new SparkContext(sparkConf)
    val spark = SparkSession.builder.config(sparkConf).enableHiveSupport().getOrCreate()
    val sc = spark.sparkContext
    spark.sql("show databases").rdd.foreach(println)

這次果然ok了,詳細原因有空再看,未完待續;

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM