spark2.1操作json(save/read)


建築物配置信息:

case class BuildingConfig(buildingid: String, building_height: Long, gridcount: Long, gis_display_name: String, wear_loss: Double, path_loss: Double) extends Serializable

向hdfs寫入json文件:

 sql(
      s"""|select buildingid,
          |height,
          |gridcount,
          |collect_list(gis_display_name)[0] as gis_display_name,
          |avg(wear_loss) as wear_loss,
          |avg(path_loss) as path_loss
          |from
          |xxx
          |""".stripMargin)
      .map(s => BuildingConfig(s.getAs[String]("buildingid"), s.getAs[Int]("height"), s.getAs[Long]("gridcount"), s.getAs[String]("gis_display_name"), s.getAs[Double]("wear_loss"), s.getAs[Double]("path_loss")))
      .toDF.write.format("org.apache.spark.sql.json").mode(SaveMode.Overwrite).save(s"/user/my/buidlingconfigjson/${p_city}")

從hdfs中讀取json文件:

 /**
      * scala> buildingConfig.printSchema
      * root
      * |-- building_height: long (nullable = true)
      * |-- buildingid: string (nullable = true)
      * |-- gis_display_name: string (nullable = true)
      * |-- gridcount: long (nullable = true)
      * |-- path_loss: double (nullable = true)
      * |-- wear_loss: double (nullable = true)
      **/
    spark.read.json(s"/user/my/buildingconfigjson/${p_city}")
      .map(s => BuildingConfig(s.getAs[String]("buildingid"), s.getAs[Long]("building_height"), s.getAs[Long]("gridcount"), s.getAs[String]("gis_display_name"), s.getAs[Double]("wear_loss"), s.getAs[Double]("path_loss")))
      .createOrReplaceTempView("building_scene_config")

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM