Spark解析json


----

import org.apache.spark.{SparkConf, SparkContext}

import scala.util.parsing.json.JSON

object JSONParse {
  def main(args: Array[String]): Unit = {
    val inputFileName = "file:///Users/walker/learn/mycode/spark/test_data/people.json"

    val conf = new SparkConf().setAppName("JSONParse").setMaster("local")
    val sc = new SparkContext(conf)
    val jsonStrRDD = sc.textFile(inputFileName)
    val parsedResult = jsonStrRDD.map(JSON.parseFull(_)) //jsonStrRDD.map(line => JSON.parseFull(line))
    parsedResult.foreach(
      r => r match {
//          解析成功,返回Some(map: Map[String,Any])
        case Some(map: Map[String,Any]) => println(map)
//          解析不成功,返回None
        case None => println("Parsing failed")
      }
    )
  }
}

 

原始json數據:

{"name":"Michael"}
{"name":"Andy", "age":30}
{"name":"Justin", "age":19}

{"name":"Justin", "age":19,hello}
{57657:12345, "age":19}

 

 

-----


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM