idea連接spark集群報錯解析:Caused by: java.lang.ClassCastException


cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.sql.execution.aggregate.SortAggregateExec.aggregateExpressions of type scala.collection.Seq in instance of org.apache.spark.sql.execution.aggregate.SortAggregateExec
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 8 in stage 4.0 failed 4 times, most recent failure: Lost task 8.3 in stage 4.0 (TID 332, 172.16.43.200, executor 1): java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.List$SerializationProxy to field org.apache.spark.sql.execution.aggregate.SortAggregateExec.aggregateExpressions of type scala.collection.Seq in instance of org.apache.spark.sql.execution.aggregate.SortAggregateExec

  當只指定了spark集群的地址,沒有設定setJars這個參數,那么就會報以上這種錯誤,解決方法就是設置setjars這個參數,如下:

System.setProperty("hadoop.home.dir", "E:\\winutils")
    val conf = new SparkConf().setAppName("DetailRatio")
      .setMaster("spark://172.xx.xx.xx:7077")
   // .setMaster("local")
     .setJars(List("E:\\vense_work\\venseData\\out\\artifacts\\DetailRatio_jar\\venseData.jar"))
    //.set("spark.submit.deployMode", "client")

  


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM