安裝好spark(spark-2.0.0-bin-hadoop2.6)后在ubuntu的終端輸入pyspark啟動時出現錯誤提示:Exception in thread "main" java.lang.UnsupportedClassVersionError,上百度搜了一下,很多博客說這是因為The problem is that you compiled with/for Java 8, but you are running Spark on Java 7 or older,所以就下載安裝了java8(注意要每個節點都要安裝java8) 重新在ubuntu終端輸入pyspark啟動后成功:
如何在ubuntu14下下載安裝java8可以參照zhuxp1的博客:https://blog.csdn.net/zhuxiaoping54532/article/details/70158200