Spark 分布式環境---slave節點無法啟動(已解決)


soyo@soyo-VPCCB3S1C:~$ start-slaves.sh soyo-slave01: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local2/spark/logs/spark-soyo-org.apache.spark.deploy.worker.Worker-1-soyo-slave01.out soyo-slave01: failed to launch: nice -n 0 /usr/local2/spark/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://soyo-VPCCB3S1C:7077
soyo-slave01:   /usr/local2/spark/bin/spark-class: 行 71: /usr/lib/jvm/java-8-openjdk-amd64/bin/java: 沒有那個文件或目錄 soyo-slave01: full log in /usr/local2/spark/logs/spark-soyo-org.apache.spark.deploy.worker.Worker-1-soyo-slave01.out
解決:
修改 soyo-slave01 節點上bashrc里JDK的安裝路徑(因為ubuntu14.04 不是安裝的默認openJDK)之后ok


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM