報錯信息如下:
spark02: failed to launch: nice -n 0 /usr/local/softwareInstall/spark-2.1.1-bin-hadoop2.7/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://spark01:7077
spark03: failed to launch: nice -n 0 /usr/local/softwareInstall/spark-2.1.1-bin-hadoop2.7/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://spark01:7077
spark02: JAVA_HOME is not set
spark02: full log in /usr/local/softwareInstall/spark-2.1.1-bin-hadoop2.7/logs/spark-spark-org.apache.spark.deploy.worker.Worker-1-spark02.out
spark03: JAVA_HOME is not set
spark03: full log in /usr/local/softwareInstall/spark-2.1.1-bin-hadoop2.7/logs/spark-spark-org.apache.spark.deploy.worker.Worker-1-spark03.out
摸不着頭腦
預算網上求解,終於得到解決方案
The solution had been quite easy and straightforward. Just added export JAVA_HOME=/usr/java/default in /root/.bashrc and it successfully started the spark services from root user without the JAVA_HOME is not set error. Hope it helps somebody facing same problem.
也就是說在每一個worker的root用戶下/root/.bashrc的文件后邊導入jdk 的路徑
然后我就這樣試了一下,發現問題居然沒有解決,還是報那個錯誤,然后又找了一會答案還是沒找到
於是我就想網上是說在root用戶下添加java路徑,我用的不是root用戶,於是我就在我用的那個用戶的目錄下的.bashrc文件添加Java路徑,然后問題解決。
--參考https://blog.csdn.net/Abandon_Sun/article/details/76686398