spark standalone集群模式下一個啟動問題的解決


spark standalone集群配置好后,啟動sbin/start-all.sh報錯,其中一個worker沒有正常啟動起來,查看此worker上的spark\logs目錄下的 log文件,有顯示如下的錯誤

 

20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/04/01 02:46:08 WARN Utils: Service 'sparkWorker' could not bind on a random free port. You may check whether configuring an appropriate binding address.
20/04/01 02:46:08 ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[main,5,main]
java.net.BindException: Cannot assign requested address: Service 'sparkWorker' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkWorker' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:128)
at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:558)
at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1283)
at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:501)
at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:486)
at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:989)
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:254)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:364)
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
at java.lang.Thread.run(Thread.java:748)

 

從上面紅色字體部分,可以判斷是地址綁定錯誤,先檢查此worker的ip地址:ifconfig,確認ip地址為:192.28.12.243

然后檢查此worker的/etc/hosts 文件,發現其ip配置為:

179.28.120.243   dxhost8002

由此確定是hosts文件的ip地址配置錯誤,導致worker啟動異常,修改hosts文件為正確的ip地址:192.28.12.243,然后重新啟動spark,一切恢復正常

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM