Spark:java.net.BindException: Address already in use: Service 'SparkUI' failed after 16 retries!


Spark多任務提交運行時候報錯。

java.net.BindException: Address already in use: Service 'SparkUI' failed after 16 retries! at sun.nio.ch.Net.bind0(Native Method) at sun.nio.ch.Net.bind(Net.java:444) at sun.nio.ch.Net.bind(Net.java:436) at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214) at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74) at org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187) at org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316) at org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265) at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64) at org.eclipse.jetty.server.Server.doStart(Server.java:293) at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64) at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:194) at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:204) at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:204) at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1676) at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1667) at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:204) at org.apache.spark.ui.WebUI.bind(WebUI.scala:102) at org.apache.spark.SparkContext$$anonfun$10.apply(SparkContext.scala:269) at org.apache.spark.SparkContext$$anonfun$10.apply(SparkContext.scala:269) at scala.Option.foreach(Option.scala:236) at org.apache.spark.SparkContext.<init>(SparkContext.scala:269) at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61) at com.Spark.FileInput.main(FileInput.java:80) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

錯誤原因:

每一個Spark任務都會占用一個SparkUI端口,默認為4040,如果被占用則依次遞增端口重試。但是有個默認重試次數,為16次。16次重試都失敗后,會放棄該任務的運行。

解決方法

初始化SparkConf時,添加conf.set(“spark.port.maxRetries”,“100”)語句 使用spark-submit提交任務時,在命令行中添加-Dspark.port.maxRetries=100 在spark-defaults.conf中添加spark.port.maxRetries 100

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM