hadoop 無法啟動


包括 namenode 和 datanode 在內都沒有啟動。JPS查看不到除了它本身之外的任何進程
查看 out 文件內容如下:

  1 2020-10-19 20:10:50,206 ERROR [main] namenode.NameNode (NameNode.java:1587) - Failed to start namenode.
  2 java.net.SocketException: Permission denied
  3     at sun.nio.ch.Net.bind0(Native Method) ~[?:1.8.0_191]
  4     at sun.nio.ch.Net.bind(Net.java:433) ~[?:1.8.0_191]
  5     at sun.nio.ch.Net.bind(Net.java:425) ~[?:1.8.0_191]
  6     at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223) ~[?:1.8.0_191]
  7     at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74) ~[?:1.8.0_191]
  8     at org.mortbay.jetty.nio.SelectChannelConnector.open(SelectChannelConnector.java:216) ~[jetty-6.1.26.jar:6.1.26]  9     at org.apache.hadoop.http.HttpServer2.openListeners(HttpServer2.java:934) ~[hadoop-common-2.7.7.jar:?]
 10     at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:876) ~[hadoop-common-2.7.7.jar:?]
 11     at org.apache.hadoop.hdfs.server.namenode.NameNodeHttpServer.start(NameNodeHttpServer.java:142) ~[hadoop-hdfs-2.    7.7.jar:?]
 12     at org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:761) ~[hadoop-hdfs-2.7.7.jar:?] 13     at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:640) ~[hadoop-hdfs-2.7.7.jar:?]
 14     at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:820) ~[hadoop-hdfs-2.7.7.jar:?]
 15     at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:804) ~[hadoop-hdfs-2.7.7.jar:?]
 16     at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1516) ~[hadoop-hdfs-2.7.7.jar:?] 17     at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1582) [hadoop-hdfs-2.7.7.jar:?]

查看 log 文件報錯部分如下:

 73 2020-10-19 20:10:50,024 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.h    adoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs
 74 2020-10-19 20:10:50,168 INFO org.apache.hadoop.http.HttpServer2: Added filter 'org.apache.hadoop.hdfs.web.AuthFilter    ' (class=org.apache.hadoop.hdfs.web.AuthFilter)
 75 2020-10-19 20:10:50,169 INFO org.apache.hadoop.http.HttpServer2: addJerseyResourcePackage: packageName=org.apache.ha    doop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources, pathSpec=/webhdfs/v1/*
 76 2020-10-19 20:10:50,200 INFO org.apache.hadoop.http.HttpServer2: HttpServer.start() threw a non Bind IOException
 77 java.net.SocketException: Permission denied
 78     at sun.nio.ch.Net.bind0(Native Method)
 79     at sun.nio.ch.Net.bind(Net.java:433)
 80     at sun.nio.ch.Net.bind(Net.java:425)
 81     at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
 82     at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
 83     at org.mortbay.jetty.nio.SelectChannelConnector.open(SelectChannelConnector.java:216)
 84     at org.apache.hadoop.http.HttpServer2.openListeners(HttpServer2.java:934)
 85     at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:876)
 86     at org.apache.hadoop.hdfs.server.namenode.NameNodeHttpServer.start(NameNodeHttpServer.java:142)
 87     at org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:761)
 88     at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:640)
 89     at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:820)
 90     at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:804)
 91     at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1516)
 92     at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1582)
 93 2020-10-19 20:10:50,202 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Stopping NameNode metrics system...
 94 2020-10-19 20:10:50,203 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system stopped.
 95 2020-10-19 20:10:50,203 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system shutdown com    plete.
 96 2020-10-19 20:10:50,217 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1

還不知道問題出在何處,網上各種搜都搜不到。只有官方的一篇FAQ里面講到了出現這種報錯,里面給的解釋是說程序使用到了一個小於 1024 的端口,但是我查遍了所有配置文件,沒有哪里用到了小於1024的端口。
而這個問題出現在我本機安裝了 docker desktop 之后,於是就懷疑是不是這個影響了什么。但是也毫無頭緒,准備卸載 docker for windows 試試。


卸載了 docker 也沒用。搞了幾天之后,在 docker 里面安裝 docker-hive 也啟動不了報錯。然后拿着報錯去網上搜,在 stackoverflow 上發現了有人說有一部分端口被保留不允許綁定,並且給了一條命令

netsh interface ipv4 show excludedportrange protocol=tcp

執行命令之后,得到如下結果:

(上圖結果是我修改后的)

於是我就去百度查“管理的端口排除”,得到了這篇文章,到此,終於明白是怎么回事了!!

是裝 docker for windows 前開啟了 Hyper-V,系統保留了一部分端口給 Hyper-V。按照前面那篇文章的說法去做(修改動態端口范圍),然后就可以正常啟動hadoop了


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM