一、域名解析問題
域名解析暫時失敗問題
vim /etc/sysconfig/network
查看主機名
vim etc/hosts
配置IP地址與主機名 192.168.60.132 centos #改正就OK啦
二、啟動問題
Starting namenodes on [localhost] ERROR: Attempting to operate on hdfs namenode as root ERROR: but there is no HDFS_NAMENODE_USER defined. Aborting operation. Starting datanodes ERROR: Attempting to operate on hdfs datanode as root ERROR: but there is no HDFS_DATANODE_USER defined. Aborting operation. Starting secondary namenodes [bogon] ERROR: Attempting to operate on hdfs secondarynamenode as root ERROR: but there is no HDFS_SECONDARYNAMENODE_USER defined. Aborting operation.
處理1 $ vim sbin/start-dfs.sh $ vim sbin/stop-dfs.sh 兩處增加以下內容 HDFS_DATANODE_USER=root HDFS_DATANODE_SECURE_USER=hdfs HDFS_NAMENODE_USER=root HDFS_SECONDARYNAMENODE_USER=root 處理2 $ vim sbin/start-yarn.sh $ vim sbin/stop-yarn.sh 兩處增加以下內容 YARN_RESOURCEMANAGER_USER=root HADOOP_SECURE_DN_USER=yarn YARN_NODEMANAGER_USER=root
三、端口問題
localhost: ssh: connect to host localhost port 22: Cannot assign requested address cd /etc/ssh vim sshd_config 添加 Port 22
四、Failed to get D-Bus connection: Operation not permitted
解決方法:docker run --privileged -ti -e "container=docker" -v /sys/fs/cgroup:/sys/fs/cgroup hadoop-master /usr/sbin/init
五、服務報錯
sshd re-exec requires execution with an absolute path 在開啟SSHD服務時報錯. sshd re-exec requires execution with an absolute path 用絕對路徑啟動,也報錯如下: Could not load host key: /etc/ssh/ssh_host_key Could not load host key: /etc/ssh/ssh_host_rsa_key Could not load host key: /etc/ssh/ssh_host_dsa_key Disabling protocol version 1. Could not load host key Disabling protocol version 2. Could not load host key sshd: no hostkeys available — exiting
解決過程:
#ssh-keygen -t dsa -f /etc/ssh/ssh_host_dsa_key #ssh-keygen -t rsa -f /etc/ssh/ssh_host_rsa_key #/usr/sbin/sshd
執行后報錯:
Could not load host key: /etc/ssh/ssh_host_ecdsa_key
Could not load host key: /etc/ssh/ssh_host_ed25519_key
解決過程:
#ssh-keygen -t dsa -f /etc/ssh/ssh_host_ecdsa_key #ssh-keygen -t rsa -f /etc/ssh/ssh_host_ed25519_key #/usr/sbin/sshd
六、報錯如下
WARNING: HADOOP_SECURE_DN_USER has been replaced by HDFS_DATANODE_SECURE_USER. Using value of HADOOP_SECURE_DN_USER. Starting namenodes on [master] master: /usr/hadoop/hadoop-3.2.0/libexec/hadoop-functions.sh: line 982: ssh: command not found Starting datanodes Last login: Mon Jan 28 08:32:32 UTC 2019 on pts/0 localhost: /usr/hadoop/hadoop-3.2.0/libexec/hadoop-functions.sh: line 982: ssh: command not found Starting secondary namenodes [b982e2adc393] Last login: Mon Jan 28 08:32:33 UTC 2019 on pts/0 b982e2adc393: /usr/hadoop/hadoop-3.2.0/libexec/hadoop-functions.sh: line 982: ssh: command not found Starting resourcemanager Last login: Mon Jan 28 08:32:35 UTC 2019 on pts/0 Starting nodemanagers Last login: Mon Jan 28 08:32:42 UTC 2019 on pts/0 localhost: /usr/hadoop/hadoop-3.2.0/libexec/hadoop-functions.sh: line 982: ssh: command not foun
解決: 在 $ vim sbin/start-dfs.sh $ vim sbin/stop-dfs.sh 將HADOOP_SECURE_DN_USER=hdfs替換為HADOOP_DATANODE_SECURE_DN_USER=hdfs centos默認安裝有ssh服務,沒有客戶端。 查看ssh安裝 # rpm -qa | grep openssh openssh-5.3p1-123.el6_9.x86_64 openssh-server-5.3p1-123.el6_9.x86_64 沒有安裝openssh-clients yum安裝ssh客戶端 yum -y install openssh-clients
七、docker: Error response from daemon: cgroups: cannot find cgroup mount destination: unknown.
沒有找到具體的解決方法,重啟后可以訪問。
八、permission denied (publickey,gssapi-keyex,gssapi-with-mic,password)
vim /etc/ssh/sshd_config
# $OpenBSD: sshd_config,v 1.80 2008/07/02 02:24:18 djm Exp $ # This is the sshd server system-wide configuration file. See # sshd_config(5) for more information. # This sshd was compiled with PATH=/usr/local/bin:/bin:/usr/bin # The strategy used for options in the default sshd_config shipped with # OpenSSH is to specify options with their default value where # possible, but leave them commented. Uncommented options change a # default value. #Port 22 #AddressFamily any #ListenAddress 0.0.0.0 #ListenAddress :: # Disable legacy (protocol version 1) support in the server for new # installations. In future the default will change to require explicit # activation of protocol 1 Protocol 2 # HostKey for protocol version 1
-----------
就是改一下信息
RSAAuthentication yes
PubkeyAuthentication yes
PasswordAuthentication no
改完后一定要重啟ssh服務 service sshd restart
systemctl restart sshd.service
九、WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
系統預裝的glibc庫是2.12版本,而hadoop期望是2.14版本,所以打印警告信息。
現在有兩個辦法,重新編譯glibc.2.14版本,安裝后專門給hadoop使用,這個有點危險。
第二個辦法直接在log4j日志中去除告警信息。在//usr/local/hadoop-2.5.2/etc/hadoop/log4j.properties文件中添加
log4j.logger.org.apache.hadoop.util.NativeCodeLoader=ERROR
十、MapReduce操作出現錯誤: 找不到或無法加載主類org.apache.hadoop.mapreduce.v2.app.MRAppMaster 問題解決方法
在命令行下輸入 Hadoop classpath
編輯yarn-site.xml
添加一下內容
<configuration> <property> <name>yarn.application.classpath</name> <value>輸入剛才返回的Hadoop classpath路徑</value> </property> </configuration>