kettle 連接Hadoop 遇錯


kettle從windows中往hdfs中寫文件

One

2016/07/19 14:14:53 - Spoon - 正在開始任務...
2016/07/19 14:14:53 - load_hdfs - 開始執行任務
2016/07/19 14:14:53 - load_hdfs - 開始項[Hadoop Copy Files]
2016/07/19 14:14:53 - Hadoop Copy Files - 開始...
2016/07/19 14:14:53 - Hadoop Copy Files - 正在處理行, 源文件/目錄: [file:///E:/weblogs_rebuild.txt/weblogs_rebuild.txt] ... 目標文件/目錄 : [hdfs://hadoop:8020/data]... 通配符 : [^.*\.txt]
2016/07/19 14:14:53 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
2016/07/19 14:14:53 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
2016/07/19 14:14:53 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
2016/07/19 14:14:53 - Hadoop Copy Files - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : 文件系統異常:Could not copy "file:///E:/weblogs_rebuild.txt/weblogs_rebuild.txt" to "hdfs://hadoop:8020/data/weblogs_rebuild.txt".
2016/07/19 14:14:53 - Hadoop Copy Files - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : 原因:Could not write to "hdfs://hadoop:8020/data/weblogs_rebuild.txt".
2016/07/19 14:14:53 - Hadoop Copy Files - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : 原因:Permission denied: user=Administrator, access=WRITE, inode="/data/weblogs_rebuild.txt":root:hadoop:drwxr-xr-x
    at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:320)
    at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292)
    at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:213)
    at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
    at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1698)
    at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1682)
    at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1665)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:2517)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2452)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2335)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:623)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:397)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2045)

 

 網上一些解決思路

1.修改服務器上hadoop的配置文件hdfs-site.xml中

改為false,重啟hadoop,但我試了一下,然后從ambari重啟集群,發現又變為true了,不知道什么原因

2.對應目錄授權chmod 777,還是報錯

3.最后解決方法:

hadoop fs -mkdir /user/Administrator

hadoop fs -chown Administrator:hdfs /user/Administrator

 

 Two

2016/07/20 10:07:03 - Hadoop Copy Files - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : 文件系統異常:Could not copy "file:///E:/Test/red.txt" to "hdfs://hadoop:8020/kettle/red.txt".
2016/07/20 10:07:03 - Hadoop Copy Files - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : 原因:Could not close the output stream for file "hdfs://hadoop:8020/kettle/red.txt".
2016/07/20 10:07:03 - Hadoop Copy Files - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : 原因:Connection timed out: no further information

原因:這是在power服務器上就報這樣的錯,但同樣的方式到x86服務器就可以成功。

具體解決方法:我的另一篇博文Linux啟動kettle及linux和windows中kettle往hdfs中寫數據(3)http://www.cnblogs.com/womars/p/5718349.html

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM