hadoop開發環境:window上eclipse+虛擬機的ubuntu13.04+hadoop-1.1.2+JDK1.7
今天在eclipse上(eclipse在window上)運行hadoop example的worldcount程序出現:
WARN mapred.JobClient: No job jar file set. User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
INFO mapred.JobClient: Task Id : attempt_201312271733_0015_m_000000_0, Status : FAILED
java.lang.RuntimeException: java.lang.ClassNotFoundException: com.kai.hadoop.WordCount$TokenizerMapper
WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
WARN snappy.LoadSnappy: Snappy native library not loaded
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:849)
at org.apache.hadoop.mapreduce.JobContext.getMapperClass(JobContext.java:199)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:719)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.lang.ClassNotFoundException: com.kai.hadoop.WordCount$TokenizerMapper
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:270)
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:802)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:847)
... 8 more
這個程序直接ubuntu下是可以直接運行的,找了下網上的解決方案:
(1)大部分是說加job.setJarByClass(WordCount.class);但是這個在程序里有設置,所以不是自己想要的
(2)還有一種方法是說org.apache.hadoop.fs里面的一個類名為FileUtl.class中的一個方法checkReturnValue搞的鬼,要修改源碼,將這個注釋掉,然后重新打成hadoop-core-1.1.2.jar包,但是也沒有用
(3)http://blog.csdn.net/zklth/article/details/5816435是受這篇帖子的啟發,因為自己hadoop也是個偽集群,打成jar在主節點下運行時可以
不知道這樣理解對不對,現在也只能是在window下寫好代碼,然后打成jar包再上傳到linux運行,難道真的要這樣?這樣豈不是太麻煩,希望有知道的大神指點下(但願不是讓我換到linux下做開發)