You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The C++ example in chapter 2 of the Definitive Guide fails with an authentication error with Hadoop 0.20.2+737. Is there a configuration parameter we can set to permit this example to execute successfully on an Hadoop cluster?
hadoop pipes -D hadoop.pipes.java.recordreader=true -D hadoop.pipes.java.recordwriter=true -input sample.txt -output output -program bin/max_temperature
10/10/15 10:48:34 WARN mapred.JobClient: No job jar file set. User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
10/10/15 10:48:34 INFO mapred.FileInputFormat: Total input paths to process : 1
10/10/15 10:48:35 INFO mapred.JobClient: Running job: job_201010121147_0019
10/10/15 10:48:36 INFO mapred.JobClient: map 0% reduce 0%
10/10/15 10:48:52 INFO mapred.JobClient: Task Id : attempt_201010121147_0019_m_000001_0, Status : FAILED
java.io.IOException
at org.apache.hadoop.mapred.pipes.OutputHandler.waitForAuthentication(OutputHandler.java:188)
at org.apache.hadoop.mapred.pipes.Application.waitForAuthentication(Application.java:198)
at org.apache.hadoop.mapred.pipes.Application.<init>(Application.java:149)
at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:68)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:383)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:317)
at org.apache.hadoop.mapred.Child$4.run(Child.java:217)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1063)
at org.apache.hadoop.mapred.Child.main(Child.java:211)
attempt_201010121147_0019_m_000001_0: Server failed to authenticate. Exiting
The text was updated successfully, but these errors were encountered:
I tried the same example like you, and unfortunately I have the same error... I would like to know if you find a solution to your problem? I spent two days to find a way to resolve that, without success. Your help will be welcome.
I am using Hadoop-0.20.203, on on single node cluster on Fedora.
After few days of research and try, I understand that Fedora and C++ on 64bits for Hadoop is not a good match. I tried to compile the Hadoop wordcount C++ with ant like explained in the wiki. But ant gets me some error about : libssl and stdint.
First, if you are on Fedora you have to add -lcrypto to the LIBS variables in the .configure. That is cause the dependency on libcrypto must now be explicitely stated on these platform when linking to libssl.(see bug on Fedora). Second issue : ant produces a lot of error about C++ files : to resolve that you just have to add an include : stdint.h on the top of the file.
Then the build success. I tried then to run wordcount example on my Hadoop cluster and it works, while mine did not. I expected that issue come from the library that I just corrected and I was right : I tried to run Hadoop example with library from the hadoop install directory and it did not work and I get the same error.
That could be explained by the fact that ant recompile the C++ library needed for Hadoop(with correction that I did) and used it, instead library provides in the Hadoop install directory.
The C++ example in chapter 2 of the Definitive Guide fails with an authentication error with Hadoop 0.20.2+737. Is there a configuration parameter we can set to permit this example to execute successfully on an Hadoop cluster?
The text was updated successfully, but these errors were encountered: