Common problem while copying data from source to HDFS using Flume

FLUME JAVA.LANG.CLASSNOTFOUNDEXCEPTION: ORG.APACHE.HADOOP.IO.SEQUENCEFILE$COMPRESSIONTYPE

Scenario: I wanted to copy the logs from the scource to HDFS. HDFS demons are up and running on the cluster. I’ve pointed the sink to hdfs but when I am trying to start the agent it is not starting. On checking the log files I see the stacktrace like this :

Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.io.SequenceFile$CompressionType
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)

It is very clear that is not able to find the expected class on the classpath, hence the solution :
Copy your hadoop-core-xyz.jar to $FLUME_HOME/lib directory.

Note : If you are running your hadoop cluster on 0.20 versions, after copying this file, FileNotFound Exception will be solved, but you will end up getting authentication errors. Try using 1.0.x stable versions.

Advertisements

Tagged: ,

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: