Common problem while copying data from source to HDFS using Flume


Scenario: I wanted to copy the logs from the scource to HDFS. HDFS demons are up and running on the cluster. I’ve pointed the sink to hdfs but when I am trying to start the agent it is not starting. On checking the log files I see the stacktrace like this :

Caused by: java.lang.ClassNotFoundException:$CompressionType
at Method)
at java.lang.ClassLoader.loadClass(
at sun.misc.Launcher$AppClassLoader.loadClass(
at java.lang.ClassLoader.loadClass(

It is very clear that is not able to find the expected class on the classpath, hence the solution :
Copy your hadoop-core-xyz.jar to $FLUME_HOME/lib directory.

Note : If you are running your hadoop cluster on 0.20 versions, after copying this file, FileNotFound Exception will be solved, but you will end up getting authentication errors. Try using 1.0.x stable versions.


Tagged: ,

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: