Wednesday, September 25, 2013

out of Memory Error in Hadoop on ubuntu - 128m limit - really? - thanks to Stack Overflow

out of Memory Error in Hadoop

Q:
I tried installing Hadoop following thishttp://hadoop.apache.org/common/docs/stable/single_node_setup.html document. When I tried executing this
bin/hadoop jar hadoop-examples-*.jar grep input output 'dfs[a-z.]+' 
I am getting the following Exception
java.lang.OutOfMemoryError: Java heap space
Please suggest a solution so that i can try out the example. The entire Exception is listed below. I am new to Hadoop I might have done something dump . Any suggestion will be highly appreciated.
....
A:
For anyone using RPM or DEB packages, the documentation and common advice is misleading. These packages install hadoop configuration files into /etc/hadoop. These will take priority over other settings.
The /etc/hadoop/hadoop-env.sh sets the maximum java heap memory for Hadoop, by Default it is:
export HADOOP_CLIENT_OPTS="-Xmx128m $HADOOP_CLIENT_OPTS"
This Xmx setting is too low, simply change it to this and rerun
export HADOOP_CLIENT_OPTS="-Xmx2048m $HADOOP_CLIENT_OPTS"
share|improve this answer
that fixes the issue... – polerto Apr 7 at 22:36
I just had the exact same problem as the OP, and I was using the RPM package. This fixed the problem. Upvoted. – Aaron Aug 12 at 20:13
[ed: so in Connie's ubuntu install, this worked (don't try on a real network with automounts, just funny little standalone ubuntu systems, where you don't know where hadoop got installed):

find / -name hadoop-env.sh -print 2>/dev/null|xargs grep -l Xmx128m|sudo xargs sed -i 's/Xmx128m/Xmx2048m/'

sudo `which hadoop` jar ../*examp* grep input output 'dfs[a-z.]+'
# That was the example given in the docs, which now runs for me without error...

Why modify files by hand when you can use find and sed -i...   :-) ]

java - out of Memory Error in Hadoop - Stack Overflow

'via Blog this'

No comments:

Post a Comment