out of Memory Error in HadoopQ: I tried installing Hadoop following thishttp://hadoop.apache.org/common/docs/stable/single_node_setup.html document. When I tried executing this
I am getting the following Exception
Please suggest a solution so that i can try out the example. The entire Exception is listed below. I am new to Hadoop I might have done something dump . Any suggestion will be highly appreciated. .... A: For anyone using RPM or DEB packages, the documentation and common advice is misleading. These packages install hadoop configuration files into /etc/hadoop. These will take priority over other settings. The /etc/hadoop/hadoop-env.sh sets the maximum java heap memory for Hadoop, by Default it is: export HADOOP_CLIENT_OPTS="-Xmx128m $HADOOP_CLIENT_OPTS" This Xmx setting is too low, simply change it to this and rerun export HADOOP_CLIENT_OPTS="-Xmx2048m $HADOOP_CLIENT_OPTS" | |||||
|
find / -name hadoop-env.sh -print 2>/dev/null|xargs grep -l Xmx128m|sudo xargs sed -i 's/Xmx128m/Xmx2048m/'
sudo `which hadoop` jar ../*examp* grep input output 'dfs[a-z.]+'
# That was the example given in the docs, which now runs for me without error...
java - out of Memory Error in Hadoop - Stack Overflow
'via Blog this'