我改小了内存,64M,但是运行后报错了,错误信息如下;
Hadoop job information for Stage-1: number of mappers: 2; number of reducers: 2
2015-04-10 14:09:49,476 Stage-1 map = 0%, reduce = 0%
2015-04-10 14:10:25,976 Stage-1 map = 100%, reduce = 100%
Ended Job = job_1428645885603_0002 with errors
Error during job, obtaining debugging information...
Examining task ID: task_1428645885603_0002_m_000001 (and more) from job job_1428645885603_0002
Task with the most failures(4):
-----
Task ID:
task_1428645885603_0002_m_000000
URL:
http://master:8088/taskdetails.jsp?jobid=job_1428645885603_0002&tipid=task_1428645885603_0002_m_000000
-----
Diagnostic Messages for this Task:
Error: Java heap space
重新修改内存大小(配置文件为:hdfs-stie.xml):
<property>
<name>mapreduce.map.memory.mb</name>
<value>128</value>
</property>
<property>
<name>mapreduce.map.java.opts</name>
<value>-Xmx256M</value>
</property>
<property>
<name>mapreduce.reduce.memory.mb</name>
<value>128</value>
</property>
<property>
<name>mapreduce.reduce.java.opts</name>
<value>-Xmx128M</value>
</property>
可以计算结果,但是依然是在一台机器上运行的。
|