<?xml version="1.0" encoding="UTF-8"?>
<feed xmlns="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <title>GC Overhead limit exceeded error in Hadoop</title>
  <link rel="alternate" href="https://tis.xsede.org/c/message_boards/find_thread?p_l_id=&amp;threadId=2523439" />
  <subtitle>GC Overhead limit exceeded error in Hadoop</subtitle>
  <entry>
    <title>GC Overhead limit exceeded error in Hadoop</title>
    <link rel="alternate" href="https://tis.xsede.org/c/message_boards/find_message?p_l_id=&amp;messageId=2523438" />
    <author>
      <name>Enamul Karim</name>
    </author>
    <id>https://tis.xsede.org/c/message_boards/find_message?p_l_id=&amp;messageId=2523438</id>
    <updated>2020-07-30T09:13:38Z</updated>
    <published>2020-07-30T09:05:57Z</published>
    <summary type="html">I am working on a Hadoop Map/Reduce application. In one of the experiments with a large data set, I got an error saying that GC overhead limit exceeded. Can anyone tell me what could be the reason and how can I solve it in Hadoop?</summary>
    <dc:creator>Enamul Karim</dc:creator>
    <dc:date>2020-07-30T09:05:57Z</dc:date>
  </entry>
</feed>

