While crawling through a large number of links, it may be possible for the Webcrawler to use up all of a machine's memory or slow down as it uses more memory. This could be fixed by finding a way to check memory usage on the machine, and potentially allow users to set memory usage limits. When these limits are hit, the Webcrawler should write its data to the database, clear its memory usage, and start from where it has left off. There may be other ways of preventing this as well.
While crawling through a large number of links, it may be possible for the Webcrawler to use up all of a machine's memory or slow down as it uses more memory. This could be fixed by finding a way to check memory usage on the machine, and potentially allow users to set memory usage limits. When these limits are hit, the Webcrawler should write its data to the database, clear its memory usage, and start from where it has left off. There may be other ways of preventing this as well.