• Forum
  • Lounge
  • I don't understand how Java memory leaks

 
I don't understand how Java memory leaks happen

Minecraft is a common example. In older, buggier versions, memory use would slowly rise, and this memory would somehow not be eligible for collection by the garbage collector, until eventually the 'leak' caused the game to run out of memory.

But how does this even happen in a garbage-collected language? What do you have to do wrong to run into this?
Last edited on
Can't it be a bug in the JVM?
You would think that you don't need to worry about memory leaks in java, but that is not completely the case.

http://stackoverflow.com/questions/1281549/memory-leak-traps-in-the-java-standard-api
I think it's more probable that those memory leaks were bugs in the application itself, not JVM. How can memory leaks happen in a garbage-collected environment? It's simple - GC cannot release memory which the application still references, even if it's not really used.
Just put more and more objects into some collection and forget to remove them - the collection will grow and you'll have a leak.
I thought a leak was when you can't reference the memory to deallocate it.
Could you free memory you allocate in a function with an outside reference to that pointer?
Lumpkin wrote:
Could you free memory you allocate in a function with an outside reference to that pointer?
Does not compute.
Never mind I answered it.
Topic archived. No new replies allowed.