Minecraft is a common example. In older, buggier versions, memory use would slowly rise, and this memory would somehow not be eligible for collection by the garbage collector, until eventually the 'leak' caused the game to run out of memory.
But how does this even happen in a garbage-collected language? What do you have to do wrong to run into this?
I think it's more probable that those memory leaks were bugs in the application itself, not JVM. How can memory leaks happen in a garbage-collected environment? It's simple - GC cannot release memory which the application still references, even if it's not really used.