Help using 0xDEADBEEF in a memory pool problem

Hello,

I am working on a problem in which the an integer pointer is initialized to 0xDEADBEEF. It is allocated memory in my memory pool and then deallocated later.
I read docs on this magic value number, but i am not sure how to use it.
Please help!

Thanks,
M
Have you tried grilling it? That's my favorite way!
What o/s, compiler, IDE?

And how are you seeing the deadbeef?

And what docs?

Andy
Last edited on
@mukulabdagiri

I don't know if it helps, but the hex value of '0xDEADBEEF' in decimal, is 3,735,928,559
The point of initialising memory to 0xDEADBEEF is that it's a string of characters that's immediately noticeable when looking at the hex values of memory. If you're using a debugger to look at the contents of memory, it should be easy to notice the letters "DEADBEEF", and you'll know that there's some uninitialised memory there.
do not use it. In the release version it's uninitialized memory
Topic archived. No new replies allowed.