I'm compiling with g++ with the -g and -O0 flags, and often when I'm debugging, I'll get errors such as "No symbol "XXX" in current context. It doesn't always happen, sometimes the symbols are seen just fine. The usual solution to this error is to compile with -g and make sure optimization is off, but these are not the issues here. I'm running DDD which is a GUI wrapper of gdb I believe.
There is no such thing as "optimization off" in g++ (or gcc, for that matter). What -O0 represents is "minimum optimization", and omitting the -O switch altogether presents even a smaller optimization (at least that was true the last time I sat down and enumerated them). You always get some kind of optimization, no matter what you ask for. That's just the way it is.
What you're seeing with those "no symbol XXX in current context" is data objects getting optimized out. It can be very frustrating, and to resolve those kinds of issues during debugging, I'd recommend not stating any optimization option at all in compilation ... that's what works best for me, anyway.
ddd shouldn't be affecting behaviors at all -- set a breakpoint at that same spot with gdb and try it, I'll bet you get exactly the same result.