strcasestr returns wrong pointer

hey all,
i've got a problem with the strcasestr function from string.h. here is how i use it:
1
2
3
char *tmpcp;
tmpcp = strcasestr(buffer, "Content-Length:");
printf("%p \n%p\n",tmpcp,buffer);

it should look for the string "Content-Length:" in buffer and return a pointer to the begining of this substring. but the printf function tells me that:
1
2
0xffffffffec094b64 
0x7fffec094a00

or something similar. so if i try to use the pointer returned by the function i naturaly get a segmentaion fault. the last 5 bytes of the returned value seem to be correct, but the first shouldn't differ.
i realy dont know why it does not work correctly, so i would be glad if you could help me.
by the way i use the glibc-2.16.0-3 and gcc 4.7.1. do you think it could be a bug inside one of them?

thank you very much for helping me (if you do),
ritka
Is the buffer null terminated?
%p? Why not %s?
I was able to reproduce a similar output:
0xffffffffeee0ff64 
0x7fffeee0ff50

by following the manpage exactly and ignoring the compiler diagnostics:
test.c:9: warning: implicit declaration of function ‘strcasestr’
test.c:9: warning: assignment makes pointer from integer without a cast


If this is what happened to you, make sure to either #define _GNU_SOURCE on the first line of your file, or compile with -D_GNU_SOURCE

Note that #define'ing _GNU_SOURCE as the manpage says, just before #include <string.h>, doesn't work if you included something like stdio.h, which on Linux, includes string.h internally: see http://ideone.com/2wfNi vs. http://ideone.com/x4inR
Last edited on
thank you! this seems to work fine. but i dont understand why it is necessary to define _GNU_SOURCE. what does it do? and why does it not work without it?

@kbw: i wanted to show the pointers, not the content they are pointing to.
Last edited on
_GNU_SOURCE modifies string.h to include GNU's non-standard function prototypes, including the prototype for strcasestr

Without it, since you ignored the compiler diagnostic, the compiler decided that this unknown function you're calling returns int. On your system, int is 32 bit and char* is 64 bit. So the function actually returned a 64-bit value (0x7fffec094b64), but your code only took the lower 32 bits (0xec094b64). You then stored that 32-bit signed value in a char*, which caused the compiler to expand it to a 64-bit value, and since the high bit was 1, it filled the high 32 bits with 1's as well, giving you 0xffffffffec094b64.

In short, always enable warnings and don't ignore them.
Last edited on
i understand, thanks! problem is solved :)
Topic archived. No new replies allowed.