Strange bug converting little endian to host endian

Jan 9, 2013 at 11:23am
I am having a problem with a function I wrote to convert little endian byte order integers to host byte order integers. I am using it to read little endian integers from a binary file. I am on Windows so the Linux endian header is not an option. Here is the function code:

1
2
3
4
5
6
7
8
uint32_t letoh_32(const uint32_t& num)
{
    char* buffer = (char*)#
    uint32_t temp = (buffer[0]) | (buffer[1]<<8) | (buffer[2]<<16) | (buffer[3]<<24);
    std::cout << "Little endian: " << num << ". Host endian: " << temp << std::endl;
    return num;
}
// Code from http://stackoverflow.com/questions/13001183/how-to-read-little-endian-integers-from-file-in-c 


And this is the output:

Little endian: 28. Host endian: 28
Little endian: 3683. Host endian: 3683
Little endian: 7337. Host endian: 4294967209
Little endian: 10991. Host endian: 4294967279
Little endian: 14645. Host endian: 14645
Little endian: 18299. Host endian: 18299


Each little endian value should be the same as the host endian, because I'm on Windows. I know I could re-write this, doing byte swapping when on a big endian platform using bool big_endian = ((*(char*)&i) == 0) and I have written a htole_32 function using this that works, I'm just curious about why this code does not work.

Thanks
Jan 9, 2013 at 11:39am
try unsigned char instead of char (which appears to be signed in your case). bit shifts don't operate on single byte , so they sign-extend their arguments to int
Last edited on Jan 9, 2013 at 11:40am
Jan 9, 2013 at 11:43am
Thanks heaps, I totally skimmed over that.
Topic archived. No new replies allowed.