Strange bug converting little endian to host endian

I am having a problem with a function I wrote to convert little endian byte order integers to host byte order integers. I am using it to read little endian integers from a binary file. I am on Windows so the Linux endian header is not an option. Here is the function code:

uint32_t letoh_32(const uint32_t& num)
    char* buffer = (char*)#
    uint32_t temp = (buffer[0]) | (buffer[1]<<8) | (buffer[2]<<16) | (buffer[3]<<24);
    std::cout << "Little endian: " << num << ". Host endian: " << temp << std::endl;
    return num;
// Code from 

And this is the output:

Little endian: 28. Host endian: 28
Little endian: 3683. Host endian: 3683
Little endian: 7337. Host endian: 4294967209
Little endian: 10991. Host endian: 4294967279
Little endian: 14645. Host endian: 14645
Little endian: 18299. Host endian: 18299

Each little endian value should be the same as the host endian, because I'm on Windows. I know I could re-write this, doing byte swapping when on a big endian platform using bool big_endian = ((*(char*)&i) == 0) and I have written a htole_32 function using this that works, I'm just curious about why this code does not work.

try unsigned char instead of char (which appears to be signed in your case). bit shifts don't operate on single byte , so they sign-extend their arguments to int
Last edited on
Thanks heaps, I totally skimmed over that.
Topic archived. No new replies allowed.