|What does actually happen here?|
You're learning how numbers are stored in computer memory.
On my machine, "int" happens to be 4 bytes (it doesn't have to be, it was 2 on many old 16-bit systems, and it is 8 on a few).
My machine (IBM Power7) stores the four bytes in memory in "big-endian" order, the same way we write on paper: 0x12345678 is stored as 0x12, then 0x34, then 0x56, then 0x78. The number 512 (0x200 in hex) is stored as 0x00, then 0x00, then 0x02, then 0x00.
On consumer PCs, numbers are stored in "little-endian" order, backwards: 0x12345678 is stored as 0x78, then 0x56, then 0x34, then 0x12. The number 512 is stored as 0x00, then 0x02, then 0x00, then 0x00.
The command c = 1; above modifies the first byte.
On my computer, i becomes 0x01, then 0x00, then 0x02, then 0x00. That's 0x01000200 in hex, or 16777728 in decimal.
On yours, it becomes 0x01, then 0x02, then 0x00, then 0x00. That's 0x00000201 in hex (remember, backwards), that's 513 in decimal.