when converting Decimal to Binary we use the largest power to 2 and substraction to do it. but in decimal to hex, we use the each hex correspond 4 digits binary, why converting dec to bin can't use each dec correspond 4 digits binary like hex ?

=======

p.s. what does the prefix "hexa" in hexadecimal mean?

=======

p.s. what does the prefix "hexa" in hexadecimal mean?

You can convert from decimal to binary by repeated division by 2, the remainder each time gives the corresponding binary digit.

You could use a similar method for hexadecimal. Repeatedly divide by 16, the remainder each time is the corresponding hexadecimal digit.

It's just a convenience that groups of 4 binary digits can be represented as a hex digit. It makes conversion in one's head possible, rather than resorting to complex methods.

The prefix "hexadeci" means 16.

Sixteen of course is six + teen ( 6 + 10 )

Similarly hexadeci is hexa + deci ( 6 + 10)

You could use a similar method for hexadecimal. Repeatedly divide by 16, the remainder each time is the corresponding hexadecimal digit.

It's just a convenience that groups of 4 binary digits can be represented as a hex digit. It makes conversion in one's head possible, rather than resorting to complex methods.

The prefix "hexadeci" means 16.

Sixteen of course is six + teen ( 6 + 10 )

Similarly hexadeci is hexa + deci ( 6 + 10)

why converting dec to bin can't use each dec correspond 4 digits binary like hex |

Because that would leave gaps.

We can represent 0 to 9 decimal as 0000 through 1001 binary.

But that would still leave 1010 through 1111 which don't match any decimal digit. (naturally enough, because these correspond to the hex digits A to F).

why converting dec to bin can't use each dec correspond 4 digits binary like hex |

Well, the answer is it can. This encoding is called Binary Coded Decimal (BCD).

http://en.wikipedia.org/wiki/Binary-coded_decimal

It was apparently used by some early computers, but it wastes storage and goes against the way computers work, as Chervil has said. So all computers now use hexadecimal (I guess there are exceptions, but not in any PC, Mac, Android-device, iPhone, ...).

Though, as Chervil has also said, hexadecimal is just a convenience. Computers actually use binary as they are, after all, just a huge load of on-off switches. Using hexadecimal allows humans to write down the values is a more readable form without getting the computers to do more work.

But BCD requires the computer to do more work.

So, it's not that it cannot be done. But that it is less efficient so people chose to do otherwise (i.e. stick to standard binary encoding).

Andy

Last edited on

Topic archived. No new replies allowed.