Huffman code question

I'm going through my notes from class, but am confused about one part: let's say I have a string of ABRACADABRA.

Each letter gets 3 bits. E.g A = 000, B = 001, ...
Which means with 11 letters, it uses 33 bits. Is there a reason for 3 bits and not any other amount?

The exact wording from class notes: Since we have 5 letters, we need 3 bits to represent each character.
> Each letter gets 3 bits.
This piece of information is so misleading. Have you learned about computer bit? A bit only stores one of the two values. And with N bits standing next to each other the maximum value they can store is 2N - 1. With n = 3, these bits can only hold the maximum value of 7.

Whereas a normal alphabet contains up to 27 characters!

By the way, edit the topic and change it to Beginners section. Your topic doesn't seem to fit this forum.
Last edited on
https://en.wikipedia.org/wiki/Huffman_coding


@closed account 5a8Ym39o6

This part from the OP: "Since we have 5 letters, we need 3 bits to represent each character. "

jjwooyoung wrote:
Is there a reason for 3 bits and not any other amount?


It's 3 for this example, there could be a larger set of codes for some other input - for example look at the table on the rhs of the wiki page - there are 13 symbols with 5 bits. As closed account says, one needs enough bits to cover the number of symbols.

Hope this helps :+)
closed account (48T7M4Gy)
Thanks to the OP and subsequent comments. Huffman coding of characters results in characters having varying bit lengths unlike ASCII for instance and that aspect is why the 'normal' rules don't apply.

There's a good demo on coding and decoding on YouTube.
https://www.youtube.com/watch?v=apcCVfXfcqE
Last edited on
Topic archived. No new replies allowed.