converting data from decimal to octal to binary

Pages: 12
i hear you but i my needs are not printing (printing is for testing so i can lay eyes on what is happening), the point of this is to conveniently package bits for decoding so that i can read data payload from an encoded bit stream. then once the decoding is done, the bits have to go back into a single array for reading because the payload is not in the form of 6-bit dec numbers.

I'm more than a little confused.

There shouldn't be any need to convert to binary here. In fact... converting a number to/from binary does not make any sense unless you are talking about how the number is displayed as text. There's no such thing as a "binary number" vs. a "decimal number". They're one and the same.

But maybe I'm just failing to understand what you're actually trying to do here. Other people seem to get it, so I'll just assume I'm misunderstanding and will step out and let them field it. =)
@Disch there was a previous thread which may shed a little light on the background to the question - though I don't fully understand it either, but perhaps that doesn't matter.
Topic archived. No new replies allowed.
Pages: 12