Converting Base to Base

For my CSC class I have to create a program which converts from all bases to all bases (i.e. bin to dec, bin to hex, dec to bin, dec to hex, hex to bin and hex to dec) total of 6 of them.

I missed one class and I seem to be completely lost. I also have to create a menu asking for the user to state which base they will be using and which they would want in the output.

Granted, the second part is easy, but I can't seem to get the conversion to work at all.

any help would be greatly appreciated. A tutorial of sorts.
number = digits_j base_j where base_j = b^j
By instance 1011_2 = 1*2^0 + 1*2^1 + 0*2^2 + 1*2^3

Now do the conversion to decimal.
You could write
digits_j base_j = digits_0 + b (digits_1+ b(digits_2 + b (...) ) ) (Horner's method)

So number = digits_0 + b*something that means number-b*something = digits_0
because of the restrictions of the problem digits_0 = mod(number, b)

Now do the conversion from decimal.
This is often made much more confusing than it need be... Unless you are taking a math class where you must directly convert between radices, in the computer you can usually think of it this way:

"base" or "radix" is something that humans need when they write a number.
a number itself is just a number (even in the computer).

So the thing you must do is implement two functions: one to convert a human readable "number" (or string of digits) to a computer's number, and another to convert a computer number to a human readable "number" (a text string).

So, the process is like this:

    0A (hex)    -->    ten (in the computer)    -->    12 (oct)

How it is actually stored in the computer is unimportant, because the computer treats it as a number for you.

Now, the important operations to taking a number apart are integer division and remainder:

12%10 = 2
12/10 = 1

And the important operations to putting a number together are multiplication and addition:

1*10 + 2 = 12

In these examples, 10 is the radix (or "base").

The last thing to remember is that 1 != '1'. In C and C++, the number 1 is an unprintable character. The character '1' has a big number. Convert between the number and the human readable text digits using some addition:

'1' - '0' = 1
1 + '0' = '1'

Hope this helps.
I hope I'm not giving you the total solution (Well the containers aren't standard at least). But I was working on a project with these functions last night. This is half of it. Just do the reverse to get strings from unsigned int.

unsigned int ToNum(QChar in)
    int out = in.toLatin1();
    if (out >= 'A' && out <= 'F')
        out -= ('A' - 10);
    else if (out >= 'a' && out <= 'f')
        out -= ('a' - 10);
        out -= '0';
    return out;

unsigned int str2num(QString num, int base)
    unsigned int output = 0;
    for (int i = 0; i < num.length(); ++i)
        output *= base;
        output += ToNum(;
    return output;
Topic archived. No new replies allowed.