Type cast from char array to int in HEX to DEC funct

#include <iostream>
#include <string.h>
#include <math.h>
using namespace std;
int hexdec (char * hex) {
int k=strlen(hex);
int i;
int res=0;
for (i=2;i<k;i++) {if (hex[i]=='F') res+=15*pow(16,k-i-1);
else if (hex[i]=='E') res+=14*pow(16,k-i-1);
else if (hex[i]=='D') res+=13*pow(16,k-i-1);
else if (hex[i]=='C') res+=12*pow(16,k-i-1);
else if (hex[i]=='B') res+=11*pow(16,k-i-1);
else if (hex[i]=='A') res+=10*pow(16,k-i-1);
else if (hex[i]>=0 || hex[i]<=9) res+=hex[i]*pow(16,k-i-1);// this string is not workable here how type cast
}
return res;}

int main()
{
cout << hexdec("0x1F")<< endl;
return 0;
}
Last edited on
Note that the characters between 0 and 9 are actually '0'..'9', and not 0..9 as you have in your code; the latter are the ASCII characters represented by codes 0 to 9.

As for your actual question, you could make a function that converts those characters for you, or use the ASCII code math trick and subtract the code for '0'.
res=((int)hex[i])--why it doesnt works or another type-cast.
Or simply implicit typecast. How would you resolve this question?
It is works very well with numbers such as 0xFFFF, so the issue is decimal digit chars conversion to decimal itself?
Topic archived. No new replies allowed.