hex2bin issue

Hello, I have a faulty hex2bin function here, and I'm wondering if anyone has suggestions to fix it.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
std::string hex2Bin(std::string &input) {
	std::ostringstream output;
	int len = input.length();
	char tmp[5], *ptr;
	tmp[0] = '0';
	tmp[1] = 'x';
	tmp[4] = '\0';
	for (int i = 0; i < len; i += 2) {
		if (i + 1 >= len)
			return output.str();
		tmp[2] = input[i];
		tmp[3] = input[i+1];
		//std::cout << tmp << "\n";
		output << (char)strtol(tmp, &ptr, 16);
	}
	return output.str();
}


Here are two sample generated outputs over this hex:

Hex: 96d013fa9cfe343952a952b671ada6867116948912726c890a732864198a1c3a861e994854796dce89da9a6d6e7f4b6220a8491d37ff113616c9a7812b0deb2d

The php script I have is returning the correct value
1
2
3
4
5
6
PHP: Ç_Èo<ë…9§q?bì=O—"d�ð%sM0²ÑøäÈÁ"ü»¸½¹&èÏø÷_œÌ2P¨¹PM‹(¯¡¢Ç[ToÊiÊ]æÍ\è!
Ú<V‚ø+?¬ñu!ܨÔX'Ø*‡ÒUE¨^&´àgùmOŸë[¯ôÉ¢ô 󃩼ÏQ

C++: Ç_Èo<ë…9§q?bì=O—"d ð%sM0²ÑøäÈÁ"ü»¸½¹&èÏø÷_œÌ2P¨¹PM‹(¯¡¢Ç[ToÊiÊ]æÍ\è!
Ú<V‚ø+?¬ñu!ܨÔX'Ø*‡ÒUE¨^&´àgùmOŸë[¯ôÉ¢ô
󃩼ÏQ


Any help would be great, Thanks.
What character set is that supposed to be? The output looks like garbage to me.
not sure how much PHP you know, but I'm using the Hex2Bin function for OpenSSL's RSA functioning, more specifically RSA_Verify which requests a binary form of the signature.

This is the PHP version of the hex2bin function that I have, and what is generating the correct output

1
2
3
4
5
6
7
   function hex2bin($data) {
      $len = strlen($data);
      for($i=0;$i<$len;$i+=2) {
         $newdata .= pack("C",hexdec(substr($data,$i,2)));
      }
      return $newdata;
   }
Ok, got a better description from another website. I'm dealing with x509 certificates, and during the verification process of the certificate's signature for openSSL, the hexadecimal signature needs to be converted to a byte array, that's what the hex2bin function does.
I have another update, maybe it can be solvable this time. It appears that the issue lies in the way that the string is stored when it hits an 0x00/0x20 case. apparently the 0x00 is being treated as a white space 0x20, as well as the 0x20, when I need it to come out as the pure ASCII form, which has it in some weird looking vertical rectangle, which is throwing off the binary signature. I want to try to avoid using a Replace on 0x20 to 0x00 because if the case does come along where 0x20 is valid, it would invalidate it.

So, my question apparently has now switched to how can I keep 0x00 in my string using the above code, or would it just be better to try to use a "table" swap method IE:

 
const char * table[256] = {"0x00", "0x01" .... "0xff"};


If anyone knows, that would be great, I do however need to keep the data in either std::string, const char *, or unsigned char * format, or be able to convert to it keeping the 0x00 intact.
Last edited on
Topic archived. No new replies allowed.