Problem with Base64 conversion of an MD5 hash string

Hi guys and gals!

First of all I would like to say that I'm pretty bad at c++ (I'm mostly a c# guy), and I apologies if this should have been posted in the beginnings forum (I was in doubt since I think that this question might not be that easy to answer).

Okay, so what I'm trying to do is make a c++ DLL (to be used in UnrealScript) with the purpose of converting a string into a MD5 hash, and then converting it into a Base64 string.

I've done this before in c#, using the System namespace for base64 conversion, and it's pretty simple (sorry if c# code is sacrilege here!):

1
2
3
4
5
6
private static string CreateMD5Hash(string input)
{
    MD5 md5Hash = MD5.Create();
    byte[] data = md5Hash.ComputeHash(Encoding.UTF8.GetBytes(input));
    return System.Convert.ToBase64String(data);
}


The above code works fine for c#, but I'm less fortunate with my c++ DLL. The c++ DLL code compiles fine, and I'm pretty sure that the MD5 hash is encoded correctly, so I think that it's the base64 conversion which is the problem.

Here's the code:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
std::string wstrtostr(const std::wstring &wstr)
{
    // Convert a Unicode string to an ASCII string
    std::string strTo;
    char *szTo = new char[wstr.length() + 1];
    szTo[wstr.size()] = '\0';
    WideCharToMultiByte(CP_ACP, 0, wstr.c_str(), -1, szTo, (int)wstr.length(), NULL, NULL);
    strTo = szTo;
    delete[] szTo;
    return strTo;
}

__declspec(dllexport) void DLLBase64(wchar_t* inputStr, wchar_t* outputStr)
{
    const size_t newsize = 100;
    std::wstring wstr(inputStr);
    std::string stdstr = wstrtostr(wstr);
    std::string md5str = md5(stdstr);
    String^ sysstr = gcnew String(md5str.c_str());
    array<Byte>^ byarr = Encoding::UTF8->GetBytes(sysstr);
    String^ convertedstr = Convert::ToBase64String(byarr);
    std::string convertedstdstr = marshal_as<std::string>(convertedstr);
    size_t origsize = strlen(convertedstdstr.c_str()) + 1;
    size_t convertedChars = 0;
    outputStr[newsize];
    mbstowcs_s(&convertedChars, outputStr, origsize, convertedstdstr.c_str(), _TRUNCATE);
}


The input I send to the DLL's inputStr is basically a json string combined with a private key.
The output I get with the c# code for a specific test json string, (which is what I also expect/want to get with the c++ cide) is:
MMbymhNEPJHFfIRFnpoB+g==

The output I actually get with the c++ DLL is:
MzBjNmYyOWExMzQ0M2M5MWM1N2M4NDQ1OWU5YTAxZmE=

I'm using the System namespace to convert the base64 string for the c++ DLL (like i did with the c# code), so it's a mystery to me why the results are so different.

It took me some time to get here, and there's probably a lot of things I should have done differently, so I would appreciate any input helping me solve the problem :)

Thanks for any replies :)
/Simon
Use CryptBinaryToString() API function from Crypt32.DLL.
Topic archived. No new replies allowed.