How can I read a webpages source?

I am trying to view a webpage's source with c++ so I can get the response
for a whitelist. This doesn't read the specified page's source, but it instead reads the contents of the parent directory. How can I have it read the XeonWhitelist page's source?

#include <urlmon.h>

const char* website = "http://www.wearedevs.net/Assets/Modules/SoftwareCom/XeonWhitelist.php";
const char* page = "XeonWhitelist.php";
string source;
HRESULT hr = URLDownloadToFile(NULL, website, page, 0, NULL);
if (hr == S_OK) {
ifstream fin(page);
char szBuff[2048];
while (fin.getline(szBuff, 2048))
{
source += szBuff;
cout << szBuff << endl;
}
//cout << "Source: " << source << endl;
}
else {
cout << "Could not get the webpage..!";
}
Your code is fine. The problem is that the server sends an empty file to the browser and for some reason URLDownloadToFile refuses to create an empty file.
It works when you use a different url like http://www.wearedevs.net
Topic archived. No new replies allowed.