C++ with internet input/output

I'm trying to write a program that will read text from a server page*, then use some of the terms from that text as search terms on a different webpage. I'll then copy the results of each search. Is this something that is possible to do in C++?

I've only ever programmed on Windows before, but I'm using a Linux machine for this task. Are there any major differences I should be aware of?

Using C++ isn't entirely necessary (I'm allowed to use whichever language I want), but I'm only familiar with C++, so using it is preferrable to learning a new one.

Also, in my research into this, I heard a lot of references to "sockets," but I wasn't able to determine how that might apply to this specific situation.

*my knowledge of how the Internet works is woefully limited, but I'm fairly sure that the page that I need to read from is not on the Internet but rather on a server connected to this computer. There is no "WWW" in the URL if that matters.
Last edited on
You could use tcp streams from boost.asio, as in this web scraping example: http://rosettacode.org/wiki/Web_scraping#C.2B.2B , but boost is too low-level for convenient work with web pages from C++ (and sockets are even more low-level!). I would take a look at the poco library: http://pocoproject.org/ or some other web libraries. Or consider a language that has these features built-in.
Last edited on
I sometimes use cURL:
http://cplusplus.com/forum/unices/45878/#msg249287

Alternatively, if I want to minimise external dependencies:
http://cplusplus.com/forum/general/58677/#msg316810
Topic archived. No new replies allowed.