Spyders to gather information from the internet

I am looking to make a tedious task fast. My friends and me listen to "A State of Trance" by Armin Van Buuren. It's a weekly radio show and has many different artists per two-hour mix. You can find the track listings at: http://www.arminvanbuuren.com/asot/ . I want to make a spyder that you can just type in (ether command line or GUI): 283 and it will save the track listings (http://www.arminvanbuuren.com/asot/98/) of episode 283 to a text file. Is this easy to do? Would it be easier to do in another language? Thanks again for your past help and current stuff too,

You can write a simple TCP connection to execute a basic HTTP req and parse the results off yourself. Fairly simple.

If you can find an HTTP library than that'd make it even easier. But overall it's not too hard.
wget and grep can do this work.
If you are using Linux, you can just write a bash script.
A script to download the source of the page? I'm running Ubuntu 8.10, I'll look into it, thanks,

Used wget, worked like a charm, thanks again,

Topic archived. No new replies allowed.