I recently got here from a google search and found this website quite helpful -hats off by the way and keep it up :) -

but i have some problems at times where internet connection isn't available to me

so i was wondering if there was an option to download the website content for offline use ?

Last edited on
In linux it would be done with the 'wget' console program
You can still download wget and use it on Windows or OS X, however. You can recursively download the site with:
wget -r

You can download and install it for your respective OS here:

(NOTE: if you're already running Linux, it's probably already installed, you can check with wget --version).
Last edited on
Use a program like to download the page and and links on that page. Be carefull though, if you don't set it up right or stop it, it will download all the links on that page and all links on the pages you downloaded. The memory adds up kind of fast.
lol imagine if that site had links to other sites that had links to other sites which had links to google and other sites/search engines/website'd effectively download the entire internet up until your HDD space runs out LOL

Btw, I've used httrack before. I've also used other similar applications. If I remember correctly, though, httrack doesn't arrange the files the same way as they are on the website, but rather have a single index.html while every other file is held within a subfolder. I could be mistaking it for a different app, though.
Last edited on
OP, once you get the site downloaded, you can copy the directory containing it to a mobile device (i.e. iPod, iPhone, Android phone) and have access to it anytime you have access to your device, regardless of internet connection. I did this with when I was learning the basics of the language.
Topic archived. No new replies allowed.