Which software should I use to 'save' a website?

1

I want to take a few pages to be able to view offline I wanted some good free software that can save a website for me, allow me to view it offline, and also search the pages for text.

Can anybody reccomend any software?

Pete2k

Posted 2011-07-08T14:20:41.883

Reputation: 713

Question was closed 2014-08-09T05:26:00.453

I deleted my post. You can't give -1 to people just because they gave you a suggestion. I don't want to be available like them, I don't want to help you – None – 2011-07-08T14:28:40.497

Your browser can do it. If you're on IE, just save the website as a .mht file, in FF as "Website Complete". You can open either file offline and all the relevant parts will be there. – None – 2011-07-08T14:34:41.350

btw, if you're not downvoting the below suggestions (and really you shouldn't be because they are good), say so. It would be a shame to have your question closed because someone else is behaving poorly. – None – 2011-07-08T14:36:10.170

@dnagirl both ff and ie offer to save a "page complete" not an entire site (in most cases a set of pages) – fvu – 2011-07-08T14:44:07.007

Answers

8

I like HTTrack. Very flexible yet relatively easy to use.

As much as I like wget, the visual feedback HTTrack gives helps you figure out problems there may be with your mirroring operation, and that saves a lot of time and frustration.

fvu

Posted 2011-07-08T14:20:41.883

Reputation: 233

6

I use wget, which you can get here for Windows, or included with pretty much every *nix out there. The -m option allows you to mirror a website, though it's always required some fiddling.

Teleport Pro is something I've always seen recommended for this kind of problem on Windows, but I've never used it myself.

Once the files are downloaded you can search their contents using any grep-like tool.

ravuya

Posted 2011-07-08T14:20:41.883

Reputation: 161

1+1. I have no idea why anyone would downvote this answer. – You – 2011-07-08T14:27:26.667

Idem, when I started to wrote my post that I deleted, everyone was -1 – None – 2011-07-08T14:31:13.227

3

wget is a command line tool that can do this. Once you've downloaded the files you can search them using any file searching utility.

paulmorriss

Posted 2011-07-08T14:20:41.883

Reputation: 1 639

what fool has marked you down 1? – Pete2k – 2011-07-08T14:26:41.500

0

A Site is just a HTML Document. Saving a Full site would require some automatons and may require a notable amount of space.

There are few plug-ins that allow you load all the resources one site uses. For sites that load some data through databases, you might not get the full site.

If you want to save specific pages, use right click -> save page as.

For most browsers, it's done the same way.

Here is a related question : How can I download an entire website?

Uncreative Name

Posted 2011-07-08T14:20:41.883

Reputation: 64