Free Mac OS X application for downloading an entire website

27

19

Is there any free application for downloading an entire site installable on Mac OS X 10.6?

Am1rr3zA

Posted 2009-11-08T20:00:55.413

Reputation: 4 715

Answers

21

I've always loved the name of this one: SiteSucker.

UPDATE: Versions 2.5 and above are not free any more. You may still be able to download earlier versions from their website.

Grzegorz Adam Hankiewicz

Posted 2009-11-08T20:00:55.413

Reputation: 1 029

Ditto! this thing works, has a nice gui and is easy to configure.... – Brad Parks – 2014-08-11T01:58:38.983

1It's not free. On the App store, they're asking $5. – JohnK – 2014-09-16T04:26:20.293

2

@JohnK looks like they changed policy for 2.5.x and above, but earlier versions are still available for free from http://www.sitesucker.us/mac/versions2.html.

– Grzegorz Adam Hankiewicz – 2014-09-17T17:46:44.480

1

@GrzegorzAdamHankiewicz is right, you can download 2.4.6 free from their site here - http://ricks-apps.com/osx/sitesucker/archive/2.x/2.4.x/2.4.6/SiteSucker_2.4.6.dmg

– csilk – 2017-04-11T04:25:11.467

FYI: 2.4.6 still works great on macOS Mojave 10.14.5 – Xavi Esteve – 2019-07-29T09:40:30.660

I like this one too. Simple to use interface also. – Troggy – 2009-11-10T16:59:14.490

44

You can use wget with it's --mirror switch.

wget --mirror –w 2 –p --HTML-extension –-convert-links –P /home/user/sitecopy/

man page for additional switches here.

For OSX, you can easily install wget(and other command line tools) using brew.

If using the command line is too difficult, then CocoaWget is an OS X GUI for wget. (Version 2.7.0 includes wget 1.11.4 from June 2008, but it works fine.)

John T

Posted 2009-11-08T20:00:55.413

Reputation: 149 037

as of v1.12, --html-extension was renamed to --adjust-extension – Rog182 – 2015-01-15T04:58:02.637

@JohnT Does wget files downlaod all CSS/JS/Images and fonts too? – Volatil3 – 2016-11-12T14:40:27.347

@Volatil3 yes, it does. – CodeBrauer – 2016-12-23T17:35:38.440

2Wget is great. I use wget --page-requisites --adjust-extension --convert-links when i want to download single but complete pages (articles etc). – ggustafsson – 2012-02-04T15:05:04.227

I need software man, don't want use wget – Am1rr3zA – 2009-11-08T20:07:29.707

33wget is software, and it's the most flexible. – John T – 2009-11-08T20:09:37.317

7wget is brilliant software, it's a one-stop-shop for any downloading you might fancy. – Phoshi – 2009-11-08T20:15:36.837

ok then Tanx for your answer – Am1rr3zA – 2009-11-08T20:17:39.513

3

SiteSuuker has already been recommended and it does a decent job for most websites.

I also find DeepVacuum to be a handy and simple tool with some useful "presets".

Screenshot is attached below.

-

Simple interface with "presets"

PKHunter

Posted 2009-11-08T20:00:55.413

Reputation: 193

1Awesome tool, just what i needed! – Hello World – 2014-10-23T12:26:35.517

It's really simple and works very well! – madx – 2015-09-03T21:18:54.557

2

MicTech

Posted 2009-11-08T20:00:55.413

Reputation: 9 888

1

http://epicware.com/webgrabber.html

I use this on leopard, not sure if it will work on snow leopard, but worth a try

Robbie

Posted 2009-11-08T20:00:55.413

Reputation:

1

pavuk is by far the best option ... It is command line but has an X-Windows GUI if you install this from the Installation Disk or download. Perhaps someone could write a Aqua shell for it.

pavuk will even find links in external javascript files that are referenced and point these to the local distribution if you use the -mode sync or -mode mirror options.

It is available through the os x ports project, install port and type

port install pavuk

Lots of options (a forest of options).

user25971

Posted 2009-11-08T20:00:55.413

Reputation:

0

A1 Website Download for Mac

It has presets for various common site download tasks and many options for those who wish to configure in detail. Includes UI + CLI support.

Starts as a 30 days trial after which is turns into "free mode" (still suitable for small websites under 500 pages)

Tom

Posted 2009-11-08T20:00:55.413

Reputation: 347

-2

Use curl, it's installed by default in OS X. wget isn't, at least not on my machine, (Leopard).

Typing:

curl http://www.thewebsite.com/ > dump.html

Will download to the file, dump.html in your current folder

Fred

Posted 2009-11-08T20:00:55.413

Reputation: 140

Main problem with that is that that's downloading the homepage, not the entire website. – Phoshi – 2009-11-08T21:58:17.190

Well, look at the man page – Fred – 2009-11-08T21:59:46.580

2Last I checked, curl doesn't do recursive downloads (that is, it can't follow hyperlinks to download linked resources like other web pages). Thus, you can't really mirror a whole website with it. – Lawrence Velázquez – 2009-11-09T00:05:58.697

Well, then do a quick script to get the links, we are in command line land right? Otherwise, just use a tool with a graphical front end. – Fred – 2009-11-09T00:54:33.943

A quick script, I dare you ;-) Last time I checked, curl also didn't even download the media embedded within that single web page. So: I'd love to see that script that, for a single page, 1) fetches all images etcetera and 2) rewrites the HTML to refer to those local copies... ;-) – Arjan – 2009-11-10T16:32:35.963

2(And its name is cURL... I think John T's edit were really improving your answer.) – Arjan – 2009-11-10T22:04:13.940

There is no difference, and it's a erroneous way to write. Although often used in marketing and so on. It's a name, so I suppose the correct way would be to capitalize the first letter. Look up the man page in a terminal and you see what I'm talking about, it's either curl or Curl. But what are you arguing about here, something of substance? I think not. – Fred – 2009-11-10T23:04:52.380