4
1
I'd like to backup content of my blog which is powered by posterous.com. I'd like to save all texts and images to the local disk. Ability to browse it offline is a plus.
What I've already tried:
wget -mk http://myblogurl
It downloads the first page with list of posts, then stops with "20 redirections exceeded
" message.
It downloads the first page with redirection to the www.posterous.com home page instead of real page content.
Edit: The url of the site I'm trying to backup is blog.safabyte.net
I tried on a random user on posterous, and it worked without any problems. How about giving us the actual site url? – gorilla – 2010-01-23T00:45:48.970
Link added. See bottom of the question. – Martin Vobr – 2010-01-23T01:21:03.287
Just tried, wget picked up the your full blog contents – Sathyajith Bhat – 2010-01-23T05:33:38.013
Could you post the command line? In my case the 'wget -mk http://blog.safabyte.com' get index.html only. No images are downloaded. No pages with posts are downloaded. I'm using wget 1.11.3 from cygwin running on WinXP.
– Martin Vobr – 2010-01-23T10:37:37.883@Martin Vobr :
wget -mk http://blog.safabyte.net
GNU Wget 1.11.1 on openSUSE 11.0 – Sathyajith Bhat – 2010-01-23T17:03:05.593Added a 'windows' tag as it seems to be os specific. After trying few things I've found a solution. It looks like the
wget -mk http://blog.safabyte.net
does not works on win. Howeverwget -mk http://blog.safabyte.net/*
DOES work. – Martin Vobr – 2010-01-23T18:50:43.163Thanks @Sathya and @gorilla. Yours proof that it works for others has made me to try to fiddle with parameters again and to find how to get it work. – Martin Vobr – 2010-01-23T18:52:20.710
@Martin : Glad to hear it worked out. You might want to post your comment as an answer and mark it as accepted, it would help others in the future. – Sathyajith Bhat – 2010-01-24T06:03:09.373