0
2
I have a Google Code project which has a lot of wiki'ed documentation. I would like to create a copy of this documentation for offline browsing. I would like to use wget
or a similar utility.
I have tried the following:
$ wget --no-parent \
--recursive \
--page-requisites \
--html-extension \
--base="http://code.google.com/p/myProject/" \
"http://code.google.com/p/myProject/"
The problem is that links from within the mirrored copy have links like:
file:///p/myProject/documentName
This renaming of links in this way causes 404 (not found) errors, since the links point to nowhere valid on the filesystem.
What options should I use instead with wget
, so that I can make a local copy of the site's documentation and other pages?
Just FYI, the source for the wiki pages is located in your source repository. So you could download them all and run them through your own renderer. – Der Hochstapler – 2012-03-26T10:25:09.083
That doesn't help me because it contains a lot of Google Code markup. I'm asking how I would do this with
wget
orcurl
(or similar), I think. – Alex Reynolds – 2012-03-26T10:34:46.420@AlexReynolds this will help to that https://addons.mozilla.org/en-US/firefox/addon/google-code-wiki-viewer/
– HackToHell – 2012-03-26T12:09:37.657