Most wikis store their data in a database. These are not simple pages that you can download off the web server, they are dynamically created at the time you request them, using a number of queries to that database.
Finding out the size would be tricky.... You would need the total size of the database, plus any supporting files in the web accessible directory.
I suppose if you wanted to download all 2000 articles as they stand today, you could write a script that would query the database for each article, and download it to your machine. But to get to the revisions of each article, and to access the possibly deleted articles, you would need to understand the URL scheme of the wiki software in question. Then you could measure the size of all of those files.... But that may not give you an accurate idea of the size when it is all stored on the web and database servers.
You can't. It's like asking "How long is a rope?". And the answer is "It depends.". – TFM – 2011-08-06T07:14:17.223
BTW: You could make a guess: "estimated bytes per page" x "number of pages" x "estimated number of revisions per page". But what about the pictures? – TFM – 2011-08-06T07:19:07.233
It totally depends on the website in question, therefore much too broad. You'll only know the size if you fully (recursively) download every page there is and then look at the size. By the way: Some Wikis allow content to be downloaded in one huge dump. – slhck – 2011-08-06T10:37:17.393
I'm not asking "how long is a rope" - I'm asking, in a way, "how to find the given rope's length". – Adobe – 2011-08-06T11:13:23.603