Downloading a RAR file from Mediafire using WGET

5

2

Example: http://www.mediafire.com/?tjmjrmtuyco

This was what I tried...

wget -A rar [-r [-l 1]] <mediafireurl>

That is to say, I tried with and without the recursive option. It ends up downloading an HTML page of a few KB in size, while what I want is in the range 90-100 MB and RAR.

What happens with MediaFire for those who may not be aware, is that it first says

Processing Download Request...

This text after a second or so turns into the download link and reads

Click here to start download..

I would appreciate it if someone would tell me how to write a proper script for this situation.

Radha Krishna Murthy Lolla

Posted 2011-07-19T11:24:39.757

Reputation:

please, mark mine as the answer https://superuser.com/a/1517096/635532

– Zibri – 2020-01-14T21:49:05.880

7

This is probably not allowed according to the Mediafire TOS and they will do their best to make it as hard as possible for you to do.

– Joachim Sauer – 2011-07-19T11:38:27.210

seems to be difficult with captcha, javascript timer and all the other things in place... they also have mechanisms in place to block downloads from much more sophisticated download managers.. – tumchaaditya – 2012-06-12T02:39:09.773

you can try jdownloader. it automates the download process from such file sharing sites(mediafire, filesonic etc.) – tumchaaditya – 2012-06-12T02:40:49.617

Answers

6

From Mediafires Terms of Service:

General Use of the Service, Permissions and Restrictions

You agree while using MediaFire Services, that you may not:

Alter or modify any part of the Services;

Use the Services for any illegal purpose;

Use any robot, spider, offline readers, site search and/or retrieval application, or other device to retrieve or index any portion of the Services, with the exception of public search engines

So essentially by using anything other than the tools that Mediafire provide via their website you are in fact breaking their terms of service.

blendmaster345

Posted 2011-07-19T11:24:39.757

Reputation: 61

4

I've never tried myself, but there are a few things you could try to "cheat" the website.

For example --referer will let you specify a referer URL - maybe the site expects you to come from a specific "home" page or something: with this option wget will pretend it's coming from there.

Also, --user-agent will make wget "pretend" it's a different agent - namely, a browser like Firefox.

--header will let you forge the whole HTTP request to mimic that of a browser.

If none of those work, there are also more options, dealing with cookies and other advanced settings: man wget for the whole list.

I hope this helps a bit: if you succeed, please post how you did it!

MacThePenguin

Posted 2011-07-19T11:24:39.757

Reputation: 508

3

Actually it can be done. What you have to do is:

  • Go to the link like you're going to download to your computer
  • When the "download" button comes up, "right-click" and copy the link and add that to your wget.

It'll be something like

wget http://download85794.mediafire.com/whatever_your_file_is

T Jones

Posted 2011-07-19T11:24:39.757

Reputation: 31

That's right! it works this way – Ahmed Essam – 2016-04-18T22:09:23.783

2

Sites like this use multiple methods to prevent simple/automated downloading. A few examples of such techniques include:

  • Using sessions
  • Generating unique download links/keys
  • Using CAPTCHAS (can be defeated, but certainly not by wget)
  • Timers for non-premium users to delay the download
  • IFrames containing the download link
  • Providing the link from another site/domain
  • Checking the web client (is it a web browser or something else)
  • Checking referer to prevent hotlinking (did the download request come from the site or elsewhere)
  • Checking the headers to verify it conforms to their expectations
  • Using PUT instead of GET to use "hidden" form fields
  • Setting and checking cookies
  • Using JavaScript to redirect or generate the download link
  • Using Flash to test the user or generate the download link

Basically, downloading files from sites like this with tools like cURL or wget would at best, be difficult, and certainly not practical.

Synetech

Posted 2011-07-19T11:24:39.757

Reputation: 63 242

0

Right click the download button, "copy link address"

wget (url)

Easy as that, just did it.

AnonymousUser

Posted 2011-07-19T11:24:39.757

Reputation: 1

0

bash function:

mdl () {
url=$(curl -Lqs "$1"|grep "href.*download.*media.*"|tail -1|cut -d '"' -f 2)
aria2c -x 6 "$url" # or wget "$url" if you prefer.
}

Example:
$ sudo apt install aria2
$ mdl "http://www.mediafire.com/?tjmjrmtuyco"

01/14 13:58:34 [NOTICE] Downloading 1 item(s)
38MiB/100MiB(38%) CN:4 DL:6.6MiB ETA:9s]

Zibri

Posted 2011-07-19T11:24:39.757

Reputation: 191