0
I have over 100GBs worth of files on a server sitting on a dedicated 1GBs port. Our office sits on a 100MBs port. Each of the archive files are about 1-5GBs each.
What protocol would be the fastest way to download these files?
I was thinking that a direct HTTP connection would be best over FTP or BitTorrent.
calculate the theoretical limit, test HTTP, see how close you are to the theoretical limit. – the8472 – 2017-02-03T16:09:33.473
2
Possible duplicate of For file transfer, does ftp perform better than http?
– Ƭᴇcʜιᴇ007 – 2017-02-03T16:11:36.0871This depends on many factors, so there isn't really a silver-bullet answer to give you (as discussed in the linked duplicate). What have your tests shown to be fastest in your scenario? – Ƭᴇcʜιᴇ007 – 2017-02-03T16:13:58.790
I've found setting up a bittorrent service and using bittorrent to be faster so far. – Jason – 2017-02-03T16:36:45.680
I'd personally go with SCP due to security, and compression if both ends can cope with it... – djsmiley2k TMW – 2017-02-03T16:51:29.573
dunno but it might be more a question of quickest way, though I don't know that either.. Your question is ok though you've framed it a bit like a theoretical whereas in practise the question might be more of a practical thing, of this way works faster for whatever reason not purely based on protocol. – barlop – 2017-02-03T16:55:42.600