How can I calculate how long it will take to transfer a file over LAN?

6

3

I want to know about the method with which we can calculate data transfer speed.

If I am sending a 1 GB file through a 1 Gb/s LAN connection, from one computer to another, how much time will it take to transfer that file?

Rohit

Posted 2010-09-16T06:30:29.497

Reputation: 61

There will be multiple bottlenecks along the way you have to consider. For example, hard drive read/write times, other programs using the LAN connection, other programs running at the same time using the CPU. The copy dialogs will try to indicate this to you based off of current predictions but changes in network speed and bandwidth will always change it midway. Theoretically with no bottlenecks it should take 8 seconds as there are 8 gigabits in a gigabyte - although due to aforementioned and other factors this is almost never the case. – QuickishFM – 2020-01-06T18:32:08.063

3Have you ever noticed how the windows File Copy dialog say 1 min, then 10 seconds, then 8 minutes, then 14 seconds ? That should give you an idea as to how hard doing this correctly is. – Toby Allen – 2012-07-10T20:38:20.127

Answers

12

As Hippo and MaQleod have stated a byte is 8 bits.
This means 1Gigabit = 0.125 GigaBytes = 125 MegaBytes.

This means the theoretical maximum of a 1Gbps connection is 0.125 GigaBytes per second.

Remember, the entire connection will run at the speed of the slowest element. So, if you're downloading to your hard drive you'd expect it to be limited to the speed of the drives - about 60-70MB/s for a common mechanical hard drive.

Chances are even if there's nothing else to limit the speed you will still not achieve the theoretical maximum speed for data transfer because of other restricting factors such as packet overhead.

Also, you ideally want to make sure you are using Cat6 cabling, not Cat5/5e


Note on size prefixes

This section is why I felt I'd add my answer, even though it's a moderate dupe of the answers so far.

There are two main schemes for prefixing bytes to indicate magitude:

SI Prefix (abbr)= Num Bytes             |  IEC Prefix (abbr)= Num Bytes       
-------------------------------------------------------------------------------
1 GigaByte (GB) = 1 000 000 000 (10^9)  |  1 GibiByte (GiB) = 1 073 741 824 (2^30)
1 MegaByte (MB) = 1 000 000     (10^6)  |  1 MebiByte (MiB) = 1 048 576     (2^20)
1 KiloByte (KB) = 1 000         (10^3)  |  1 KibiByte (KiB) = 1 024         (2^10)

It is highly common for most people to use the SI prefix to mean the IEC number of bytes, although in all "offical" terms this usage is deprecated and shouldn't be used. It doesn't help that both prefix patterns are often incorrectly represented by the same short versions - you often can't tell just by looking if GB is GigaByte or GibiByte, even though it should be Giga, it's often used to represent Gibi - such as in Windows Explorer for example.

This is why you often buy a 500GB hard drive that, when connected, only has ~465GiB of space - the manufacturer is using Giga, and the OS is using Gibi.

In terms of GigaBit Ethernet, it runs at a speed of 1000 Megabits per second - or 1 000 000 000 bits/s - so for completeness the final results are:

1 Gigabit  =  125 000 000 Bytes  =   125 MegaBytes  =   0.125 GigaBytes 
                                 =  ~119 MebiBytes  =  ~0.116 Gibibytes  

DMA57361

Posted 2010-09-16T06:30:29.497

Reputation: 17 581

The IEC prefixes are not shorted like the SI ones: Gibibyte is shorted to GiB. Also, all telecommunication speeds are fully in SI scheme, and it is not uncommon for hard drive manufactures to mark a 1000 GiB hard drive as 1 T(i)B one (the GiB is real, and TiB is imaginary here). – whitequark – 2010-11-29T15:05:20.260

@white you are correct, thanks, I'll see about fixing it. I'm starting to think I should leave old posts alone even in the face of an error, there's also something else that crops up to change after the initial one. ;) – DMA57361 – 2010-11-29T18:59:46.770

In my experience, the size of the file doesn't matter as much as the protocol involved. NFS has much higher transfer speeds than SMB due to the packet overhead. There is much more information being sent over SMB than there is with NFS. So the calculation really is moot if you don't account for the protocol. – MaQleod – 2012-07-10T20:36:40.143

1

In a 1 Gbps connection, 1 Gigabit will take 1 second. Since there are 8 bits in a byte, 1 Gigabyte will take 8 times longer.

So your 1 GB file will take 8 seconds in ideal conditions. However, hard disk speeds are usually much slower, so your file transfer might take three times longer to complete.

Hippo

Posted 2010-09-16T06:30:29.497

Reputation: 474

so potentially 24 seconds. How come server-to-server 1Gbit transfers take 35 minutes? – chovy – 2015-12-18T07:31:11.620

0

1 byte = 8 bits, this means that 1 gigabyte is equal to 8589934592 bits, or 8 gigabits. So 1 gigabyte will take 8 seconds on a 1 gigabit/second LAN (but you have to allow for some packet overhead and so it will take a bit longer).

I must add that this will vary GREATLY depending on protocol. For instance, transfers over NFS have much lower overhead that packets over SMB and both are significantly faster/leaner than NETBIOS (which hopefully no one in their right mind is using anymore).

MaQleod

Posted 2010-09-16T06:30:29.497

Reputation: 12 560

0

The transfer protocol matters. I am assuming you are using Windows since it is the most common OS. Also the lack of details on your question implies that you are using the "Windows File Sharing" which uses SMB. I would say that you'll see 20-30 Megabytes per second. That is, again, assuming all the computers are running on Windows 7 or at least Vista SP1, and Gigabit ethernet connected correctly.

I suggest you to get teracopy or something similar and watch the copy speed and get a better estimate.

Mavromatis Lozay

Posted 2010-09-16T06:30:29.497

Reputation: 424

0

Note that: The 1GB file = 1 x 2^30 bytes = 2^33 bits. (on Windows OS as it incorrectly uses SI prefix when they should be using IEC instead.. [1])

While the data transfer rate: 1Gb/s = 10^9bps.[2]

So "ideally" it would take...

2^33 bits / 10^9bps = (8,589,934,592)b / (10^9)bps = ~8.58s

Ofcourse, the HDD latency, network parameters, propagation delay, etc.. play a part in the final estimation.


Reference:

  1. http://en.wikipedia.org/wiki/Data_rate_units#Problematic_variations
  2. http://en.wikipedia.org/wiki/Data_rate_units#Conversion_formula

Kent Pawar

Posted 2010-09-16T06:30:29.497

Reputation: 562