Many download managers like this, this and this support downloading a file over multiple parallel connections, one per thread.
The concept is that each connection will download one part of the file separately.
For example if there are 5 connections, then the first connection is going to download the first 0-20% portion of the file, second connection will download 20-40% portion and so on..
Similarly, on the server side, there will be 5 threads, where one thread will be reading 20% of the file in parallel.
But, I thought that trying to concurrently read a single file with multiple threads will actually make the download significantly slower, since the read head of the mechanical disk will have to do more seeks than before.
Even if we assume that the disk controller queuing mechanism is intelligent enough to batch all the 5 multipart requests to a single file together in one sequential read, it does not give us any advantage over doing the read in one just one thread and then serving the file over just 1 http connection.
So how are parallel downloads to a file faster?