Concurrent FTP access

8

1

How does FTP servers handle concurrent access to the same file, i.e. if one user is updating a file while another is reading the same file? Should I be worried about getting corrupt data? Is it dependent on the FTP server or even the operating system?

Kristian

Posted 2009-08-20T11:53:09.643

Reputation: 802

Answers

2

I think the FTP Server itself does not handle this.
The underlying file-system will manage the accesses involved here.

If a read starts before a write, the read would typically get the older version.

You should get the answer to this question based on
the server file-system handling for a file being overwritten.

nik

Posted 2009-08-20T11:53:09.643

Reputation: 50 788

1

I think FTP implementations just don't deal with this, and the OS will vary on how it does, windows might lock the file, linux will give you partial data.

Yes you should be worried, specially under high usage. The solutions I found on the past were sketchy at best, including different folders for upload/download and a monitor process to copy from upload to download when the file is fully uploaded.

This gets worse as files are longer and/or people upload from slow connections.

webclimber

Posted 2009-08-20T11:53:09.643

Reputation: 111

0

I am pretty sure you could have problems. Try this:

Start a upload of a large file to your FTP server. Refresh the view of the folder it is being uploaded to and you will see that the file size increases and the upload progresses.

If you try this with a .mp3 file. You can access it via the browser and see that it will only play up to the point that is has currently been uploaded.

This is why programs like Dreamweaver have a check in/check out system. So that if someone is working on a .html file someone else cannot upload a older version or cause that sort of problems.

I don't think FTP uses any kind of temp files or queuing either...

ian

Posted 2009-08-20T11:53:09.643

Reputation: 103

Correct. I once unzipped a file that actually was not transferred all the way. Weird error messages, of course... – Arjan – 2009-08-20T12:23:02.367

I think updating and existing file is different from creating a new file (the read will not start before the create happens, and then it will follow the write). If the read races-beyond the write, you get incompletely terminated reads. An incomplete ZIP file will show corruption. An incomplete mp3 will play up to the first point of corruption (i think). That is a difference in the file-formats. – nik – 2009-08-20T13:28:09.917

A modify/overwrite of an existing file should identify a read-in-progress and create a new version-to-write, retaining the older one for the read to complete. That is why I say in my answer that a typical filesystem would give an older copy of the file in such a case. – nik – 2009-08-20T13:32:59.597

0

Whenever I attempt to access a page I haven't uploaded yet, I get a "connection reset while loading". I think it's really up to the software, though.

Phoshi

Posted 2009-08-20T11:53:09.643

Reputation: 22 001

-1

Using a Revision Control Software would help in this case, have a look at Git and SVN, note that there is two main types (centralized and distributed) and many other applications in addition to those two.

Shadok

Posted 2009-08-20T11:53:09.643

Reputation: 3 760

This answer has nothing at all to do with the question... – Kristian – 2011-09-10T17:25:10.850

Operations in those systems being atomic there is no way you could get an uncomplete file, problem solved. – Shadok – 2011-09-11T01:21:24.800

Yes I know, but the question was specifically about FTP servers – Kristian – 2011-09-11T20:47:31.087