I work from home, off-grid, with a somewhat shaky satellite connection. There's a good chance that when uploading web files, it will hiccup, and the upload starts over.
Until recently, our company server was Win 2003 using Windows FTP. When my sat connection hiccuped, the file would just start over, but it would upload fine...irritating, but I could get work done.
We recently upgraded the server to Windows Server 2008 R2, still using Windows FTP, and using the same version/setup of Filezilla that I was using previously.
Now, intermittently, when a file has to re-upload, I get a critical transfer error.
"550 The process cannot access the file because it is being used by another process."
I have to log into the server using remote desktop, manually delete the 0 byte file left there, and then reupload. (Though if I wait a few minutes before trying to upload it again, I can upload successfully.)
This doesn't happen every time I get a hiccup; sometimes the file overwrites no problem. The file is nearly always a 0 byte file, with an occasional multi-byte file in there, just to confuse me. I had the guy who set up the server try various settings to fix this, but the problem is, it's intermittent. So he would try something, we'd wait until it happened again, he'd try something else...it was like spitting into the wind.
I upgraded to the newest version of Filezilla, but that didn't help. One of the things my consultant said he tried was to allow uploads to continue after an error, but that didn't help either. It's something on the server that is different from the Win 2003 server.
Thus far, our staff members (that all have regular DSL connections) don't have this problem. It's only with my sat connection, and only since our server upgrade.
What server settings might temporarily lock a file after a bad upload, so that an FTP program is unable to overwrite it?
I have a section of Filezilla log showing the error, if that would help.