2

I recently need to transfer large tar files to an external USB hard drive from one Red Hat ES 5 server to another (neither with network connectivity).

Originally I was trying to move ~90gig tar file from the server to the external drive. This transfer worked, but when it came time to unmount the drive I received an I/O error refusing the device disconnect and then it unmounted, but not cleanly. I tried to do ~20-30gig files and the same thing happened. After breaking things up to ~10-15gig tar balls things were transfered just fine.

I was hoping someone could provide some insight as to why I experienced problems with the larger tar files (other then my own in-experience). Any information would be great!

hsatterwhite
  • 322
  • 2
  • 5
  • 14
  • Are you sure you exited the mounted folder before attempting to unmount the drive? Also, what method did you use to copy the file(s)? is it possible that it was still operating in the background? – Chris Nava Feb 24 '11 at 05:41
  • Hey Chris, yes I was not in the mounted folder of the device and I ran fuser -m /dev/device1, which returned no processes still running against that device. I was just using tar from the directory of files on the server and creating the actual tar ball on the usb hard drive. – hsatterwhite Feb 24 '11 at 13:22
  • 3
    It is possible that the files are still being trasferred "in the background" when the `cp` command returns. When the `cp` command returns, run the `sync` command. If the `sync` command blocks (does not return) then data is still being written to the drive. The `sync` command will return when the 'dirty' data has been completely flushed to the disk. After `sync` returns, are you *still* unable to unmount? – JeffG Mar 02 '11 at 22:19
  • Also try `lsof +D /mntpoint` – Steven Mar 03 '11 at 15:43

1 Answers1

2

Try using the "sync" command before unmounting the drive. It's possible that it hasn't finished copying everything.

jd50
  • 71
  • 3