-1

I am trying to tar a huge directory (with about 600GB) and sometimes my SSH conection drops and I need to execute the tar code again and my linux overwites the previous file beginning all the tar again.

Is there some parameter that allows my tar file to be resumed after some problem?

amandanovaes
  • 133
  • 1
  • 5

2 Answers2

7

Is there some parameter that allows my tar file to be resumed after some problem?

Nope.

What you should do, though, is run your command from within a terminal multiplexer like screen or tmux. That way if your connection drops, the process keeps running.

EEAA
  • 108,414
  • 18
  • 172
  • 242
  • 3
    Or use nohup command – Engineer2021 Apr 23 '14 at 20:04
  • nohup command will keep the proccess after I close my SSH window? I use putty, so after I close putty it will keep the process? – amandanovaes Apr 23 '14 at 20:23
  • @amandanovaes [This answer](http://serverfault.com/questions/463366/does-getting-disconnected-from-an-ssh-session-kill-your-programs/463375#463375) explains what happens when your terminal disconnects. (shameless self plug) – Andrew B Apr 23 '14 at 20:28
  • I dont know what is a terminal multplexer! That's why I think that maybe nohup is a much better solution but I want to make sure I understand it. – amandanovaes Apr 23 '14 at 20:47
  • 5
    @amandanovaes - if you're doing linux sysadmin, you need to know what screen and/or tmux do. They are essential tools in your sysadmin toolbox. For may other reasons than just this specific use case. – EEAA Apr 23 '14 at 20:54
0

Over ssh, I would use rsync. This brings everything over uncompressed, but it will only bring over changed files, which is likely what you want as well. A 600GB transfer will probably have a few connection drops. This recovers automatically.

I've used variants of this script over time. I forgot where it came from, so if the author is found, I would like to give credit.

#!/bin/bash

### ABOUT
### Runs rsync, retrying on errors up to a maximum number of tries.
### Simply edit the rsync line in the script to whatever parameters you need.

# Trap interrupts and exit instead of continuing the loop
trap "echo Exited!; exit;" SIGINT SIGTERM

MAX_RETRIES=50
i=0

# Set the initial return value to failure false

while [ $? -ne 0 -a $i -lt $MAX_RETRIES ]
do
 i=$(($i+1))
 rsync -avz --progress --partial -e "ssh -i /home/youngian/my_ssh_key" /mnt/storage/duplicity_backups backupuser@backup.dreamhost.com:.
done

if [ $i -eq $MAX_RETRIES ]
then
  echo "Hit maximum number of retries, giving up."
fi
rickfoosusa
  • 231
  • 4
  • 5