Environment: Ubuntu 16.04 in Azure
I'm attempting to backup specific folders. I'm using a simple tar command with no compression. My objective is to keep the file for a month, adding changes incrementally once per day, then compress tar and start a new one once per month.
My problem is this: Local backups make no sense to me because if the local filesystem dies or errors, both the original and backup die. (seems obvious, but I state for clarity)
I have tried backing up to a separate Azure storage container using an SMB connection, and also to another identical Ubuntu machine over NFS.
I tried the second option because I read this article: tar incremental backup is backing everything up, every time when used on the Dropbox directory
So, no matter what I do, the tar seems to ignore the -u flag when issuing the following command:
cd /savelocation; sudo /bin/tar --ignore-failed-read -up -f /savelocation/backupfoldername.tar /var/www/foldertobackup
Instead of getting a few megs extra, which would represent the files added or changed, I'm getting a file that's 44GB which is twice the size of the original.
Any thoughts or questions are most appreciated.