Let's say we have:
- a local computer (Windows or Linux)
- a distant dedicated server with 2TB storage, running Linux
I'd like to do a backup my local computer's data folder (1 TB) to the distant server (updated every week of a few GB), but I have a very specific requirement:
I want that that even if a hacker gets root access on the distant server, they cannot use the data.
NB: The files never needs to be used on the distant server, it's only for storage, the data never needs to be used on that distant server.
TL;DR I want that the data is totally impossible to use on the distant server.
I was thinking about using rsync
, but then the files are in the filesystem of the distant server, and thus it can be read by a hacker if he gets root.
(Working) proof of concept but totally inefficient: do a backup_20201116_1538.7z
archive of the whole data with 7Zip, AES-encrypted with a long password/key. Send this file to server. If someone has root access on the server, he has access to this .7z file, but he cannot do anything with it because he doesn't have the decryption key.
Inefficient because:
I need to make a temporary .7z file of ~ 800 GB before being able to send it via SFTP!
If next week I want to update the backup I need to resend a new 800 GB (argh!), so it's not incremental / diff like
rsync
!