-1

I have large files (1 GB+) that should be kept replicated at two locations and internet speeds keep a cloud or online solution resource prohibitive. (Already slow connection approx 6 Mbps in a remote part of Canada with a hardware VPN on top)

I frequently (daily) move between the locations with my laptop. A solution that uses the laptop as a shuttle would work well upon connection to the WiFi networks at each location. Automation a huge plus as I don't necessarily want to manually check daily for changes.

I've no problem rolling my own if required.

Both servers are Linux machines.

Never underestimate the bandwidth of a station wagon full of tapes

  • *A solution that uses the laptop as a shuttle would work well* - call me skeptical. How much do they change and how long can they go being out of sync? Do a differencing-copy over the internet, prioritise it below any other traffic and forget about it – TessellatingHeckler Sep 09 '15 at 04:04
  • @EEAA Sorry, should have included more information. I originally tried writing my own shell scripts and python scripts that monitor the mounting of a specified USB disk on the server. This never quite worked right as I suspect my understanding of those events has probably failed me here. I did try solutions such as git with a cronjob but the files I am syncing (media, art assets) do not diff well. – Aaron Dale Sep 10 '15 at 04:13

1 Answers1

0

I would recommend Unison.

It supports syncronisation of two or more sets of directory with bi-directional syncronisation.

You will have to be careful to properly resolve conflicts, but Unison makes this fairly easy.

Automation would be a matter of detecting the connection to the wifi networks and each end and that is highly dependent on what system the laptop runs on.

Aaron Tate
  • 1,222
  • 7
  • 9
  • Unison looks like it will hit the nail on the head. I will conduct trials for the situation and report back. – Aaron Dale Sep 10 '15 at 04:26