Syncing desktop to a notebook

I have a notebook which is essentially a copy of my desktop. It goes with me when I'm away from my home office, which is often.

I would like to use a tool to sync specified directories so that the most recent files are installed automatically on both hosts.

Recommendations please...
 
I already use rsync and scp, but they have to be manually directed to attain a two-way sync, and finger trouble can cause loss of data.

I want time stamps on the inodes to automatically dictate the direction of the sync.

I just had a look at the documentation on the website for syncthing. It looks likely, claiming:
Adding files to the shared directory on either device will synchronize those files to the other side.
It keeps extra permanent metadata (.stfolder)on the disk of each host, which will have to be managed if files or directories are deleted or moved. I'm not wild about that feature, but maybe it's required for satisfactory performance.

I'm planning to give it a try. Thanks.
 
Git is a bad idea to sync two computers, because the binary handling of Git inside a repository really isn't that great at all. If you want Git, then you should use devel/hs-git-annex instead, where the binaries are managed outside a Git repository. This might at first glance sound really weird, but actually makes a lot of sense. Plus it keeps a nice history of file changes.

Syncthing is nice, but depending on your amount of data hashing will take quite some time. In my experience when the amount of data is big enough this isn't exactly the fastest tool due to the initial hashing time then. Rsync is another nice fit to do the job, for backup reasons also Rsnapshot which is Rsync + snapshots through file links. If both computers have ZFS, also ZFS send/receive might be good.

Probably though Git-Annex is what might come in best for large quantity of data.
 
I use net/rsync, actually sysutils/py-rdiff-backup, for backups and net/unison for syncing working folders between computers.

unison
fits your use case -- it propagates changes in both directions when you ask it to, using ssh transport, and asks you which computer's version to keep when there are conflicts (changes were made on both computers). It will not automatically propagate changes while you are working, which can be a problem with syncthing (see discussion journey from unison to syncthing and back to unison). Things I like about unison: no special "shared" directory is needed, just list your home folders that you want synced in the config file; specify stuff to exclude like you would with rsync; it tolerates power loss or network loss during the transfer and never (?) corrupts either computer or loses track of its state (it was written by Prof. Benjamin Pierce); it works very well syncing multiple computers in a star topology (e.g. multiple notebook computers syncing to a workstation, expecting all changes made anywhere to eventually sync to every computer).

[One caution on unison. It uses ocaml for serialization and FreeBSD's unison using ocaml 4.05 is not compatible with Linux versions of unison using ocaml 4.1x. I sync multiple computers in a star topology using FreeBSD and it works great. I do sometimes run Gentoo on those same datasets but only sync when running FreeBSD. If you want to sync between FreeBSD and Linux using unison then you would want to compile unison locally using an up-to-date ocaml. ]
 
Syncthing is nice, but depending on your amount of data hashing will take quite some time. In my experience when the amount of data is big enough this isn't exactly the fastest tool due to the initial hashing time then. Rsync is another nice fit to do the job, for backup reasons also Rsnapshot which is Rsync + snapshots through file links. If both computers have ZFS, also ZFS send/receive might be good.
rsync(1) is exactly what I use at the moment, but the invocation, and safety checks, are manual, and the direction is one way. The solution I seek is automated, and two way (but with a guarantee to not destroy anything gratuitously).

The data quantities to be sync'd are moderate -- a few gigabytes of data and a few thousand files. I don't yet have a feel for what the overhead would be for metadata management for any of the schemes suggested. II'm only likely to look into it if performance issues materialise.

Backups are a different problem. I already use rsnapshot(1) for primary backups of all hosts to a dedicated ZFS pool, and ZFS send/receive to copy those data to an external disk for off-site backups.

Thank you.
 
Back
Top