Here's what I do it's Custom
I don't bother with "non-living" backups. If I have a machine fail I want to slide over to the next one and continue (my college days - get the report *finished*). The cost of a (2nd hand) PC to house the backup disk in - minimal - the benefits - great.
I keep 2-3 machines synchronized BY HAND (rsync) and use RCS to keep EACH FILE incrementally saved. Here's how and why. It saves time and is more useful, and stable.
I use RCS whenever I edit a hand edited script or have hour of change to some edited file. This means there are two copies of each file: dir/file and dir/RCS/file,v. Loss of file? just co -l file - no need to (telnet/rsh/ssh) live running "backup". If somehow i delete a whole dir tree (very rare i do) ... i just rsync it from the other pc. no backup software needed. if i need another pc - i just copy the whole drive and boot the new pc - no restoring bacup needed. Another thing: say software installs and clobbers file: just co -l file. RCS is 100x more efficient than keeping incremental backups.
Caveate: only broadcast/synchronize file,RCS/file,v to all hosts (srdist file) by hand. Only a human realizes which copy is really the newer and has just edited changes - and unix time is too easily wrong to use time as an indicator! I tried a few schemes for automated: they have serious flaws of continually having A WRONG versioned file as newer - and that might make or break a whole backup as to if restoring it helps !! one file.
At first it sounds like work using RCS for each edit. But compared to using incremental backups it's a HUGE timesaver. And it also means you can shift between two PC at all times for reasons other than backups. Just to use 2 or at once.
I don't backup the OS because that can be re-installed - and likely changes anywho - whether i like it, or agree, or not. I hate incompatibility changes - often new releases are just damaged UNIX - just changed defaults and fewer features *available* due to hacking unix away from being unix. Not to mention: nothing works and everything must be hacked to fit joe schmo's hack to the core to fix his misfit past hacks
These days - rebuilding/recompiling a complete distro fresh (assuming one has a script for doing so) takes not much longer than restoring a backup. Then one only needs their hand edited configs - which are RCS kept and kept outside system directories. I NEVER keep any hand-edited files in system directories, as these are systematically destroyed during re-install or upgrades or what.
I found out long ago that 1 backup isn't enough: there's a chance there's a fatal flaw with the back OR a chance you'll damage the backup (or the system being restored) when trying to restore. Therefore I prefer 3 living copies, just like accounting, always.
Caveate: beginners would not understand why to do this or what to do if they loose a file or a system. beginners would not understand how to keep hand-edited files clear of system directories that get clobbered. beginners would not necessarily appreciate or be able to keep 3 different kinds of PC up running living and synchronized - though the process simple when OS install is left aside (problem? os install is not really aside).
where to see scripts i use (here, only srdist really describes my convenient rsync wrapper 'srdist' simple file distributing):
https://sourceforge.net/projects/x-lfs-2010/
https://sourceforge.net/projects/ (see srdist)
i've been wanting to install bsd 4.3 or 10. btw i'm new to bsd on apple. but linux has allot in common (ie, early linux had mostly/wholey ripped bsd4.3 userland)