2

There are currently many tools that let you fully backup a running Raspberry into either a clone device (SD/pendrive), like rpi-clone, or to a series of rsync incremental backups (RaspiBackup by framp is a hugely versatile tool, and adds a LOT of functions over that), or to an .img file ready to be flashed (I started by using bkup_rpimage, and just now switched to a more recent version, dolorosus/raspibackup).

So, if your intent is to obtain a FULL system backup, you couldn't really wish for more, there's one tool for each taste, and yet more.

And now, for something completely different.

How can you create an essential backup, ideally of only those files that you would need to "copy over" into a freshly installed raspbian image right off https://www.raspberrypi.org/downloads/raspbian/, to get a resulting system that is "mostly surely" equivalent to the original one, functionality wise?

The objective is NOT to replace full system backups, but rather to keep a "fail safe" alternative to them, in case your backup disks are stolen, get corrupted, your wife formats them, your house takes fire, whatever; this alternative backup will be stored encrypted on the cloud (where you cannot casually store several multiple-gigabyte .img files every few days) ideally without the need to ever use it (which is the ideal concept for any kind of backup, or any means of security in general).

The requirements are: 1) There must be a reasonably easy way to restore the backup (not a big problem with a script) 2) The final restored product should practically have the same functionality of the previous system, maybe requiring only minimal manual "post-processing" 3) The size should be the smallest possible, in order to be easily encrypted and uploaded to the cloud

I am working on something like this, but I need suggestions to help create something robust, efficient, and especially more complete than what I, as a hobbyist IT enthusiast, am doing.

My own tool (https://github.com/ephestione/bazidrop) uses the following strategy:

  1. rsync /home and /etc to a local backup location, where are also saved crontabs, database(s) dumps, and a list of all the installed packages.
  2. 7zip said backup location, and then encrypt it with GPG
  3. upload it to the cloud where a crude versioning system is used

Yet I feel I am missing something, for example it will not take into account programs that have been compiled rather than installed via apt, but there is surely more that can be improved.

ephestione
  • 125
  • 1
  • 8
  • I think the following is too complicated:"... essential backup, only those files need to copy over into a freshly installed raspbian image right off, ... resulting system that is "mostly surely" equivalent to the original one, functionality wise?". One reason is that "only those files that need to copy" is very difficult to classify. Data and program files are easy, so I have off site (ssd in my personal bank safe for an annual fee, and website (I have a personal wordpress/github to store things I always worry to lose, cloud backup etc, so, data files are easy, / to cont, .. – tlfong01 Sep 30 '19 at 01:13
  • I agree with you, in the sense that I don't expect to receive a "definitive answer" of 42, but rather collect the most pointers possible as to what, in the file system, needs to be saved in such a backup so that you preserve the most of the original OS, but still saving tons of space on the backup media. I could use such means as to backup to my websites, since they are dispersed among several servers, but I totally dislike the idea of mixng my own private data with the public_html of my online presence :) – ephestione Sep 30 '19 at 07:53
  • Yes, I agree. I commented on data and program files. Let me carry on telling you how I "backup" Rpi system files. Actually I have more than 50 8GB/16GB SD cards. Usually a week or two I will clone a new copy of the Rpi image from working one, store the working one, and use the newly cloned one as the working copy. Whenever I think I reach a mile stone, I would store the current in a safe place. I learned this brute force, complete (not incremental at all) from huge developers as Microsoft. They have huge disks (tape in old days) backup every day or even hour in critical times. l – tlfong01 Sep 30 '19 at 08:05
  • Whenever they have big trouble, they just fall back the day before, or the week before, or the month before. Of course they use version control software, like GitHub. But still not as reliable as brute force back up every thing, as tapes and disks are dirt cheap these days. Just random thought, sorry for the typos. – tlfong01 Sep 30 '19 at 08:07
  • Thanks for the insight ;) Yes in fact storage is "cheap", but it's perishable if it's stolen, broken, electrocuted, anything! Cloud storage assures dependability to an extent, in the sense that I trust the disaster recovery policies of a big company more than my own, but cloud space is not cheap, and free cloud space is limited. I noticed, though, that a 3GB raspbian installation of mine takes less than 700MB when imaged and 7zipped, so I could use that as a last resort for cloud backups... – ephestione Sep 30 '19 at 13:46
  • Wow, hope you get nice answers but thanks a lot for the whole package of information that you presented on the question. Dropbox Uploader rocks and I'm about to install your script. Thx again! – brasofilo Oct 03 '19 at 03:15
  • @brasofilo me and dolorosus are working on github on an enhanced version of the imaging backup, and it's not just because it's my creature, but I really think this, it's currently the best (as in most streamlined, no frills, efficient and optimized) imaging backup tool available. The bleeding edge version of it is currently in this branch: https://github.com/ephestione/RaspiBackup/tree/auto-/boot-%26 – ephestione Oct 03 '19 at 09:42

0 Answers0