dimanche, novembre 28 2010

Monitoring remote rsync.net storage quota with munin

I'm using rsync.net's networked storage for my duplicity backups (operated with backupninja). rsync.net uses quotas to limit each user's storage space. As I'm using munin to monitor my local machines, including their disks capacity, I wanted to include a similar graphing for the rsync.net quota too.

Here's a very basic munin plugin to be installed as /etc/munin/plugins/rsyncnetquota to be able to graph the output of the quota command :

#!/bin/bash

user=12345
host=whatever.rsync.net

quota=`ssh $user@$host quota | grep -e '^ */' | sed 's/^ *//g'`
current=`echo $quota | cut -d ' ' -f 2 | sed 's/\*$//'`
quota=`echo $quota | cut -d ' ' -f 3`
warning=$((quota*80/100*1024))
critical=$((quota*95/100*1024))

case $1 in
   config)
	echo "graph_title Rsync.net quota for $user (quota : $quota)"
	cat <<"EOM"
graph_vlabel quota
quota.label quota
EOM
echo "quota.warning $warning"
echo "quota.critical $critical"
#graph_args --base 1000

        exit 0;;
esac

echo -n "quota.value "
let current=$current*1024
echo $current

In my case, I want to have a warning alert at 80% and a critical message at 95%. Note that you may prefer adding constants here instead of issueing 2 ssh connections, one for the execution with parameter 'config' and one for the real value collection.

The script should be run as a user (here, root) which is allowed to execute password-less ssh onto the remote rsync.net account (ssh public keys, etc.), so a corresponding configuration should be added to /etc/munin/plugin-conf.d/munin-node in the form of :

[rsyncnetquota]
user root

mercredi, mars 3 2010

Restoring duplicity backups with different debian distro : attention to incompatible versions

I've recently had to test my backupninja + duplicity backups (yes, I think I haven't blogged about backups since I described my previous setup which used amanda).

The zonbu PC that managed the backup disks target of duplicity, which was running Debian stable has died.

I tried and restore the contents of its system (to have a look on the config files I had setup there) on a Debian testing system, using the same duplicity commandline (more or less).

However, I couldn't do that, since apparently, duplicity in Debian stable (0.4.11) and testing (0.6.06) don't seem to be exactly compatible.

I managed to restore anyway by reinstalling duplicity 0.4.11 in a custom prefix setup, which worked fine. The command-line then goes something like this (excerpt from the tarball's README) :

python setup.py install --prefix=/usr/local
PYTHONPATH='/usr/local/lib/python2.x/site-packages/' /usr/local/bin/duplicity -V

I've traced this problem in Debian BTS (#572102), as I'm afraid of the consequences when people will try and restore on the next stable distro backups made with the previous stable...

You've been warned anyway ;)

jeudi, août 28 2008

Amanda backups to VFAT partition on external (USB) drive on Debian

I have setup a mini-PC on which I have installed my network backup infrastructure, using Amanda.

In this post, I try to summarize some useful options and links... to be improved, of course : comments most welcome.

Lire la suite...

jeudi, mai 3 2007

BackupPc saved my life (almost ;)

I've been using BackupPc for quite some time to backup the home network computers on an external disk.

BackupPc offers a nice Web interface which allows retrieving of files from the backups, in the case of an accidental deletion, for instance, which is quite user-friendly. Apart from occasionaly restoring deleted files, I had never had to restore a backup.

I was more or less annoyed by the processing power needed by those backups over and over again (the external disk on which I'm saving the files is quite slow although connected through firewire, and some of the machines are a bit old and don't like the compression, md5 and other computations needed by the backup process)... and even though there's a differential backup mechanism for only saving full backups from time to time, the length of the backups and the slow down and noise of the disk were recurrent annoyances :(

But two days ago, I've had the assurance that all this was worth the game, since one of the disks on which I had saved more or less all my data suddenly broke down :(

I've had to buy a new disk, but I've been able to restore more or less all the files quite easily (using command line and tar format instead of the web interface), and I'm back on tracks with only one day of email lost.

So thanks a lot BackupPc, even though you make my computers slower and more noisy !

mardi, janvier 23 2007

Backup du dump SPIP (dand une page perso Free) via CURL

J'ai mis en place un site SPIP dans une page perso Free.

SPIP propose une interface permettant de sauvegarder un fichier de dump SPIP, qui permet de restaurer le site (si tout va bien), mais elle est accessible uniquement en tant qu'administrateur en mode connecté, et nécessite la création d'un truc en FTP... bref, pas cool pour automatiser les sauvegardes.

J'ai donc bidouillé un truc avec CURL qui permet de récupérer le dump et de faire des sauvegardes automatisées, donc, en crontab.

Lire la suite...

Backup de la base de données MySQL des pages persos Free avec CURL

J'ai mis en ligne un site sur une page perso Free, qui utilise une base de données MySQL Free.

Il existe un outil de sauvegarde via un formulaire Web pour ces bases de données, nécessitant de s'authentifier et de valider un formulaire, mais je préfère une sauvegarde automatique déclenchée périodiquement.

J'ai donc bidouillé un script permettant de faire la même chose avec CURL, qui peut donc tourner en crontab. Il est prévu pour tourner dans un répertoire, et garder seulement les 10 derniers dumps de la base.

Lire la suite...