Wednesday 13 June 2007

Backing up your home to DVD with debian

I've finally got a DVD writer (external - it's a laptop) and it works quite amazingly well: no setup issues at all!

Now, the whole point of the exercise was that my Dell is getting a little dodgy (you have to press down hard around the power button to flex the PCB below in order to get it to boot!) and I wanted to back up my documents, sourcecode, emails etc. in case the worst should befall my poor old laptop. But I had a German exam the next day, and while I was burning the DVD I should also be revising!

In general, burning a data DVD on linux is easy. You get a copy of mkisofs and growisofs (on debian, this is easy: apt-get install dvd+rw-tools mkisofs). Then you call mkisofs on a directory tree to create an iso image (put it in /tmp!) and growisofs to write the lot to your DVD (I got the instructions here

Ahah, but not so fast! I wanted to do my home directory whilst xfce, claws-mail, firefox and probably a whole load of other processes were writing tiddly little bits to all my hidden dot-files and dot-directories in ~/ (you know, the ones like ~/.gnome and ~/.mozilla etc.). In fact, I didn't twig the first time and mkisofs threw a wobbly! Moreover, I didn't want to make another copy of everything because there's quite a lot of stuff and I didn't really have space for that on the harddisk partition.

So here's the solution:

First I made a folder called backmeup in my home directory:
mkdir backmeup
Then I moved all of the "normal" i.e. non-hidden files in my home directory into backmeup:
mv * backmeup
This will actually complain because at some point you've asked it to move backmeup into backmeup, but ignore the complaint - it did what we wanted! Now for the dot-files. We don't want to move them into backmeup, because some of the running programs would throw a wobbly, so we will actually need to copy this lot. Now, don't run cp -Rp .* backmeup/ because it'll go horribly wrong and try to make two copies of backmeup via the .. directory I think (disclaimer: Yes, this did go horribly wrong. And it took me 5 minutes to notice!)

Instead, call the following command:
cp -Rp `find -maxdepth 1 -name '.*' | sed '/^\.$/d'` backmeup
which is what we really wanted.

Now we're almost done! Just make the iso from the backmeup tree (no-one's going to be writing automatically to something called that!):
nice -n 19 mkisofs -r -o /tmp/rupert.iso /home/rupert/backmeup
The nice command is just to stop the job hogging the whole system.

... And write the iso to the disk:
growisofs -Z /dev/scd0=/tmp/rupert.iso

(clearly the device to which you write will need to depend on what the kernel calls it - I'm trying to do something clever with udev, but haven't quite got there yet!)

Wednesday 6 June 2007

Ical subscriptions with orage

Orage looks to be an exciting program for my new Xfce desktop, but what good's a calendar if it can't subscribe to ical feeds from the internet?! Fortunately, I'm supposed to be revising at the moment, so have had an urge to hack code together...

And so we present a simple shell script to download .ics files periodically from the internet to your filesystem. The one nifty feature is that if the internet's down, this won't blat the current copies as using wget blindly would, which should be useful with my laptop.


There are probably more sophisticated systems, but I was going for the 25 minute job option. Firstly, we need a list of feeds - mine looked something like this:

## Lines must be either urls, blank, or start with a #.

# UK holidays
http://ical.mac.com/ical/UK32Holidays.ics

# NASA Space Missions
http://ical.mac.com/tonyfarley/SpaceMissions.ics

# Astronomical Events
http://hewgill.com/astrocal/astrocal.ics


The lines beginning with hashes really are comments - they get stripped by the downloading script, which uses wget:

#!/bin/sh

dir=/home/rupert/.icals/

for f in `cat $dir/feeds.list | sed -e '/^$/d' -e '/^\#/d'`
do
fout=$dir/`echo $f | md5sum | cut -f 1 -d ' '`.ics
tmp=`mktemp $dir/getfile.XXXXXX`
wget -q --tries=3 -O $tmp $f
if [ -s $tmp ]; then
mv $tmp $fout
else
rm $tmp
fi
done


To make it work, you need to call the first file feeds.list and set dir to a directory containing feeds.list and in which you'd like your downloaded .ics files to end up. The quickest way I could think of to give the files names was the md5sum of the url, so once you've got everything in place, set the executable bit on the script (which I called get_feeds.sh) and run it.

If all goes to plan, you should get some files called things like "5b7ed8afdd4a8f71525d0df0e47231e5.ics" appearing in $dir. Now we need to automate it: I used cron, so call crontab -e and add the following line to your crontab:

*/15 * * * * /home/rupert/.icals/get_feeds.sh

(well clearly, you'll need to make the directory right!) This calls the get_feeds.sh script every 15 minutes. Maybe we should slow that down, but I was testing!

Finally, use the new "Exchange Data" item on Orage's File menu to add a foreign file corresponding to the .ics you downloaded and you're away! Wahey!

Incidentally, the SVN orage that I downloaded this evening doesn't appear to be getting multiple foreign files quite right - if I can work out what's going on, I'll file a bug tomorrow!