I maintain a drupal codebase that hosts multiple sites. I’ve been shamefully lax in getting regular backups of those files and databases, until today. Here is a pretty basic bash script that will create a bzip2 archive of the drupal codebase (including site-specific dirs), and then create mysqldump exports of each site’s database and gzip those files. This particular script requires that the drupal install directory be named “drupal”, but you can change this easily enough to suit your needs.


This does require a ~/.drupalsites file that contains the needed database login info (suggest chmod 600). Like I said, nothing too clever.This script runs fine from cron, I have it scheduled for a weekly run. I then plan to rsync this to my home server for an offsite copy, even though the hosting service provides backups as well.