I selected the free service offered by CloudSafe GmbH as the replacement iDisk. They offer 2 GB for free. Their site is very secure in that all access is via https, and all data stored there is highly encrypted and must be decrypted through the use of a lengthy key. Also, they offer WebDAV over https to the data.
The free CloudSafe accounts can have up to three WebDAV mountable remote drives, called “safes”, each with its own encryption key and access rules. For the purposes of backup, I created a safe called “Backup”.
In order to use the remote drive, you first have to use CloudSafe's dashboard to enable WebDAV on the safe. When you do this, the system will display two critical codes. The first code is part of the address used to access the drive, and is a 10-digit number, like « https://0123456789.webdav.cloudsafe.com/ ». The second code is used, along with the e-mail address you use to access your CloudSafe data online, to get access (i.e., decrypt) the data. The other code consists of four six-character alphanumeric strings, like ACB123-DEF456-GHI789-JKLMN0.
When you have received those codes, the first thing to do is to use Finder's CMD-K option to open the safe. It may be necessary to have some content in the safe for it to open correctly. In my case, I created a folder called Daily there. When you go through Finder's authentication protocol, enter the full https address as the device, the email address as the login name, and the decryption string as the password. IMPORTANT: save this in your login keychain.
Now, some of what follows can be done differently if you prefer, but this is what I did.
I have a miniature partial unix-style file system called “usr” under Documents in my home directory. I put it there to keep it relatively unobtrusive and to avoid cluttering the main file system. In what follows, it is assumed that the folder “~/Documents/usr/libexec” exists to contain the script.
Next, the script itself:
#!/bin/ksh
# backs up a list of folders or files to the CloudSafe Daily folder.
# The backups are done in subfolders of Daily as follows: there is a
# folder for every month (%m; 01-12) in every year (%Y). The backup is
# done there whenever the corresponding folder (%Y%m) doesn't exist. On
# all other days, the backup is done in a 7-day cycle based on the day
# of the week (%u; 1-7; Monday = 1). All previous contents (if any) are
# removed before each backup.
# NOTE: the CloudSafe file system is very simple and does not support
# links and so on, so nothing complicated should be backed up here. all
# are below $HOME. If it becomes necessary to backup more complicated
# filesystem structures, maybe we can backup using tar or a disk image
Me=`basename "$0" .ksh`
# server info
SAFE=0123456789 # REPLACE THIS WITH YOUR SAFE'S INFORMATION
SERVER=webdav.cloudsafe.com
URL="https://$SAFE.$SERVER/Daily"
# mountpoint info
MNT=/Volumes
DEST="$MNT/Daily"
Year=`date +%Y`
Month=`date +%m`
Day=`date +%u`
# try a command n times or until success
function tryrep {
typeset i ntry=$1 ; shift ; typeset cmd="$@"
for (( i=0 ; i<$ntry ; i++ )) ; do
if $cmd ; then return 0 ; fi
sleep 10
done
return 1
}
log(){
print -- "$Me: $*" | logger -s
}
err(){
log "$*"
exit 1
}
errum(){
if tryrep 100 umount "$DEST" ; then
sleep 5
if [[ -d "$DEST" ]] ; then
rmdir "$DEST"
fi
fi
err "$*"
}
# the list of assets
set -A Src \
Library/Keychains/personal.keychain \
Library/Keychains/login.keychain
# mount volume
if ! mkdir "$DEST" ; then
err "Mountpoint '$DEST' is in use or $MNT is unwritable"
fi
# assumes that authentication is in user's keychain & mount_webdav has access
if ! tryrep 10 /sbin/mount_webdav "$URL" "$DEST" ; then
rmdir "$DEST"
err "Failed to mount '$DEST'"
fi
log "Mounted '$URL' at '$DEST'"
# establish and zero the destination folder
if [[ ! -d "$DEST/$Year$Month" ]] ; then
Dest="$DEST/$Year$Month"
else
Dest="$DEST/$Day"
fi
rm -rf "$Dest"
mkdir "$Dest"
for (( i=0 ; i<${#Src[*]} ; i++ )) ; do
where=$(dirname "${Src[i]}")
mkdir -p "$Dest/$where"
if ! cp -Rp "$HOME"/"${Src[i]}" "$Dest/$where" ; then
errum "Copy returned an error (${Src[i]})"
fi
log "Copied '${Src[i]}' to '$Dest/$where'"
done
log "Backup complete"
if tryrep 100 umount "$DEST" ; then
sleep 5
if [[ -d "$DEST" ]] ; then
rmdir "$DEST"
fi
else
err "Problem unmounting $DEST"
fi
log "Unmounted '$DEST', exiting"
exit 0
The version of the script above backs up only your main login keychain plus a “personal” keychain, but you can alter the « Src » array to contain what you want to include. These can be either files or folders. Note that they shouldn't include symlinks or Finder aliases, because those aren't supported in the CloudSafe filesystem.
Next, use the crontab -e command to create an entry in your personal crontab like this:
In the example, this will run the above script at 2:30 AM every day. Take a look at the documentation in crontab(1) and crontab(5) for more information about how you can set this up to run.30 2 * * * ~/Documents/usr/libexec/cloudSafeDaily.ksh
Basically what it does is to try (heroically) to mount your Backup safe at the indicated time. It figures out the year, month, and the day of the week by using the date(1) command. It looks to see if there is a long-term backup already for the year and month (for example, /Volumes/Daily/201203) and if there isn't, it will use that as the destination; otherwise, it will use the day of the week (for example, /Volumes/Daily/1) as the destination. Then it copies the indicated data into the destination (after first removing whatever was there before), creating all folders in the paths as needed. For example, in the example it will create (e.g.) /Volumes/Daily/1/Library/Keychains/login.keychain along with the Library and Keychains folders. This folder-creation is necessary in order to prevent files of the same name in different folders overwriting each other.
This will allow you always to go back 7 days, plus it will keep one backup per month as long as you let it run.
It does not check for space, because the WebDAV filesystem doesn't support that feature correctly. So, it will keep going until you get an error, which shouldn't be a problem if you use this only for smallish files. If the script works normally, there will be a few lines of information written to the system log; if there are errors, a descriptive log entry will be made to help you try to pinpoint the problem.
Why did I make the login and personal keychains the default items to backup?
There is a bunch of critical information in the login keychain, plus, you can store texts in there as encrypted secure notes. You can use this for all of my password information and various other important, secret information.
Note that secure notes do not unlock automatically by default, but some passwords do. Also note that the password for the login keychain is normally the same as your login password and some feel that this is a security problem. If you think this, then my advice is to create a second keychain file, which I call « personal.keychain », for example. Put things that are unlikely to be needed by programs, such as your secure notes and certain passwords and certificates, and give it its own, different password. I added this to the nightly backup on a line before « Library/Keychains/login.keychain » that says « Library/Keychains/personal.keychain \ ». They will both be backed up. Note the backslash at the end of the non-final line: this is critical. Another option would be to remove the final « /login.keychain » from the existing line; this will cause the entire Keychains folder to be backed up, no matter how many keychains you have in there (I didn't do that by default because sometimes a lot of useless files can accumulate in the Keychains folder).
UPDATE: It turns out that in order for the crontab process to get access to the information in the keychain, it must be added to the System keychain, and access must not be restricted. This doesn't seem acceptable to me.