Mac/Linux – Backing up MySQL Databases

For similar reasons as to why I created drupal_db_dump for dumping Drupal databases; I wrote a wrapper script for the mysqldump command. One difference however, is that you can pass in options to the mysqldump command, which allows for greater utility. You do this by appending — to the end of the script and then specifying any valid mysqldump flags.


./msdw -d database1,database2,database3 -p /path/to/store/dumps -a — -u mysql_user –password=itsasecret –single-transaction

The only mysqldump flags you shouldn’t specify are ones relating to which databases to dump (ex: –all-databases). The database(s) you wish to dump must be the argument to -d flag and if you specify multiple databases, they must be separated by commas with no spaces in between. As with drupal_db_dump, specifying -a will auto-purge all but the last dump for the previous month when a new month begins. But again, by design, it will do this only if the backups span a maximum of 2 months.

For anyone wondering, msdw is an abbreviation for mysqldump wrapper. The guy that usually names my scripts got fired for being too verbose. drupab_db_dump, really? Terrible.


Linux/Mac – Backing up Drupal Databases

I manage several servers that are hosts to a variety of Drupal 6 & 7 sites, and I needed a script to intelligently dump each site’s database. Enter drupal_db_dump.  A  script I wrote in Ruby that uses drush (command line tool to manage Drupal) to do exactly just that. This script is best run via a cron job and takes two or three parameters: a Drupal sites directory, a directory where you’d like the dumps to be stored, and an optional archive toggle. Here’s what it will do:

  1. Go into every site in the specified sites directory and dump each database to a dated folder inside the dump directory (organized by site).
  2. Create an md5 checksum of the dump and write it to a file inside the dated folder.
  3. Compress the dated folder, and remove the original.

I also included an optional archiving functionality. If specified (-a), it will purge all but the last backup for the previous month when a new month begins. The last backup is moved to a folder called monthly_archives. By default, this functionality is turned off.

Benefits of use:

  1. Eliminates the need to configure a separate backup for each site,  as it will traverse all sites in the given directory.
  2. Quicker to setup, as you do not need to configure a regular and MySQL user. Drush will use the information in the site’s settings.php file to dump the database.
  3. Checksums!


  1. If the backups for a site span more than two months, this script will not attempt to archive them. This is by design. You will need to manually move or remove the backups until they only span one or two months.

Mac – Prevent Dropbox & Google Drive from Asking for Administrator Privileges

The first time Dropbox or Google Drive is launched on a Mac it will ask the current user for Administrator privileges so that a helper utility can be installed to modify folder icons, which will be used for all subsequent users. This obviously presents a problem when you deploy these applications through ARD, munki, Casper, or Dell Kase (why!?) as the first user that tries to configure their account will most likely not have Administrator privileges, otherwise they probably would not have needed you to install it.

In order to solve this problem I’ve created two post-install scripts, one for Dropbox and the other for Google Drive, that will place these helper utilities so that a user is never asked for administrative credentials. You use it by either appending it to your install through your software service management tool of choice, or by running it as a script after copying one of the applications to the Applications directory.

You can find the script for Dropbox here, and the one of Google Drive here. These scripts have been tested with Dropbox 2.0+ and Google Drive 1.9+. But of course, I recommend testing them with whatever version you’re using before you deploy en masse.

Mac – Munkiserver Puppet Module

Forget what you know about the word puppet because when I say Puppet I want you to think of configuration/state management to the extreme, and not marionnettes. What can Puppet do, you ask? Puppet can manage a machine from the beginning to the end of its lifecycle. It can enforce a state on a machine. You want to ensure that the SSH/Apache/MySQL services are always running? No problem, Puppet will do that. And you’ll see this first hand after the jump, but it can also automate repetitive tasks (ex: setting up clients) and quickly deploy additional servers to help load balance a critical service. Alright, this sales pitch is over. If you want to know more, you can learn more learn more about how Puppet works here.

Munkiserver? You know about munki, Greg Neagle’s fantastic software management application, but what is munkiserver? Munkiserver is a Ruby on Rails web application for managing your munki setup, developed by Jordan Raine. It uses munki a little bit differently but adds some neat features. For example, clients are in a 1-1 relationship with the server  (i.e. each client has their own manifest), making it super easy to specify one off installs. However, you can still group clients together using computer groups and apply software bundles to them, thus achieving the same level of functionality as regular manifests in vanilla munki. Another difference is that all configurations (ex: pkginfo, manifests, bundles, etc…) are stored in a backend database; there is no flat repo. This does add some complexity and makes it impossible to add manifest logic. However, munkiserver does give you the ability to add raw tags to a package’s pkginfo file via the web application. Now that I mentioned it, everything is done through the web application:

  • Adding/removing computer clients
  • Uploading/editing packages
  • Editing manifests
  • Assigning user/group permissions
  • Viewing which packages have updates (uses to check)
  • Viewing warranty information
  • The list goes on…

If munkiserver sounds like something you want to try but don’t want to spend the time setting it up, today is your lucky day. I’ve wrote a Puppet module that will automatically configure a new instance of the munkiserver application on any Mac OS X 10.6+ system in 20 minutes or less (depending on your internet connection, and CPU speed). And regardless of whether or not you have any knowledge about Puppet or an existing Puppet server, this writeup will assume you don’t in both cases, I will explain how to deploy a new munkiserver using only a local Puppet manifest (no Puppet server required). This is because if you already have a Puppet server I think you’ll know what to do for the most part. Details after the jump.

Continue reading

Mac/Windows/Linux – Disabling Adobe’s PDF Plugin in Firefox

Back in January 2010 Greg Neagle wrote up a popular article detailing how to set default settings for Firefox. While his article was Mac focused, this technique is able to be used on all platforms. And in using it, you can effectively control a user’s experience with the browser. I previously have been using custom cfg files to disable the annoying update popups users receive, as we use munki to update software and most of our users aren’t administrators anyways. I recently just updated our cfg to disable Adobe’s PDF plugin and thought I’d share how I did it as, if you can believe this, it’s prone to problems. If you don’t already have a custom cfg file in place I recommend reading Greg’s article and following his instructions first. If you’re doing this on Windows or Linux, swap out the file locations mentioned in Greg’s article for the ones posted in the resource section at the end of this article.

Got a custom cfg file? Good. Add the following lines, this has been tested on Firefox 17+:

pref(“browser.preferences.inContent”, true);
pref(“pdfjs.disabled”, false);
pref(‘plugin.disable_full_page_plugin_for_types’, ‘application/pdf’);

The first two lines enable the in-browser PDF viewer Firefox has recently added. The third line forces Firefox to switch back to its default PDF handling action (Use Preview) if any 3rd party PDF plugin is selected on launch. Note that non 3rd party options (i.e. Use Preview, Always Ask, Preview in Firefox, Save File), are able to be chosen and will stick on re-launch. I want to emphasize this point, the above does not stop a user from selecting a 3rd party PDF plugin for their session but will merely reset it if its been done on re-launch.



Windows (may vary on the architecture version installed):

cfg: “C:\Program Files\Mozilla Firefox”

js: “C:\Program Files\Mozilla Firefox\defaults\pref”

Linux (may vary on flavour, installation method, and/or architecture version installed):

cfg: ” /usr/lib/firefox/”

js: “/usr/lib/firefox/defaults/preferences/”

Mac – Setting Up a Chroot User/Group for SSH

“A chroot on Unix operating systems is an operation that changes the apparent root directory for the current running process and its children. A program that is run in such a modified environment cannot name (and therefore normally not access) files outside the designated directory tree. The term “chroot” may refer to the chroot(2) system call or the chroot(8) wrapper program. The modified environment is called a “chroot jail”.”


Why would someone want to do this? Well sometimes a user doesn’t need access to the entire filesystem and every command to do what they need to do. In my case, I was setting up an SSH SOCKS proxy for some outside collaborators and wanted to limit access to what that SSH user could do on the command line since they didn’t need it. I’ll show you step by step how to set up a chroot jail environment after the jump.

Continue reading

Mac/Linux – Parallel Rsync Utility

I’m in the process of migrating 10 TBs of data from an NFS share to a CIFS share and while talking over the details with my team lead he mentioned that he would slap me if I proceeded to do the transfer in serial :-). With that motivation I wrote prsync_transfer! He was joking of course, but in all seriousness he is right. If you run rsync in a serial fashion, the initial “receiving file list…” process may take a while to complete, especially if you have a lot of small files to transfer. After the jump I’ll show you the utility I wrote to resolve this issue.

Continue reading