DTrace in MySQL: Documentation and a MySQL University Session

DTrace has been something that I’ve been trying to get into the MySQL server for more than a year now.

After a combination of my own patches and working with Mikael Ronstrom and Alexey Kopytov we finally have a suite of probes in MySQL 6.0.8. Better still, after a short hiatus while I was busy working on a million-and-one other things, the documentation for those probes is now available: Tracing mysqld with DTrace.

The documentation is comparatively light and deep all at the same time. It’s lightweight from the perspective that I’ve added very little detail on the mechanics of DTrace itself, since there is no need to replicate the excellent guides that Sun already provide on the topic. At the same time, I’ve tried to provide at least one (and sometimes two) D script examples for each of the groups of probes in the 6.0.8 release.

So what next for MySQL DTrace probes?

Well, the next version of the probes has already been worked on. I’ve been testing them for a month or so, and due to a problem with the probes on SPARC I couldn’t approve the patch, but I managed to resolve that last week. The new patch extends the probes to enable a more detailed look at certain operations, and it enables us to expand the probes to be placed anywhere within the server, including individual engine-level row operations.

If you want a demonstration of DTrace in MySQL, some of the things you can monitor without using the user probes we’ve added, and those new probes I just mentioned, then you will want to attend the MySQL University session this Thursday (12th Feb), at 14:00UTC where I’ll be doing all of the above.

As a side note, because I know there are people interested, last week I also finished the patch for these probes to go into the MySQL 5.1.30 version that we will be putting into OpenSolaris. Sunanda is working on getting that release out there as I type this.

LOSUG Presentation Slides Now Available

My presentation at LOSUG on tuesday went down like a house on fire – I think it would be safe to say that the phrase for the evening was ‘It’s a cache!’.

For that to make sense, you need to look at the slides, which are now available here.

Attendance was great, but it seems the last minute change of day meant that some people missed the session. We had 151 people register, and about 80 turned up on the night.

LOSUG January: MySQL/DTrace and Memcached

Next Tuesday (27th Jan), I’ll be speaking at the London OpenSolaris User Group again. For those that follow the LOSUG meetings, we normally the third thursday of the month, but due to the overwhelming popularity of the event this month (more 100 registrations so far) we have had to push the event back to the last Tuesday of the month.

This month, I’ll be talking about the DTrace probes that we have added into MySQL and demonstrating their use and functionality. Along the way I’ll also cover some of the internals of MySQL and how it works (and how they relate to the DTrace probes we’ve added), how to use the probes to analyze and diagnose your queries, and how I’ve already used the DTrace probes to provide information up to the Query Analysis functionality within Enterprise Monitor.

After we’ve looked at performance monitoring and optimization with DTrace, I’ll then demonstrate how to get a little more performance out of your application using MySQL by taking advantage of the memory cache offered by Memcached.

As before, I’ll provide a link to the finished presentation once I’ve demonstrated everything to the LOSUG folks on tuesday. If you happen to be in London, and don’t already have plans of some sort, please feel free to come along – food and drink will be provided after the session and I’ll be around for as long as I can for any questions.

Podcast Producer: Scheduling Podcasts

A new article in my series on getting the most out of Podcast Producer is now available, this time looking at a solution involving iCal and the command-line elements of Podcast Producer that can automate the process of recording.

This can be particularly useful if you have cameras set up in various classrooms or offices and have set times for different presentations. You can set the presentations to be automatically recorded, and use the information within the iCal event to set the data about the podcast itself. Using the system in article you can control your entire podcast process by creating new events within iCal.

If you work with podcasts on an ongoing basis, scheduling for the recording and publishing of your podcasts is critical. From an audience perspective, you want to have a regular stream of content to keep people interested in your podcasts. You also want to make sure that you are making the best use of your presenters by giving them the time and flexibility to create a podcast at a time that suits their schedule and yours.

Instead of explicitly recording a podcast and having the information submitted into the system, why not schedule podcasts recordings to take place automatically according to your organizational needs? For example, within a school or university you may have regular class sessions that you want to record and publish as a podcast.

The article demonstrates how to use AppleScript in combination with iCal to automate the recording of podcasts from cameras without having to manually create each recording. The solution will simplify the scheduling of recordings to create a suitable event within iCal and then uses iCal and Podcast Producer handle all of the complexities of the recording process.

Read: Podcast Producer: Scheduling Podcasts

Podcast Producer: Publishing to YouTube

My new article at the Apple Developer Connection is now available.

When creating podcasts you dont always want to publish to one of the blogs or wiki services on your Leopard Server, or to iTunes. How about posting to YouTube?

From the intro:

YouTube has created a whole new generation of users who like to view video over the Internet, whether at their computer, their laptop or when using their iPhone. With Podcast Producer, you have many workflows available to you on your Mac OS X Server, but you can also customize workflows and publish content directly to YouTube. Follow along to build a custom workflow that will take an existing video podcast through Podcast Producer and post the content directly onto YouTube.

The solution involves a custom application using the YouTube Java kit that submits a converted podcast content to your YouTube account.

Read: Podcast Producer: Publishing to YouTube

I love my Moleskine

Not remotely related to computing at all, but I’ve just been updating my diaries, and I use Moleskine. I go everywhere with a Moleskine of some description (they’ve recently released some really tiny notebooks, which make it easier to carry your life around with you).

Despite having computers, organization tools, and email, there is something decidedly comforting about writing, by hand, into a physical notebook.

Of course, you write in pencil so that the contents don’t smudge, and somehow that just feels even better.

Multiple VCS Updates and Cleanups

I spend a lot of time updating a variety of different repositories of different varieties and denominations, and I hate having to do that all by hand – I’d rather just go up into a top-level directory and say update-all and let a script work out what to do, no matter what different repos are there.

I do it with a function defined within my bash profile/rc scripts, and it covers git, bzr, svn, bk, and cvs. The trick is to identify what type of directory we are updating. I do this, lazily, for each type individually, rather than for each directory, but I’ve found this method to be more reliable.

update-all ()
{
for file in `ls -d */.svn 2>/dev/null`;
do
realdir=`echo $file|cut -d/ -f1`;
echo Updating in $realdir;
( cd $realdir;
svn update );
done;
for file in `ls -d */.bzr 2>/dev/null`;
do
realdir=`echo $file|cut -d/ -f1`;
echo Updating in $realdir;
( cd $realdir;
bzr pull );
done;
for file in `ls -d */.git 2>/dev/null`;
do
realdir=`echo $file|cut -d/ -f1`;
echo Updating in $realdir;
( cd $realdir;
git pull );
done;
for file in `ls -d */CVS 2>/dev/null`;
do
realdir=`echo $file|cut -d/ -f1`;
echo Updating in $realdir;
( cd $realdir;
cvs up );
done;
for file in `ls -d */BitKeeper 2>/dev/null`;
do
realdir=`echo $file|cut -d/ -f1`;
echo Updating in $realdir;
( cd $realdir;
bk pull );
done;
unset realdir
}

That’s it – a quick way to update any directory of repos.

All the MCB Guru blogs that are fit to print