It's funny what a day's wrestling with a hard problem can lead to as inspiration.

I'm going to start with a piece of Django code that someone else wrote: Django Activity Stream, a simple piece of code that lets you track "everything" the actors in your system do: every bookmark made, every comment made, every story read in my current publishing system. If you're an author, you want to track when your stories are read, and when comments are made on your stories. If you're a reviewer, you want to know when someone replies to your review, or gives you a thumbs up. And so forth an so on.

The trouble with all this is that, in order for it to work, you would have to touch all of your different modules with function calls like action.send(), which would just clutter every piece of code with knowledge about this logging facility. Not cool. The activity stream is orthagonal to your business logic, but critical to your goal: this is known as a cross-cutting concern. Having to touch all of those Django applications in order to get logging becomes known as tangling.

Django provides an easier way of monitoring its own internal activity, called signals. A "Signal" is a specialized object that can publish when a certain event has occurred, and then other systems can subscribe to receive the signal when the event occurs.  

Django provides a comprehensive collection of signals associated with the database, and since most Django applications are MVC, when the model is altered a database event occurs, and we can easily intercept those and use them to our advantage.   The most common signal used is post_save, which is emitted whenever anything saves to the database.   Here's an example that listens for the creation of a FacebookUser (which also creates a standard django User as well), and guarantees that a Profile for the user will also be generated:

from facebookuser.models import FacebookUser
from django.db.models.signals import post_save

def set_up_profile_for_remote_user(sender, **kwargs):
    if not kwargs['created']:
        return
    Profile(user = kwargs['instance'].user).save()

post_save.connect(set_up_profile_for_remote_user, sender=FacebookUser)

The post_save.connect method says "Listen for database saves, and if the sending object is a FacebookUser, call the function set_up_profile_for_remote_user." The callback gets three arguments: the calling class (FacebookUser), the specific instance that was just saved, and a flag indicating whether or not that class was just created, or merely modified. The last two are sent as named arguments, so most signals use the kwargs syntax.

When used with the activity stream (as with logging), the database is emitting all of these events. The activity stream has a method for capturing what object emitted the event, a "verb" associated with that event, and an optional target object. All we need to do is capture the various database events, associate them with actions, and route them to the stream. For this, I have my various event generators, I have the activity stream, and I have a small project-specific application I name "hub," into which I just put the signal routers:

def record_comment(sender, **kwargs):
    if not kwargs['created']:
        return
    comment = kwargs['instance']
    action.send(sender = comment.user, verb = 'commented',
                target = comment, public = True)

post_save.connect(record_comment, sender = Comment)

By using a simple application silo like "hub," neither the activity stream application, nor any of the various applications being monitored, need know anything about one another, yet the cross-cutting concerns of the activity stream can be fully addressed.  This ease of dealing with such concerns is the cornerstone of Aspect-oriented programming, and is a good way of easing into an understanding of AOP.

A couple of caveats: One, Django signals are synchronous. In these examples, immediately after the database write ends the signal receivers are called sequentially within the thread of the original request event. Any signal handler that takes a significant time to resolve ought to be kicked out to a queue manager. (This fact seems to surprise a lot of people when they encounter it.) Two, the order in which callbacks are called for a given signal cannot be set by the developer. If any given signal has multiple subscribers, do not rely on the order of those subscribers to handle the signal.

There are quite a few Django signals covering pre- and post-saving, deleting, and HTTP request management, and it's possible that you can always write your own if you feel your application is doing something unique enough the standard set won't work for you, but must publish so other applications can listen in and get signaled as needed. But for the most part, the standard set provides you with awesome capabilities that should get you on your way.

p.s. After a while handling signals, you may notice that the callback record_comment calls action.send, which is itself a signal publisher, the generic "add an action to the Action table" handler. The idea behind activity stream is that you would have lots of publishers of actions, and only one subscriber listening for these events. By further centralizing the activity stream, we avoid the kind of clutter seen in Pinax, which is riddled with "optional" calls to the notification application.  Take this example from the microblogging application:

try:
    from notification import models as notification
except ImportError:
    notification = None
...
    if notification:
        notification.send([reply_recipient], "tweet_reply_received", {'tweet': instance,})

This is exactly the kind of cross-cutting concern (note the use of a signal publisher for notification!) that would be better handled with a signal from the microblogging app.  Perhaps the app could have published "tweet received" as a signal, with an optional "module for common associated applications" dispatching that event to logging, activity streams, notifications or whatever as available.  The presence of explicit publication would have signaled to any developer to look for subscribers, so I don't think this is a dangerous case of unlabeled subroutines and the potential for spaghetti code that goes with it.  Most Django apps are small silos of code; it would not have been onerous to separate out and isolate this cross-cutting concern from the rest of the microblogging application's functionality.