Like many designers, I've been enjoying Clients from Hell, a collection of various incidents freelance designers and developers have had working with customers over the years.  Oh, boy, do I know some of these.  But there's one that stuck in my craw: "Slow It Down."  The basic complaint is common to Ajax-based applications: if you're close to the server when using Ajax for state management (send commands back to the server to change or save some detail), the transition from "working" to "done" can be so short that some users won't realize that the action they committed has been saved.

I saw this on a project for an intranet deployment once.  The user was presented with a list of alert states, and if the alerts were being dealt with he could click on a checkbox to "silent" the alert and it would stop harassing his pager.  Since it was an intranet, the LAN was very fast and the server lightly loaded, so the "working" spinner would blink by so fast some people complained that they didn't know if the action had been saved.  Where, they asked, was the submit button?

In "Slow it Down," the developer says that he added a 1.5 second delay "on the server."  He gives this as an example of a dumb client, but in this case, the client is right and the developer is wrong, and the developer is solving the problem the wrong way.  Using sleep() on the server blocks the thread, may slow down other transactions, and most importantly blocks the client from performing other actions.   The correct solution is to do it on the client, where you don't use up server resources and the user can have many such transactions going on at the same time.

Here's the whole solution, using jQuery. I create a function that clears the spinner out. Then I add an event handler that handles my data transaction and registers an on-end event handler. That event handler checks to see if the transaction took longer than the "display spinner" timeout. If it did, it clears the spinner.  If it did not, it creates a new setTimeout event to clear the spinner after the specified amount of time has passed.

(function() {
    var timeout = 500;

    function swapSpinner() {

    $('#submit_button').click(function() {
                '<img src="/assets/images/spinner.gif" alt="" />');
            var form = $('remote_form');
            var data = $(form).serialize();
            var action = $(form).attr('action');
            var started = new Date();

            $.post(action, data,
                   function(data, status) {
                       var when = new Date();
                       var inter = when.getTime() - started.getTime();
                       console.log(status, inter);
                       if (inter < timeout) {
                           setTimeout(swapSpinner, timeout - inter);
                       } else {
        return false;

This is the correct way to do this: a server thread is freed the moment the transaction is done, but the cognitive message the server is doing something has been successfully delivered to the user, and the (admittedly minimal, but as Google can tell you gazillions of minimal costs add up) CPU load has been distributed to the client. Because javascript is single-threaded and event modeled, many such events can be going on at once, so in the example I described above, the customer could click several boxes in a row and the system would do the right thing.

The developer might say that this is a case of educating the user.  I disagree: this is a case of unpleasant surprise.  There's a rule in web design: exceed your client's expectations gently.  If you're going to make transactions instantaneous, you're taking away the cognitive step submit the transaction.  You must replace that with some alternative acknowledgment.  A delay does that.  A delay done right does that and creates new opportunities for further exploration of the technique.