Mean Time Between Stupid Events

Mean time between stupid events (MTBSE) is a politically incorrect term.  MTBSE is so offensive it is rarely spoken.   MTBSE is understandable by almost everyone—particularly in Monday morning quarterback sessions.  MTBSE ruins your reputation.  MTBSE applies to humans as it measures their failure rates of both commission and omission.

Mathematically, MTBSE = (Summation of time)/(Summation of stupid events).

Stupid events are those situations you’d rather forget if your performed them.  They’re situations where you exclaim: “What were they thinking when they did that!”  They’re irrational events performed by usually rational people that result in severe embarrassment.  Stupid events can be as innocent as malapropisms (unintentional use of wrong word), mispronouncements (pronounce something incorrectly), and other statements that you’d like to recall as soon as spoken.  Stupid events are Emails launched that you wish you could retrieve.  Stupid events are deadly events that started out well but turned to both a deadly and sad ending.  You know a stupid event when you see it—see Figure 1.

Figure 1: Train Wreck—MTBSE? 

Did you say: “What were they thinking?”   This October 21, 1895 accident occurred at Paris La Gare Montparnasse station.  The locomotive overran the bumper stop, careened across the station concourse, sailed out of the station, and crashed into the street below.

MTBSE particularly characterizes many events performed by immature people, people who have consumed too much alcohol, and people who are on both legal/illegal drugs.  It frequently characterizes our personal performance.  MTBSE often characterizes institutional performance.

MTBSE characterizes management Go-I-tus (we’ve to maintain the schedule—what will people think about us if we delay, etc.!!).  Go-I-tus is a well known phenomenon with pilots who commit unsafe acts simply to get from point A to point B when conditions are unfavorable—they think they’re bullet proof and we get another monument to MTBSE.   Go-I-tus was a primary cause of the Challenger space shuttle failure where management took unnecessary risks to maintain schedule in the face of technical discussions rejecting launch (same institutional performance for Columbia?).  In the end, the actions are characterized by MTBSE.  Both space shuttle launches were not failures from aging equipment but were MTBSE.

MTBSE is induced by demanding too much to occur in too short an interval of time.  Consider the probability of human error in a nuclear reactor control room for a single event (See reading list for Human Reliability & Safety Analysis Data Handbook by Gertman and Blackman, John Wiley, 1994, ISBN 0-471-59110-6 based on Swain, A.D. (1987) Accident Sequence Evaluation Procedure (ASEP) Program, NUREG/CR-4772, U.S. Nuclear regulatory Commission, Washington, DC.) as shown in Table 1.  We set-up our people to fail by demanding their response to be too quick—if you need fast responses, then best you automate, else, you’ll be pointing your finger at a MTBSE (and you may be the root of the MTBSE problem by asking your humans to do too much too quickly).

Table 1: Time To React And Human Reliability In Control Rooms

Time To React For A Single Or First Event
(Swain 1987)

Probability Of Human Error
(Swain 1987)

Human Reliability






* If they haven’t forgotten what they are to do!

In case you don’t believe what the Nuclear Reactor business has been able to achieve in reducing MTBSE, read the short article on Wikipedia about NUREG-CR1150 or go directly to the Nuclear Regulatory Commision for CR1150.  You may also find the organizational effort to error proof the system of interest in the book Hostages To Each Other by Joseph Rees which tells about improvements follow the Three Mile Island Incident.

The amount of risk we take is $Risk = (probability of failure)*($Consequence of the failure).  Don’t take too much risk it’s unwise and you can be a direct contributor to MTBSE, and don’t take too little risk as it’s unprofitable to spend excess money to cover so little risk.  You must know when to accept the risk and when to deny the risk.

 Return to the list of problems by clicking here. 

Refer to the caveats on the Problem Of The Month Page about the limitations of the solution above. Maybe you have a better idea on how to solve the problem. Maybe you find where I've screwed-up the solution and you can point out my errors as you check my calculations. E-mail your comments, criticism, and corrections to: Paul Barringer by     clicking here.   Return to the top of this problem.

You can download a PDF copy of this Problem Of The Month by clicking here.

Return to Barringer & Associates, Inc. homepage

Last revised 10/15/2007
© Barringer & Associates, Inc. 2004