The Hawaii False Alert: Design Flaws in Public Alert System

A Professor's Perspective on Current Events

 

Professor Michael Roberto, D.B.A.
By Professor Michael Roberto, D.B.A.

The Hawaii False Alert: Is It Really Human Error? Design flaws within the alert system. Professor Michael A. Roberto discusses the panic-inducing communication in Hawaii. Read more below.

This article originally premiered on Michael Roberto’s Blog.

Don Norman has written an outstanding article for Fast Company about the false alert that caused panic in Hawaii over the weekend.  In the aftermath of the incident, we heard that “human error”  caused the false alert to be transmitted widely to citizens of the state. Norman challenges this initial conclusion. Norman writes:

When some error occurs, it is commonplace to look for the reason. In serious cases, a committee is formed which more or less thoroughly tries to determine the cause. Eventually, it will be discovered that a person did something wrong. “Hah,” says the investigation committee. “Human error. Increase the training. Punish the guilty person.” Everyone feels good. The public is reassured. An innocent person is punished, and the real problem remains unfixed. The correct response is for the committee to ask, “What caused the human error? How could that have been prevented?” Find the root cause and then cure that. To me, the most frustrating aspect of these errors is that they result from poor design. Incompetent design. Worse, for decades we have known how proper, human-centered design can prevent them.

This article is part of our Professor’s Perspective series—a place for experts to share their views and opinions on current events.

Norman points out several egregious design flaws with this alert system.  First, why was a confirmation not required before the alert was sent?  Ideally, he notes, the confirmation should be provided by a second person working independently from the person who selected the alert message.  Second, when operating in test mode, the messages should all start with a clear indication that it is only a test.  That should be in bold!  It should be capitalized!  It should be crystal clear!   Finally, the system should be designed to enable an immediate correction.   The delay was preventable with a better design.
In sum, you can look at any failure in two contrasting ways.  You can examine it individualistically, i.e. it is human error.  Or, you can look at it systemically, i.e. what systems, procedures, and situational factors contributed to poor actions or decisions?   The latter approach is much more likely to lead to learning, improvement, and future accident prevention.   We have shown that in our own research on tragic accidents such as the Columbia space shuttle accident.
For more with Professor Roberto, check out his courses on The Great Courses Plus!

Be the first to comment

Leave a Reply

Your email address will not be published.