A few days ago, a ballistic missile alert went out to residents of the U.S. state of Hawaii, causing them to receive this text to their phones:
Source: Andy Thammavongsa/Twitter
Following the alert—which took 38 minutes to correct—state officials blamed the situation on a user error. Hawaii Governor David Ige had this to say:
“It was a mistake made during a standard procedure at the change over of a shift, and an employee pushed the wrong button.”
But is that account fair on the user—and is it a fair analysis of what happened?
A photo of the screen in question has emerged, which makes it pretty clear how this mistake happened. These are the options the operator in question apparently faced:
The options for an alert drill and a real alert almost identical in terms of text and appearance. What’s more, the link for a real drill is confusingly placed below text that reads “1. TEST message”, making it look like a section heading.
The system operator working at the time has reportedly been moved to other duties. But was it really their fault? What this incident shows is what happens when we create systems that are not simply not user-friendly, but in many cases are actively user-hostile.
Many of us who have worked with point of sale machines in bars, or with finance systems in back offices, will have experienced screens that look a lot like this one. Clunky, counterintuitive interfaces easily mislead us and cause us to make errors, even — perhaps especially — in systems that we have used hundreds of times before.
Compounding this problem, many employees are under constant pressure to work quickly, or to handle multiple duties simultaneously. Of course, for most of us, a misplaced click on a computer system at work does not carry significant consequences. Unfortunately for the operator in Hawaii, choosing the wrong link led to a major statewide security alert and international headlines.
If we’re going to play the blame game, we should recognise that in this and similar cases, the blame really lies in the process (or lack of one) that led to the operator being confronted with the screen above.
As contributors to this Hacker News thread have noted, the false alert was the result not only of one user action, but also of the work of designers (or its absence), the decisions of officials who procured the system and signed off on the interface, the developers who coded and modified it — in short, everyone who potentially added to the number of options available in the list, obscuring core functionality and making it harder to distinguish between test and real alerts.
As with many medical systems, a big part of the problem is that those choosing, commissioning, and approving systems are not the end users but rather their managers—often people who are often several levels of seniority back from the front line.
UX design authority Don Norman has identified three simple ways in which a different user experience design could have prevented the false alert in Hawaii:
- Have a separate test mode for the system, with each message stating clearly “this is only a test”.
- Have the system default to test mode.
- Require a second person to confirm the sending of any alert (real or test)—like how a pharmacist has a colleague check each prescription before handing it to the patient.
Importantly, these errors of user experience design are not free of consequences. It’s temping to think that the only material effect of this mistake was some embarrassment for state officials. But let’s remember that an entire state population was, for a short time, panicked into expecting imminent, deadly nuclear attack. That’s a pretty serious, and traumatic, consequence of poor UX design.
Concerningly, poor interaction design and outdated software exist on far more critical platforms than the Hawaii alert system. For example, the UK’s Trident nuclear weapons system continues to run on Windows XP—albeit a modified version with custom support from Microsoft.
Living in a world armed with nuclear weapons, we of course also have launch systems. If we had the security clearance to see them, we would probably be surprised at how primitive some of them are. Particularly in a time of tense international relations, those in the position to do so would do well to review how resilient their systems are to user error—which is really a euphemism for bad design.