The Human Factor: How Three Mile Island Could Have Been Prevented
Three Mile Island stands out as one of the greatest U.S. tragedies, and possibly the first major disaster of the nuclear age. Being one of the first, and one of the most publicized reactor meltdowns in world history, it would be almost a decade before the disaster was overshadowed by that of Chernobyl.
One of the most cited reasons for the disaster was confusion concerning the state of the valve in the reactor’s control system. Specifically, the interface was designed to tell the operators that the valve was open, but was built to give the status of the solenoid controlling the valve, instead of the actual valve state itself.
The application of human factors principles may have very well solved this problem ahead of time.
In the field of human factors, we often times speak about expectancies. These are an individual’s expectations about the way a system should work. We also speak about visibility, which is how transparent a system is to the operator, specifically in reference to the operators understanding of the system’s underlying processes.
For example, when you get in your car and turn the key, you expect the engine to start. Breaking expectations often times leads us to understand that something is wrong with the system we are working with. If you turn your key and the engine doesn’t start, you know something is probably wrong with your car.
Problems arise when these expectancies are broken, yet the user is un-aware of a change. Very much like your engine still starting, but someone having drained out all the oil without any indication, immediately destroying your engine. Now imagine you had a light that came on telling you there was no oil in your car - you’d never start the engine.
Three Mile-Island’s control panel had a light showing the valve of concern was closed. In fact, the operators were so used to this that they looked for every possible other solution, even though this valve being stuck open was an obvious cause of the initial problems. Not only that, but further downstream information that would very well have hinted to the operators that something was seriously wrong was either hidden from view due to bad interface design, or was not something the operators were trained on.
The field of human factors works hard to do its best to prevent events like this from ever happening again, through the application of usability, design, and training.