At least since the first water-powered milling machines, people have depended on automation. When something can be automated, there’s a pretty high likelihood that it will be. But in some instances, improvements aimed to make certain processes more effective and safer have the opposite effect of making people lose the ability to think critically.
Flying high on trust
A perfect illustration of people putting too much trust in machines is the vast body of research into airline pilots becoming overly reliant on computers. Stephen Casner, who has been studying this subject since 1990, was quoted in The New Yorker saying, “What we’re doing is using human beings as safety nets or backups to computers, and that’s completely backward. It would be much better if the computing system watched us and chimed in when we do something wrong.”
A person’s mind wanders and they become so reliant on automation that they trust it never to make mistakes. This can happen to anyone in any job, but is perfectly exemplified by pilots. According to Casner’s research, the likelihood of pilots’ thoughts drifting increases with the amount of automation in the cockpit. Automation can also cause some of the pilots’ skills to atrophy. All of which leads to pilots making more errors and becoming complacent.
People trust machines to do what they’re programmed to do. The problem is that sometimes machines don’t—a component might fail, there could be a power source disruption, faults within the control system, electromagnetic interference, or the machine could be programmed incorrectly. There are numerous things that could go wrong. Or someone could expect that the machine will perform a job that it’s actually not supposed to do—as it happened recently when a driver was killed while using his car’s autopilot system in a way it was not designed to be used.
The more automated a procedure is and the more comfortable people are with it, the more complacent they become. This often leads to workers paying less conscious attention to the task at hand than they should be. Complacency is responsible for a huge number of injuries and deaths each year, but complacency involving machines can be especially dangerous.
People trust in the infallibility of machines to such an extent that they often miss vital warning signs or ignore their own doubts. This happens because of automation bias—the tendency for people to favor suggestions from automated systems, even when there is contradictory (and correct) information available from other sources. It happens in hospitals, pharmacies, nuclear power plants, on airplanes, and in other workplaces all over the world.
In an ideal scenario, automation monitors human actions (like alarms that let drivers know when they start drifting off the road) and serve as the last line of defense rather than as a principal performer. But since that’s not possible in workplaces where certain jobs are becoming increasingly automated, the next best thing is to make sure that the workers don’t rely on the machines’ infallibility and always remain vigilant, because being complacent when dealing with hazardous energy or materials can lead to serious incidents.
Trusting the brain
The human tendency to blindly trust machines is why it’s so important to ensure that workers actively fight complacency. The introduction of a good human factors safety training program, combined with regular toolbox talks and refreshers on the potential danger of automation, can drastically reduce the risk of complacency-related injuries.
The right human factors training and a safety-first approach can also teach workers that questioning computers is not only allowed but necessary. Of course, most of the time the computers will be right. But when they’re not, it’s important to be able to trust that the right training and a safety-focused attitude will empower workers to take safety into their own hands.