Managing human opportunities and limitations is important
Understanding and taking account of how people receive, process and utilise information are crucial for achieving safe and efficient operation. “Humans make mistakes all the time. That’s normal,” says Jens Christen Rolfsen at Safetec Nordic.
- Working environment
“We people are a fantastic resource,” observed Rolfsen, a psychologist who heads the company’s leadership, organisation and safety department, when addressing a conference on human opportunities and limitations staged by the Petroleum Safety Authority Norway (PSA) in September.
Read more about the seminar and access Rolfsen’s full presentation (in Norwegian only) here.
“We ensure things happen safely and efficiently day after day,” he noted. “We solve new problems, we’re adaptable, we manage a lot. Now and again, however, we reveal our limitations in the form of incidents and accidents. And we must be aware of these if we’re to operate a safe system.”
Situational understanding
Rolfsen emphasised the importance of situational awareness, and how losing it is often used as an imprecise explanation for accidents.
“We can read in many reports that ‘the accident was due to a loss of situational awareness. That explains nothing. It’s equally meaningless to say that it was due to human error. When somebody fails to grasp the situation, we must understand why.”
That requires comprehending a person’s information processing system, he said – including the way people receive stimuli via their senses, pay attention, and select relevant information through cognitive processes.
Big implications
How people receive and process information has big implications for the design of control-room and alarm systems, where the need to support changing perceptions in an ongoing process is crucial.
At a time of increased automation and autonomous processes, Rolfsen emphasised, technology development must be human-centred to ensure that decision-makers receive the right information.
“The key question is what those sitting at the sharp end need into order to maintain a correct understanding of the situation,” he said.
Probability of error
Workload affects the ability of individuals to perform, Rolfsen noted, and said that having both too much and too little to do could increase the probability of making errors.
This has particular relevance in control rooms where operators monitor autonomous processes. Maintaining an optimum workload becomes a key challenge where both inattentiveness and tunnel vision can arise if the level of activity is inappropriate.
“People make mistakes all the time,” observed Rolfsen. “A pilot makes an average of 3.8 of them per flight. We can’t do much about that. But what we can manage are the preconditions which increase or reduce the probability of errors, and not least picking them up and reducing their consequences.”
When workloads are low, a person’s attention may not be “turned on” – so they fail to notice that things are starting to move in another and perhaps erroneous direction. With excessive workloads, attention is weakened and people often develop what is known as tunnel vision.
Demanding
Rolfsen referred to an example of such narrowing of attention during simulator training of ship’s navigators, which he had personally witnessed, where participants had to deal with a demanding condition.
These capable people were extremely concentrated on doing things right. But when the fire alarm suddenly sounded, they took no notice. It continued ringing for five minutes and 40 seconds before anyone responded.
“Imagine that – five minutes and 40 seconds,” said Rolfsen. “It’s incredible what we fail to pick up.
“When the workload rises, we deactivate our senses. Hearing is among the first to become disengaged. That’s pretty fascinating, considering how many alarms are sound-based.
“And we lose the ability to be flexible, to take one, two or three steps back and ask what’s happening now, what haven’t we grasped, and what in our surroundings have we failed to pick up.
“The consequence is an increased probability of error.”
Design and operate
Nothing can be done about the fact that people make mistakes, Rolfsen said. What will be possible is to design and operate systems in such a way that they take account of human opportunities and limitations.
“People contribute, day after day, to operations in high-risk industries being executed safely and efficiently. Human limitations are well documented, and good methods exist for identifying and managing these to prevent them becoming a risk factor.
“We can apply these methods in the design and operational phases, in change and restructuring processes, and when introducing new technology.”