Exploring the impact of design, workload and other factors on performance

Table showing the impact of different perfrmance infuencing factors.

The Control Panel game involves participants playing a simple process task on their mobile phone and after each round the facilitator displays the average group responses (and fastest responders) on the screen. The primary purpose of the game is to introduce the delegates to the impact of a range of Performance Influencing Factors (PIFs) on their performance. In each round a new PIF is added to the game and impacts performance. Over the 4 rounds the performance of participants does deteriorate in different ways, but not always as expected.  In this blog I will discuss early feedback results and compare against the research. We currently do not have large enough numbers to produce conclusive results, but this will come!

Practice Round & Round 1

The participants get a practice round to get familiar with the controls and then they take part in their first timed round. The task is fairly simple – press the button with the corresponding colour when the indicator goes above the red line. It is rare (average of 0.15) for people to make a slip (e.g. press the wrong button or press the button when the indicator is not above the line). Response times are typically quick i.e. under 1 second. If the users don’t respond to the indicator being above the red line after 3 seconds then a fire breaks out and they need to press the ‘water’ button. It is very rare in the first round for them to have fires (0.02 per person).
Round 2

In this round there is only one change. The buttons are switched round so that the colours of the buttons are not placed underneath the corresponding indicators (Human-Machine Interface design). The participants are reminded that they have to press the button with the corresponding colour but not that the button placement has changed. The results are very consistent. We have had over 230 people complete these two rounds and the response time increases by about 0.3s on average.   The biggest increase is in the number of times participants press the wrong button or press a button when not required (slips). In the most recent data above these go from 0.15 to an average of 1.42 per participant. The interface design is un-intuitive and while the user is not familiar with this unintuitive design it is the prime time for slips to occur. Fires (omissions) do not seem to increase so people seem to quickly notice that the indicator is still above the red line and then press the correctly coloured button.

Round 3

In this round there is the addition of a messaging system. The system gives participants a number of banal messages and then occasionally an emergency instruction to press the ‘water’ button. So, they are instructed to pay attention to the messages and the indicators (divided attention).

The surprising result is that ‘slip’ errors drop right down from 1.52 to 0.58. It seems that participants are able to adjust to the buttons being in the wrong place. Their average response time increases from 1.1s to 1.55s. This is likely to be partly due to having to stop themselves from pressing the wrong button*.

The other influence is that average fires per person increases dramatically from 0.07 to 0.73. The message system will repeat the message to ‘press the water button’ on the screen but it seems that many people are missing this message till a fire breaks out. The messages do not stand out in any way from the other messages that appear so it seems that they are not grabbing the attention of the participants. In addition, when they press the water button they do not get any confirmation so it may be that some people are not correctly pressing the button and do not know! (Human-Machine Interface (HMI) feedback).

*note that even after testing the game many times I still have to stop myself from pressing the wrong button.

Round 4

In this round there is an additional task to complete. There is a green level indicator that keeps increasing and the participants have to press the ‘reset’ button when it is between the two red lines. It then goes back to 0 and starts climbing again. In the current data when a participant does not reset this before it reaches the top this has been added to the slip category. In retrospect this should have been added to the omission category. This has been corrected and future analysis will identify what the true numbers actually are.

In reality this round has increased the workload to the point that some people are struggling. The HMI design is such that the green indicator and the messaging system require a great deal of attention to know when action is needed. Omission failures and response times increase in this round, but so do slips. More research will confirm the real split here.

Comparing to other research

Research by Reason & Takano (1999) examined the impact of Psychological biases and PIFs on human cognitive performance in a mixture of nuclear power plant incidents and simulator experiments. They found that out of the 12 PIFs they measured, excessive workload (& high job demand) affected the most errors (50.6%); inattention was next on 38% and equal third was poor human machine interface and low arousal on 30.4%. The Control Panel game demonstrates the impact of the combination of workload and poor Human-Machine Interface.

Of the cognitive biases they measured – frequency bias appeared the most times (24.1%) with confirmation bias next on 16.5%.  They describe Frequency bias as ‘This bias is relating to habit intrusion, and means that human performance often can be captured by familiar behavioural patterns that occur so frequently in his experiences’. It is possible that there was an element of this in round 2 in that people will be familiar with the button being underneath the correct indicator (supported by the behaviour in the practice round and round 1). Designs that work differently from the way workers are expecting have played a major part in safety incidents, for example the fire in Baton Rouge which was animated by the CSB https://www.youtube.com/watch?v=QyIIe5T5beM

Reason (2002) identified and demonstrated that Omission errors are more likely under certain (8) conditions. This includes when memory demands are high, steps are lacking in conspicuousness and the steps are not part of the main goal. He identifies that “where two similar steps are required to achieve a particular goal, it is the second of these two steps that is most likely to be neglected”.  This could certainly be the case in our study where the main goal was pressing the buttons as quickly as possible and then similar secondary steps were added, but perhaps not deemed as important.

Accessing Control Panel

The Control Panel game allows the facilitator to demonstrate these PIFs through an engaging experience. It opens up discussion on a range of topics. The screen below shows an example of feedback that the game gives after each round.

The game is available for consultants to rent per training course and for companies to rent on a monthly or yearly basis. Please contact info@caspianpsychology.com for more information.

References

Reason J (2002). Combating omission errors through task analysis and good reminders, BMJ Quality & Safety 2002;11:40-44.

Takano, K & Reason, J. (1999) Psychological Biases Affecting Human Cognitive Performance in Dynamic Operational Environments, Journal of Nuclear Science and Technology, 36:11, 1041-1051, DOI: 10.1080/18811248.1999.9726296 – https://qualitysafety.bmj.com/content/11/1/40

Leave a Reply

Your email address will not be published. Required fields are marked *