A psychological perspective does not start from the assumption that people are fundamentally irrational. Rather, it emphasises a different logic: a logic that meets the challenges we have evolved to face. For much of our evolution we have faced an environment with major differences from the modern business world. We have developed a range of cognitive mechanisms to cope with adverse environments in which resources are scarce.
These mechanisms include a range of simplifying and confidence-sustaining mental short cuts (heuristics) that help us to make quick decisions when pausing to undertake a full analysis would be unwise. While these ways of thinking are not the same as rigorous logic or formally rational reasoning, they are well suited to fast-paced intuitive judgements and actions. However, these evolved modes of thinking also create some major traps.
The use of heuristics
As decision makers, none of us has infinite resources or time to devote to gathering and analysing information. In addition, we all have significant limitations to the amount of complexity we can cope with. Thus, even where we make conscious efforts to make decisions according to a formally rational process, we often need to make simplifying assumptions and accept limits on the availability of information and the thoroughness of our analysis. As noted above, we constantly use heuristics as a way of reducing the complexity of decision making: for example, associating a particular brand with quality rather than engaging in a detailed evaluation of the merits of different breakfast cereals or clothing stores.
Many of these are entirely unconscious. They are often useful, but also lead to some significant biases in our decision making. Some of the most important are:
- framing the problem
- using information
- problems of judgement
- post-decision evaluation
Framing the problem
The way in which a problem is framed can have a significant effect on how you make decisions. Medical decisions can be affected by whether outcomes are framed as likelihood of deaths or of saving patients. Financial decisions can be affected by whether you see yourself in a position of loss or gain. In a position of gain we tend to become risk averse; in a position of loss we will tend to take risks to avoid or recover losses. You may know people who are good at using this to their advantage; they exert influence by framing choices so that others will choose the option they prefer.
Framing effects can be quite subtle and even affect our recall of events. For example, in one study, groups of students were shown a film of a car accident. Each group of students was shown the same film clip and then asked ‘How fast were the cars going when they ---- each other?’ where ‘----’ was a different word for each group, variously ‘smashed into’, ‘collided into’, ‘bumped into’, ‘hit’ and ‘contacted’. The table below shows the average speed estimated by each group.
|Verb||Mean Estimate of Speed (mph)|
Those who were asked the ‘smashed’ question were also more likely to believe they had seen broken glass in the film clip than those who were asked the ‘hit’ question. There was no broken glass.
Our use of information is often biased in important regards. First, we pay more attention to information that is easily available. Second, we overweight memories which are more easily retrievable – usually because they are emotionally vivid or have personal relevance. We pay selective attention to information, often in a self-serving way. We will often give greater weight to information which shows us in a favourable light (self-serving bias), or information that supports an already established point of view (confirmation bias).
For example, in some research that colleagues and I carried out into the decision making of traders in investment banks, one trader told us:
’I spend time talking to a lot of people; consultants, other traders on the desk, in the markets, finding out what people are doing. I am always absorbing information. ... I like to find people who have the same thought processes as me’.
This trader may have been suffering from the confirmation bias: unconsciously avoiding people who might offer views too different from his own.
Problems of judgement
We are constantly bombarded by information. Simply walking though a room risks flooding us with more sensory information that we can possibly process. Stop for a moment and consider all the different things you can see, hear, smell, or feel.
Which of them do you usually tune out? From birth we start learning to filter information out and to prioritise, label and classify the phenomena we observe. This is a vital process. Without it we literally could not function in our day-to-day lives. In our work lives, if we did not filter information and discard options we would suffer from analysis paralysis: the inability to make any decision in the face of the complexity and the ambiguity of the real world.
However, this filtering comes at a cost and introduces some significant biases into the judgements we make. One is overconfidence: we tend to be unduly optimistic about estimates and judgements that we make and filter out of our awareness many of the sources of uncertainty. Another problem is our tendency to be swayed by how a problem is framed.
Many decisions need revisiting and updating as new information comes available. However most of us make insufficient anchoring adjustment: this is the tendency to fail to update one’s targets as the environment changes. Once a manager has made an initial decision or judgement then this provides a mental anchor which acts as a source of resistance to reaching a significantly different conclusion as new information becomes available. It is what happens when one has made a snap judgement and then disregards feedback that is inconsistent with this position.
This bias can affect judgements about people as well as technical judgements. Making early judgements about someone, for example in a job interview, may put you in an anchored position and later information may come too late to shift your opinion.
For most normally functioning people, maintaining self-esteem is an important internal goal. This can cause us to filter out or discount information that might show us in an unfavourable light. This is what lies behind the fundamental attribution bias. This is the tendency to attribute good outcomes to our own actions and bad outcomes to factors outside our control. While such defences against loss of self-esteem can be helpful to the extent that they help us persist in the face of adversity, they can reduce learning and reduce opportunities to take corrective action.
Another important internal goal is to maintain a sense of control over events and our environment. In consequence, a common way in which we distort our understanding of events is to assume we have greater control of events than we really do. When we suffer from this illusion of control, we are likely to underestimate the risks of our actions and decisions, and have problems in learning from experience, as we discount information that suggests we are not in control.
This psychological perspective sees people as driven to achieve cognitive mastery of their environment. It is essential to try and avoid the inherent bias involved in our coping mechanisms.
Car accident study sourced from EF Loftus and JC Palmer, ‘Reconstruction of automobile destruction: an example of the interaction between language and memory’, Journal of Learning and Verbal Behavior, 1974.
About this article
This article is based on extracts taken from the Open University Business School course Making a Difference (B830).