Skip to content
Skip to main content

About this free course

Download this course

Share this free course

Introducing consciousness
Introducing consciousness

Start this free course now. Just create an account and sign in. Enrol and complete the course for a free statement of participation or digital badge if available.

4.5 The function of consciousness

There is another problem I want to mention briefly. What is the function of consciousness? What difference does it make to have phenomenally conscious experiences?

This may seem an odd question. Surely, the answer is obvious: the function of consciousness is to provide us with information about our environment – about colours, shapes, sounds and so on. But this is too swift. We do not need to have conscious experiences in order to acquire perceptual information about our environment. Cog's sensors provide it with information about colours and shapes and sounds, too – it is just that this information does not have a phenomenal character to it. What is added by supplementing this information with phenomenal character? We can put the same point in terms of the distinction between access-consciousness and phenomenal consciousness. It is obvious why it is useful for a creature's experiences to be access-consciousness – to be available to the processes controlling reasoning and behaviour. But why is it useful for them to be phenomenally conscious, too?

It might be suggested that the phenomenal character of an experience affects our reaction to it. Pain, for example, not only tells us that our body has been damaged, but also induces us to react to the damage in a certain way. If I touch something hot, then the pain moves me to withdraw my hand. Smells, tastes and colours also provoke characteristic reactions. Again, however, this is too swift. For a sensory state could cause a reaction without having any phenomenal character. As I mentioned earlier, it would be possible to program Cog to take avoiding action when it detects damage to itself – so that if someone pokes it in its eye, for example, it registers the fact, withdraws its head quickly and says ‘Ouch!’. Yet it might still not actually feel anything – not have any conscious sensations of pain. So what is the point of consciousness? Provided Cog reacts to damage in the right way, why need it feel pain as well?

There is a general problem here. Whatever effects a conscious mental state has, it seems, a non-conscious one could also have. (‘Conscious’ here means ‘phenomenally conscious’ of course.) So why did evolution equip us with conscious experiences, rather than non-conscious ones? What survival advantage does phenomenal consciousness confer? Does it do anything at all? Or is it just a by-product of other processes, like the exhaust from a car's engine, which does not play any useful role?

This problem is closely connected with that of providing a reductive explanation of consciousness. Reductive explanations of mental phenomena typically exploit the fact that mental states can be characterised in functional terms – in terms of the role they play in mental processing and behavioural control. If a mental state can be characterised in this way, then we can identify it with whatever brain state plays the role in question. But if consciousness does not have a function, then this approach is a non-starter.

You may be feeling that something must have gone wrong here. Surely it is absurd to suggest that consciousness has no function – that the painfulness of a pain makes no difference to its effects. The suggestion is certainly counterintuitive; but that does not necessarily mean that we should rule it out. Even our strongest intuitions can mislead us (it seemed obvious to our ancestors that the earth was flat and that the sun moved through the sky), and we may have to escape the confines of our familiar outlook if we are to understand consciousness.