Laurie Taylor:
If you're all sitting quietly - here's a rather wonderful poem by Emily Dickinson: A montage of a woman walking into a sea made of herself Creative commons image Credit: Shirin K A Winiger under CC-BY-NC-SA licence 'yesterday was tomorrow's last week' by Shirin K. A. Winiger

The Brain - is wider than the Sky -
For - put them side by side -
The one the other will contain
With ease - and You - beside -
The Brain is deeper than the sea -
For - hold them - Blue to Blue -
The one the other will absorb -
As Sponges - Buckets do -
The Brain is just the weight of God -
For - Heft them - Pound for Pound -
And they will differ - if they do -
As Syllable from Sound -

The Brain is wider than the sky. Let's consider that first verse. Well it's got one obvious meaning and I'm just going to quote a meaning for that first verse:

"It is a celebration of the sheer scale of the brain, its ability to encompass the entire world…" and then the writer goes on, "But there is a twist in that verse. Not only does the brain include the whole sky, it also includes the mind - the 'You' or self that perceives the sky and can think about it. It is this mind that makes the brain wider than everything."

And those words of appreciation come from a new book by author and Sunday Times journalist Bryan Appleyard who enjoys their resonance so much that he's nicked the first line of that poem for his new book, the line The Brain Is Wider Than The Sky and its subtitled Why Simple Solutions Don't Work in a Complex World.

BA (Honours) Combined Social Sciences BA (Honours) Combined Social Sciences

Develop a rich synthesis of knowledge and skills drawn from the breadth of social sciences...

Read more : BA (Honours) Combined Social Sciences

Bryan now joins me in the studio together with John Gray, the author of Straw Dogs and Emeritus Professor of European Thought at the LSE.

Let's just concentrate a little, Bryan, to begin with on that poem. Just expand on the meaning of it, why you love it so much.

Bryan Appleyard:
Well first of all I love Emily Dickinson and the extraordinary thing about her was that she lived this very sequested life in Massachusetts, one theory suggests she had epilepsy which was rather unrespectable at the time.

And I like it because it very quickly, in the first line almost, it gets you there, which is this mystery of how the mind seems to create a world, encompass the world and itself, which is obviously what human consciousness does.

The first two verses say the same thing in different ways, the third verse is much argued about. But sometimes I think it means one thing, sometimes another, but it's precisely about the balance of what it is to have a mind, it has consciousness.

Laurie Taylor:
This does introduce the whole idea of consciousness, -of our capacity for being aware of ourselves as something which is, or has been, inexplicable. This is what you call the hard problem of consciousness isn't it?

Bryan Appleyard:
Yeah, the hard problem, I suppose in one sense, is simply how matter becomes mind in the first place; but it's also how having become mind it has become self-conscious - so there's a sort of double hard problem.

And I became very interested in this as an aspect of a wider issue which is in our time, and the reason we're so interested in it, is suddenly neuroscience has raced ahead of the other sciences in fashionability and that's primarily because of the FMRI machine which can track movements in real time of blood flows in the brain.

I had a FMRI scan as part of this process and it didn't work because I had to read that poem and I got no emotional response whatsoever.

Laurie Taylor:
Would you accept, the idea there's going to be a material basis for consciousness, I mean you may want to say that perhaps these MRI scanners can't find consciousness, can't stick a pin on it, as some sort of territory of the cortex, can't do that, but nevertheless, when an explanation of consciousness is produced - unless you want to say it's inexplicable - it is going to be a material explanation; it's one that's going to be do with neurons and cells and synapses and so on?

Bryan Appleyard:
I don't know, is the short answer to that. I mean, personally, there's a perfectly logical reason for saying it's impossible - it's perfectly rational to say it's impossible for the mind to take a picture of itself, as it would be for a camera to take a picture of a film.

But I do know that what we're looking at now is very, very basic. I think Colin Blakemore said '- it's like looking at the cosmos through Galileo's telescope' - it's very, very primitive and we're not seeing fine detail at all. These are low resolution scans, effectively.

Laurie Taylor:
John Gray - are you a materialist in this respect?

John Gray:
No I'm not, at least I'm certainly more agnostic than I could be if I were materialist - and for some of the reasons that Bryan explains in his book.

I mean one important argument in his book is that reductionist programs in science - that's to say programs which aim to explain some phenomena or structures or things in the world fully and exhaustively by reference to something which is simpler or more rudimentary, a few elements or principles or ultimately maybe only one such element or principle - that those programs don't seem to work in lots and lots of contexts in that you can put together a number of simple elements and something completely different comes out in ways that can't be reduced in that way.

So it could be that mind or consciousness is like that, with respect to matter. I mean it might be that if there were no matter there'd be no mind. But it could still be the case that mind is not reducible to matter. And that's a sort of view that I would be inclined to think was plausible.

Laurie Taylor:
The reason you want to cite consciousness and draw our attention to the difficulty of explaining it or finding a material basis for it is because you want to do something to undermine the grandiose claims that are being made about what computers are going to be able to do.

Particularly you refer to this concept of singularity - perhaps you'd better explain what singularity means before you demolish it.

Bryan Appleyard:
It's troubling similar to the American fundamentalist idea of the rapture, which is in a certain couple of decades, 2045 I think, what will happen is our technology is increasing exponentially in this view, it will converge at some point in 2045, we'll build the last machine we ever need to build. [It] will be a hyper-intelligent computer which will be able to make itself more intelligent and that will sort-of solve all our problems.

Now in some versions we download ourselves, upload ourselves, into this computer and become effectively immortal, or it simply makes it medically mortal as we are.

This sounds very extreme and slightly crazy but in fact it's quite conventional thoughts in large areas of American Silicon Valley and so on, that we are indeed going to attain this machine consciousness.

And it underpins a lot of the way consumer computers are going, the way they're becoming more intimate to us, so in smart phones, GPS devices and everything, they're tracking us and sort-of borrowing our - stealing our - information. And I think that's a tendency that is the other side of the neuroscientific equation - these machines are closing in on us.

Laurie Taylor:
But in terms of solving problems, in terms of accepting data and interpreting data and producing solutions, we might be moving towards a stage where computers can do far more than they can at the moment, aren't we in some danger of, if you like, being almost Luddite if we continually say 'oh it's very well they can do that, they can run the world, they can manage us, they can do the economy, they can sort out the trade balance, they can manage all these - they can do all that but they don't have consciousness'. I mean, don't we sound a little bit old fogeyish, we're clinging on to this one little tiny bit of human difference that we've got, and we're saying 'look you think you've got it beaten, you think you're being clever, look at your wonderful computers, we've got consciousness'?

John Gray:
Wel,l it might not even be the consciousness which is what is lacking from the computers. I mean it may well be lacking and that might be crucial, as Bryan says, but it might also be the case that anything which can't appear in consciousness or which can't be broken up into bits of information by conscious beings, such as ourselves, is missing as well.

And one of the things I'm bothered about, which is also discussed in Bryan's books, is that many of the sources of human creativity are not conscious and many of the sources of the meanings in our lives come from areas of our mental life that are not conscious and in some cases perhaps can't even be made conscious.

So one of the oddities about this idea - which people who call themselves trans-humanists have taken up as a serious practical project of a higher type of intelligence embodied in a machine or in some further enhanced version of ourselves, that is designed by us, or perhaps in collaboration with the machine - is that anything which goes into that design - the design of a higher human being - would only be the part of ourselves that we can process as information, the parts that we can't will inescapably be missed out.

So we might end up with a being, a creature, a half-android half-machine, or whatever, which is in some senses superior to us - but it may lack the parts of ourselves that make our lives worth living.

And so we might then ask, having created it, why should we make way for it, if the life it leads is meaningless to us?

Laurie Taylor:
So you're talking about parts of the human character being excluded rather in the way they are when we're attempting to have a conversation on the phone with a computer which forces us - into yes, no [or] ridiculously restricted answers?

John Gray:
Yes, yes exactly.

Laurie Taylor:
But I mean this is on a grand scale. We're talking about emotion as well here, aren't we? We're talking about the way in which this notion of singularity or whatever pays no attention to the relevance or significance of emotion in our lives?

Bryan Appleyard:
Well exactly, I mean you mention these call trees, as they're known, when you talk to them, you have no human communication with these things through the sort of unspoken cues, even the unconscious cues, because you're hearing tone of voice.

One of the reasons I wrote this book was I got so angry about one of these things - and I realised that what it was doing was desperately simplifying me so I could be machine readable. It was stripping away all the things - the things that John refers to - stripping away my personality in order to just to get me to buy something.

Laurie Taylor:
Yeah but if you're getting an insurance policy I don't know whether I want all sorts of aspects of my personality to come into it.

I mean I'd like it to get better, I'd like the phone system to get better but that's what I'm asking for, not for it to be abolished am I John? I mean aren't I complaining about its inefficiency rather than the idea that it would be quite nice to have machines...

John Gray:
But it's you, Laurie, who's complaining, not some future enhanced replica of you which sees nothing wrong in this tree system.

In other words what's complaining is a highly complex mixture of conscious/unconscious emotion, thought, immensely - and according to Bryan, I think he's absolutely right here, - and Emily Dickinson more complex than we ourselves can conceive, that's the key. In other words anything that we construct or design will be based - in terms of improved version of ourselves - will be based on our present limited understanding of ourselves.

Now even if having designed it that machine can then go on to design itself even higher, it would still have missed out what we haven't put into it.

Bryan Appleyard:
Yeah, it might go off in a completely different direction and reject us.

Laurie Taylor:
Is it a question of either/or? I mean I can remember - if we go back to the 18th and 19th Century we find romantic poets objecting to progress on rather similar grounds to the ways you're talking: 'we're going to lose all the emotion, it's going to become a soulless world, getting and spending, we've laid to our powers'....

Bryan Appleyard:
We do have the technical means.

Laurie Taylor:
Well you want to - I mean but David's saying that these are the trains arriving in the Lake District or an indication really that all was going to be - all the mystery and wonder of life was going to ....

Bryan Appleyard:
Well they left us with a great legacy, we preserve our environment to a large extent - we have national parks and things - that was a great legacy of that position. So in a sense they won, I mean Ruskin was as powerful as Robert Louis Stevenson in that sense.

So I mean that legacy I think does feed through into what I'm saying and it's very important. And indeed one of the important points in the book is about art and the phenomenon of neuro-aesthetics where they're trying to use neuroscience to probe our reactions to art. Now it's not really working to be honest but...

Laurie Taylor:
It sounds like Huxley's phrase 'trying to prove the world is round musically', I mean it's really sort of incompatibility of spheres isn't it?

Bryan Appleyard, John Gray, thank you very much.

Actually my most recent attempt to use one of those call trees foundered this week. I spent half an hour on the screen putting in passwords, my mother's maiden name to get some insurance and then it said: 'We are unable to offer insurance as your property does not match the particulars of any existing property.'

There you go. Who said machines weren't logical?

This discussion was originally broadcast as part of Thinking Allowed on BBC Radio 4 on Wednesday, 16th November 2011.