Thursday, April 23, 2009

The Two-Dimensional Mind


A world leader in the field of neuroscience, Professor Susan Greenfield has devoted her life to studying the 1.5kg of tissue that makes each one of us who we are. She talks to Eimear Vize about her fears that incessantly escaping into cyberspace may be corroding our sense of who and what we are.

Baroness Susan Greenfield wants to warn us about the loss of personal identity that threatens our screen-addicted teen generation.

It’s not the first time that someone has held aloft the shiny boxes of IT and game technologies and warned that no good will come of them. But one tends to take notice when a brilliant neuroscientist preaches the dire prediction that computer games and virtual lives could drastically alter our brains, and impact on how we interact and think, leading to a loss of empathy and imagination, even blurring the cyber world and reality with ominous consequences.

Susan Greenfield calls it the Nobody Scenario. By spending inordinate quantities of time in the interactive, virtual, two-dimensional, realms of cyberspace, she maintains we are destined to lose an awareness of who and what we are. This increasing addiction to technology could limit our individuality: “not someones, or anyones, but nobodies”, she warns.

A renowned and at times provocative British scientist, writer, broadcaster, and member of the House of Lords, Baroness Greenfield kicked up a media storm in the UK and the USA recently following concerns she expressed during a House of Lords debate, in which she said that social networking, including Twitter, Bebo and Facebook, as well as computer games, might be particularly harmful to children, and could be behind the observed rise in cases of attention deficit-hyperactivity disorder and autism.

Her controversial theories are fleshed out in her new book - ID: The Quest for Identity in the 21st Century (Sceptre, 2008), which emphasises the plasticity of the brain and its vulnerability to these invasive new screen-based technologies.

Susan is concerned that, for the first time in human history, individuality could be obliterated in favour of a passive state, reacting to a flood of incoming sensations that could shift the landscape of the brain into one where personalised brain connectivity is either not functional or absent altogether.

Given the time young people spend glued to TV and computer screens – estimated to be six hours daily by the 2009 ChildWise survey in the UK – the Baroness believes the minds of the younger “screen” generation are developing differently from those of previous “book” generations

But which concerns her most: the subject matter – mindless blasting of zombie fiends on game consoles, desensitising the youth of today? – or the method of delivery?

“It’s the method of delivery I think, hours spent in front of screens that could be spent in conversation, or socialising, or reading a good book. But what surprises me, as I said in the book is that this is the first time grown-ups are playing these computer games.

“Normally you play charades or you play bridge or something, but that’s a means to an end, it’s usually a social thing, but to play a game on your own as an adult, as a matter of choice to spend six hours a day playing Lord of Warcraft, for example, I find very…” she brakes off, searching for the right adjective, but alternatively notes, “especially that people have the time to do this of course, what a waste of time, yeah? Also socially it’s completely devoid of any value whatsoever; it worries me that adults are playing these games by themselves.”

The plasticity of our minds – “not just young minds it’s older adult minds also” she emphasises – is a central theme in her latest book. “Although neuroscientists are very familiar with this, not many in the general public perhaps have realised that each one of us has a unique brain because each individual brain, even if you’re a clone, an identical twin, will have unique configurations that are driven in turn by the unique experiences that people have. The brain adapts rather quickly to the environment it’s in. If you have an environment that’s different then the brain will adapt accordingly. Interacting only with the screen, and doing so in a solitary way, mandates you living in two dimensions,” she says, sitting in her office at Oxford University where she is Professor of Pharmacology, and also heads a multi-disciplinary group researching brain mechanisms linked to neurodegeneration.

The first female director of the Royal Institution of Great Britain, Susan’s credentials as someone able to bridge the gap between scientists and non-scientists - one of the main reasons the Royal Institution was originally established - are impeccable. Her research career for the past decade has run parallel with another as prolific writer and publicist for science. She has appeared on television as host of the BBC documentary series Brain Story, in glossy magazines including Hello! and on innumerable public platforms. Her books include Journey to the Centres of the Mind (1995). The Human Brain: A guided Tour (1997), The Private Life of the Brain (2000), and Tomorrow's People: How 21st Century Technology is Changing the Way We Think and Feel (2003).

A star media boffin, Susan is the public face of science for many in the English-speaking world. When her latest book ‘ID’ was published last year the angle adopted by many reviewers was that galloping technological advances have pushed human kind to the brink of a mass identity crisis. In her own words, however, Susan qualifies her concern stressing that we are allowing ourselves to ‘sleepwalk’ towards this potential transformation: “It’s only a crisis if we allow ourselves to sleep walk into it, whereas it could just as easily offer huge opportunities for us as well as a crisis,” she remarks.

Her primary objective in questioning this “screen life of two dimensions” is to throw open the door to debate on this issue and get the scientific community and general public, finally, exploring what may be lost or gained for humanity from their increasingly frequent interface with technology.

So, are we on a virtual journey towards irrevocably losing certain mental skills that have taken millions of years to develop?

“Well that’s what I think should be found out,” she answers. “I think the Government should have a look into this. I think there are winners and losers cognitively, and all I ask is that people should study that more and evaluate it in a systematic way so that you can actually see what people do better, what people do worse, then we can decide how we harness these things. But at the moment it’s anecdotal,” she concedes.

Susan highlights a particular issue that she says worries her “hugely”: the three fold increase over the last ten years in cases of Attention Deficit Hyperactivity Disorder and the rise of Ritalin prescriptions. “And I just wonder could it be due to a ‘screen world’ mandated by a short attention span. That’s only a theory of course, but shouldn’t we be looking further into this?

“What we really need first is to decide what we want our kids to learn. What values, talents and skills do you want them to have? That they are very good at solving abstract problems, like in an IQ test, or is it that they are creative, or do you care more about how imaginative they are? Do you care about how well they read, or do you care about how many languages they speak, or do you want them to do science and have a natural curiosity about experiments?

“Until you answer those questions, how can the computer technologist design the software for you, yeah? You have to answer those things first.”

While the South Australian Government's thinker in residence in 2004 and 2005, Susan was on a mission get scientists out of their ivory research towers and talking to the media, to the private sector, to politicians, to educationalists in order to contribute to mainstream life in the 21st century. “Science is touching everything we do. We need to do this,” she says. 


One of her projects in Adelaide was to get neuroscientists and educationalists to work together to explore how being 'people of the screen' might be different from 'people of the book', as she refers to older generations.

The question is can we map those differences?

“I think there could be but you’d have to do it systematically. Certainly generationally, people like me, I was born in 1950, are obviously people of the book, yeah? Where as there are certain people alive now who are quite articulate and coming up to adulthood, who will always remember computers. So I think one could actually compare and contrast in large groups, what we can and can’t do, and what they can and can’t do, it might be quite interesting. Neuroscientists, computer experts and educationalists should work closely together, bankrolled by the Government.”

But she readily expresses reservations that modern imaging technologies would be able to assist in this mapping exercise. “I’m one of the cynics about imaging because it only shows you bits of the brain. It’s a bit like saying the monitor light in your iron is on, it doesn’t really tell you much more one way or another. Apart for perhaps that brains may be working in a different way it doesn’t say what’s happening.

“It’s a bit like those old Victorian photographs where the exposure time was so slow that you would see the buildings but you wouldn’t see the people, so ongoing steady states can be picked up because the time window, the exposure, is greater than a second. And we know the brain works on thousandths of a second, so if you want to see ongoing sustained, protracted situations then you can - of course it helps with diagnosis of tumours and this sort of thing - but not with understanding exactly what’s happening in the brain.

“My own subjective view about what is important is that people should be fulfilled, and that of course will change from century to century. What gives you personal fulfilment? I think we could, especially with the technologies available and coming on-stream, be lured into a rat race where you want to be smarter than other people, like with these cognitive enhancers, and it begs the question why do you want a better memory, what’s that going to do for you?

“I can see an increase in the appeal of smart drugs, which I think is very sad because why must you feel inadequate about a thing that the brain does anyway, brilliantly, is to learn and adapt?

“But it appears we all want everything in an instant. I was horrified to hear on the radio someone say that books were cognitive enhancers and pills were quicker routes,” she comments, aghast. “I mean that’s just….we’re living in a world now where everything has to be fast and instantly delivered. I heard on the BBC recently: “you’re revision getting you down? Here it is in bite-sized chunks”, and I felt like saying that’s wrong, you don’t want to atomise facts you want to put them into another bigger context to correlate one thing to something else.”

Susan’s writings have, to varying degrees, criticised the Technophobe, the Technophile and the Cynic. So what would she advocate? “What we need is a techno-savvy populace who say, yes we have this technology now how can we use it? How can we make the most of it? We should ask ourselves how could we stave off the more worrying excesses of the new technologies that could erode our human individuality? It’s starts with this awareness, asking these questions.”

No comments:

Post a Comment