Mind Change: How Digital Technologies Are Leaving Their Mark on Our Brains

Image of Mind Change: How Digital Technologies Are Leaving Their Mark on Our Brains
Release Date: 
February 11, 2015
Random House
Reviewed by: 

“Anything we practice repeatedly changes the brain; fixate on iPhones and similar screens, and we become better at staying helplessly glued to them.”

For eons humans have adapted splendidly to every niche on the planet. In the digital era our brains have likewise been adapting to the increasing bombardment of data, and the results are decidedly mixed. So says Baroness Susan Greenfield, a much–decorated neuroscientist and former director of Britain’s Royal Institution.

What she says should give us pause because the known benefits of screen technology do not, in her mind, compensate for its more numerous negatives.

Dr. Greenfield’s perspective begins at the microscopic level, the configurations of brain–cell connections unique only to you that accordingly yield a singular mind. It is these idiosyncratic associations that endow personal significance to the people, objects, and events in one’s life. Yet the digital world’s influence on identity is merely one of four grand themes that she unpacks in the rich pages of Mind Change.

Currently, external forces shape identity more than an internal sense of who one is. A potential audience is always in mind, as is its approval. Around age 11 research shows that youngsters undergo a personality split, inventing online personas that can indulge in behavior that their real-life selves might feel inappropriate. Uninhibited by the feedback inherent in face–to–face interactions, they can be meaner, more opinionated, reckless, and sexually adventurous. Yet this liberating abandon is paradoxically paired with a self–monitoring in which everything these youngsters do is judged according to whether it’s worthy of sharing on social media or not.

The need for affirmation is ancient, but it has led in the digital world to a milieu in which accomplishment and talent are no longer measures of self-worth. What matters is how many followers you have, how much feedback and attention you can garner in cyberspace. It’s about numbers and the ultimate desire to “go viral.”

Instead of experiencing life, you engage in a performance of life before a critical, vigilant audience. And so the digital world brings out the worst of human nature: a shallow desire for status and recognition irrespective of merit. The rampant growth of selfies is indicative of this narcissism.

The effect on memory and learning by digital devices is another theme. Teachers and parents complain of digital natives’ limited attention spans and inability to synthesize the information they cut and paste from the Internet. They are unable to connect the dots, to turn that information into knowledge let alone advance to the next achievement of wisdom.

Distracted minds learn poorly because learning takes time. Knowledge cannot be downloaded to a jacked–in brain as in The Matrix. Sustained, focused attention is required to understand anything in depth—which is precisely what most digital natives lack.

The heavy reliance on search engines, Greenfield says, and surfing rather than taking the time to delve into a subject is inducing physical brain changes and changing the way we think. When you always have the luxury of looking something up, there is no need to remember facts. And, indeed, studies show that subjects don’t learn facts so well when they believe the information will be accessible later. Google is wiping out factual memory—the dots to be connected over the course of a lifetime—just as earlier calculators wiped away the ability to calculate in one’s head.

Decreased empathy and emotional intelligence is associated with a screen world in which you never look anyone in the eye and no one looks back at you. Yet it requires practice to know oneself accurately, to read others via tone, body language, and expression, and to discern their state of mind. Emotionally numbed and isolated from live feedback, the digitally immersed fail to see how they come across. They have to be told when their words or actions are hurtful. Even then, people eschew conflict resolution in person and prefer to “apologize” via text.

Evidence suggests that obsessive gaming leads to greater recklessness and an increasingly aggressive disposition, perhaps because games suspend consequences, reduce a complex world to a cartoon sketch, and remove impediments that abound in real life. Whether screen exposure similarly leads to autistic-like behaviors is inconclusive yet suggestive.

There is a well-established link between early TV watching and autism, and we’ve known for a long while that autistic individuals are more comfortable in cyberspace than in the real world.

Lastly, there is the cultural clash between digital natives, who have never known a world without the Internet, and older “digital immigrants” who still struggle with data deluge, or prefer to say no to it and resist its culture of over-sharing. On one hand there is George Clooney who says, “I don’t like to share my personal life . . . It wouldn’t be personal if I shared it.” At the other extreme is Facebook founder Mark Zuckerberg, blind to his arrogance and blithe at arrogating from millions a prerogative that is not his: “We decided that [sharing everything] would be the social norm now and we just went for it.”