Maggie Jackson: Technology, distraction, digital health, and the future

About Maggie Jackson

 Photo credit: Karen Smul

Photo credit: Karen Smul

Maggie Jackson is an award-winning author and journalist known for her writings on technology’s impact on humanity. Her acclaimed book Distracted: Reclaiming Our Focus in a World of Lost Attention was compared by Fast Company magazine to Silent Spring for its prescient warnings of a looming crisis in attention. The book, with a foreword by Bill McKibben, will be published in a new updated edition in September.

Jackson’s articles have appeared in The New York Times, The Wall Street Journal, Los Angeles Times, and on National Public Radio, among many other publications, and her work and comments have been featured in media worldwide. Her essays appear in numerous anthologies, including The State of the American Mind: Sixteen Leading Critics on the New Anti-Intellectualism (Templeton, 2015) and The Digital Divide (Penguin, 2010). 

A former Boston Globe contributing columnist, Jackson is the recipient of Media Awards from the Work-Life Council of the Conference Board; the Massachusetts Psychological Association; and the Women’s Press Club of New York. She was a finalist for the Hillman Prize, one of journalism’s highest honors for social justice reporting and has served as a Visiting Fellow at the Bard Graduate Center, an affiliate of the Institute of the Future in Palo Alto, and a University of Maryland Journalism Fellow in Child and Family Policy. A graduate of Yale University and the London School of Economics with highest honors, Jackson lives with her family in New York and Rhode Island.


How can technology facilitate a healthy work-life balance? 

I believe that the crucial question today is improving the balance between digital and non-digital worlds

Over the last 20 years, technology has changed human experience of time and space radically. Distance no longer matters much, nor duration, as devices allow us to fling our bodies and thoughts around the globe near-instantly. While on a business trip, a parent can skype a bedtime story with a child at home. The boss can reach a worker who’s hiking on a remote mountaintop. Technology has broken down cultural and physical boundaries and walls – making home, work, and relationships portable. That’s old news now, and yet we’re still coming to grips with the deep impact of such changes. 

For instance, it’s becoming more apparent that the anywhere-anytime culture isn’t simply a matter of carrying our work or home lives around with us and attending to them as we wish. It’s not that simple by far. First, today’s devices are designed to be insistent, intrusive systems of delivery, so any single object of our focus – an email, a text, a news alert – is in competition with others at every minute. We now inhabit spaces of overlapping, often-conflicting commitments and so have trouble choosing the nature and pace of our focus. 

The overall result, I believe, is a life of continual negotiation of roles and attentional priorities. Constant checking behavior (polls suggest Americans check their phones on average up to 150 times a day) is a visible symptom of the need to rewrite work-life balance dozens of times a day. The “fear of missing out” that partly drives always-on connectivity also is a symptom of the necessity of continually renegotiating the fabric of life on- and off-line. 

Because this trend toward boundary-less living is so tech-driven, I believe that the crucial question today is improving the balance between digital and non-digital worlds. After that, work-life balance will follow. 

We need to save time for uninterrupted social presence, the kind that nurtures deeper relationships. We urgently need space in our lives where we are not mechanically poked, prodded and managed, ie when we are in touch with and able to manage our inner lives. (Even a silent phone in “off” mode undercuts both focus and cognitive ability, according to research by Adrian Ward at the University of Texas/Austin.) 

One solution would be to think more deliberately about boundaries in all parts of our life, but especially in the digital sphere. Too often lines of division are seen as a confinement, a kind of archaic Industrial Age habit. But boundaries demarcate; think of a job description, a child’s bedtime, or the invention of the weekend, a ritual that boosts well-being even among the jobless. Boundaries are systems of prioritization, safety zones, structures for depth, and crucial tools for providing structure in a digital age. A family that turns off its cell phones at dinner is creating opportunities for the kind of in-depth bonding that rarely is forged online.

Technology can help facilitate creative boundary-making – think of the new Apple and Google product designs that prompt offline time. But our devices cannot do the work of inventing and managing the boundaries that are crucial for human flourishing. 


Can you tell us about your new book?

My new book will draw from my research into how technology is changing our ideas of what it means to “know” something and what it means to be smart

I have a couple of book projects on the front burner. My most recent book, Distracted: Reclaiming Our Focus in a World of Lost Attention, explores the fragmentation of focus and the science of attention in the digital age. One of the first books to warn of our current crisis of inattention, it’s been compared by Fast Company magazine to Rachel Carson’s Silent Spring, and will be published in a new updated edition in September. 

After I finished that book, I realized that attention, as crucial a human faculty as it is, is nevertheless a vehicle, a means to whatever goals we are pursuing. And I began to see that if we have a moment’s focus, the crucial next stepping stone to human flourishing is to be able to think well, especially in a digital age. Those musings have led me on a multi-year journey into the nature of deliberation and contemplation, and in particular to the realization that uncertainty is the overlooked gateway or keystone to good thinking in an age of snap judgement. 

image.png

We think of uncertainty as something to avoid, particularly in an age that quite narrowly defines productivity and efficiency and good thinking as quick, automatic, machine-like, neat, packaged, and outcome-oriented. Of course humans need to pursue resolution, yet the uncertainty that we scorn is a key trigger to deep thought and itself a space of possibilities. Without giving uncertainty its due, humans don’t have choices. When we open ourselves to speculation or a new point of view, we create a space where deeper thinking can unfold. 

My new book will draw from my research into how technology is changing our ideas of what it means to “know” something and what it means to be smart. As well, I am drawing from new research on the upsides of uncertainty in numerous domains, including medicine, business, education, philosophy and of course psychology/cognitive science. It’s even a topic of conversation and interest in the HCI world, Rich Pak and others have told me. 

I believe that today more and more people are retreating political, psychologically, and culturally into narrow-mindedness, but I am heartened by the possibility that we can envision uncertainty as a new language for critical thinking. 


What does the future of human relationships with technology look like: good, bad, or ugly?

The essential question is: will our technologies help us flourish? The potential – the wondrous abundance, the speed of delivery, the possibility for augmenting the human or inspiring new art forms – is certainly there. But I would argue that at the moment we aren’t for the most part using these tools wisely, mostly because we aren’t doing enough to understand technology’s costs, benefits, and implications.

I’ve been thinking a lot about one of technology’s main characteristics: instantaneity. When information is instant, answers begin to seem so, too. After a brief dose of online searching, people become significantly less willing to struggle with complex problems; their “need for cognition” drops even as they begin to overestimate their ability to know. (The findings echo the well-documented “automation effect,” in which humans stop trying to get better at their jobs when working closely with machines, such as automated cockpits.) In other experiments, people on average ranked themselves far better at locating information than at thinking through a problem themselves.

Overall, the instantaneity that is so commonplace today may shift our ideas about what human cognition can be. I see signs that people have less faith in their own mental capacities, as well as less desire to do the hard work of deliberation. Their faith increasingly instead lies with technology. These trends will affect a broad range of future activities, such as whether or not people can manage a driverless car gone awry or even think it’s their role to do so; whether or not they any longer recognize the value of “inefficient” cognitive states of mind such as daydreaming, or whether or not they have the tenacity to push beyond the surface understanding of a problem on their own. Socially, similar risks are raised by instant access to relationships – whether to a friend on social media or to a companion robot that’s always beside a child or elder. Suddenly the awkwardness of depth need no longer trouble us as humans! 

These are the kinds of questions that we urgently need to be asking across society in order to harness technology’s powers well.We need to ask better questions about the unintended consequences and the costs/benefits of instantaneity, or of gaining knowledge from essentially template-based formats. We need to be vigilant in understanding how humans may be changed when technology becomes their nursemaid, coach, teacher, companion. 

image.png

Recently, an interview with the singer Taylor Goldsmith of the LA rock band Dawes caught my eye. The theme of the band’s latest album, Passwords, is hacking, surveillance and espionage. “I recognize what modern technology serves,” he told the New York Times. “I’m just saying, ‘let’s have more of a conversation about it.’” 

Well, there is a growing global conversation about technology’s effects on humanity, as well there should be. But we need to do far moreto truly understand and so better shape our relations with technology. That should mean far more robust schooling of children in information literacy, the market-driven nature of the Net, and in general critical thinking skills. That should mean training developers to become more accountable to users, perhaps by trying to visualize more completely the unintended consequences of their creations. It certainly must mean becoming more measured in our own personal attitudes; we all too often still gravitate to exclusively dystopian or utopian viewpoints on technology. 

Will we have good, bad, or ugly future relations to technology? At best, we’ll have all of the above. But at the moment, I believe that we are allowing technology in its present forms to do far more to diminish human capabilities than to augment them. By better understanding technology, we can avert this frightening scenario.