"It’s harder to imagine the past that went away than it is to imagine the future." (From his Paris Review interview.)
Contemplative computing may sound like an oxymoron, but it's really quite simple. It's about how to use information technologies and social media so they're not endlessly distracting and demanding, but instead help us be more mindful, focused and creative.
My book on contemplative computing, The Distraction Addiction, was published by Little, Brown and Co. in 2013, and is available in bookstores and online. This 2011 talk is a good introduction to the project and its big ideas.
I've got a piece on binge-watching in the latest issue of Slate. In that piece, I mention that I've been doing interviews with people about their viewing habits. If you want to share your experiences binge-watching, I'd be happy to hear them.
The questions are below. You can cut and paste them into an email, with your answers, and send it to email@example.com.
1. What kinds of shows do you binge watch? What does a show have to offer to be worth watching for hours at a time?
2. Thinking about your mental state of level of focus when you binge watch, is it a different kind of experience than watching regular live television, or YouTube videos, or things on the DVR?
3. Do you pay more or less attention during these multiple hours of watching?
4. Are there other activities-- the theatre, the movies, concerts-- where you're similarly engaged?
5. Are there things you consciously ignore or filter out while you're watching, either on screen or in your environment-- e.g., fast-forward through commercials, turn off your phone, not check your email?
6. Are there things you do during a binge-watching session that you don't do when watching "normal" TV-- e.g., look up the cast on IMDB to see what else they've been in, knitting, your taxes?
7. Where do you binge watch? In the living room, bedroom, on planes, elsewhere? And on what device(s)?
8. How does this differ from what we used to call “watching lots of television”? For you, is there a qualitative difference between binge watching and hours of channel-surfing, or watching whatever happens to be on?
9. Do you binge watch with other people, by yourself, or a mix?
10. If you could share a bit about your age and occupation, that would be great.
In a recent Random House blog essay, writer and bookseller Fiona Ducan argues that "we read more, and in more ways, than ever, and this is thanks to all the book-killing culprits" online.
books and devices on the #2 uptown, via flickr
She also makes a great point about how thinking about books as technology lets us see their value more clearly, and think more usefully about how printed and digital media can coexist in the future:
Literary types privilege the book as the ultimate form for reading. To privilege the book as reading, though—to forget that it is a technology—is analogous to forgetting one has a body (something lit types are also wont to do), and to forget one has a body is to let it soften and lay to waste. When you recognize the book as technology, you realize that print and screen, like body and mind, are not mutually exclusive mediums, but that they are increasingly mutually influencing.
On the heels of Kathryn Schulz’s essay on the impact of Twitter on her writing comes high school teacher Andrew Simmons’ equally thoughtful explanation of how "Facebook Has Transformed My Students' Writing—for the Better:"
As a high-school English teacher, I read well over a thousand student essays a year. I can report that complete sentences are an increasingly endangered species…. However, while Facebook and Twitter have eroded writing conventions among my students, they have not killed the most important ingredients in personal writing: self-reflection and emotional honesty. For younger high school boys particularly, social networking has actually improved writing – not the product or the process, but the sensitivity and inward focus required to even begin to produce a draft that will eventually be worth editing….
Many of my students grow up in households in which machismo reigns supreme. They've never been allowed to cry. Their mothers and sisters cook and wash the dishes and clean. They've been encouraged to see themselves as dominant, powerful, swaggering, sullen men, not sensitive and reflective men, powerfully kind, confidently open. Fostering those traits is a woman’s responsibility, like housework. In this sense, Facebook is a genuine outlet for the young men I teach. Just as social networking frees users from public decorum and encourages the birthing of troll alter egos, it allows my students to safely, if temporarily, construct kinder, gentler versions of themselves as well.
The great news is that this has a positive effect on teaching and learning. My students in 2013 are more comfortable writing about personal issues than were my classmates in the mid-late '90s. When I assign narrative essays, students discuss sexual abuse, poverty, imprisoned family members, alcoholic parents, gang violence, the struggle to learn English in America – topics they may need to address, not merely subjects they believe might entertain or interest a reader.
For Schulz (and for me), tweeting and blogging offers some of the same emotional rewards as what I think we’d both classify as "real" (i.e. paid and published, but mainly paid— true writers are completely mercenary about their work), but in the psychological equivalent of a low-nutrition, energy-shot form. At my age, and because of the way I use it, Facebook isn’t about self-discovery so much as self-presentation and keeping up with a few friends. But if I were younger, I can imagine it working the way Simmons describes.
The contrast between these essays serves as a reminder of how differently the same medium can affect users at different stages in their lives, or with different skills. It reminds me of this recent study (behind a firewall) of the impact of video games at TV on children’s psychosocial adjustment: yes, it concludes that playing three hours per day or more games has no discernible impact on children’s levels of aggression or sociability, but it covers 5- and 7-year olds, who probably aren’t playing a lot of Quake 4 or GTA 5. Assuming these kids are playing games that are age-appropriate, applying the results to older kids, or to violent games, or to “gaming” generally (as if that were a useful category) involves making a bunch of leaps that probably aren’t warranted.
I’ve never before gone mad for any type of technology. Even the Internet did not particularly seduce me before the Twitter portal. I used it only for e-mail, and for targeted research; as recently as 2009, I probably spent, on average, under 30 minutes a day online. I didn’t have a cell phone until 2004, didn’t have a smartphone until 2010. I only got addicted to coffee three years ago. But then along came that goddamned bluebird, which seems to have been built with uncanny precision to hijack my kind of mind....
It's one of the most thoughtful pieces about the upsides and downsides of Twitter I've read.
Part of what makes it attractive is not that it's simply a distraction; rather, "it’s sufficiently smart and interesting that spending massive amounts of time on it is totally possible and semi-defensible." It occupies a space somewhere in the "exploration" space of the graphic I described a couple years ago.
The particular strength of the piece is that, as well as anything I've seen, it explains why Twitter is especially appealing to writers-- and why that appeal is a two-edged thing:
A tweet is basically a genre in which you try to say an informative thing in an interesting way while abiding by its constraint (those famous 140 characters) and making use of its curious argot (@, RT, MT, HT). For people who love that kind of challenge — and it’s easy to see why writers might be overrepresented among them — Twitter has the same allure as gaming. It is, essentially, Sentences With Friends.
It's a bit like speed chess as described in Waiting for Bobby Fisher. At one point the chess master (Ben Kingsley) explains to his young student that speed chess is fun, but destroys your ability to play serious games.
I am convinced that steadily attending to an idea is the core of intellectual labor, and that steadily attending to people is the core of kindness. And I gravely worry that Twitter undermines that capacity for sustained attention. I know it has undermined my own: I’ve watched my distractibility increase over the last few years, felt my time get divided into ever skinnier and less productive chunks.
More disturbing, I have felt my mind get divided into tweet-size chunks as well. It’s one thing to spend a lot of time on Twitter; it’s another thing, when I’m not on it, to catch myself thinking of — and thinking in — tweets. This is a classic sign of addiction: “Do you find yourself thinking about when you’ll have your next drink?” etc. In context, though, it’s more complicated than that, because thinking in tweets is only a half-step removed from what I’ve done all my life, which is to try to match words to thoughts and experiences. The job of a writer is to do that in a sustained way — a job I find brutally hard, and, when it works, deeply gratifying. The trouble with Twitter is that it produces a watered-down version of that gratification, at a very rapid rate, with minimal investment — and, if I am going to be honest with myself, minimal payoff, and minimal point.
It's not just that training your brain to write in 140 characters can wither your ability to concentrate in the kind of serious, sustained way every writer finds is essential to write books; the very public, immediately rewarding nature of the writing is also a problem. Schulz talks about spending years in a virtual cave when she was writing her book. "In my experience, and the experience of most writers I know, that cave is the necessary setting for serious writing," she says. "Unfortunately, it is also a dreadful place: cold, dark, desperately lonely." (I agree completely that it's essential for every writer, but I find it a more felicitous place; still, I get what she means.)
Twitter, by contrast, is a warm, cheerful, readily accessible, 24-hour-a-day antidote to isolation. And that is exactly the issue. The trouble with Twitter isn’t that it’s full of inanity and self-promoting jerks. The trouble is that it’s a solution to a problem that shouldn’t be solved. Eighty percent of the battle of writing involves keeping yourself in that cave: waiting out the loneliness and opacity and emptiness and frustration and bad sentences and dead ends and despair until the damn thing resolves into words. That kind of patience, a steady turning away from everything but the mind and the topic at hand, can only be accomplished by cultivating the habit of attention and a tolerance for solitude.
By removing the need to isolate yourself to write, offering an alternative that's social, and providing immediate emotional rewards (but no financial ones), Schulz is arguing, Twitter lets writers practice a version of their craft that is fun, that is occasionally valuable enough to keep doing (there's our old friend Mr. Intermittent Rewards!)-- but which can also make it harder for writers to do more serious writing.
Not that she has any plan of giving it up (any more than I do). The challenge is to learn to use it thoughtfully, to observe how your usage can unintentionally affect you, and to retake control over it when it tries to push you in a different direction.
When I was writing The Distraction Addiction, I got into the habit of getting up super-early to write. I'm not a morning person, and never have been, but I was willing to try anything to get words on the page, and lots of writers talk about getting up before dawn to write.
After a few painful days, I found it really worked for me. In the pre-dawn hours, I was just too damn bleary for Facebook to be appealing: it was the socio-cognitive equivalent of still being in your bathrobe and not wanting to interact with anyone until you've showered and dressed. I also felt like if I was going to get up this early, I wasn't going to waste it, dammit.
I also discovered that if I stopped in mid-sentence the night before, I could sit down at the keyboard, finish that thought, and that would ease me into writing before the birds were up. Eventually, I'd get into a flow, start coming up with new things, and by the time it was time to get the kids to school, I had a few hundred words.
I also tried a mobile version of this as well. After dropping the kids off at school, I would take my old dog Christopher for a walk around the school (he had lived in that neighborhood so he knew it well), and carry my little Moleskine notebook and a pen in my hand. Christopher walked slowly, so it was pretty easy to jot down ideas in it as we walked.
They didn't have to be brilliant turns of phrase, or blazing insights; my aim was just to keep thinking, and keep writing. Probably 95% of what I wrote down didn't go anywhere, didn't lead to another better thought, and never made it into the manuscript. But the other 5% did.
Writing a 90,000 word manuscript happens one word at a time, so having lots of good ideas matters.
That led me to a bigger realization: that while I'd spent most of my life thinking that the way you write is to wait for inspiration to hit and then start writing, a more reliable (and more professional) approach is to start writing, keep going, and trust that you'll say something inspired. Writing is not an activity in which the creativity comes first; writing is the act through which creativity is sparked and expresses itself. You're not a scribe to a Muse who throws lightning bolts (or whatever muses threw) at you; you're an artisan whose inspiration is bound up in engagement with your instruments and materials and workspace.
Drake Baer has a piece in Fast Company that explains why this is so-- why inspiration follows work, rather than the other way around. "[A]dvances in science have allowed us to get a better idea as to why better ideas come after jumping into our workflow," he writes, "rather than waiting for sudden inspiration to strike." The big idea is that regular, persistent practice puts you in a state in which big ideas-- which actually are always shooting around in your mind, undetected and inaccessible-- have a chance to get out.
You can go read the article for the details, but this strikes me as exactly right, and also not at all surprising-- though our belief in native youthful genius, and our admiration for people who seem naturally gifted tends to obscure that fact. Baer's account of the relationship between regular, persistent work and creativity is not at all unlike Christian contemplatives' description of silence and meditation as essential for hearing the voice of God. As Thomas Merton would have put it, that voice is always present, but you have to still yourself enough to become aware of it, and to hear what it says.
There's another takeaway from the piece. We tend to think of creativity as a kind of Browninan collision of different ideas; indeed, Fast Company's definition is "finding the connections between seemingly unrelated things." Like this restaurant in Finland:
This tends to encourage us to think of creativity as something happens during the give-and-take of brainstorming sessions, as arising from collaboration between interdisciplinary groups, through scintillating shop talk around the lab bench or water cooler, etc.
But the insight about persistence and focus reminds us that it's critically important to have times where you can actually make and observe and reflect on those connections in your own mind. To have ideas worth sharing, you have to have a persistent practice that lets those connections become visible and tangible, expressed as words or code or pictures or plans.
A little while ago I got an invitation to contribute to Medium, and decided to try it out this morning. I posted there an essay on Nigel Thrift's new piece on attention in the digital age, which I kind of went Morozov on. But it was a nice, brief example of how not to think about attention, media, and education.
I've copied it below the fold, in case Medium goes belly up. But they do lovely work, and I'm curious to see what kind of traffic one gets there.
This week's Chronicle of Higher Education has a long piece (behind a firewall) on the work of University of Washington professor David Levy. Levy wrote a very smart book in the early 2000s called Scrolling Forward (I reviewed it for the Los Angeles Times), and for the last several years has been thinking about how the pursuit of ever-faster information technologies and decision-making drive away "slower, 'endangered' practices, such as time to think and reflect, time to listen, and time to cultivate our humanity."
The Chronicle article talks about how students in his class "scrutinize their use of technology: how much time they spend with it, how it affects their emotions, how it fragments their attention. They watch videos of themselves multitasking and write guidelines for improving their habits. They also practice meditation—during class—to sharpen their attention." All good. More broadly, though, Levy
sees these techniques as the template for a grass-roots movement that could spur similar investigations on other campuses and beyond. Mr. Levy hopes to open a fresh window on the polarized cultural debate about Internet distraction and information abundance.
At its extreme, that debate plays out in the writing of authors whom the critic Adam Gopnik has dubbed the Never-Betters and the Better-Nevers. Those camps duke it out over whether the Internet will unleash vast reservoirs of human potential (Clay Shirky) or destroy our capacity for concentration and contemplation (Nicholas Carr)….
Mr. Levy... sees a problem with many discussions about what technology is doing to our minds.
"So many of those debates fail to even acknowledge or realize that we can educate ourselves, even in the digital era, to be more attentive," he says. "What's crucial is education."
I think we may be seeing a definitive turn in the public conversation about how information technologies are affecting us. When I was writing the book, I wrote a section on "digital Panglosses and digital Cassandras" that I eventually cut out. The book was already long enough, and I wasn't that interested in explicitly locating my own work in this debate; instead, I wanted to chart a way out of it, and reframe the questions in ways that let them be answered.
At the base of this enterprise is a sense that we need to recover the belief that connection is inevitable, but distraction is a choice: that despite claims of technological determinism, we have the ability to design and redesign our relationships with information technologies, and make choices about how they affect us.
Clearly, opening up this space of possibility is important for Levy and his students. And I think it's going to be the place to be in the coming years.
Alex Mar talks about how the Internet is invading the writers' retreat, with predictable and lamentable results.
Residencies have long been the writer’s last defense against the distractions of the outside world. But now the incessant digital static of the Internet, that irresistible force we live in such close, constant contact with, is setting the deep-immersion experience necessary to produce great works of literature against a constant barrage of information. Whenever the work becomes difficult (i.e. every 10 minutes), the Web is there with promises of more barely relevant factoids.
The MacDowell Colony, the oldest artists' colony in the United States, opened its doors in 1907, at a time when the biggest technical distractions would have been electric lighting and the telegraph. Of course, you also would have had to deal with the everyday distractions of family, your day job, and all the other stuff we've had to balance for centuries. And if you lived in a city, you might also have to deal with incessant noise, horrible smells in the summer, and the prospect of cholera outbreaks. (One day I'll write a history of distraction and get to the bottom of this-- document what it was that people in previous centuries were getting away from, and looking for, when they went on retreat.)
And of course the problem isn't just that cell towers and wifi access at writers' retreats are a problem; the same technology that distracts writers is also essential now for professional self-fashioning and self-promotion.
[B]ook marketing now almost always requires the author’s regular engagement with his disembodied fans. Junot Díaz had finally weaned himself off Facebook when his publisher asked him to jump back on to promote This Is How You Lose Her. It had been so hard “to wean myself off the damn e-crack, and here I was jumping back in voluntarily,” he said. “Took only about two days to get right back to my check-it-every-five-seconds cycle.” He went from reading something like a book a week to reading a book a month: a projected total loss of 36 books per year.
As someone who just created a public author page on Facebook, I can certainly sympathize. I'm just starting to think about how much I want to engage with readers-- you don't want to be aloof, but at the same time if you're an expert on conquering digital distraction you don't' want to fall prey to it yourself!
While at first blush it might seem like the erosion of enforced solitude in writers' retreats is one of those issues for which the phrase "First World problems" doesn't convey enough elite privilege, the issue was clarified by an exchange between newly-published author Julian Tepper and the great Philip Roth, recounted in the Paris Review. Tepper, who works at a diner Roth visits, had just given Roth a copy of his new novel-- and then:
Roth, who, the world would learn sixteen days later, was retiring from writing, said, in an even tone, with seeming sincerity, “Yeah, this is great. But I would quit while you’re ahead. Really, it’s an awful field. Just torture. Awful. You write and write, and you have to throw almost all of it away because it’s not any good. I would say just stop now. You don’t want to do this to yourself. That’s my advice to you.”
I managed, “It’s too late, sir. There’s no turning back. I’m in.”
Nodding slowly, he said to me, “Well then, good luck.”
After which I went back to work.
Okay, so that's a great story, actually, and Tepper tells it well. But then he goes on to say:
I still feel strongly that the one thing a writer has above all else, the reward which is bigger than anything that may come to him after huge advances and Hollywood adaptations, is the weapon against boredom. The question of how to spend his time, what to do today, tomorrow, and during all the other pockets of time in between when some doing is required: this is not applicable to the writer. For he can always lose himself in the act of writing and make time vanish. After which, he actually has something to show for his efforts. Not bad. Very good, in fact. Maybe too romantic a conceit, but this, I believed, was the great prize for being born … an author.
Elizabeth Gilbert was incredulous: "seriously--is writing really all that difficult?"
Yes, of course, it is; I know this personally--but is it that much more difficult than other things? Is it more difficult than working in a steel mill, or raising a child alone, or commuting three hours a day to a deeply unsatisfying cubicle job, or doing laundry in a nursing home, or running a hospital ward, or being a luggage handler, or digging septic systems, or waiting tables at a delicatessen, or--for that matter--pretty much anything else that people do?
Not really, right?
In fact, I'm going to go out on a limb here and share a little secret about the writing life that nobody likes to admit: Compared to almost every other occupation on earth, it's f*cking great. I say this as somebody who spent years earning exactly zero dollars for my writing (while waiting tables, like Mr. Tepper) and who now makes many dollars at it. But zero dollars or many dollars, I can honestly say it's the best life there is, because you get to live within the realm of your own mind, and that is a profoundly rare human privilege.
People who make it as writers are intensely self-directed, and enjoy that part of the work. Now, that doesn't mean were all reclusive, or that the rest of the business doesn't interest us: one of the thing I've really enjoyed about The Distraction Addiction experience-- not the research and writing, but the search for people to blurb the book, the dealing with the marketing and promotions people, the arguments over what the cover art should look like-- is that I've learned a lot about the industry, about the craft of turning 90,000 words into that object on the table at your local bookstore, and about how low the odds of you succeeding are without a great agent and editor (something I've talked about before). For someone who's spent a lifetime around books, and is no stranger to academic and corporate and magazine writing, it's been very eye-opening, in a good way.
But ultimately, all the machinery and plans and pitches rely on my ability to get up at 5 AM and write undistracted for a few hours, several days a week, for a couple years; to live in my own mind; to push ideas as far as I possibly can; and, as Tepper puts it, to lose myself in the act of writing and make time vanish-- and reemerge at the end of that with words worth keeping. That's the heart of writing, what has to be preserved at all costs, and what the shiny-blinky attractiveness of the Web threatens.
The day I'm trying to finish the copy edits to my book and dealing with an author questionnaire that's about as long as something written for the House Un-american Affairs Committee is the perfect time for both my agent and editor to send me links to interesting pieces about digital distraction. So here goes:
In this age of distraction and multitasking, it’s all too easy to blame technology for luring us away and ruining our writing productivity.
I see writers lamenting Facebook and Twitter as distractions. And they certainly can be. But all forms of technology — even social media — can be powerful tools when used properly.
Since we’re often writing using interfaces that entice us into that distracting online world (desktop computers, laptops, tablets, even phones), it behooves us to take advantage of the many ways to make the most of online and offline technology to keep us on track with our writing.
And perhaps more importantly, since the big dream of writing can trigger resistance, procrastination, and massive quantities of self-doubt, it helps to find workarounds and little tricks to keep writing rather than getting distracted.
I won't reproduce all her recommendations; go check them out for yourself.
Second, my agent points me to a Harvard Business Review piece on "Battling Your Online Addiction:"
How much time do you spend each day responding to email, checking Facebook, sending and reading Tweets, aimlessly surfing your favorite websites and buying things you don't need? How much time, in other words, do you spend doing stuff online that doesn't add much value in your life, or in anyone else's?...
95 percent of our behaviors occur on automatic pilot, out of habit or in reaction to an external demand or stimulus. We spend a crazily disproportionate amount of time seeking the next source of instant gratification, rather than pursuing the more challenging goals that ultimately deliver more long-term value and greater satisfaction.
It's not about summoning the strength to say "no." Each time we intentionally forgo something desirable, we deplete our already limited reservoir of will and discipline. When was the last time you resisted the seductive ping of an incoming email?
So how, then, to withstand this Pavlovian pull? And how, in turn, to take back control of your attention, so you can put it to better and richer use?
The advice in the two pieces is different, but not mutually exclusive. Figure out what works for you.
I got copyedits of my book today. I made the mistake of hitting some button that provided me with a summary of all the changes, and in a roughly 90,000-word manuscript, there are over 5100 insertions, 5200 deletions, and 130 comments. All of which will require my attention.
If being confronted with a 1:9 word-to-edits ratio isn't enough to make you believe that The Onion's recent headline "4 Copy Editors Killed In Ongoing AP Style, Chicago Manual Gang Violence" could have been real.
Not that I'm complaining. It's just another stage on the way to this thing being real, and being out in the world.
My experience writing The Distraction Addiction followed many of the rules the video describes: boosting your productivity by dividing big projects (like chapters) into smaller, easier-to-deal with pieces; making it easier to get started (for example, by stopping in mid-sentence the day before, so you could start more easily the next morning); having a disciplined but realistic schedule; and alternating intense periods of work with breaks.
All easy to say, but a challenge to do. But just get started, and you'll figure it out.
Or so Silas House argues in a recent essay, "The Art of Being Still:"
I’m not talking about the kind of stillness that involves locking yourself in a room with a laptop, while you wait for the words to come. We writers must learn how to become still in our heads, to achieve the sort of stillness that allows our senses to become heightened. The wonderful nonfiction writer Joyce Dyer refers to this as seeing like an animal.
When I was in Cambridge and first started working seriously on contemplative computing project, one of the first things I tried to get my head around was the idea of "calm," what it meant in human-computer interaction, what it means in contemplative traditions, and how to reconcile the two. Discovering Yvonne Rogers' response to Weiser's idea of calm computing, and reading Mihaly Csikszentmihalyi's Flow, (buy your own copy-- it's really good) led me in the direction of thinking of calm not only as a physiological state (e.g., being relaxed) or as the absence of stressors (e.g., lying in a hammock with a piña colada), but also as something skilled and active; and that this last definition was the most useful one in my work. Contemplative computing isn't about escaping stresses or quitting your job; it's about learning to use devices in ways that produce a samurai, hunter-like calm.
This, I think, is what House and Dyer mean when they talk about "seeing like an animal." It's a form of mindfulness. It's detached, deliberate observation.
Likewise, the "stillness" that House is advocating is skilled, active, and disciplined. It's not the stillness of a rock at the bottom of a ravine. It's the stillness of a hunter watching its prey, or the stillness of a yoga pose. It's a form of stillness which requires discipline, focus, and the patience to hold itself unmoving, for a long time. House isn't the only one with this idea. As Tenzin Priyadarshi put it, stillness is a precondition to silence, which is necessary for insight and clarity.
Most writers today have jobs or families or responsibilities, and most often, all three. We don’t have time to sit in the woods for a few hours every day, staring at the leaves, pondering life’s mysteries and miracles and the ways we can articulate them for the reading masses.
We writers must become multitaskers who can be still in our heads while also driving safely to work, while waiting to be called “next” at the D.M.V., while riding the subway or doing the grocery shopping or walking the dogs or cooking supper or mowing our lawns.
The term multitasking at first struck me as ambiguous, but I think House is onto something. As I noted recently, "multitasking" involves weaving together several activities that all lead to some larger goal (though I admit I might have already lost the war to rehabilitate the term, and might have to use something else). What House is talking about, I think, is learning how to use your everyday life to feed your life as a writer: in effect, to learn how to view everything you do as contributing to the larger goal. Multitasking, not switch-tasking between writing and parenting and unloading the dishwasher. A tall order, but if you can pull it off, a great one.
(House essay stumbled upon via Matt Thomas)
I thought it would be interesting to create a photo set that mixes images I use in my contemplative computing talks, with photos from Cambridge and my year writing the book.
It's currently rather on the long side, but perhaps I'll get it cut down to a clean 100 pictures, or 50, or something that doesn't exhaust everyone. Right now there's a kind of Brian Eno ambient imagery gone mad quality to it.
I'm just getting around to Carl Wilkinson's recent Telegraph essay on writers "Shutting out a world of digital distraction." It's about how Zadie Smith, Nick Hornby and others deal with digital distraction, which for writers is particularly challenging. Successful writing requires a high degree of concentration over long periods, but the Internet can be quite useful for doing the sort of research that supports imaginative writing (not to mention serious nonfiction). Add in communicating with agents, getting messages from fans, and the temptation to check your Amazon rank, and you have a powerful device.
Unfortunately, the piece also has a couple paragraphs featuring that mix of technological determinism and neuroscience that I now regard as nearly inevitable. Editors seem to require having a section like this:
the internet is not just a distraction – it’s actually changing our brains, too. In his Pulitzer Prize-nominated book The Shallows: How the Internet is Changing the Way We Think, Read and Remember (2010), Nicholas Carr highlighted the shift that is occurring from the calm, focused “linear mind” of the past to one that demands information in “short, disjointed, often overlapping bursts – the faster, the better”….
Our working lives are ever more dominated by computer screens, and thanks to the demanding, fragmentary and distracting nature of the internet, we are finding it harder to both focus at work and switch off afterwards.
“How can people not think this is changing your brain?” asks the neuroscientist Baroness Susan Greenfield, professor of pharmacology at Oxford University. “How can you seriously think that people who work like this are the same as people 20 or 30 years ago? Whether it’s better or worse is another issue, but clearly there is a sea change going on and one that we need to think about and evaluate.... I’m a baby boomer, not part of the digital-native generation, and even I find it harder to read a full news story now. These are trends that I find concerning.”
As with Nick Carr's recent piece, Katie Roiphe's piece on Freedom, everything Sven Birkets has written since about 1991, and the rest of the "digital Cassandra" literature (Christopher Chabris and Daniel Simons called it "digital alarmism"), I think the problem here is that statements like these emphasize the flexibility of neural structure in a way that ironically diminishes our sense of agency and capacity for change. The argument works like this:
I don't want to argue, pace Stephen Poole, that this is merely neurobollocks (though I love that phrase), (Nor do i want to single out Baroness Greenfield, who's come in for lots of criticism for the ways she's tried to talk about these issues.)
All I want to argue is that 1-4 can be true, but that doesn't mean 5 must be true as well.
Technological determinism is not, absolutely not, a logical consequence of neuroplasticity.
It's possible to believe that the world is changing quickly, that our brains seek to mirror these changes or adapt to them in ways that we're starting to understand (but have a long way to go before we completely comprehend), and lots of this change happens without our realizing it, before we're aware of it, and becomes self-reinforcing.
But-- and this is the important bit, so listen up-- we also have the ability of observe our minds, to retake control of the direction in which they develop, and to use neuroplasticity for our own ends.
Because we can observe our minds as work, we can draw on a very long tradition of practice in building attention and controlling our minds-- no matter what the world is doing. Yes, the great Jeff Hammerbacher line that "The best minds of my generation are thinking about how to make people click ads" is absolutely true*, but when all is said and done, even Google hasn't taken away free will.
We can get our minds back. It's just a matter of remembering how.
And can even be represented in graphical form.
Chad Wellmon has a smart essay in The Hedgehog Review arguing that "Google Isn’t Making Us Stupid…or Smart."
Our compulsive talk about information overload can isolate and abstract digital technology from society, human persons, and our broader culture. We have become distracted by all the data and inarticulate about our digital technologies…. [A]sking whether Google makes us stupid, as some cultural critics recently have, is the wrong question. It assumes sharp distinctions between humans and technology that are no longer, if they ever were, tenable…. [T]he history of information overload is instructive less for what it teaches us about the quantity of information than what it teaches us about how the technologies that we design to engage the world come in turn to shape us.
It's something you should definitely read, but it also reminded me of a section of my book that I lovingly crafted but ultimately editing out, and indeed pretty much forgot about until tonight. It describes the optimistic and pessimistic evaluations of the impact of information technology on-- well, everything-- as a prelude to my own bold declaration that I was going to go in a different direction. (It's something I wrote about early in the project.)
I liked what I wrote, but ultimately I decided that the introduction was too long; more fundamentally, I was really building on the work of everyone I was talking about, not trying to challenge them. (I don't like getting into unnecessary arguments, and I find you never get into trouble being generous giving credit to people who've written before you.) Still, the section is worth sharing.
No, it's not a new study of people listening to The Sex Pistols while in fMRI machines (though that would make an awesome research project): rather, it's Stephen Poole's latest essay (in the New Statesman) on the misuse of neuroscience, particularly by what Evgeny Mozorov might describe as the "naked and the TED" set:
An intellectual pestilence is upon us. Shop shelves groan with books purporting to explain, through snazzy brain-imaging studies, not only how thoughts and emotions function, but how politics and religion work, and what the correct answers are to age-old philosophical controversies. The dazzling real achievements of brain research are routinely pressed into service for questions they were never designed to answer. This is the plague of neuroscientism – aka neurobabble, neurobollocks, or neurotrash – and it’s everywhere.
How's it work?
You start each chapter with a pat anecdote about an individual’s professional or entrepreneurial success, or narrow escape from peril. You then mine the neuroscientific research for an apparently relevant specific result and narrate the experiment, perhaps interviewing the scientist involved and describing his hair. You then climax in a fit of premature extrapolation, inferring from the scientific result a calming bromide about what it is to function optimally as a modern human being. Voilà, a laboratory-sanctioned Big Idea in digestible narrative form. This is what the psychologist Christopher Chabris has named the “story-study-lesson” model, perhaps first perfected by one Malcolm Gladwell. A series of these threesomes may be packaged into a book, and then resold again and again as a stand-up act on the wonderfully lucrative corporate lecture circuit.
This all actually speaks to something I've been thinking about, and strategizing over, quite a bit. When I started working on the contemplative computing project, I was really taken with the brain science literature, and imagined drawing rather heavily on it in my own work.
the exploratorium, via flickr
After a couple months, though, I found myself backing away from that, and the finished book (which is making its way through the offices of Little, Brown at this very moment) refers to but doesn't rely on neuroscience much at all.
So what happened?
the underground, via flickr
First of all, being in Cambridge, I was within striking distance of some very good people in the field, and a few conversations with them-- the Eagle attracts as many scientists as it did when Watson and Crick hung out there-- convinced me that while current work in neuroscience is very interesting, you can't responsibly extrapolate from it to make the sorts of claims I would have made.
the eagle, via flickr
It's a challenge to speak to general audiences about technical subjects-- hell, it's hard to speak to expert audiences about technical subjects-- and you don't do anyone any favors by introducing error into the conversation.
the exploratorium, via flickr
My boss at Microsoft Cambridge trained in the philosophy of mind, and he was very insistent that I pay attention to the difference between mind and brain in my writing, and be clear about which one I was really talking about.
if you see the buddha on the shelf… via flickr
Just as important, I realized that Buddhist writings provide a really well-developed language for talking about the mind. Indeed, there was very little I needed to say about consciousness or the mind that wasn't very old, pretty commonsensical, and verifiable by people on their own; they didn't have to take my claims on faith, nor did they need their own neuroscience lab to check things out for themselves.
So I ended up quite content to write a book that I think can be read and enjoyed by people who love Malcolm Gladwell and Jonah Lehrer and Susan Greenfield, but which doesn't rely so heavily on neuroscience. And I think it'll be a better book because of that.
Not that I wouldn't mind to break into the ranks of people who can pay the mortgage (or the second mortgage) by giving keynotes at swank corporate affairs. I just want to do it by talking about complicated things in ways that make them accessible to people, not by talking about complicated things in ways that make me look smart but mislead the audience.
Here's the cover for the contemplative computing book:
Little, Brown spent a lot of time on it, and I think they've managed to communicate a lot in a very small, challenging medium. They were also really good about explaining the design choices, making clear that they thought worked, and accommodating those changes I thought would improve it (or explaining why they would be hard to implement).
So the machine chugs along, and we get one step closer to having a finished book on the shelves!
I write about people, technology, and the worlds they make.
My book on contemplative computing, The Distraction Addiction, was published by Little, Brown and Company in 2013. (It's been translated into Dutch (as Verslaafd aan afleiding) and Russian (as Ukroschenie tsifrovoy obezyany); Spanish, Chinese and Korean translations are in the works.)
I'm a senior consultant at Strategic Business Insights, a Menlo Park, CA consulting and research firm. I also have two academic appointments: I'm a visitor at the Peace Innovation Lab at Stanford University, and an Associate Fellow at Oxford University's Saïd Business School.
The Distraction Addiction
My latest book, and the first book from the contemplative computing project. The Distraction Addiction is published by Little, Brown and Co.. It's been widely reviewed and garnered lots of good press. You can find your own copy at your local bookstore, or order it through Barnes & Noble, Amazon (check B&N first, as it's usually cheaper there), or IndieBound.
The Spanish edition
The Dutch edition
Empire and the Sun
My first book, Empire and the Sun: Victorian Solar Eclipse Expeditions, was published with Stanford University Press in 2002 (order via Amazon).
PUBLISHED IN 2012
PUBLISHED IN 2011
PUBLISHED IN 2010
PUBLISHED IN 2009