I hadn’t heard of this idea until this Atlantic video from James Hamblin:
Contemplative computing may sound like an oxymoron, but it's really quite simple. It's about how to use information technologies and social media so they're not endlessly distracting and demanding, but instead help us be more mindful, focused and creative.
My book on contemplative computing, The Distraction Addiction, was published by Little, Brown and Co. in 2013, and is available in bookstores and online. This 2011 talk is a good introduction to the project and its big ideas.
I write about people, technology, and the worlds they make.
My book on contemplative computing, The Distraction Addiction, was published by Little, Brown and Company in 2013. (It's been translated into Dutch (as Verslaafd aan afleiding) and Spanish (as Enamorados de la Distracción); Russian, Chinese and Korean translations are in the works.)
My next book, Rest: Why Working Less Gets More Done, is under contract with Basic Books. Until it's out, you can follow my thinking about deliberate rest, creativity, and productivity on the project Web site.
Tom Chatfield’s short essay, "The attention economy,” raises an interesting question: why do we think of attention as a resource?
For all the sophistication of a world in which most of our waking hours are spent consuming or interacting with media, we have scarcely advanced in our understanding of what attention means. What are we actually talking about when we base both business and mental models on a ‘resource’ that, to all intents and purposes, is fabricated from scratch every time a new way of measuring it comes along?
For the ancients, Chatfield notes, attention wasn’t a resource; it was a relationship.
For the ancient Greeks and Romans, this wooing [i.e., getting other’s attention] was a sufficiently fine art in itself to be the central focus of education. As the manual on classical rhetoric Rhetorica ad Herennium put it 2,100 years ago: ‘We wish to have our hearer receptive, well-disposed, and attentive (docilem, benivolum, attentum).’ To be civilised was to speak persuasively about the things that mattered: law and custom, loyalty and justice.
In this understanding, there is no such thing as “attention” as something that exists outside a relationship. It’s not like energy, or a pint of blood: it only exists between the person giving their attention, and the person trying to hold it. Indeed, Chatfield points out,
In Latin, the verb attendere — from which our word ‘attention’ derives — literally means to stretch towards. A compound of ad (‘towards’) and tendere (‘to stretch’), it invokes an archetypal image: one person bending towards another in order to attend to them, both physically and mentally.
I think there’s still some value in the attention-as-resource model, if only because we can demonstrate that humans have only a certain amount of attention they can “pay” in a day; in that respect, it’s like self-discipline or decision-making. But the notion that it can be treated as essentially interchangeable with coal or wind, does bear some rethinking.
One of the things you always, and I mean always, hear about Internet of Things and smart home devices is that they “just work.” They’re all like these magic autonomous robots that’ll connect themselves to your wifi, then go do their thing, yet also be totally unobtrusive and intuitive (whatever those two words mean). Sounds cool, right?
Of course, the reality is very different, as this essay from IoS explains. The light went on sometime around the point when the author’s Internet-enabled thermostat stopped working whenever the wifi connection was lost (and “The only way to control the gadget is via the app, so when it breaks you’re really screwed"), and it came time to update their Philips Hue light bulbs: “When the first firmware update rolled around, it was exciting, until I spent an hour trying to update lightbulbs. Nobody warned me that being an adult would mean wasting my waking hours updating Linux on a set of lightbulbs, rebooting them until they’d take the latest firmware. The future is great."
In other words, things work great until they don’t, at which point all the wheels come off. Further, as we’ve learned recently, connected devices are “connected” to the fates of their companies, in a way that “dumb” devices are not. If the company that made your hammer or pants goes belly-up, that doesn’t affect your ability to pound nails or cover up your naughty bits. But that’s not the case with smart home devices.
A one-time purchase of a smart device isn’t a sustainable plan for companies that need to run servers to support those devices. Not only are you buying into a smart device that might not turn out to be as smart as you thought, it’s possible it’ll just stop working in two years or so when the company goes under or gets acquired.
The Internet of Things right now is a mess. It’s being built by scrappy startups with delusions of grandeur, but no backup plan for when connectivity fails, or consideration for if their business models reach out more than a year or two — leaving you and me at risk.
Just another indicator of how technologies of the future could turn out to be really distracting.
We often regard a failure of focus as a failure of will, or a moral failure. But there’s also a physical and physiological foundation to our capacity to focus on a problem, or remember a number. And there’s an interesting study that suggests that our tendency to wander off-topic isn’t so much a function of willpower, or our mental inadequacies, as it is a reflection of our natural capacity for what scientists call “habituation."
Habituation is the phenomenon where you stop noticing regular things in your environment: the rain on the roof, the ticking of a clock, the objects in your field of vision. We think our vision encompasses a nearly-hemispherical area in front of us, but in fact our eyes are only focused on a small part of that world at any given time, and we stop keeping track of things that aren’t moving. Our brains are good at creating a sense that we’re continuously observing the world, though that illusion is not perfect— if we’re concentrating hard while reading, for example, we can be surprised by the “sudden" appearance of a bird on the windowsill or a person in the room.
A couple years ago, University of Illinois psychology professor Alejandro Lleras wondered, what if focus is subject to the same rules that govern sensory habituation? What if our minds naturally tend to wander off things we think are repetitive? As he explained in 2012,
For 40 or 50 years, most papers published on the vigilance decrement treated attention as a limited resource that would get used up over time, and I believe that to be wrong. You start performing poorly on a task because you've stopped paying attention to it. But you are always paying attention to something. Attention is not the problem.
That insight that attention isn’t something that waxes and wanes, but instead is something that’s always directed somewhere, led him to draw a parallel between the attention we give to a task, and the fact that we tend to “edit out” stationary objects in our environment:
Constant stimulation is registered by our brains as unimportant, to the point that the brain erases it from our awareness. So I thought, well, if there's some kind of analogy about the ways the brain fundamentally processes information, things that are true for sensations ought to be true for thoughts. If sustained attention to a sensation makes that sensation vanish from our awareness, sustained attention to a thought should also lead to that thought's disappearance from our mind!
He and his colleague Atsunori Ariga, then a postdoc at University of Illinois, constructed a simple test. Four groups of students were given slightly different tasks.
To be clear, the purpose of the experiment wasn’t to test whether people could remember the numbers; it was testing whether having this other brief task helped people pay attention to the lines— that is, their performance on the vigilance test.
What they found was that the performance of the third group was pretty consistent, but everybody else got worse over time.
So does this mean that multitasking is actually good? Does texting while driving make you a better driver.
As they put it, "heightened levels of vigilance can be maintained over prolonged periods of time with the use of brief, relatively rare and actively controlled disengagements from the vigilance task.” But they’re testing how well you do on a very simple task. If you’re working on an assembly line, and literally the only think you do is make sure that three bolts are properly tightened, then this kind of break is essential. But if you’re doing something complex, then introducing a second task isn’t going to improve your performance. Indeed, the opposite is a lot more likely.
The challenge is to find a brief respite that is different, but doesn’t threaten to take too much time. This is why a “quick” email check is problematic: checking your email is rarely quick, because there’s almost always something that you feel needs an immediate reply, or leads to something else.
But you can imagine that automobile auto-pilots could be really useful here: if they were designed to let you take 30 seconds every 10 minutes or so to refocus your eyes, blink, and maybe run through some mental exercise— a couple Trivial Pursuit questions, for example— that could recharge your ability to stay focused on the road.
Here’s the abstract:
We newly propose that the vigilance decrement occurs because the cognitive control system fails to maintain active the goal of the vigilance task over prolonged periods of time (goal habituation). Further, we hypothesized that momentarily deactivating this goal (via a switch in tasks) would prevent the activation level of the vigilance goal from ever habituating. We asked observers to perform a visual vigilance task while maintaining digits in-memory. When observers retrieved the digits at the end of the vigilance task, their vigilance performance steeply declined over time. However, when observers were asked to sporadically recollect the digits during the vigilance task, the vigilance decrement was averted. Our results present a direct challenge to the pervasive view that vigilance decrements are due to a depletion of attentional resources and provide a tractable mechanism to prevent this insidious phenomenon in everyday life.
Georgia State University researcher Susan Snyder is studying the impact of Internet addiction (or PIU, Problematic Internet Use, described as >25 hours/week of non-school or -work use) on family ties. A new article finds that
College students who are addicted to the Internet report positive and negative effects on their family relationships….
On the plus side, these students reported their time on the Internet often improved family connectedness when they and their family were apart. However, their excessive Internet use led to increased family conflict and disconnectedness when family members were all together. And most students with PIU felt their families also overused the Internet, with parents not setting enough limits for either parent or sibling Internet use.
I’m sure there’s more to it, but until I read more, I’ll have to file this under what my mentor Riki Kuklick described as “the power of the social sciences” studies— things like detailed statistical studies of tax records that showed that— TAA-DAA!!!— incomes rose during the Industrial Revolution.
Part of me is also not sure classifying non-work and non-school use as unproblematic, but I’m not sure why.
I’ve been thinking about different forms of distraction, and the differences between distraction and mind-wandering. We talk a lot about focus and distraction, and often talk about them as if they’re two different mental categories. Like many everyday terms, we don’t use formal definitions; we just take for granted that they’re different, and that one (focus) is good, and the other (distraction) is bad.
When we talk about distraction, we can be talking about a multitude of things. For example:
Likewise, we tend to use distraction and mind-wandering somewhat interchangeably, or at least recognize that there’s some overlap between them. Some examples of distraction definitely involve mind-wandering: staring out the window rather than paying attention to the teacher, for example, is an example we’re all familiar with.
But if we’re to make the argument that mind-wandering is beneficial, I think it would be useful to explain how despite our common usage, mind-wandering is actually quite different from distraction— and indeed, that distraction is a little more multifaceted that we usually think. It’s always worth knowing your enemy.
First, let me explain the critical components of focus, or what I'll call Directed Attention. Consider the crude drawing below.
This is directed attention. You have something that you should be focusing on (the square in the upper right), and you direct your attention at it. There are two important components to this scenario:
So if you’re driving, this would be you watching the road, the traffic, etc. If you’re reading, you’re paying attention to the words, not just scanning the page while you think about something else.
Now, what happens when you’re distracted? Distraction isn’t a single condition; I can think of two major kinds of distraction.
Let’s call the first Hijacked. It looks like this:
In this case, you ought to be paying attention to the square in the upper right— and for a while you were— but then your attention is captured by the shiny blinky thing on the left, and you veer off-course and pay attention to that instead.
For example, if you’re driving, you take your eyes off the road because your phone tells you that someone’s liked your last tweet. Or you’re reading War and Peace, and you get a notification that something is happening on the Internet.
I use the term hijacked because the shiny blinky thing may be designed to capture your attention, or behave in ways that assume that you should pay attention to it whenever it wants you to. In other words, this is not accidental.
My favorite example of a technology designed this way is the Facebook Messenger iPhone app, which is constantly reminding me that I haven’t turned on notifications, and that I really, really should. It (or rather Facebook Messenger’s designers) assumes that my attention should be interruptible whenever it decides, and that failing to do constitutes an error on my part. (I haven’t figured out how to switch this off, and consequently rarely use Messenger on the iPhone. But since research has revealed that even awareness of notifications is distracting, I’m standing my ground.)
The other important thing about having your attention hijacked is that you’re still paying attention to something: your attention is diverted, pulled away, but it’s not diffused. You may be focused as intently on the shiny blinky thing as on the thing you should be concentrating on. Any parent who’s had to pull kids away from a video game or the television when they should be getting ready for dinner or doing their homework understands the power of this kind of distraction. it’s not that the kid is failing to focus; it’s that they’re focused on the wrong thing at the wrong time.
So this kind of distraction isn’t about not being able to concentrate; it’s about concentrating on the wrong thing at the wrong time, and doing so because your attention is pulled away. It’s the state of “machine flow” that makes Vegas casinos rich. It’s the conscious effort to capture and commoditize your attention that critics of social media and videogame design protest against.
Note that a “distracting” social media service or video game isn't interested in you never being able to pay attention to anything; it just wants you to always pay attention to it. It wants to be the shiny-blinky thing that you pay attention to, instead of whatever else you’re doing. The objective isn’t to put you in a state of continuous partial attention, but to hijack and hold your attention for as long as possible.
There’s another kind of distraction, which I call Aimless. It looks like this:
In this case, you should be directing your attention at the object in the top right, but instead your attention just goes around randomly. This is what Buddhists call the monkey mind, the mind that never pays attention to any one thing for very long, and indeed cannot focus on anything. In this case:
For example, you’re in the car and you should be paying attention to traffic, but instead your attention drifts in quick sequence to a dozen other things. Or you should be reading War and Peace, but instead you’re mindlessly clicking through Buzzed listicles of the best Buzzed listicles.
Now, so far as I can tell, nobody tries to promote aimlessness: Facebook and Twitter and Netflix and EA don’t gain anything by having you not be able to pay attention to them for any length of time. Aimlessness is a product of cognitive overload, of having too many different things competing for your attention and weighing on your mind. This is the state that Nicholas Carr complains about in The Shallows.
Of course, you can move between these forms of distraction. Consider the following schedule:
One other essential thing to notice about distraction is that it’s not just a measurable cognitive state: it’s a contextual thing as well. Distraction requires being distracted away from something, whether it’s an object you should be looking at, a piece of music you should be listening to, a task you should be completing.
So how does mind-wandering differ from these forms of distraction? This to me is what Wandering looks like:
In this instance, your attention isn’t focused on any particular thing, but it doesn’t need to be; you can safely let your thoughts go off on their own and spontaneously generate, because you don’t have something you need to concentrate on. In other words:
For example, if you’re doing something completely automatic, like folding laundry or unloading the dishwasher, you can pretty safely let your mind wander.
To continue the geographical metaphor, I quite enjoy walking around places like London and Cambridge because I can just wander around them, with no set destination, nowhere I need to be, but a pretty good likelihood that I’ll come across something interesting.
It’s this absence of a goal, and the absence of pressure to focus on something, that distinguishes wandering from distraction, rather than a distinctive mental state.
But psychologically, there’s a very big difference between letting your mind drift while you’re walking the dogs or lingering over a cup of coffee; having a long to-do list but not being able to focus on any specific thing because your brain refuses to focus; having a long to-do list but not doing any of it because something shiny and blinky is capturing your attention.
Writer Joe Fassler has a piece in The Atlantic on “How Fiction Can Survive in a Distracted World.” It’s mainly a conversation with author Kevin Barry, and it makes the case that “novelists shouldn’t even try to compete for people’s eyes,” which means competing with screens and everything that’s on them. Rather, "they should go for their ears instead…. Barry argued that the human voice still has the power to mesmerize us the way screens seem to, and that modern fiction should be heard and not seen."
Barry argues that "one thing can still arrest us, slow us down, and stop us in our tracks: the human voice."
I think this explains the explosion in podcasts and radio narratives. The human voice still holds our attention, allowing us to tune in to a narrative in a way we find increasingly difficult on the page.
Readers and listeners increasingly want their stories to come at them directly in the form of a human voice. While everybody says that book sales are dropping, there’s an explosion in literary events, book festivals, spoken word events. People want to listen, and they want to hear stories.
Barry uses Dylan Thomas’ Under Milk Wood to illustrate the kind of approach he’s advocating. I won’t reproduce it all here, or try to summarize it; it’s long, and deserves to read. But I’ll highlight this bit:
I love the refrain, “listen,” which repeats all the way through the work:
Listen. It is night moving in the streets …
Listen. It is night in the chill, squat chapel, hymning in bonnet and brooch and bombazine black …
Time passes. Listen. Time passes.
With this injunction to listen, Thomas is saying stop, stop, stop. He’s slowing us down so that we can enter this world.
This is striking because stopping is exactly what we instinctively do when we’re listening carefully to something. If you watch people talking on their phones while talking, you’ll often see them slow down or pause when they’re paying really close attention to the conversation. I’m one of those people who usually will pace around when talking, but I find when I really have to listen to someone, I stand still.
When we’re out on a walk and we want to listen for something— a bird, or something in the bushes— what do we naturally do? We stop. We still the self-generated noise that usually surrounds us, so we can better hear what’s going on outside ourselves. So this injunction to stop, stop, stop isn’t one that we only treat as a metaphor; in our daily lives, there’s an embodied aspect to concentration and listening as well. Listening requires slowing down, or being still.
Michael Schulson in Aeon writes about designing devices for addiction:
[S]hould individuals be blamed for having poor self-control? To a point, yes. Personal responsibility matters. But it’s important to realise that many websites and other digital tools have been engineered specifically to elicit compulsive behaviour.
A handful of corporations determine the basic shape of the web that most of us use every day. Many of those companies make money by capturing users’ attention, and turning it into pageviews and clicks. They’ve staked their futures on methods to cultivate habits in users, in order to win as much of that attention as possible. Successful companies build specialised teams and collect reams of personalised data, all intended to hook users on their products.
‘Much as a user might need to exercise willpower, responsibility and self-control, and that’s great, we also have to acknowledge the other side of the street,’ said Tristan Harris, an ethical design proponent who works at Google. (He spoke outside his role at the search giant.) Major tech companies, Harris told me, ‘have 100 of the smartest statisticians and computer scientists, who went to top schools, whose job it is to break your willpower.’
I met Harris not long ago, and seems to me that we’re reaching a turning point in the way we talk about the addictive quality of devices and social media: it’s no longer sufficient to invoke dopamine and intermittent rewards, and then shrug and either assume that these are inherent, unavoidable features of our technologies, or are addictive because of flaws in our human programming, rather than effects that designers work hard to create. Behind every claim that some technology or technological feature is inevitable is someone working hard to make money off that feature, while also convincing you that it just happened, and there’s nothing to be done about it.
As “classic” spam has declined, it’s become clear that the internet in general – indeed, life in general – has become an awful lot spammier. Partly, this is simply because spammers have found ways to spam that don’t involve email, using texts, Twitter, Gchat and so on. But there’s a deeper point here, too....
If spamming is about abusing the resource of other people’s attention, the ethos of spam is everywhere: in clickbait headlines that promise far more than they deliver; in tweets that exploit the “curiosity gap” by tantalizingly omitting key information; in the daily email I now receive... from a clothing store where I once bought one shirt.
One of the most effective things I’ve done to get my phone to defend rather than attack my attention is to turn off as many notifications and alerts as possible. I started this a couple years ago, and now consider it essential. I have a super-quiet ringtone for people who aren’t on my “call in case of zombie apocalypse” list; the people who really matter in my life, in contrast, get the opening bars of Derek and the Domino’s “Layla.” The virtue of this practice is that I can more easily ignore calls from people who I might or might not want to talk to, or might or might not have the bandwidth for. (This article provides an overview of why this is good. For those of you who have iPhones and want to try this for yourself, here’s how you set up whitelists, and here’s how you create custom ringtones.)
A new study from Florida State provides confirmation that I’m on the right track. In an experiment, they had about 150 undergraduates take a test measuring their attention levels. In the test, students had to watch a screen and press a button every time a new number appeared, unless the number was 3. Measuring their response speeds, and whether they mistakenly press 3, give you a measure the attention level of the participant. You can see an example of the screen below:
Here’s the abstract:
It is well documented that interacting with a mobile phone is associated with poorer performance on concurrently performed tasks because limited attentional resources must be shared between tasks. However, mobile phones generate auditory or tactile notifications to alert users of incoming calls and messages. Although these notifications are generally short in duration, they can prompt task-irrelevant thoughts, or mind wandering, which has been shown to damage task performance. We found that cellular phone notifications alone significantly disrupted performance on an attention-demanding task, even when participants did not directly interact with a mobile device during the task. The magnitude of observed distraction effects was comparable in magnitude to those seen when users actively used a mobile phone, either for voice calls or text messaging.
In other words, just knowing you got a call or text can be almost as distracting as talking on the phone. Or as The Atlantic explains,
The researchers found that performance on the assessment suffered if the student received any kind of audible notification. That is, every kind of phone distraction was equally destructive to their performance: An irruptive ping distracted people just as much as a shrill, sustained ring tone. It didn’t matter, too, if a student ignored the text or didn’t answer the phone: As long as they got a notification, and knew they got it, their test performance suffered.
“Our results suggest that mobile phones can disrupt attention performance even if one does not interact with the device,” write the study’s authors. “As mobile phones become integrated into more and more tasks, it may become increasingly difficult for people to set their phones aside and concentrate fully on the task at hand, whatever it may be.”
You can add this to the discovery that distraction in the classroom is contagious as another reason to encourage students to go device-free, and to encourage people to leave their phones in the office during meetings.
The Wall Street Journal has an article about how to keep your Apple Watch from distracting you. Some of the recommendations are similar to the ones I made in my mindful iPhone posts: use the VIP feature to make whitelists, turn off most notifications, delete useless or interruption-generating apps. Though ultimately, Joanna Stern says,
As angry as I’ve wanted to be at the Apple Watch for interrupting my life, it’s on me to limit its distractions. As technology becomes an extension of our bodies, we need to find our own controls; we need to resist burying ourselves in the digital world and stay present in the real one.
And no, I’m not getting an Apple Watch. Not only do I have an older, pre-Bluetooth 4.0 phone, but I’ve thrown my lot in with a Seiko dive watch (with Nato strap, natch).
A couple days ago I was interviewed on ABC Sunshine Coast's morning show about technology and distraction. You can now listen to the interview on Soundcloud:
For a short interview it covers a fair amount of ground. The interviewer asked good questions.
Two new books, The Organized Mind by psychologist Daniel Levitin, and Matthew Crawford’s The World Beyond Your Head, talk about the importance of learning how to intelligently offload memory and tasks onto your physical environment. This is something that we often do without much thought— anyone who’s written a note to themselves, or leaves a bag near the front door so they’ll remember to take it to work the next day, has done this— and indeed I talk about it a little bit in The Distraction Addiction. But they both make the case that this is a skill worth paying more attention to, and worth cultivating more consciously.
As Oliver Burkeman explains in a recent review, Levitin argues that lots of "information overload" problems are really problems of information management and attention management; and that seen this way, the solution
says Levitin, is to “shift the burden of organising from our brains to the external world”. Presidents and celebrities employ people to “narrow the attentional filter”, making sure they only see the stuff they need to see. But if you can’t afford an entourage, use the physical environment instead. Levitin’s specific tips might not blow your mind. One is to leave items you need to take to work on the doormat, so you’ll see them on leaving; another is to keep stacks of index cards for stray ideas and to-dos, then designate a time to gather and process them…. These aren’t revolutionary. Yet it’s intriguing to think of them not as one-off fixes for absent-mindedness but part of a comprehensive plan to structure how information flows through your life.
Crawford’s The World Beyond Your Head (an excerpt is here) explains how this isn’t just something that “knowledge workers” (gag) do: it’s something that good bartenders, cooks, and other people who have to juggle lots of tasks learn to do.
A bartender gets an order from a waitress: a vodka and soda, a glass of house red, a martini up, and a mojito. What does he do? He lays out the four different kinds of glass that the drinks require in a row, so he doesn’t have to remember them. If another order comes in while he is working on the first, he lays out more glasses. In this way, the sequence of orders, as well as the content of each order, is represented in a spatial arrangement that is visible at a glance. It is in the world, rather than in his head.
Consider a short-order cook on the breakfast shift. As he finishes his coffee, the first order of the morning comes in: a sausage, onion, and mushroom omelet with wheat toast. The cook lays out the already chopped sausage next to the pan, the onions next to the sausage, then the bread, and finally the mushrooms, farthest from the pan. He now has the ingredients in a spatial order that corresponds to the temporal order in which he will require them: once it gets hot, the sausage will provide the grease in which the onions will cook, and the onions take longer to fry than the mushrooms do. He places the bread between the onions and the mushrooms as a reminder to himself to start toasting the bread at such a time that the toast will be ready just as he is sliding the omelet out of the pan.
Crawford mentions the work of David Kirsh, particularly his essay “The Intelligent Use of Space,” and Andy Clark’s work on embodied cognition (which I’ve written about at length). I’m not at all surprised that he talks about these different crafts and jobs: anyone who’s a philosophy Ph.D. who makes a living as a motorcycle repairman will have a respect for varied types of work. Crawford’s last book, Shop Class as Soul Craft, upended our casual assumption that any kind of work that involves “content” is complicated, highly-skilled, and elite, while anything that involves mere stuff or machines or service is essentially for dullards.
if I were in Seattle, I’d go to this, and not just because the title name-checks my book: Matthew Crawford is speaking about his new book, The World Beyond Your Head.
Many point to our technology addiction (namely, the influx of smart phones and the internet) as the root of society’s lack of focus, but according to Matthew Crawford (Shop Class as Soulcraft) the problem goes much deeper. Crawford’s The World Beyond Your Head takes a historical approach to our mass-distraction, revealing that the trouble can be traced to the very foundations of Western culture in the Enlightenment. From short-order cooks to gambling addicts, he examines success stories of the extreme focus many of us seek, but fail to achieve. He’ll share his findings on what it takes to refocus our lives–by mastering our minds–and the implications this has for culture, democracy, and even how children are raised.
I spoke at Town Hall in 2013, when I was on tour for The Distraction Addiction, and it was a great venue.
But Seattle is a great place, and a great place to engage with interesting audiences.
I’ve been talking about this concept for a while (most recently in the Penn Gazette, my alumni magazine), and have even created a few myself; the WNYC Bored and Brilliant campaign also made some. I like the homemade quality and whimsy of Molly's, though. Not that they actually are necessarily that easy to make. This has that improved look that I suspect takes hours to get right.
This one could be the cover of my next book, on rest.
I especially like this one because Mom just got a new smartphone, and is calling me regularly for tech support help.
Another new article of mine, this time in the Penn Gazette, which happens to be my alumni magazine. I really like the 1970s cybernetic thing going on in the accompanying illustration:
We know how to use tools; the problem is that our smartphones don’t know how to be good ones. Our natural inclination is to treat them as extensions of our selves—and sometimes that works just fine. For example, aside from those of my wife and children, I haven’t memorized a phone number in years, because I can trust my smartphone’s flawless recall. But other apps are designed to capture and resell my attention; and the more I interact with them, the better they get at distracting me. (I’m looking at you, Facebook.) Our phones are clever enough to grab our attention, but not smart enough to guard it, or know when we should be left alone.
The good news is, you can turn these weapons of mass distraction, these interruption amplifiers, into filters that protect your attention rather than compete for it.
Tuesday’s Bored and Brilliant challenge is an interesting one for me, because I agree with it, with some strong reservations. Here it is:
Your instructions: See the world through your eyes, not your screen. Take absolutely no pictures today. Not of your lunch, not of your children, not of your cubicle mate, not of the beautiful sunset. No picture messages. No cat pics.
Now, as someone who loves taking pictures, and who has written a lot about digital photography, I think that mindless photography is a bad thing.
But I think all mindless activities are bad. I even wrote a book about why they’re bad, and what to do about them.
However, mindful photograpy is a great thing, and I think that what we should be focused on in this challenge is avoiding mindless picture-taking— taking pictures of our sandwiches out of habit— and more on mindful photography.
There’s a long history of criticizing the practice of continuous documentation. Sherry Turkle had a piece in the New York Times in 2013, for example, about how constant photography “interrupts experience to mark the moment,” and “makes us accustomed to putting ourselves and those around us “on pause” in order to document our lives."
Of course, other people have found this kind of attitude off-putting, as this classic XKCD comic explains:
But as Australian philosopher Damon Young put it in an essay in 2013, the problem isn’t so much with the quality of the picture, but rather with how "ubiquitous photography can be a distraction from a more fraught, awkward or intense response to life:"
So the problem is not necessarily the imagery – it's the avoidance it enables. And the technology does not force us to do this. It is a human, all-too-human urge for ease: instead of confronting life, we turn away to a kitsch scene with a schmick filter….
The point is not to shun technology – this idea is simply more distraction, in a romantic key. The point is to reflect on how it's used: to savour the run, or to just keep running away.
Certainly many of the photographs we take are not Cartier-Bresson level achievements, and many aren’t meant to permanently document events or people (indeed, you rather hope that that revealing selfie actually vanishes into the Snapchat nether void).
But I think Young is onto something, because in my experience it’s absolutely the case that you can use photography— even iPhone photography, or somewhat gimmicky programs like Hipstamatic— to see the world more deeply.
When I was in England, and taking a lot of pictures, I became very aware (as I explained at the time at great length) of how having a camera made me look at my surroundings more intensely, pay attention to materials and colors and shadow, and appreciate even the wintry beauty of the landscape.
Having the camera didn’t just allow me to record what was a life-changing trip; it made me more engaged during it.
So photography can take us out of world. Or it can encourage us to pay closer attention to it.
But the camera itself doesn't change the way we see the world; learning to use the camera, the practice of seeing the world through it, is what changes us. That’s an important distinction, because it places the agency, the ability to change, back with us.
So the aim should be to think a bit more about whether a picture— or taking a picture under some circumstances— is an act that interrupts what you’re doing, or flows with it; whether it redirects your attention away from the subject, or encourages you to look at it more deeply. If it’s the second, then go for it.
This kind of mindful photography is also really important these days because it’s a really good place to learn a more contemplative attitude to technology, and it illustrates how you can become more mindful about technology, and more mindful while using technology.
Of course, humans are no longer the only species that takes lots of pictures:
this monkey stole an iPhone. you won't believe what happens next
So maybe we really can all benefit from this challenge, so long as we use it not to renounce technology, but to become more thoughtful about it, and to think about the virtues of using it more mindfully.
Yahoo! tech columnist Rob Walker takes a gentle prod at Bored and Brilliant:
Finally! After decades of tech innovations aimed squarely at squishing boredom out of existence by offering a nonstop barrage of distractions, we are ready to technologize boredom itself.
The Bored and Brilliant people seem to be taking my suggestion about lock and home screens as nudges to heart: their Flickr account now includes this lock screen:
The idea of using the lock screen as a nudge is one I've been talking about for a while, and I've used a couple different messages to remind me to use my phone more mindfully. It works.
WNYC’s Bored and Brilliant challenge starts today.
What’s on the agenda?
As you move from place to place, keep your phone in your pocket, out of your direct line of sight. Better yet, keep it in your bag.
I would think if there was one place in the world you could wander around with our looking at your cellphone, it would be New York, but as host Manoush Zomorodi recently found, a third of people on the streets are looking down at their phones while walking. (In my experience the number is astronomical on subways.)
The podcast features an interview with me, which we conducted a few weeks ago.
You can listen to it below. Manoush and her team did an excellent job editing it.
It concludes with several suggestions for how to better manage your phone, using whitelists, special ringtones, and so on. It was fun.
I really like the Bored and Brilliant challenge because, unlike many “put down your phone and get back to the real world” sorts of challenges, Manoush and her team seem intent on providing listeners with advice about what to do instead of checking their mail a dozen times an hour. Too often these campaigns treat digital distraction as a moral failing that simply requires Being A Better Person; the Bored and Brilliant approach is more constructive.
It’s also perfectly balanced between my last book and my next one. As I said in another recent interview, while The Distraction Addiction is about the benefits of mindfulness, the next book is about the benefits of mind-wandering— and how digital technologies do a brilliant job of intruding on both, by offering diversions that seep into our time as effectively as water into a basement.
Mindfulness and mind-wandering don’t just share a mutual enemy. They’re linked to each other. (By mind-wandering I mean not distraction— having your attention drawing to B when it should be on A— but rather allowing your mind to be focused on nothing at all, and leaving it free to attend to what it wants, without conscious effort.) The evidence I’m seeing is that people who are capable of concentrating really hard on a subject are also very good at intentionally disengaging their minds; that, in effect, improving your ability to do the one improves your ability to do the other.
So to be brilliant, it seems, you must be bored.
The Distraction Addiction
My latest book, and the first book from the contemplative computing project. The Distraction Addiction is published by Little, Brown and Co.. It's been widely reviewed and garnered lots of good press. You can find your own copy at your local bookstore, or order it through Barnes & Noble, Amazon (check B&N first, as it's usually cheaper there), or IndieBound.
The Spanish edition
The Dutch edition
The Chinese edition
The Korean edition
Empire and the Sun
My first book, Empire and the Sun: Victorian Solar Eclipse Expeditions, was published with Stanford University Press in 2002 (order via Amazon).
PUBLISHED IN 2015
PUBLISHED IN 2014
PUBLISHED IN 2013
PUBLISHED IN 2012
PUBLISHED IN 2011
PUBLISHED IN 2010
PUBLISHED IN 2009