This is an Onion parody, but who hasn't thought about how much money they could make selling out their user base?
Contemplative computing may sound like an oxymoron, but it's really quite simple. It's about how to use information technologies and social media so they're not endlessly distracting and demanding, but instead help us be more mindful, focused and creative.
My book on contemplative computing, The Distraction Addiction, was published by Little, Brown and Co. in 2013, and is available in bookstores and online. This 2011 talk is a good introduction to the project and its big ideas.
This is an Onion parody, but who hasn't thought about how much money they could make selling out their user base?
Because I can't cover the whole world, I just found out about Offtime, a project out of Berlin to develop an app to manage calls, texts, etc. while you go offline. The crowdfunding campaign describes what it does:
With ( OFFTIME ) you are the one to decide what is important right now: Keep out all apps, phone calls and text messages that distract you and go back to concentrating on your work, a chat with a friend or simply a moment of peace and quiet. ( OFFTIME ) will look after everything else. That way you can be more efficient or recharge the batteries properly.
Or you can just look at the video.
In principle, this seems like exactly the kind of app lots of people need. Of course, it's the sort of thing that'll only work if people recognize the need for it. Indeed, Offtime cofounder Alexander Steinhart argued at TEDxEutropolis 2013 that
A key thing from my perspective is that we have to have a broader public awareness about this issues. Because only when we have reached this awareness and recognition, we will not only be able to individually make more informed decisions, but also develop and use a new form of technology: technology that doesn’t exploit us, but that does support us.
This dovetails nicely with my discoveries about Zenware and why it works: not just because of the functional properties of Freedom or Leechblock, but because of the attitude that users bring to the programs. Zenware works, in part, because users want it to work. But it only works on those who recognize a problem, and a desire to change.
Sherry Turkle recently published an op ed in the New York Times about selfies and "The Documented Life." In it, she argues that the focus on taking selfies, on self-documentation, and our willingness to interrupt our lives in order to document our lives gets in the way of living:
A selfie, like any photograph, interrupts experience to mark the moment. In this, it shares something with all the other ways we break up our day, when we text during class, in meetings, at the theater, at dinners with friends. And yes, at funerals, but also more regularly at church and synagogue services. We text when we are in bed with our partners and spouses. We watch our political representatives text during sessions.
Technology doesn’t just do things for us. It does things to us, changing not just what we do but who we are. The selfie makes us accustomed to putting ourselves and those around us “on pause” in order to document our lives. It is an extension of how we have learned to put our conversations “on pause” when we send or receive a text, an image, an email, a call.
Why is this a problem? Because, Turkle argues, "When you get accustomed to a life of stops and starts, you get less accustomed to reflecting on where you are and what you are thinking." It makes it less likely for us to "have serious conversations with ourselves and with other people," or simply "sit still with our thoughts."
But all is not lost:
It is not too late to reclaim our composure. I see the most hope in young people who have grown up with this technology and begin to see its cost. They respond when adults provide them with sacred spaces (the kitchen, the family room, the car) as device-free zones to reclaim conversation and self-reflection.
Now, to readers of Alone Together, this will sound familiar. Turlke argues in that book that too often, interaction with technology serves as an easy, self-distracting substitute for the harder but ultimately more fulfilling work of interacting with people. The painful irony is that while these technologies offer the opportunity to make us more "social," they leave us less capable of being social.*
Jason Feifer at Fast Company dismissed her as "scaremongering" in an essay titled "Google Makes You Smarter, Facebook Makes You Happier, Selfies Make You A Better Person." His response to Turkle:
Have you ever pulled your friends together for a photograph, and struck a pose? Have you ever crowded heads with your boyfriend or girlfriend, making faces into a camera to make each other laugh? These aren’t interruptions of a moment; they are a moment. Some selfies are frivolous, of course, but so are some conversations. Technology can create togetherness....
Turkle imagines that any interaction with technology somehow negates all the time spent doing other things. She also imagines that we must devote ourselves in only one way to every task: At a dinner table, we are only serious and focused on conversation; at a memorial service, we are only mournful.
The message is reinforced by the article's banner picture, a kind of visual illustration of an Upworthy level of amazement:
This woman used the Internet. You won't believe what happened next.
There are two points I want to make. First, critics are missing an important part of Turkle's argument. And second, everyone in this argument has an unhelpful model of how technologies affect us.
Selfies vs. Self-Understanding
First of all, Turkle isn't arguing that this selfies-- or other technology-mediated engagements with others and the world-- are always bad. Believe me, she's too smart for that. What this essay argues is that this kind of easy self-occupation and self-distraction reduces the space in which we can make sense of ourselves. Note that she talks about having "serious conversations with ourselves" as well as others. And why is sitting "still with our thoughts" important? Because it "does honor to what we are thinking about," and "does honor to ourselves."
That is the central point of this article. For an academic like Turkle, the notion that the unexamined life is not worth living (as the Greeks would have put it) is self-evident. This is one of the oldest and most important functions of philosophy and higher education, and cutting it off-- substituting selfies for self-understanding, as it were-- is one of the most destructive effects of distracting technology and social media.
The idea that you need periods of silence, or quiet, or leisure, or slowness, to make sense of your life is not exactly new. It's a line of argument that you can find in Thomas Merton and every other contemplative who's ever said more than two words. Unfortunately, it's a point that Feifer and other critics miss in their eagerness to defend Facebook.
So they overlook the main point of Turkle's argument in favor of the shiny-blinky distraction of "Internet BAD!" This is Doge-level analysis.
The Vitamin Theory of Interaction
But if they elide her main argument, they end up replicating one of her flaws. Feifer is right that Turkle proposes a binary choice: you can either use devices that bleed you of your capacity for deep feeling and useful boredom, or you can drop them and reclaim your humanity.
The problem with both Turkle's essay and Feifer's response is that they both treat our relations with technology as kind of like Newtonian physics, as describeable by laws of cause and effect: using this technology does that to us. It reflects what I'm starting to think of as the Vitamin Theory of Interaction With Technology, that our interactions with technology are as predictable and easy to understand as the effects of vitamins on our body. Vitamin A does this, Facebook does that; C does this, Twitter does that.
Turkle in her New York TImes piece focuses on the downsides, and how self-distraction is ultimately self-destruction, while Feifer emphasizes the upsides of connecting with others through technology. But both frame the arguments in a language of cause and effect.
The problem is that this model is utterly wrong. Context and intention, for example, really matter in shaping how use use technologies, and what we get out of them. As I talk about in my book, people who like Zenware like it not just because of the formal benefits or the cool design. They like it because it reminds them of the need to focus, and serves as an external expression of a commitment to be concentrate. For Buddhist monks who live in monasteries, the fact that the sacred and profane are one screen away from each other presents a challenge to be extra mindful when online.
To take an example from the end of Feiler's essay:
Is it good that a dad pays more attention to Google than his daughter? No, of course not. That guy should put down his phone. But we should be careful about distinguishing between actual problems and perceived ones. The phone is not, by itself, the problem. In fact, if that dad closed his Google app, opened his camera app, and took a selfie with his 14-year-old daughter, I bet she’d be thrilled.
Or she might react the way my daughter reacts.
In other words, while it would be convenient if our interactions with technology were like physical phenomena, they have a phenomenology that we cannot ignore.
Photography can take us out of world, or encourage us to pay closer attention to it. But the camera doesn't change the way we see the world; learning to use the camera, the practice of seeing the world through it, is what changes us. Practice, not the product, is what affects us. It's not phenomenon, it's phenomenology.
*Another issue I had with Alone Together was that the book's core studies involved groups who are socially marginal in ways it never acknowledges: its description of a project to give robotic seal-like creatures to elderly people in nursing homes doesn't talk about how this population can often be socially isolated, dealing with higher-than-average rates of depression, and thus more likely to want to interact with both a robot seal and graduate students running the experiment. Trying to extract Big Lessons About The Modern Condition from studies of little kids and isolated elders seems a little problematic to me.
On the heels of Kathryn Schulz’s essay on the impact of Twitter on her writing comes high school teacher Andrew Simmons’ equally thoughtful explanation of how "Facebook Has Transformed My Students' Writing—for the Better:"
As a high-school English teacher, I read well over a thousand student essays a year. I can report that complete sentences are an increasingly endangered species…. However, while Facebook and Twitter have eroded writing conventions among my students, they have not killed the most important ingredients in personal writing: self-reflection and emotional honesty. For younger high school boys particularly, social networking has actually improved writing – not the product or the process, but the sensitivity and inward focus required to even begin to produce a draft that will eventually be worth editing….
Many of my students grow up in households in which machismo reigns supreme. They've never been allowed to cry. Their mothers and sisters cook and wash the dishes and clean. They've been encouraged to see themselves as dominant, powerful, swaggering, sullen men, not sensitive and reflective men, powerfully kind, confidently open. Fostering those traits is a woman’s responsibility, like housework. In this sense, Facebook is a genuine outlet for the young men I teach. Just as social networking frees users from public decorum and encourages the birthing of troll alter egos, it allows my students to safely, if temporarily, construct kinder, gentler versions of themselves as well.
The great news is that this has a positive effect on teaching and learning. My students in 2013 are more comfortable writing about personal issues than were my classmates in the mid-late '90s. When I assign narrative essays, students discuss sexual abuse, poverty, imprisoned family members, alcoholic parents, gang violence, the struggle to learn English in America – topics they may need to address, not merely subjects they believe might entertain or interest a reader.
For Schulz (and for me), tweeting and blogging offers some of the same emotional rewards as what I think we’d both classify as "real" (i.e. paid and published, but mainly paid— true writers are completely mercenary about their work), but in the psychological equivalent of a low-nutrition, energy-shot form. At my age, and because of the way I use it, Facebook isn’t about self-discovery so much as self-presentation and keeping up with a few friends. But if I were younger, I can imagine it working the way Simmons describes.
The contrast between these essays serves as a reminder of how differently the same medium can affect users at different stages in their lives, or with different skills. It reminds me of this recent study (behind a firewall) of the impact of video games at TV on children’s psychosocial adjustment: yes, it concludes that playing three hours per day or more games has no discernible impact on children’s levels of aggression or sociability, but it covers 5- and 7-year olds, who probably aren’t playing a lot of Quake 4 or GTA 5. Assuming these kids are playing games that are age-appropriate, applying the results to older kids, or to violent games, or to “gaming” generally (as if that were a useful category) involves making a bunch of leaps that probably aren’t warranted.
I’ve never before gone mad for any type of technology. Even the Internet did not particularly seduce me before the Twitter portal. I used it only for e-mail, and for targeted research; as recently as 2009, I probably spent, on average, under 30 minutes a day online. I didn’t have a cell phone until 2004, didn’t have a smartphone until 2010. I only got addicted to coffee three years ago. But then along came that goddamned bluebird, which seems to have been built with uncanny precision to hijack my kind of mind....
It's one of the most thoughtful pieces about the upsides and downsides of Twitter I've read.
Part of what makes it attractive is not that it's simply a distraction; rather, "it’s sufficiently smart and interesting that spending massive amounts of time on it is totally possible and semi-defensible." It occupies a space somewhere in the "exploration" space of the graphic I described a couple years ago.
The particular strength of the piece is that, as well as anything I've seen, it explains why Twitter is especially appealing to writers-- and why that appeal is a two-edged thing:
A tweet is basically a genre in which you try to say an informative thing in an interesting way while abiding by its constraint (those famous 140 characters) and making use of its curious argot (@, RT, MT, HT). For people who love that kind of challenge — and it’s easy to see why writers might be overrepresented among them — Twitter has the same allure as gaming. It is, essentially, Sentences With Friends.
It's a bit like speed chess as described in Waiting for Bobby Fisher. At one point the chess master (Ben Kingsley) explains to his young student that speed chess is fun, but destroys your ability to play serious games.
I am convinced that steadily attending to an idea is the core of intellectual labor, and that steadily attending to people is the core of kindness. And I gravely worry that Twitter undermines that capacity for sustained attention. I know it has undermined my own: I’ve watched my distractibility increase over the last few years, felt my time get divided into ever skinnier and less productive chunks.
More disturbing, I have felt my mind get divided into tweet-size chunks as well. It’s one thing to spend a lot of time on Twitter; it’s another thing, when I’m not on it, to catch myself thinking of — and thinking in — tweets. This is a classic sign of addiction: “Do you find yourself thinking about when you’ll have your next drink?” etc. In context, though, it’s more complicated than that, because thinking in tweets is only a half-step removed from what I’ve done all my life, which is to try to match words to thoughts and experiences. The job of a writer is to do that in a sustained way — a job I find brutally hard, and, when it works, deeply gratifying. The trouble with Twitter is that it produces a watered-down version of that gratification, at a very rapid rate, with minimal investment — and, if I am going to be honest with myself, minimal payoff, and minimal point.
It's not just that training your brain to write in 140 characters can wither your ability to concentrate in the kind of serious, sustained way every writer finds is essential to write books; the very public, immediately rewarding nature of the writing is also a problem. Schulz talks about spending years in a virtual cave when she was writing her book. "In my experience, and the experience of most writers I know, that cave is the necessary setting for serious writing," she says. "Unfortunately, it is also a dreadful place: cold, dark, desperately lonely." (I agree completely that it's essential for every writer, but I find it a more felicitous place; still, I get what she means.)
Twitter, by contrast, is a warm, cheerful, readily accessible, 24-hour-a-day antidote to isolation. And that is exactly the issue. The trouble with Twitter isn’t that it’s full of inanity and self-promoting jerks. The trouble is that it’s a solution to a problem that shouldn’t be solved. Eighty percent of the battle of writing involves keeping yourself in that cave: waiting out the loneliness and opacity and emptiness and frustration and bad sentences and dead ends and despair until the damn thing resolves into words. That kind of patience, a steady turning away from everything but the mind and the topic at hand, can only be accomplished by cultivating the habit of attention and a tolerance for solitude.
By removing the need to isolate yourself to write, offering an alternative that's social, and providing immediate emotional rewards (but no financial ones), Schulz is arguing, Twitter lets writers practice a version of their craft that is fun, that is occasionally valuable enough to keep doing (there's our old friend Mr. Intermittent Rewards!)-- but which can also make it harder for writers to do more serious writing.
Not that she has any plan of giving it up (any more than I do). The challenge is to learn to use it thoughtfully, to observe how your usage can unintentionally affect you, and to retake control over it when it tries to push you in a different direction.
Your dose of anti-alarmism, from Florida teacher David Cutler, writing in The Atlantic:
Last year, I discussed former New York Congressman Anthony Weiner’s sexting scandal with seniors in my United States Government course.
We not only considered the ramifications of Weiner’s actions– and how his inappropriate use of Twitter had truncated his political career–but I also asked my students to examine their own use of social media....
My goal isn’t to scare students away from using social media, which can be an extremely useful tool. I just want them to use it wisely. As a teacher, I believe it’s my job to teach my students about digital literacy and citizenship, equipping them with the tools to navigate an increasingly open digital world. This means making students aware of potential pitfalls and helping them to make good choices with current and emerging communication platforms.
Read the whole piece. It's short but good.
Mindful magazine has an interview with child and adolescent psychiatrist Tristan Gorrindo about teens and social media. It has a nice overview of how teenage social and neurological development can create problems with social media use-- how identity formation, the growing complexity of relationships with parents, and impulsivity can make life online (and life in general) difficult-- as well as a couple other good observations:
parents who don’t understand the technology... lean towards either setting totally strict limits (as in: “You’re not going to be on the Internet at all”), or they are totally laissez-faire. They don’t understand the tech, don’t know how to set the limits, and so they don’t.
Gorrindo also notes the futility of keeping kids offline:
Abstinence is not an option. You’re better off teaching your kids how to use these tools effectively and appropriately and help them navigate the times when they are going to trip and stumble as opposed to just trying to insulate and wall them off.
Personally, I think this is very smart, and exactly the right advice. As parents of older kids, our responsibility is less to protect children from dangers and hardships and complexity (certainly the last two), and more to prepare our kids to deal with them. In effect, our job is to make our children independent enough to function without us always being around, because at some point we won't be (and probably won't want to be).
Further, using technologies well can be a real pleasure, can make you smarter, and make you a better person. Teaching kids that information technologies are bad is like teaching them that food is bad, or sex is inherently evil: you can make the case, but over the long run the wiser thing to do is to teach kids how to make good, self-aware choices.
It also reflects the fact that the most powerful way to get kids to use technologies well is to model good behavior, and to reflect on your own experience. Unless you've grown up in a prison camp in North Korea and only recently moved here, you've grown up with computers and video games, and you know their attractions and downsides better than your children do. The fact that you remember when cellphones weighed two pounds and only made calls doesn't make you obsolete or mean you have no authority to help your kids find their own balance with technology: it means you have a valuable perspective that they don't.
This almost makes me want to buy Grand Theft Auto V: the in-game version of Twitter called Bleeter. Here's the description:
Information isn't about imparting knowledge anymore. The Internet changed all that.
Welcome to world of self aggrandizing shorthand. Keep strangers and people you hated in high school up to speed with every mundane detail of your life 24/7. Welcome the the delusion of having an interesting life and friends.
Bleeter is the perfect storm of blogging, social networking and text messaging. We're demolishing 100,000 years of complex linguistic development 140 characters at a time.
Sometimes parody really does offer a way to speak the truth.
Slate contributor Steve Kolowich writes about going through-- and often deleting-- old Facebook posts, likes, and messages:
A wall post, comment, or "like" stops being useful once everyone involved is done enjoying the fleeting rush of having been publicly acknowledged, in some lightweight way, by another person. Yet Facebook holds on to these data points indefinitely, working them over for information, like old confidants turned informants.
In the end, however, that righteousness rings hollow. Only you can see your Activity Log. And besides, Facebook only cares to understand your past in a way that will help predict what advertisements you might click on in the future. It doesn't care that you used to post self-indulgent status updates while sniping at other people's grammar. It doesn't care that you posted in support of gay rights while trading homophobic jabs with your friends. It doesn't care that you shamelessly flirted with other women on their walls while your girlfriend was posting notes on yours, writing in Swedish, counting down the days until you would visit her in London.
We care. That is what makes Activity Log so discomfiting. We dread being taken out of context. But a lot of context can be too much to bear.
I had a similar experience when I saw a demo of Futureful, a great app that searches the Web for things it thinks you're interested in. To do that, it dives into your Facebook and Twitter accounts (with you permission) to see what you post about. As I write in my book about seeing it analyze me:
I’ve always assumed that my Facebook and Twitter pages reflect my true self. But when I look at the Futureful algorithm’s profile of my interests—its sense of who I am, its snapshot of what I look like online — I’m puzzled at first, then actually alarmed.
The person Futureful thinks I am is very interested in politics; most of its recommendations are from partisan American Web sites or Euro- pean news sources. (To the program’s credit, most of them are new to me; the system is doing what it’s supposed to.) According to Futureful, I’m also very cynical. It thinks I like to read about corruption, scan- dals, and disasters caused by shortsightedness and greed. There’s nothing about history, design, computer science, or futures. Nothing about Buddhism or religion. Nothing about science. This person is an observer of the follies and stupidity of mankind, a cybernetic H. L. Mencken.
If I were talking to this person at a party, I’d concoct an excuse to get away from him.
It actually inspired me to change the way I use those services.
The one good thing that's come of my paying any attention to Sean Parker's post-wedding social media tempest is that, via New York Magazine, I found out about unplugged weddings. (Since I'm currently at an age when my friends aren't getting married, and my kids' friends aren't old enough, I don't go to many of them these days.) San Francisco-based photographer Shang Chen explains the case for unplugged weddings:
When the involvement of cell phones, smartphones, iPads got to be too much, I had to put my foot down. There’s a reason why about half my couples this year chose to have an unplugged wedding ceremony. It’s to bring the wedding back to what it is really about: The emotions and beauty of the marriage itself. And in that, electronics have no place.
Shang recommends doing a few things to make this palatable to guests: making it "clear to the guests that you want them to be present;" giving them "other ways to get photos with you," such as a photobooth at the reception (never seen that), or keeping their hands busy with "something fun for the exit." This keeps it from being about banning picture-taking, or enforcing a professional monopoly, but positions a ban on picture-taking during the ceremony as as effort to remove a distraction from the event-- just like not answering your phone during the vows.
That's the main idea behind unplugged weddings: that unless you're a professional, picture-taking is at odds with your ability as a guest to savor the event, and is an imposition on others. As another photographer argues,
By bringing a camera and using it, you are not only excluding yourself from THE moment in the hopes that you’ll take one good picture, but you create an emotional void where you (as a friend or relative) should have been as part of the wedding. You are absent from the very mission that was trusted on you: to be part of the collective celebration of the love of two people.
My experience at a Maroon 5 concert was that it was less of a distraction than I expected, but there the band was playing behind a giant set of screens themselves; unless the wedding music is playing at 90 dB and the altar is showing videos of the couple rappelling out of matching helicopters, I can see how taking pictures is going to be a bit of an issue. And even though I've found that photography has made me look at the world more closely, I can see how during a wedding my taking pictures could impinge on others' experience.
However, not all wedding photographers agree. Mitchell Dyer, for example, argues that
Technology is everywhere. It makes things easier, we can’t live without it and we all love to take photos and share them….
The guests are there to have fun and to celebrate the couple. They are there to live in the moment. In this day in age, living in the moment includes capturing every second with images and video clips with smart phones and ipads. I have zero problem with this. In fact, I encourage it. Embrace it. It’s not changing.
Maybe the most interesting counterarguments against unplugged weddings, though, comes from Andrea Grimes. First,
I don’t think Facebook and Twitter transport most people out of the real world and into some sterile digital sphere, devoid of meaningful human interaction. I think most people use Facebook and Twitter and Vine the way they use their mouths and their arms and their facial expressions: to comment on and create their experiences of the world around them…. We don’t tweet and text message and Vine and Instagram in spite of our memories, but rather we use those technologies to bookmark them and bring them into focus.
In effect, she assumes, picture-taking isn't a kind of distraction; it's a form of focus.
Second, it's a form of collective memory, which is especially valuable at a wedding, which for the bride and groom often feels like an out-of-body event. "[F]lipping through my friends’ Twitter and Facebook feeds" the next day, she said, let her "see how they experienced the evening... while I was off playing bride, daughter, niece, friend, hostess, and general social butterfly."
Of course, I take issue with Dyer's idea that we can't change; here, I think the issue is how much we can change others, or whether in the context of a wedding asking other people to put away the cellphones during the ceremony is either 1) an invasion of their personal space, or 2) a sign that you're a control freak (or as Andrea Grimes puts it, you think your wedding is "such a magical and important event that regular people can’t take it all in without the careful direction of the couple").
Personally, I think there's nothing at all wrong with telling guests to refrain from picture-taking during the ceremony, any more than it's wrong to expect them to stand when the wedding party walks down the aisle, or to not talk during the vows. If their concern is to have pictures of the ceremony itself, designate one non-professional to take the pictures that go up on Facebook in near real-time. People will have lots of opportunities to take pictures later.
The deepest discipline on the American scene today is silence, because we are absolutely consumed with insane babble.... I've made conscious decisions not to participate in most [social media]. Why? Well, our biggest problem is distraction.... Why should i contribute to the constant noise? let's learn the discipline of large periods of silence.
People don't quite understand that, but it's a spiritual problem when we send out 150 tweets every hour and a half, just to fill our own ego. Why in the world do people want to know what I eat for breakfast? it doesn't help them.
I'm so glad the Bible does not say in the first chapter of john, "and the word became a tweet, and twittered among us." People can tweet if that helps to be relational to someone. But the idea that we have to reduce language that way...
I've mentioned Foster before, and I think his book Celebration of Discipline is a terrific introduction to contemplative practices that get too little attention. Ironically it's possible to grow up in America-- even in the religious South, as I did-- thinking of meditation as an exclusively Buddhist or Zen practice, and hear very little about centering prayer, or the spiritual exercises of St. Ignatius, or other forms of Christian contemplation. Though perhaps Pat Robertson and Jim Bakker and Jerry Falwell-- the leading evangelical figures of my youth-- talked about lectio divino and the Cloud of Unknowing, and I missed it.
”Everyone talks about deleting their Facebook account, but we rarely take action," Social Roulette co-creator Kyle McDonald says. "Sometimes we need a simple game to help take the responsibility off our shoulders, and provide a moment for reflection." So he and friends created Social Roulette, which as Tech Crunch explains is "an online version of Russian Roulette, the lethal real-life game where a player places one bullet in a six-chamber revolver pistol, spins the cylinder, and fires the gun at their head."
Social Roulette works on your virtual self: play, and you have a one in six chance of having your Facebook account deleted. When I explained it to her, my wife asks, "Is that winning or losing?"
As of May 11th, Facebook has changed the API key, on the grounds that it was "creating a negative user experience." No kidding.
...and now it has. On Tumblr, natch.
I don't have much to add to Evan Seliger's Wired piece about Facebook Home and the message of its advertising videos:
to be cool, worthy of admiration and emulation, we need to be egocentric. We need to care more about our own happiness than our responsibilities towards others.
Take, for example, this advertisement for how to make dinner more fun-- by ignoring your family:
There are several things that jump out at me in this.
First, no one, and I mean NO ONE IN THE UNIVERSE, can't tell when you're looking at your phone under the table. That thing where you put both hands under the table, and there are little clicking noises, and your face glows? Yeah. People can tell.
But the conceit that she's getting away with it is part of the bigger point of the ad. Not only is it egocentric, it's a celebration of a certain kind of psychopathic entitlement: the idea that if the mark is dumb enough to trust you, he deserves to be taken. Getting away with it is proof that it's okay to do it.
That casual cruelty is what lets her ignore her boring aunt in the first place, of course. But I wonder: what's the aunt's story? Is she just a terrible, boring person? Or is there something else here that inspires everybody else to be patient and polite to her? Is she getting over a stroke, and only recently recovered the ability to talk? Maybe this is the most social she's been in ages. Maybe she's a widow and has no one to talk to. Maybe something else is going on, and everyone recognizes that this performance, boring as it looks to us, is actually kind of a milestone.
Well, almost everyone.
But I detect a note of melancholy that runs in this ad, and the companion piece "Airplane," that Evan missed.
Selfish Dinner Girl is at home, trapped in conventionalized normative bourgeois paradigm of the family dinner. Airplane Dude is settling into the Economy (otherwise known as "self loading cargo") section of a plane. As Airplane Dude checks to make sure his seat back and tray table are in their full and upright position, one set of friends chillax on the beach, another seem to be doing a burlesque show in drag, his nephew is ODing on chocolate cake, and some other friends are raving in Norway's answer to Ibiza. Selfish Dinner Girl's friends are rocking out, dancing with the Bolshoi, having a snowball fight outside their condo in Vail.
And while Selfish Dinner Girl is hitting Like like it's going out of style, part of her is thinking, You're not there. You weren't invited.
She's got to wonder: When your friends come in from the cold, settle into the oversized leather sofa, mugs of hot cocoa, and Facebook Home, what will they see from you?
Stuck here with boring family :-(
If in the attention economy your role is to be a distraction for your friends-- if they see you the way you see them-- are they going to be impressed? If you're not delivering tasty treats for their monkey minds, what value are you creating for them?
Enjoy your "friends" while you have them, Selfish Dinner Girl. They're having lots of fun without you. Sooner or later, they're going to realize they don't need you any more than you need your boring family.
All too true:
After two days without Twitter, I barely missed it; by the second week, I was downright happy not to be thinking about "staying on top" of my feed. I've uninstalled Tweetdeck from my phone, and going forward will only use Twitter to post links to my own blog posts. So my first piece of advice is that you should just stop using Twitter altogether, or find a way to show only those tweets that contain links.
While I tend to check in on Twitter a couple times a day, I agree completely that the feeling that it's necessary to stay on top of your feed is ridiculous. It's one of those kinds of manufactured urgency that we use to make ourselves feel busy or important or just like everyone else who's overworked and stressed and pepetually behind.
A few weeks ago, I took Facebook and Twitter off my iPhone (though I replaced Facebook with FB Messages), and haven't missed them. No, that's not true. I have missed them occasionally, but I have no regrets about removing them, and don't plan to reinstall them. The phone is just not a congenial platform for reading-- and especially writing-- tweets, any more than it's a good platform for long-form composition. Writing 140 characters that are worth reading takes enough effort; I don't need the additional work of battling the tiny screen of my iPhone.
And, just as important, there's simply no situation in my life where I need to tweet in real time. None at all.
I'm in the midst of another experiment with my phone: I've created a 40-second ringtone that consists of silence, and assigned it to anyone who's not a short list of family, close friends, and colleagues; that second group has a purposely loud ringtone, and an alert when they email or text me. We think of being always-connected as a great thing, and it can be-- if you're ruthless about who you're always connected to. The mobile phone makes no distinction between people who matter to you and people who don't; it defaults to overconnection. Even the iPhone is pretty unsubtle in this regard: it's easy to silence all calls, but you can't make a whitelist; you have to go person-by-person, changing their default ringtone or message tone. It's a bit of a pain, and I'm surprised that Apple's not more socially nuanced.
What I've found after several weeks is that I haven't missed a single call that matters. I've missed a lot of spam calls, appeals for donations, alumni fundraisers, and all sorts of other things. The only times I've missed calls from family are when I had my phone off, or in another room, or something.
So I recommend the whitelist. Let the technology work for you: make it possible for the people you care about to reach you, and provide the rest of the world the opportunity to take your attention when it's convenient for you to give it.
The next experiment I'm going to try is to not carry my phone on me, but to leave it in my bag, with the ringer turned up high. I confess I get a bit of an anxious twinge just thinking about it-- which is exactly why I should do it.
I've been reading the page proofs for The Distraction Addiction, and should be done with it in a few days. As with any 250-page book, there are a few typos and the occasional bit of repetition, so while it's good (I'm very pleased with it, I have to admit), it's also good that I'm looking at it one more time.
I'm starting to think about the media campaign for the book, and the potential irony of using social media to promote a book about how to not be overexposed to social media. I'll have to figure out how to mindfully market the book. Of course, these days there's no such thing has just marketing a book: all marketing is self-marketing as well. I cut a lot of the autobiographical stuff from the book, but there's still plenty of me in it. So I was very interested to see Douglas Rushkoff, who has a new book out, saying goodbye to Facebook:
Facebook has never been merely a social platform. Rather, it exploits our social interactions the way a Tupperware party does.
Facebook does not exist to help us make friends, but to turn our network of connections, brand preferences and activities over time -- our "social graphs" -- into money for others....
The true end users of Facebook are the marketers who want to reach and influence us. They are Facebook's paying customers; we are the product. And we are its workers.
This isn't much of a surprise. After all, if you're not paying for the product, you are the product. But things are changing in ways Rushkoff finds even more disturbing. Now,
any of our updates might be converted into "sponsored stories" by whatever business or brand we may have mentioned. That innocent mention of cup of coffee at Starbucks, in the Facebook universe, quickly becomes an attributed endorsement of their brand....
Through a new variation of the Sponsored Stories feature called Related Posts, users who "like" something can be unwittingly associated with pretty much anything an advertiser pays for. Like e-mail spam with a spoofed identity, the Related Post shows up in a newsfeed right under the user's name and picture. If you like me, you can be shown implicitly recommending me or something I like -- something you've never heard of -- to others without your consent.
Now, of course everyone who doesn't radically overshare engages in some measure of self-fashioning and editing. But there's a difference between you doing this yourself, and someone else doing it under your name-- and purposely obscuring they're tracks. (This even happens with dead Facebook users.)
This raises an interesting possibility. Come this fall, I could appear on Related Posts as liking Google Glass, or brain-building software, or some corporate motivational program-- i.e., appear to "promote" or "like" things that run counter to what I argue in the book. Hmmm....
Two quick things:
MT @tirosenberg: The Pope goes on Twitter, and a few months later loses all interest in work.Coincidence?I DON"T THINK SO.— Daniel Pink (@DanielPink) February 12, 2013
Please don't let interest in all this peak before August 20.
I'm old enough to remember how traveling internationally in the 1970s and and 1980s meant communicating through letters or occasional expensive phone calls.
At the same time, I've really liked being able to keep in touch with family when I've traveled, and the existence of Skype and email made it a lot easier for my wife and I to go to Cambridge and leave the kids at home (with my mother, not just a few cases of ramen noodles). We Skyped with them daily.
keeping in touch with the kids, via flickr
Trinity University professor Robert Huesca spent six months in Ouidah, Benin, and has a piece in the latest Chronicle of Higher Education arguing that Facebook Can Ruin Study Abroad. He and his fellow visitors all "came equipped with various levels of advanced electronics, including computers, mobile phones, digital cameras, iPads, iPods, and other media players loaded with movies, television programs, and music," not to mention access to Facebook, IM and Skype. He argues that "online communication constitutes a vortex that consumes hours of a traveler's time," and that too many people studying abroad listen to the same music, watch the same movies, and talk to the same people they do when at home-- which means they spend less time exploring their local environments, or going through the uncomfortable but ultimately valuable experiences of dealing with culture shock and dislocation.
The corrosive consequences of new communication technologies are evident when the hours spent chatting online, listening to a homegrown playlist, or watching television reruns take time away from conversing with a local friend, hearing a native song, or learning an indigenous dance or game….
If the Internet and social media allow study abroad students to live within a protective "bubble" (call it a cultural comfort zone) of life back home, the experience abroad is diminished considerably....
In the rush to protect our students and our universities through the adoption of digital technologies, we unwittingly have extinguished the necessary conditions for personal transformation that justify the expense, risk, and sacrifice of study abroad.
Of course, this is a technology-enhanced version of a couple old problems, as a couple commenters rightly point out. As one notes, "The reason they don't experience 'there' is because they've never experienced 'here':" that a personal media bubble dulls your engagement with home and alma mater just as much as it does your engagement with novelty. Another commenter who'd spent time in Germany in the 1980s says that "while I'm sure technology makes it easier, even without much of it, people can ignore their surroundings and focus on themselves and their comfort group." (This is an old complaint: you can find accounts of foreign students at Oxford in the 1700s, or Italian universities in the 1500s, who do nothing by carouse, hunt, or fight.)
I'm sympathetic to the idea that one should err in favor of making sure you're open to new experiences and places. When my daughter went on a school trip this summer to England, I told her to ignore Facebook, precisely because I worried that her days would be less full, and she'd be less engaged with what she was experiencing, if she was posting about it at the same time. (I didn't really need to worry. She was going with her closest friends, she's not a big Facebook user anyway, and the last thing she was going to do was stay in touch with her parents.)
At the same time, I wouldn't advocate leaving everything with batteries behind. When I travel alone, I go walking in the evening and take pictures as a way of connecting with the place, and I'm certain that the camera helps me observe and remember new places in a way I might not otherwise.
My Skyping with home when I was Cambridge worked in part because it could only happen at certain times of day, before and after school. Wwe have to learn how to make a choice about how to use connectedness: how to use it to make parts of our regular home life easier, and to use it as a retreat from the dislocations of being away, while still make sure you have the freedom necessary to make the most of the opportunities in front of you.
That's from a classic Far Side cartoon featuring two bears in a circus, one of whom is holding a muzzle in his hands. It came to mind today when I read Conor Friedersdor's At What Age Will You Stop Using Facebook?
Imagine 7 years spent living in a college dorm, or 15 years spent attending the parties you went to in your twenties. Now imagine yourself perusing a Facebook stream daily for a full 25 years.
Doesn't that just feel like too long?...
The popularity of Facebook among older people today doesn't really tell us much. Like everyone already grown up when social media came along, they experienced the addicting novelty of remaking long-lost, far-flung connections while in between tasks at work or waiting for the onions to caramelize. People who grew up with social media all along will experience it differently in middle age.
I don't know the answer to the question, but I suspect that I'll be on as long as people I really care about (my grown-up kids) are on it.
But raising the question is very worthwhile, because it implicitly suggests that Facebook is something that one could use for a time then abandon, or switch off for some period of your life then reactivate-- that the assumption that your life should alway; that perhaps there are times when we can choose not to live lives on display (displays that work to the benefit of marketers and the distraction of acquaintances, as well as the possible enrichment of friends).
I write about people, technology, and the worlds they make.
My book on contemplative computing, The Distraction Addiction, was published by Little, Brown and Company in 2013. (It's been translated into Dutch (as Verslaafd aan afleiding) and Russian (as Ukroschenie tsifrovoy obezyany); Spanish, Chinese and Korean translations are in the works.)
I'm a senior consultant at Strategic Business Insights, a Menlo Park, CA consulting and research firm. I also have two academic appointments: I'm a visitor at the Peace Innovation Lab at Stanford University, and an Associate Fellow at Oxford University's Saïd Business School.
My latest book, and the first book from the contemplative computing project. The Distraction Addiction is published by Little, Brown and Co.. It's been widely reviewed and garnered lots of good press. You can find your own copy at your local bookstore, or order it through Amazon, Barnes & Noble or IndieBound.
My first book, Empire and the Sun: Victorian Solar Eclipse Expeditions, was published with Stanford University Press in 2002 (order via Amazon).
PUBLISHED IN 2012
PUBLISHED IN 2011
PUBLISHED IN 2010
PUBLISHED IN 2009