"It’s harder to imagine the past that went away than it is to imagine the future." (From his Paris Review interview.)
Contemplative computing may sound like an oxymoron, but it's really quite simple. It's about how to use information technologies and social media so they're not endlessly distracting and demanding, but instead help us be more mindful, focused and creative.
My book on contemplative computing, The Distraction Addiction, was published by Little, Brown and Co. in 2013, and is available in bookstores and online. This 2011 talk is a good introduction to the project and its big ideas.
Rebecca Rosen has an interview in The Atlantic with Sara Hendren, creator of the blog Abler. "All bodies are getting assistance from technology all the time," Rosen notes, "yet some [assistive technologies] are stigmatized:"
You have written and spoken extensively about the idea that “all technology is assistive technology.” What do you mean?
Scholars working in disability studies have called attention to this as a redundancy in technical terminology before me, and I’m trying to bring it to tech journalism: What technology would not be called assistive?
Indeed, the whole point of technology is to be assistive: there are no everyday technologies that don't help us do things, extend our physical abilities, augment our memories, etc. But we classify (and sigmatize) some technologies as "assistive," while placing others on the side of just plain old technology. As Hendren writes elsewhere,
Honestly—what technology are you using that’s not assistive? Your smartphone? Your eyeglasses? Headphones? And those three examples alone are assisting you in multiple registers: They’re enabling or augmenting a sensory experience, say, or providing navigational information. But they’re also allowing you to decide whether to be available for approach in public, or not; to check out or in on a conversation or meeting in a bunch of subtle ways; to identify, by your choice of brand or look, with one culture group and not another.
The boundary between assistive and "normal" technologies isn't solid. In my lifetime, eyeglasses have gone from being signs of nerdishness and weakness (when I was a kid in the South in the 1970s), to being fashion accessories (my son just ordered a new pair of Oakley prescription glasses). In the 19th century, canes were fashion as well as functional: now, unless you're a hiker who lives near a bespoke wood-turner with exquisite taste in driftwood, your cane options tend to look like props out of late-night infomercials about mail-order vitamin supplements.
The Guardian reports that
a team led by [Robin] Coningham, a professor of archaeology and pro-vice-chancellor at Durham University, had made a startling discovery about the date of the Buddha's birth, one that could rewrite the history of Buddhism. After a three-year dig on the site of the Maya Devi temple at Lumbini in Nepal, Coningham and his team of 40 archaeologists discovered a tree shrine that predates all known Buddhist sites by at least 300 years.
The impact of Coningham's work is groundbreaking in many ways. Prior to this discovery, it had been thought that the shrine at Lumbini – an important pilgrimage site for half a billion Buddhists worldwide – marked the birthplace of the Buddha in the third century BC. But the timber structure revealed by archaeologists was radio-carbon-dated to the sixth century BC.
"It has real significance," says Coningham, 47. "What we have for the first time is something that puts a date on the beginning of the cult of Buddhism. That gives us a really clear social and economic context... It was a time of huge transition where traditional societies were being rocked by the emergence of cities, kings, coins and an emerging middle class. It was precisely at that time that Buddha was preaching renunciation – that wealth and belongings are not everything."
This last paragraph is a reference to Karl Jasper's Axial Age thesis, which you should look up.
The article also has some nice accounts of what it's like conducting archaeological fieldwork in Nepal in January and February (spoiler alert: it gets cold).
I've been thinking a lot about vacations and leisure recently. Not because I feel I don't have enough of either: I'm lucky enough to have an employer who has a generous (if perfectly normal) vacation plan, and I don't really feel like I'm pinched for time. Rather, I've been thinking about how people take vacations, and the relationship they believe vacations have to work.
On one hand, vacations are seen (rightly) as a major perk, yet they're underused. Surveys reveal that more than half of American workers don't use all their vacation days, forfeiting up to two weeks in an effort to stay in their bosses' good graces, because they fear being seen out of the office too much, or because they like work.
Is this a bad thing though? Maybe in today's world of blurred lines between work and play, it would be natural for the vacation to go the way of the rotary phone or newspaper? Is this once-beloved and familiar institution being rendered obsolete by globalization, the Internet, and the 24/7 pace of business?
I'm not ready to give up on the vacation, though. In the social sciences, we have something called "natural experiments" where two groups are very similar, save for one key variable-- access to resources, for example. Such experiments help us better understand how humans work, and can make clear that what we think is natural or inevitable is actually changeable. I'd like to propose something like a natural experiment about our attitudes to connectivity, speed, work, and leisure.
I've found a group that has to deal with global problems and fantastic rates of technological change; who are notoriously, almost supernaturally, productive; whose scientific advances, literary achievements, and architectural monuments enlarge our view of the world and invite. While this sounds like our world, they differ from us in one critical respect: every year, they disappear into the mountains, sea-side resorts, or rural retreats for weeks at a time. They see leisure and contemplation as a necessary part of life; get more out of their vacations because they're longer and more strenuous than ours; and enjoy broad agreement on the importance of breaks, disconnection and reflection.
Who are they? The Victorians.
Forget the stereotype of Victorians as prudish, sexually repressed, dour and dull. I've studied them for years, and two things still impress me about them.
The first is how much their world was like ours. The Victorians explored and mapped the world, conquered it, then wrapped it in telegraph cable and railroad track. They globalized world trade, urbanized and industrialized the West, and invented entire scientific disciplines. The telegraph, as Tom Standage put it, was "the Victorian Internet," a technology that had dramatically quickened the pace of long-distance communication.
All this led to regular debates about whether people were overly focused on commercial life and material goods, whether transportation and communication networks had eroded a sense of place, whether the pace of life had become unhealthy. (Even the era's critics sound modern: today's worries about unequal income distribution, growing economic uncertainty, poverty and disenfranchisement were anticipated by Victorians as different as Charles Dickens and Karl Marx.)
The second thing is how hard they worked, and how much they got done. Charles Darwin could move from incredibly detailed studies of barnacles and worms to grand theorizing about the mechanisms of evolution, wrote a multi-volume account of his voyage on the HMS Beagle before he was thirty, and published another eighteen books.
Many of Darwin's contemporaries weren't just accomplished in one field; they could heave their mark on many. Politician Benjamin Disraeli was a novelist; novelist Charles Dickens was also a social reformer. The great scientist James Clerk Maxwell made a reputation studying electricity and magnetism and a fortune advising telegraph companies.
Countless other figures known only to historians combined careers as lawyers, civil servants, printers, or military officers with second lives as authors, composers, inventors, scientists, athletes, and explorers.
So the Victorians created and lived in a world that felt much like ours-- hyperconnected, fast-paced, globalizing, furiously reinventing itself-- and maintained lives of admirable productivity and accomplishment. Yet the choices they made about leisure and vacations were dramatically different from ours.
Month-long escapes from the office were not at all unusual, and six- or eight week-long vacations were not out of the question. The great critic John Ruskin commented on how the peace in Europe allowed thousands of his fellow Englishmen to visit "on the average, each two or three months." Of course, these tended to be better-off professionals-- "the noblest born, the best taught, the richest in time and money, having more leisure, knowledge, and power than any other portion of the nature," as Ruskin put it-- but with the rise of the railroads and inexpensive vacation destinations, the middle classes took weeks off as well.
It wasn't just the British who took their vacations more seriously. For city dwellers in the United States, escaping the summer heat (not to mention the smells and cholera outbreaks) was essential: in Washington DC, even clerks had a month off, and many headed for the mountains or shore.
Nor did the ideal of the long vacation pass when Queen Victoria died in 1901. In 1910, President William Howard Taft and a number of other "men of affairs" told the New York Times that the ideal vacation lasted two or three months. Railroad executive Frank Headley thought that "men who work under a mental strain" required a few weeks, while "the man whose work is merely physical effort" was restored by the weekend. Taft himself spent three months at a golf resort in Hot Springs, Virginia between his nomination to the Republican presidential ticket and his election in 1908.
What was their secret? Hardworking as they were, Victorians recognized the need to get away from the chattering of the telegraph and noise of the market. Vacations were an opportunity to challenge themselves physically, to immerse themselves in the quiet of the country or seaside, to occupy the mind with invention rather than legislation, or astronomy rather than accounting.
The politician William Gladstone recovered from his battles with Disraeli by conquering volcanoes and mountains. The organic chemist Edward Frankland took punishingly ambitious six week-long "rambles" in the mountains of Cumbria and the Swiss Alps, often with fellow scientist and climber John Tyndall. (Both found the mountain air encouraged deep thinking.) Thomas Edison's 1878 summer vacation to the Rocky Mountains included a side-trip to observe a total solar eclipse.
Some learned to vacation like this in school: Oxford students would organize weeks-long "reading parties" to rustic lodges or mountain chalets where they would spend mornings reading and afternoons outdoors.
For many scientists and scholars, studious mornings and exhausting hikes balanced each other nicely: physical activity left them little energy for frivolity.
Not everybody approved of this new kind of leisure, though. George Eliot complained that "even idleness is eager now-- eager for amusement; prone to excursion-trains, art museums, periodical literature, and exciting novels; prone even to scientific theorizing, and cursory peeps through microscopes."
For high-minded, driven people, the vacation was an opportunity for restoration, for "leisure" in the ancient sense of the word: a time to live to the very fullest. Modern science suggests that they were onto something. Psychologist Stephen Kaplan argues that immersing ourselves in challenging activities is a surer cure for overwork than just lounging by the pool.
Mihaly Csikszentmihalyi and happiness psychologists find that people who seek out difficult but rewarding activities have more satisfying lives than people who pursue sybaritic pleasures. A week partying on a tropical island will give you a hangover and some embarrassing pictures on Facebook; a week that includes climbing the volcano that created the island will leave you readier to face work.
We've largely forgotten that the Victorians combined hard work and play so effectively; even biographers usually race past the long weekends and months their subjects spent abroad, and don't connect leisure time to their subject's amazing productivity.
But even worse, we've forgotten how valuable good vacations can be for ambitious, productive people. Sometimes we love our jobs and don't really want to get away, or hate the idea of having a thousand unread messages waiting in our in-box. Or we believe that the world is too fast-paced and uncertain to get away: in an era when clients expect an immediate callback, colleagues want instant answers to their questions, bosses quick analysis of their options, you can either be on 24/7, or be replaced. Executives may believe in their own indispensability, subordinates fret about their job security, but nobody can afford to disconnect. And some of us become so accustomed to being constantly online that the idea of going offline makes us nervous.
It doesn't have to be this way. People who take digital Sabbaths report that you can keep your vacation in-box from blowing up if you give colleagues plenty of advance notice that you're going to be offline. Leaving your devices behind, or if that's intolerable at the bottom of your bag, when you go away, is also essential. The first couple days will be strange, even nerve-wracking, but pretty soon you'll feel your mind shifting into a new gear.
A more active vacation will also give your mind something completely new to think about: it's hard to fret about office politics when you're learning to kayak or hiking a steep trail.
So long as we wear stress like a badge of honor, and consider overwork a sign of virtue rather than a warning that we're burning out, we won't accomplish what we want on the job, or in life. The lives of the Victorians challenge our assumption that eternal busyness is a natural route to higher productivity and more ideas. The world doesn't need us so badly that we can't disconnect from it. We'll live more fulfulling lives, even hyper-networked lives, if we unplug from time to time.
You should read this very interesting article about daydreaming by Jessica Lahey in the Atlantic.
I’ve been reading about daydreaming extensively lately, and it has caused me regret every time I roused one of my students out of their reverie so they would start working on something “more productive.” Daydreaming has been found to be anything but counter-productive. It may just be the hidden wellspring of creativity and learning in the guise of idleness.
Not all mental downtime is alike, of course. Downtime spent playing a video game or zoning out with a television show may have its charms, but the kind of downtime I am talking about is different. I’m talking about the kind of mind-wandering that happens when the brain is free of interruption and allowed to unhook from the runaway train of the worries of the day. When the mind wanders freely between random thoughts and memories that float through our consciousness, unbidden. Television, videogames, and other electronic distractions prevent this kind of mental wandering because they interrupt the flow of thoughts and memories that cement the foundation of positive, productive daydreaming.
This is a point I make in my book. Eventually concentration fails, and your capacity to direct your attention on a problem flags; but the smart thing to do isn't to spend an hour surfing the Web or watching Home Alone 4: The Homecoming, but instead do things that allow your mind to unfocus and wander. In that state your mind can continue to turn over problems without your active management. Sometimes your mind will arrive at answers that have been eluding you, and it'll feel as if they've just popped into your head; at other times a new approach to a problem will dawn. But either way, it's essential to allow the mind to unfocus itself a bit, to let it loose to work on its own.
In other words, Lahey continues,
daydreaming only appears lazy from the outside, but viewed from the inside—or from the perspective of a psychologist, such as Kaufman, or a neuroscientist, such as Mary Helen Immordino-Yang—a complicated and extremely productive neurological process is taking place. Viewed from the inside, our children are exploring the only space where they truly have autonomy: their own minds.
Immordino-Yang’s work on the virtue of mental downtime includes the paper “Rest is not Idleness: Implications of the Brain’s Default Mode for Human Development and Education.” The title quotes a 19th-century British banker named John Lubbock, who wrote, “Rest is not idleness, and to lie sometimes on the grass under trees on a summer's day, listening to the murmur of the water, or watching the clouds float across the sky, is by no means a waste of time.”
Lubbock, according to Immordino-Yang, was way ahead of his time in understanding the value of idleness to our essential neurological functioning. What Lubbock called rest, Immordino-Yang calls “constructive internal reflection,” and she considers it is vital to learning and emotional well-being.
John Lubbock is not very well-known now, except among historians of Victorian science; but all nine of us recognize him as more than a banker. In his day he was a politician, occasional university chancellor, and the son of noted mathematician and statistician Sir John Lubbock. The elder Lubbock (the 3rd Baronet, to distinguish him from the first two baronets, who confusingly were also named John), was a member of the X Club, a circle of elite Victorian scientists, a member of the British Association for the Advancement of Science, and an officer in the Royal Society.
From 1842, the Lubbocks were also Charles Darwin's next-door neighbors. Both were products of fairly nouveau families that had prospered during the Industrial Revolution, both were Cambridge men, T. H. Huxley was their mutual friend, they were members of some of the same scientific societies. Darwin moved from London to the village of Downe, and bought Down House, where he lived the rest of his life. The Lubbock estate, High Elms, was adjacent to Down House, and Lubbock leased to Darwin a piece of land on which Darwin built his Sandwalk, a walking path where he would go every day to walk and think.
Darwin would use the Sandwalk as a break from the lab or writing, but often in the course of his walks, as he looked at the landscape or the trees (he had the path planted with trees and shrubs, and there were several well-established trees there as well), his mind would turn over a problem he was working on. He had the habit of setting a few stones on the side of the path, and moving them to the other side as he passed, to keep track of how far he'd walked; his sons would sometimes test to see how deep in thought he was by moving them around, and seeing if he would notice.
The point is, Darwin's walks were exactly the kind of productive rest, or constrtuctive internal reflection, that Immordino-Yang advocates, and that the younger Lubbock was thinking of when he distinguished between rest and idleness. Indeed, he might have had Darwin in mind: as a child he was a regular visitor to Down in his youth, and probably would have seen Darwin on the Sandwalk.
I'm growing convinced that one quiet but serious problem we have today is that we've unlearned the real value of rest. As Diarmaid MacCulloch puts it in his wonderful Silence: A Christian History, the Sabbath as described in Genesis "was a vital part of the creative process rather than simply an end of it." Likewise, we should recognize-- in our educational policies, and in our own lives-- that we can choose forms of rest that are "a vital part of the creative process," rather than merely distractions.
Update: I'd not realized when I first wrote this that the author of the "rest is not idleness" quote was the fourth baronet, not the third, and so previously had conflated the two. I've updated the piece to separate them.
I ran across this lament from Thomas a Kempis in a piece by Virginia writer Liza Field:
“Our thoughts are given to things which avail little, but that which is vital we ignore. The whole man flows out to things external, and unless he speedily recover himself, he willingly remains immersed in them.”
It's a fifteenth-century complaint, but it still has a modern ring.
These kinds of ancient quotations used to be invoked as part of an argument that distraction is nothing new, that people get past it, and we should stop worrying about whether Facebook is making us stupid.
Now, a piece like this can end with a call to relearn the kinds of mental disciplines that Kempis would have practiced to "recover himself." There's such a thing as progress after all.
The first big talk I gave about contemplative computing was at a conference on technology whose theme was "Slow." Last night I revisited that ground: I spoke at an event sponsored by the Hayden Group, an branding and marketing consultancy. (Do we say consultancy in America?) They run a great speaker series, and my friend Rich Green and I talked about slow technology in its various manifestations.
rich showing off slow technologies, via flickr
Rich showed a number of slow technologies, including some very old cameras, computing devices (slide rules and the like), and notebooks.
old and new cameras (Zeiss Ikon folding camera and iPhone), via flickr
I did a brief overview of the book, as I am wont to do these days. The crowd at these events is very high-powered, with some really smart people who aren't afraid to express their opinions; but it was that kind of conversation where even the disagreements yielded interesting things.
A couple people raised objections about the artifacts Rich had brought, and whether they would have been considered "slow" in their own time. A camera is a lot faster than a painting, a slide rule is faster than calculating with pencil and paper; even a bow and arrow (which if you read Eugen Herrigel, you associate with Zen and leaves falling from trees and practiced effortlessness) was faster than a club.
zen and the art of archery, via flickr
If that's the case, then isn't "slowness" just a kind of nostalgia, a projection of a desire for a once-simpler life (which probably didn't seem simpler back then)?
We went back and forth on this a bit, and I came to this question: is "slow" a technical descriptor, or is it something else? Put another way, does a "slow" technology have to be one that actually is s... l… o… w… as measured by a stopwatch?
slow, via flickr
Of course, there are no slow technology police, so you can define it however you want. But it seems to me that people who talk about it emphasize or seek a few things.
They want technologies that promote a measure of reflection, even if they're easy to use.
They offer opportunities for cultivating and using skills. (The disappearance of older skills, and the perspective they encouraged, is what architects complain about when they lament the impact of CAD on architectural education and practice.)
They offer their own inherent pleasures, even as you're able to use them without regard for their craftsmanship. Like the goblet, you can appreciate its fine details, but you can also ignore it while you're focused on its contents. Good cameras are a great example of this kind of balance between geeky beauty and utility: the Leica M9 is truly a thing of beauty, but you can also shift your attention away from its heavy precision and use it to see the world.
slow technologies, via flickr
They invite deliberation. As the Slow Media Manifesto declares, "Slow Media are... about choosing the ingredients mindfully and preparing them in a concentrated manner."
Using them can be so engaging that you lose track of time. This is the other critical form of slowness: not just that you're doing things at a slower pace than you are in your normal hurried life, but that your sense of time slows and stretches.
foggy park this morning, via flickr
To me, all this points to a conclusion: that the "slowness" is a psychological thing, not a purely technical one. It's the kind of slowness that comes when you're so engaged in something you don't notice the time passing: you look up and four hours have passed, and it's surprising and blissful.
The exercise of skill that challenges (but doesn't overwhelm), immersion of attention, time slowing… We've all seen this before. It's Mihaly Csikszentmihaly's definition of flow. As you're recall, flow, or optimal experiences involves
a sense that one’s skills are adequate to cope with the challenges at hand, in a goal-directed, rule-bound action system that provides clear clues as to how well one is performing. Concentration is so intense that there is no attention left over to think about anything irrelevant, or to worry about problems. Self-conciousness disappears, and the sense of time becomes distorted. An activity that produces such experiences is so gratifying that people are willing to do it for its own sake, with little concern for what they will get out of it, even when it is difficult, or dangerous. (71)
This psychological dimension, I should add, isn't one that proponents of slow technology (or other slow experiences) ignore: Jack Cheng describes the Slow Web, for example, as
a feeling we get when we consume certain web-enabled things, be it products or content…. [my emphasis] It’s not so much a checklist as a feeling, one of being at greater ease with the web-enabled products and services in our lives.
But the point is this: looking for slowness in the technologies themselves, or evaluating slow technologies on the basis of whether they're "really" slow or not compared to today's technologies or their contemporary competitors, misses the point. The slowness is experiential, and it's the simplest expression of a bigger set of phenomena that emerge when you're able to engage mindfully and skillfully with technologies. It's psychological, a product of your active use of a technology, not something the technology lets your consume.
It took a while to figure this out, and the discussion went about an hour longer than I expected. But I didn't notice. The time just flew by.
Morgan Ames, who I interviewed for my book, has a new project: My Little Pony or Porn Star? Technological Utopian Edition. (I hadn't realized that Morgan wrote English copy for Japanese technical manuals, but hey we all have hidden talents. Here's the real backstory to the name, if you dare.)
MLPOPS:TUE presents quotes that may be about computers, or some older technology, and invites you to guess where they're from. For example--
[This technology] would bring the world to the classroom and make universally available the services of the finest teachers, the inspiration of the greatest leaders.
--certainly sounds like someone talking about computers in the classroom, but it's actually a quote from the 1920s about the radio.
I also like this cartoon from the 1870s, which casts novels as the great distraction of the age:
The point, of course, is that breathless predictions about the revolutionary impacts of new media and technologies-- or lamentations of their ill effects-- are nothing new. They've been part of the modern discourse about technology and progress for at least 150 years.
Another example of how humans have coevolved with our tools:
Until around 250 years ago in the West, archaeological evidence suggests that most human beings had an edge-to-edge bite, similar to apes. In other words, our teeth were aligned liked a guillotine, with the top layer clashing against the bottom layer. Then, quite suddenly, this alignment of the jaw changed: We developed an overbite, which is still normal today. The top layer of teeth fits over the bottom layer like a lid on a box.
This change is far too recent for any evolutionary explanation. Rather, it seems to be a question of usage. An American anthropologist, C. Loring Brace, put forward the thesis that the overbite results from the way we use cutlery, from childhood onwards.
There is a pretty substantial if still-controversial literature on cooking and and its effect on humans (though some recent discoveries-- the discovery of cooking fires from a million years ago at least helps establish that cooking is very very old), but now a study about cooking and brain size in primates adds to the plausibility of the argument.
When I was at Cambridge (almost two years ago!), I stumbled on the work of cognitive archaeologists Lambros Malafouris and Colin Renfrew. Renfew is a professor at Cambridge with whom I had a really interesting lunch, while Lambros is a fellot at Oxford (and one of those brilliant young academics who in a more generous and expansive era would have gotten tenure years ago). They in turn led me to studies of ancient multitasking, particularly Monica Smith's reconstruction of multitasking in everyday ancient life and Lyn Wadley's work on halfting.
So naturally new research indicating that halfting is a much older practice than we realized caught my eye. Science has a new article on the subject, which is summarized on the AAAS Web site:
Early humans were lashing stone tips to wooden handles to make spears and knives about 200,000 years earlier than previously thought, according to a study in the 16 November issue of Science.
Attaching stone points to handles, or “hafting,” was an important technological advance that made it possible to handle or throw sharp points with much more power and control. Both Neandertals and early Homo sapiens made hafted spear tips, and evidence of this technology is relatively common after about 200,000 to 300,000 years ago.
Jayne Wilkins of the University of Toronto and colleagues present multiple lines of evidence implying that stone points from the site of Kathu Pan 1 in South Africa were hafted to form spears around 500,000 years ago. The points’ damaged edges and marks at their base are consistent with the idea that these points were hafted spear tips.
So why does this matter? The Guardian explains,
The invention of stone-tipped spears was a significant point in human evolution, allowing our ancestors to kill animals more efficiently and have more regular access to meat, which they would have needed to feed ever-growing brains. "It's a more effective strategy which would have allowed early humans to have more regular access to meat and high-quality foods, which is related to increases in brain size, which we do see in the archaeological record of this time," said Jayne Wilkins, an archaeologist at the University of Toronto who took part in the latest research.
The technique needed to make stone-tipped spears, called hafting, would also have required humans to think and plan ahead: hafting is a multi-step manufacturing process that requires many different materials and skill to put them together in the right way. "It's telling us they're able to collect the appropriate raw materials, they're able to manufacture the right type of stone weapons, they're able to collect wooden shafts, they're able to haft the stone tools to the wooden shaft as a composite technology," said Michael Petraglia, a professor of human evolution and prehistory at the University of Oxford who was not involved in the research. "This is telling us that we're dealing with an ancestor who is very bright."
This joins recent work on arrow-making, which both demonstrates that the manufacture and use of arrows is older than we thought, and that its complexity suggests ancient multitasking abilities:
"These operations would no doubt have taken place over the course of days, weeks or months, and would have been interrupted by attention to unrelated, more urgent tasks," observes paleoanthropologist Sally McBrearty of the University of Connecticut in a commentary accompanying the team’s report. "The ability to hold and manipulate operations and images of objects in memory, and to execute goal-directed procedures over space and time, is termed executive function and is an essential component of the modern mind," she explains.
McBrearty, who has long argued that modern cognitive capacity evolved at the same time as modern anatomy, with various elements of modern behavior emerging gradually over the subsequent millennia, says the new study supports her hypothesis. A competing hypothesis, advanced by Richard Klein of Stanford University, holds that modern human behavior only arose 50,000 to 40,000 years ago, as a result of some kind of fortuitous genetic mutation that kicked our ancestors’ creativity into high gear. But discoveries of symbolic items much older than that supposed mutation–and older than the PP5-6 arrowheads for that matter–have cast doubt on Klein’s theory. And other finds hint that Neandertals, too, engaged in symbolic behaviors, which would suggest that the capacity for symbolic thinking arose in our common ancestor perhaps half a million years ago.
Chad Wellmon has a smart essay in The Hedgehog Review arguing that "Google Isn’t Making Us Stupid…or Smart."
Our compulsive talk about information overload can isolate and abstract digital technology from society, human persons, and our broader culture. We have become distracted by all the data and inarticulate about our digital technologies…. [A]sking whether Google makes us stupid, as some cultural critics recently have, is the wrong question. It assumes sharp distinctions between humans and technology that are no longer, if they ever were, tenable…. [T]he history of information overload is instructive less for what it teaches us about the quantity of information than what it teaches us about how the technologies that we design to engage the world come in turn to shape us.
It's something you should definitely read, but it also reminded me of a section of my book that I lovingly crafted but ultimately editing out, and indeed pretty much forgot about until tonight. It describes the optimistic and pessimistic evaluations of the impact of information technology on-- well, everything-- as a prelude to my own bold declaration that I was going to go in a different direction. (It's something I wrote about early in the project.)
I liked what I wrote, but ultimately I decided that the introduction was too long; more fundamentally, I was really building on the work of everyone I was talking about, not trying to challenge them. (I don't like getting into unnecessary arguments, and I find you never get into trouble being generous giving credit to people who've written before you.) Still, the section is worth sharing.
This article by Southern Louisiana U. professor Matt Rossano, published in the Cambridge Archaeological Journal in 2007, has an awesome premise:
Imagine you travelled back in time 100,000 years and happened upon a group of our ancestors gathered around an evening fire. Would anyone be surprised to find them chanting, clapping, dancing in unison, or maybe just sitting mesmerized before the flickering flame? The thesis of this article is that this commonplace activity, which I will call campfire rituals of focused attention, created an important selective pressure in the evolution of the modern human mind. Ritualized gatherings before an open fire — repeated night after night, generation after generation for thousands of years — contributed significantly, though not necessarily exclusively, to the evolution of the enhanced working memory capacity required for symbolic thinking.
Makes sense to me!
I really want to write a book about the history of consciousness and contemplative practices. It would be an awesome way to write a grand history of everything in drag, and it's an obvious sequel to the contemplative computing book.
I've recently had a mild obsession with the Sandwalk, the path that Charles Darwin laid out on his estate, Down House, and which he walked for nearly forty years. Tonight I came across this art video that's shot on part of the walk.
The Thinking Path is part of a project called Darwin Originals, "a series of eight artists’ films inspired by the life, work and legacy of Charles Darwin." Here's another video of the Sandwalk, shot by a visitor to Down House.
From Janet Browne's great Charles Darwin: The Power of Place, a description of Darwin's house at Down that is also a wonderful meditation on the ways action and stillness, circulation and stability, conviviality and solitude, are balanced in creative lives.
The tumble of ideas that had characterized the first half of his existence [gave way]… to the methodical intensity of documenting and reinforcing his notions. His home and garden were his experimental laboratories, his book-lined study was his manufacturory; these were the places where he most liked to be…. [Darwin's] home and his homelife became an actual part of his intellectual enterprise….
Although his Beagle experiences were still important to him and always carried due weight in his writings, and his particular insight into nature remained undimmed, these home-based researches were the hidden triumph of his theory of evolution. His family setting, his house and garden, the surrounding Kent countryside, and his own sense of himself at the heart of the life he had created and the property he owned provided the finely crafted examples of adaptation in action that lifted his work far out of the ordinary. His thinking path, the path he called the Sandwalk that skirted the edge of a copse at the bottom of the Down House garden, became the private source of his conviction that his theory was true-- true, if only he could show it.
Solitude served him well here. But Darwin was not a complete rural recluse. Systematically, he tried his house into the hub of an ever-expanding web of scientific correspondence. Tucked away in his study… Darwin wrote letters to a remarkable number of variety of individuals.
I've been interested in the use of the term "addiction" in technology and social media-- when the high-tech industry started talking about technology being addictive (or put more starkly, when it became a good thing to try to deny users' agency and choice), and how ordinary people use the term to describe their relationships with technology.
Today I stumbled on an interesting fact while reading an essay by George Ainslie on addiction and willpower: the term "addiction" derives from a Latin (or Vulgar Latin, depending on your source) word "addictus," which was a kind of slavery: an 1875 article explains that a debtor "assigned over to the creditor (addictus) by the sentence of the praetor. The creditor was required to keep him for sixty days in chains… [and was free to] treat the debtor, who was addictus, as a slave, and compel him to work out his debt."
As Michael Quinion notes, the first English language use of the word was in Shakespeare's Henry V, when one character says of young Prince Hal that "his addiction was to courses vain," but this meaning of addiction didn't carry the connotations that it does today: it was more like habit or preference or liking, not an involuntary, uncontrollable impulse. That use of the term appears in the early 1900s, in writings about morphine and opium use.
I've been a little dismissive of people who talk about being addicted to social media, but it turns out that there's a strangely fitting double meaning of the word when it's used to describe the feeling having to check email or say something on Facebook. Clearly it references the modern meaning of a compulsion that you're powerless to control. But in an interesting twist, it also hearkens back to the ancient meaning of being addictus to others. Hmmm.
Update, June 29 2012: Historian of medicine Howard Markel writes on our modern and ancient ideas of addiction in the New York Times:
When we say that someone is “addicted” to a behavior like gambling or eating or playing video games, what does that mean? Are such compulsions really akin to dependencies like drug and alcohol addiction — or is that just loose talk?
This question arose recently after the committee writing the latest edition of the Diagnostic and Statistical Manual of Mental Disorders (D.S.M.), the standard reference work for psychiatric illnesses, announced updated definitions of substance abuse and addiction, including a new category of “behavioral addictions.” At the moment, the only disorder featured in this new category is pathological gambling, but the suggestion is that other behavioral disorders will be added in due course. Internet addiction, for instance, was initially considered for inclusion but was relegated to an appendix (as was sex addiction) pending further research....
As anyone familiar with the history of the diagnosis of addiction can tell you, the D.S.M.’s changes accurately reflect our evolving understanding of what it means to be an addict.
The concept of addiction has been changing and expanding for centuries. Initially, it wasn’t even a medical notion. In ancient Rome, “addiction” referred to a legal dependency: the bond of slavery that lenders imposed upon delinquent debtors. From the second century A.D. well into the 1800s, “addiction” described a disposition toward any number of obsessive behaviors, like excessive reading and writing or slavish devotion to a hobby. The term often implied a weakness of character or a moral failing.
“Addiction” entered the medical lexicon only in the late 19th century, as a result of the over-prescription of opium and morphine by physicians. Here, the concept of addiction came to include the notion of an exogenous substance taken into the body. Starting in the early 20th century, another key factor in diagnosing addiction was the occurrence of physical withdrawal symptoms upon quitting the substance in question.
Via James Fallows, I just found Bytes of China, written by a Fulbright scholar doing fieldwork in China. It's great work: check out this post on supercomputing in China, and the soft infrastructure, cultural factors, and stories that support innovation. The last caught my eye, because I've been doing a lot of work recently on how computers shape the way we think about ourselves.
Reading the great article but Liam Bannon, director of the Interaction Design Centre at the University of Limerick, on forgetting as a feature, not a bug. His central insight is that "human-computer interaction is largely founded on a view that compares the capability of humans and machines," (9) and that in the case of memory that that raises some problems.
[T]he dominant perspective in the human sciences over the past quarter-century has been one that views the human mind as an information-processing device, similar to computing machines. This computer model of mind has blinded us to a number of crucial features of human thinking, most importantly, the active and embodied nature of human thinking and acting in the world. In the context of our discussions on memory, I argue that this approach has over- emphasized a passive rather than an active model of human memory, ignoring the fact that remembering and forgetting are active processes....
[New] technologies are currently being viewed as either substitutes for, or possible augmentations of, human faculties. I argue that the proffered scenarios of computerized ‘help’ for human activities evident in the ubiquitous computing world tends to focus on augmentation of human remembering, with sensors and computer networks archiving vast amounts of data, but neglects to consider what augmentation might mean when it comes to that other human activity, namely, forgetting. (5)
Our models of memory are replete with technical terms such as ‘erasure’, ‘content addressing’, ‘retrieval’, which equate human and computer memory. Yet it has been common knowledge within the human sciences for decades that human memory is not akin to the storage model of computer memory. (5)
[In this model] forgetting is seen as one more example of the fragility of the human mind, where it loses out to computers, with their ability to retain information indefinitely. Forgetting is thus seen as a bug in the human makeup, an aspect of the human memory system that has negative connotations. (6)
There's no computational equivalent of human institutions for forgetting, which is problematic because forgetting is synonymous with forgiveness: as Bannon notes, there are any number of social and legal practices-- "pardon, amnesty, Catholic absolution in confession" (10)-- that officially serve to close off an event in a person's (or a group's) past. Not only is there no similar function with digital technologies, they work against any such ability: it's harder now to expunge criminal records, for example. (Some of the people who are mining public arrest records are awesomely colorful, or shady, characters, depending on your vision.) Bannon does a nice job of helping us become more aware of the stakes in not creating means of digital forgetting, and treating forgetting as a bug rather than a feature.
A provocative... well, idea... in a recent New York Times piece:
[W]e are living in an increasingly post-idea world — a world in which big, thought-provoking ideas that can’t instantly be monetized are of so little intrinsic value that fewer people are generating them and fewer outlets are disseminating them, the Internet notwithstanding. Bold ideas are almost passé....
The post-idea world has been a long time coming, and many factors have contributed to it. There is the retreat in universities from the real world, and an encouragement of and reward for the narrowest specialization rather than for daring — for tending potted plants rather than planting forests.
There is the eclipse of the public intellectual in the general media by the pundit who substitutes outrageousness for thoughtfulness, and the concomitant decline of the essay in general-interest magazines. And there is the rise of an increasingly visual culture, especially among the young — a form in which ideas are more difficult to express.
But these factors, which began decades ago, were more likely harbingers of an approaching post-idea world than the chief causes of it. The real cause may be information itself. It may seem counterintuitive that at a time when we know more than we have ever known, we think about it less.
Here's the crux of the argument: that the perceived need to keep up with information crowds out our opportunity (and perhaps reduces our sense of the need) to make sense of it.
In the past, we collected information not simply to know things. That was only the beginning. We also collected information to convert it into something larger than facts and ultimately more useful — into ideas that made sense of the information. We sought not just to apprehend the world but to truly comprehend it, which is the primary function of ideas. Great ideas explain the world and one another to us....
But if information was once grist for ideas, over the last decade it has become competition for them. We are like the farmer who has too much wheat to make flour. We are inundated with so much information that we wouldn’t have time to process it even if we wanted to, and most of us don’t want to. ...
We prefer knowing to thinking because knowing has more immediate value. It keeps us in the loop, keeps us connected to our friends and our cohort.
Digital cameras are now ubiquitous - it is estimated that 2.5 billion people in the world today have a digital camera. If the average person snaps 150 photos this year that would be a staggering 375 billion photos.... Every 2 minutes today we snap as many photos as the whole of humanity took in the 1800s. In fact, ten percent of all the photos we have were taken in the past 12 months.
I'm often more than a little skeptical of these comparisons of content creation or consumption today versus in the past-- how do you estimate how much information was generated in 1600, and can you really use the same metrics to measure how much information there is in an issue of the LA Times or Wall Street Journal?-- but photographs seem more easily comparable between analog and digital realms.
There post also reports the great statistic that in 1960, 55% of all photos were of babies.
There's a claim-- I found it first in one of Geoff "Nunberg Error" Nunberg's articles-- that before they were all put on CDs, the documentation for a 747 weighed more than the plane itself. I've also heard the claim that the average person today deals with more information in a day than someone in the 17th century dealt with in their entire lives.
I wonder if you could make-- and more important, mobilize some kind of data to validate-- the claim that today's Internet users spend more time online than yesteryear's computer professionals. Say, "The average 12 year-old spends more time each year connected to the Internet than everyone using the Internet did in 1971." Or, "My iPhone will perform more calculations figuring out directions to the HP Pavillion in San Jose than Isaac Newton made in his whole life."
Both of these sound truthy, to use Steven Colbert's famous phrase. Maybe that'll be reason enough for some readers to pick up these claims, or make up their own, and start Tweeting them as if they were true.
I write about people, technology, and the worlds they make.
My book on contemplative computing, The Distraction Addiction, was published by Little, Brown and Company in 2013. (It's been translated into Dutch (as Verslaafd aan afleiding) and Spanish; Russian, Chinese and Korean translations are in the works.)
The Distraction Addiction
My latest book, and the first book from the contemplative computing project. The Distraction Addiction is published by Little, Brown and Co.. It's been widely reviewed and garnered lots of good press. You can find your own copy at your local bookstore, or order it through Barnes & Noble, Amazon (check B&N first, as it's usually cheaper there), or IndieBound.
The Spanish edition
The Dutch edition
Empire and the Sun
My first book, Empire and the Sun: Victorian Solar Eclipse Expeditions, was published with Stanford University Press in 2002 (order via Amazon).
PUBLISHED IN 2012
PUBLISHED IN 2011
PUBLISHED IN 2010
PUBLISHED IN 2009