BN.com Gift Guide

Smarter Than You Think: How Technology Is Changing Our Minds for the Better

Overview


It's undeniable—technology is changing the way we think. But is it for the better? Amid a chorus of doomsayers, Clive Thompson delivers a resounding "yes." The Internet age has produced a radical new style of human intelligence, worthy of both celebration and analysis. We learn more and retain it longer, write and think with global audiences, and even gain an ESP-like awareness of the world around us. Modern technology is making us smarter, ...
See more details below
Hardcover
$19.14
BN.com price
(Save 31%)$27.95 List Price

Pick Up In Store

Reserve and pick up in 60 minutes at your local store

Other sellers (Hardcover)
  • All (46) from $2.00   
  • New (21) from $2.67   
  • Used (25) from $2.00   
Smarter Than You Think: How Technology Is Changing Our Minds for the Better

Available on NOOK devices and apps  
  • NOOK Devices
  • Samsung Galaxy Tab 4 NOOK 7.0
  • Samsung Galaxy Tab 4 NOOK 10.1
  • NOOK HD Tablet
  • NOOK HD+ Tablet
  • NOOK eReaders
  • NOOK Color
  • NOOK Tablet
  • Tablet/Phone
  • NOOK for Windows 8 Tablet
  • NOOK for iOS
  • NOOK for Android
  • NOOK Kids for iPad
  • PC/Mac
  • NOOK for Windows 8
  • NOOK for PC
  • NOOK for Mac
  • NOOK for Web

Want a NOOK? Explore Now

NOOK Book (eBook)
$9.99
BN.com price

Overview


It's undeniable—technology is changing the way we think. But is it for the better? Amid a chorus of doomsayers, Clive Thompson delivers a resounding "yes." The Internet age has produced a radical new style of human intelligence, worthy of both celebration and analysis. We learn more and retain it longer, write and think with global audiences, and even gain an ESP-like awareness of the world around us. Modern technology is making us smarter, better connected, and often deeper—both as individuals and as a society.
 
In Smarter Than You Think Thompson shows that every technological innovation—from the written word to the printing press to the telegraph—has provoked the very same anxieties that plague us today. We panic that life will never be the same, that our attentions are eroding, that culture is being trivialized. But as in the past, we adapt—learning to use the new and retaining what’s good of the old.
 
Thompson introduces us to a cast of extraordinary characters who augment their minds in inventive ways. There's the seventy-six-year old millionaire who digitally records his every waking moment—giving him instant recall of the events and ideas of his life, even going back decades. There's a group of courageous Chinese students who mounted an online movement that shut down a $1.6 billion toxic copper plant. There are experts and there are amateurs, including a global set of gamers who took a puzzle that had baffled HIV scientists for a decade—and solved it collaboratively in only one month.
 
Smarter Than You Think isn't just about pioneers. It's about everyday users of technology and how our digital tools—from Google to Twitter to Facebook and smartphones—are giving us new ways to learn, talk, and share our ideas. Thompson harnesses the latest discoveries in social science to explore how digital technology taps into our long-standing habits of mind—pushing them in powerful new directions. Our thinking will continue to evolve as newer tools enter our lives. Smarter Than You Think embraces and extols this transformation, presenting an exciting vision of the present and the future.
 
Read More Show Less

Editorial Reviews

The New York Times Book Review - Walter Isaacson
…judicious and insightful…[Thompson] provides…interesting current examples of how human-computer symbiosis is enlarging our intellect…Thompson avoids both the hype and the hand-wringing so common among digital age pontificators…He comes across as a sensible utopian, tending toward the belief that our digital devices and social networks are, on balance, enhancing our lives and improving the world in the same mixed-blessing sort of way that writing, paper, the printing press and the telephone did.
Publishers Weekly
Does technology make us lazy, incapable of thinking smartly about solutions to cultural problems? Does it make us shallower thinkers, ever reliant on computers to help us mold our responses to any issues? In this optimistic, fast-paced tale about the advent of technology and its influence on humans, journalist Thompson addresses these and other questions. He admits that we often allow ourselves to be used by facets of new technologies and that we must exercise caution to avoid this; yet, he demonstrates, digital tools can have a huge positive impact on us, for they provide us with infinite memory, the ability to discover connections between people, places, or ideas previously unknown to us, and new and abundant avenues for communication and publishing. For example, Thompson shares the tale of Gordon Bell, who walks around equipped with a small fish-eye camera and a tiny audio recorder. Bell uses these devices to record every moment of his life, which he records on a “lifelog” on his laptop. Because of these devices, Bell—and we, if we embrace the technology—lives in a world of infinite memory. Using technology also helps us make connections, not only with old friends on Facebook or other social media but with the world around us as we search for knowledge and facts about it. Thompson points out that “transactive memory”—which arises out of our need to understand details and to connect to larger sets of facts outside our own limited social or familial setting—allows “us to perform at higher levels, accomplishing acts of reasoning that are impossible for us alone.” In the end, Thompson believes, these features of digital tools will allow us to think more deeply and become more deeply connected both as individuals and as a society. (Sept.)
Library Journal
09/01/2013
In the face of concerns over whether digital connectivity has made us generally lazy and shallow, Thompson (columnist, Wired) investigates and demonstrates how technology can be used to improve the ways we think and learn. For instance, he looks at the educational impact of the not-for-profit free online video tutorial platform Khan Academy and describes how its student dashboard system, a teaching platform for instructors, has considerably improved teaching and learning. Thompson includes compelling examples of how technology is furthering scientific discovery. For example, he writes, neuroscientists have been seeking to understand protein folding; an online protein-folding game helped allow them to understand the phenomenon when thousands of gamers found new and interesting strategies by collaborating online. VERDICT Thompson succeeds in making the case for digital technologies enhancing how we learn, discover, and collaborate. A comparable work on crowd-sourced activism and the benefits of Web 2.0 technologies is Clay Shirky's Here Comes Everybody: The Power of Organizing Without Organizations. A well-written case for the power and advantages of new digital technologies and possibilities for human achievement, this book will appeal to a technically savvy crowd as well as to nontech readers interested in how adopting new technologies may better their lives.—Jim Hahn, Univ. of Illinois Lib., Urbana
Kirkus Reviews
A sprightly tip of the hat to the rewards and pleasures--and betterments--of our digital experiences. Who, asks Wired and New York Times Magazine contributor Thompson, hasn't felt a twinge of concern? How many times have we let Google feed us the answer to all manner of random inquiries? Indeed, does Google allow our memory muscles to grow flabby? How much is important to retain without a crib card? How much byzantine, brain-busting junk do we need at our fingertips or leave dangling at the tip of our tongues? Thompson is a firm believer in the school of digital information. Why not offload all the minutiae and free up the brain for bigger questions? Then let the computer serve as the external memory, find connections and accelerate communication and publishing. The author also argues that, despite all the excesses, writing on the Internet encourages discipline and economy of expression--if not harking back to the golden age of letter writing, at least making people put thought to screen. In addition, think of all the stuff that computers do in a wink--data crunching, calling you to task in the word cloud for repetitiveness, and more. Computers also bring analysis, logic and acuity to the table, while humans bring intuition, insight, psychology and strategy, as well as sentience. Near the beginning of the book, Thompson discusses the mind vs. computer dilemma in the context of chess: "The computer would bring the lightning fast--if uncreative--ability to analyze zillions of moves, while the human would bring intuition and insight, the ability to read opponents and psych them out." A well-framed celebration of how the digital world will make us bigger, rather than diminish us.
Read More Show Less

Product Details

  • ISBN-13: 9781594204456
  • Publisher: Penguin Group (USA)
  • Publication date: 9/12/2013
  • Pages: 352
  • Sales rank: 214,286
  • Product dimensions: 6.60 (w) x 9.34 (h) x 1.20 (d)

Meet the Author

Clive Thompson is a contributing writer for the New York Times Magazine and Wired. He also writes for Fast Company, and appears regularly on many NPR programs, CNN, Fox News, and NY1, among others.

Read More Show Less

Read an Excerpt

The “extended mind” theory of cognition argues that the reason humans are so intellectually dominant is that we’ve always outsourced bits of cognition, using tools to scaffold our thinking into ever-more-rarefied realms. Printed books amplified our memory. Inexpensive paper and reliable pens made it possible to externalize our thoughts quickly. Studies show that our eyes zip around the page while performing long division on paper, using the handwritten digits as a form of prosthetic short-term memory. “These resources enable us to pursue manipulations and juxtapositions of ideas and data that would quickly baffle the unaugmented brain,” as Andy Clark, a philosopher of the extended mind, writes.

Granted, it can be unsettling to realize how much thinking already happens outside our skulls. Culturally, we revere the Rodin ideal—the belief that genius breakthroughs come from our gray matter alone. The physicist Richard Feynman once got into an argument about this with the historian Charles Weiner. Feynman understood the extended mind; he knew that writing his equations and ideas on paper was crucial to his thought. But when Weiner looked over a pile of Feynman’s notebooks, he called them a wonderful “record of his day-to-day work.” No, no, Feynman replied testily. They weren’t a record of his thinking process. They were his thinking process:

“I actually did the work on the paper,” he said.

“Well,” Weiner said, “the work was done in your head, but the record of it is still here.”

“No, it’s not a record, not really. It’s working. You have to work on paper and this is the paper. Okay?”

Every new tool shapes the way we think, as well as what we think about. The printed word helped make our thought linear and abstract and vastly increased our artificial memory. Newspapers shrank the world; then the telegraph shrank it even further, producing a practically teleportational shift in the world of information. With every innovation, cultural prophets bickered over whether we were facing a technological apocalypse or utopia. Depending on which Victorian-age pundit you asked, the telegraph was either going to usher in a connected era of world peace or drown us in idiotic trivia. Neither was quite right, of course, yet neither was quite wrong. The one thing that both apocalyptics and utopians understand is that every new technology invisibly pushes us toward new forms of behavior while nudging us away from older, familiar ones. Harold Innis—the lesser known but arguably more interesting intellectual midwife of Marshall McLuhan—called it the “bias” of a new tool.

What exactly are the biases of today’s digital tools? There are many, but I’d argue three large ones dominate. First, they’re biased toward ridiculously huge feats of memory; smartphones, hard drives, cameras and sensors routinely record more information than any tool did before, and keep it easily accessible. Second, they’re biased toward making it easier to find connections—between ideas, pictures, people, bits of news—that were previously invisible to us. And the third one is they encourage a superfluity of communication and publishing. This last feature has a lot of surprising effects that are often ill understood. Any economist can tell you that when you suddenly increase the availability of a resource, people not only do more things with it but they do increasingly odd and unpredictable things. As electricity became cheap and ubiquitous in the West, its role expanded from things you’d expect—like nighttime lighting—to the unexpected and seemingly trivial: Battery-driven toy trains, electric blenders. The superfluity of communication today has produced everything from a rise in self-organized projects like Wikipedia to curious new forms of expression: Television-show recaps, video-game walk-throughs, map-based storytelling.

In one sense, these three shifts—infinite memory, dot-connecting, explosive publishing—are screamingly obvious to anyone who’s ever used a computer. Yet they also somehow constantly surprise us by producing ever-new “tools for thought” (to use the writer Howard Rheingold’s lovely phrase) that upend our daily mental habits in ways we never expected. Indeed, these phenomena have already woven themselves so deeply into the lives of people around the globe that it’s difficult to stand back and take account of how much things have changed and why. While this book maps out what I call the future of thought, it’s also frankly rooted in the present, because many parts of our future have already arrived, even if they are only dimly understood. As the sci-fi author William Gibson famously quipped: “The future is already here—it’s just not very evenly distributed.” This is an attempt to understand what’s happening to us right now, the better to see where our augmented thought is headed. Rather than dwell in abstractions, like so many marketers and pundits—not to mention the creators of technology, who are often remarkably poor at predicting how people will use their tools—I focus more on the actual experiences of real people.

Read More Show Less

Interviews & Essays

Barnes & Noble Review Interview with Clive Thompson

Many of us are certain we know what the advent of smartphones, social media and ubiquitous computing has done to us. The corrosive effect of new technology on society is a crisis that generates op-eds and outrage over everything from digitally spawned breaches of etiquette to broad-based interrogations into the extent that computing is rewiring our brains away from deep thought. The only thing that generates more reliable Internet traffic than a video of a baby panda and manatee playing together is a fresh article calling attention to a shocking new low in tech- oriented behavior.

Clive Thompson, a contributing editor at Wired magazine and longtime monitor of the digital world's reach into our lives, started out from much the same presumption -- that growing dependence on technology must be having a straightforwardly bad effect on our abilities to think for ourselves. But two decades of reporting on how people actually use technology -- through conversations with scientists, chess players, activists, designers, writers and ordinary people -- have led him to a very different and more hopeful conclusion. Smarter than You Think documents our entanglement with new technology and finds that as high-powered computing has permeated much of the developed world, a kind of co-evolution is taking place, in which both humans and the tools we use to communicate and calculate are changing, each in response to the other. The result, he argues, is an explosion of written language, new opportunities for creativity and deep analysis of problems small and large.

I spoke with Thompson recently via the old-fashioned method of email about his book, the questions his findings raise, and what -- even given his book's optimism -- we ought to be cautious about as the Age of the "Digital Centaur" dawns. An edited transcript of our conversation follows. --Bill Tipper

The Barnes & Noble Review: You begin Smarter than You Think with the concept of the dawn of "centaur" -- your term for the computer-assisted human brain -- and tell the story of how, rather than turning chess over to ever-more- competent computers, the chess world is excited by the notion of two computer-aided Grand Masters playing one another. Later, you talk about the notion that IBM's Jeopardy-champion computer, Watson, could become a sort of genius physician's assistant. Will we have to undergo some sort of great retraining to take up the centaur life?

Clive Thompson: Yes, I do think training is necessary to really use our new cognitive tools well. That's already true for the ones we already use every day! Search engines are basically massive A.I. machines using statistical analysis to help us find documents. They're very powerful -- particularly when wedded, "centaur" style, to our unique human ability to make meaning of things. But they have a whole suite of assumptions built into their A.I. For example, they rank each web site partly based on how many other people link to it, which is in one sense a useful social signal (if other people have found this useful, you might also), but can also descend into a popularity contest -- or get hijacked by search-engine-optimization strategies. The more you know about how a search engine works, the more you're able to use it intelligently -- and that means training.

Where's that training going to happen? Adults are, at the moment, on their own -- we have to read up on this stuff if we want to be intelligent and critical users of these tools. For kids, you can see the shift happening in schools, which is heartening. Smart teachers -- and most of all librarians -- are already working on making sure kids learn these skills.

This sort of training has always been necessary, frankly. A research library is a powerful tool for thinking -- but only if you've been taught how to use it. The same goes for a slide rule. And historically, only a small number of folks ever bothered to acquire these skills. I went to the University of Toronto, which has a fabulous research library -- but out of thousands of incoming students every year, barely a handful ever attended the training sessions on how to use the library. Professors would complain endlessly about students not using secondary sources, never consulting journals, not even being aware journals existed.

So some of these challenges we face today aren't particularly new.

BNR: One person who's become a "centaur" rather sooner than most is "wearable computing" pioneer Thad Starner, the computer science professor who has helped Google developed its famous Google Glass specs, and who has for years now carried a specially built computer that connects to the internet and projects the products of his searches on a screen in front of one eye. He suggests that in order to use such devices safely and politely we need to "multiplex" -- use technology to supplement what we're doing in the moment -- not "multitask." Is that a distinction people are ready to take in?

CT: Sure -- in fact, I think many people already are aware that they shouldn't let their devices come between each other.

There's been a flood of handwringing op-eds lately about how glassy-eyed mobile-phone zombies are ignoring each other at the restaurant instead of talking to another. I think these pundits are somewhat overblowing the frequency of this behavior, frankly. Very similar alarms were raised about the wave of supposedly society-ending isolation that would wreaked by previous newfangled media -- like the telephone in the late 19th century, and the Walkman in the '80s. We didn't suffer a social apocalypse then, and I don't think we're going to suffer one now.

That said, I actually think the op-ed handwringing is useful in its own way. It's part of how a society creates social codes around new technologies. When mobile phones inched into the mainstream in the '90s, people who bought them used to answer them, every single time they rang, whenever and wherever they rang: At the dinner table, at the funeral, while having sex. It took about a decade of this behavior peaking before society collectively began to realize this was kind of terrible behavior, and we starting poking fun at it -- you saw lots of jokes about it, like that "inconsiderate cell phone man" ad that used to run before movies. And eventually we moved away from the behavior. We're probably in the middle of this curve with social media.

BNR: Part of what creates expertise in a practice or subject is the immersion in the mundane aspects of a given discipline -- its knowledge base at the most detailed level. Is there a risk that cybernetic dependence might affect our ability to spend time within a set of basic tasks and "absorb" them?

CT: Absolutely. For example, research into the effect of calculators on the acquisition of math suggests that if you give a child a calculator too early on in their math-learning curve, they can rely on it too much, and it can interfere with their ability to internalize math. However, if you give it to them after the point where they've acquired a good grounding in basic math, the calculator can be quite beneficial: It lets them muck around with numbers and run fun experiments that reinforces, and extends, their existing number sense. Interestingly, even mental algorithms can shortchange thought. Math teachers now know that the "carry the 1" technique for adding large numbers together actually inhibits people from really understanding the way numbers work; kids who use the carry-the-number approach for adding make weirder and bigger errors than kids who don't, because they're not really thinking of the numbers as quantities ? they're just executing an algorithm to "solve the problem". So yes, using algorithms and tools for our mental work affects how we do our mental work.

That said, our thinking processes are already extremely reliant on tools outside of our heads. They've been that way for thousands of years. We have, as the philosopher Andy Clark puts it, "extended minds." We use paper documents to store knowledge so we can consult and reconsult it, giving us a type of recall impossible with our unaided minds; we use pencils to scratch down material so we can manipulate it in a fashion impossible in our unaided minds. I can't do long division without a pen to write the numbers down; does that mean I'm "stupider" when I don't have a pen and paper around? I can't organize and produce a 5,000-word magazine article without being able to store information on paper, on my computer; does that mean I'm stupider without those tools around?

The only reason we don't notice how absolutely interwoven our thinking processes have become with older technologies -- pencils, paper, electric light, penicillin, fire -- is that they're old, so we've ceased to notice their effects. But take them away, and the caliber and fabric of our thinking changes dramatically -- and, I'd argue, for the worse.

In western culture, we tend to believe that "thinking" takes places only in the pose of Rodin's Thinker: When you're isolated, alone, and unaided. But that's not at all what thinking looks like in the real world. When we think, muse, wonder, and cogitate, our minds are scaffolded by tools, by our environment, and by each other. We're actually very social thinkers.

So the long answer to your question is that yes, one can indeed get out of the habit of doing some forms of mental activity, when we use a "tool for thought" (to borrow Howard Rheingold's lovely phrase). And we do have to be attentive to the tools we use, to make sure that we're not over-relying on them. But at the same time, "overreliance" is a hard thing to separate out from "use."

BNR: You make the point that we've been "outsourcing" parts of our thinking -- especially memory -- for as long as writing has been widely used, and you suggest that one of the underappreciated effects of the technological changes of the last few decades is the explosion in the amount of writing that many people do, not just in emails and texts but in Tweets, short status updates, and comment threads. Are we in a golden age of conversational writing?

CT: I think so, yes. The amount of writing that people do online is astonishing, and historically unprecedented. This is something that's often hard for journalists and academics to grasp, but as scholars of rhetoric will tell you, before the Internet, most people graduated high school or college and did very little writing for the rest of their lives. When they did write, it was usually just some memos for work or the like. Only very rare people wrote tons of letters or diary entries; most of us wrote nothing. So we're in the middle of this crazy social shift where everyday people regularly wonder about something, or have an idea about something, or have a thought about a movie they've seen or a song they've listened to, and they sit down and write a few sentences about it. When the last episode of the second season of the BBC's Sherlock aired, I was a fan, and I immediately went to a discussion board to see what other fans thought about it ? and the thread was, barely an hour or two after the episode aired, already 10,000 words long, and filled with clever, smart theories about the big mystery. People were alternately offering evidence for their ideas, rebutting each other, flattering each other, disagreeing. It was, as you say, a particularly conversational form of writing -- part barroom debate, part exchange of letters out of the 18th century.

Indeed, the 18th century literary culture in England fascinates me, because it reminds me a lot of today's online culture. It was the early days of journals and newspapers, and the literate folks in London at the time were constantly writing missives to one another publicly, ridiculing each other, arguing and tossing thoughts back and forth. When you go back and read The Rambler and magazines like that, it feels awfully reminiscent of today's online conversational culture. This is something that Tom Standage points out in his terrific new book, Writing on the Wall: Social Media -- the First 2,000 Years. For most of human history, Standage argues, our media were very conversational and participatory. It was only for a relatively short period of a few decades in the 20th century that media became massive, centralized, and something the average person couldn't participate in.

There were clear benefits in the rise of huge, well-funded media organizations -- but something was lost, too, when so many other voices got crowded out. The same balance obtains today, in reverse: We have a lot more voices, but arguably a less centralized social discourse, and plenty of cranks and flat-out abusive speech that wouldn't have been allowed in the big centralized media.

BNR: You talk about the change in the society that comes with an "always on" world -- and suggest that sooner or later we may collectively rethink that posture; as the networked life becomes more and more normal, the anxious and fascinated need to post dinner plates to Instagram or interrupt a conversation to check Twitter may wane. But won't device makers and new platforms always be looking for ways to seduce us back into the pursuit of the new? Is it just the fascination with technology we have to be mindful of, or those who want to sell it to us?

CT: The answer here is yes and yes.

I think many people are already realizing the limits of their appetite for contact. That's why participation in Facebook looks like it's tailing off. Facebook has sort of overdesigned itself: By trying to be the key to all human interactions, it's shoved so much socializing into one place that it is becoming really ungainly. In my three years of research for my book, I talked to tons of people who find Facebook extremely useful -- but few who said they had fun using it. Whereas most people told me they enjoyed simple tools that offered one relatively clear mode of contaCT: Photos on Instagram, evanescent messages on Snapchat, or a completely-public Twitter account. But it's also true that we are, most of us, pretty social people, and so a lot of us -- certainly not everyone -- will probably keep up a reasonable level of ambient contact with each other. The pleasures of it ar as real as the annoyances.

Here's a comparison point: Our adjustment online is similar to how European and British societies adjusted to urbanization in the 18th or 19th century. If you moved from a rural area to a city, the new environment was simultaneously stimulating -- it's why cities are such creative and productive hubs -- and enervating. If you wanted to survive in the long run, you had to figure out how to carve out moments and spaces of peace and quiet. Living online is pretty similar in a lot of ways.

But yes, even as people try to adjust to social media -- and figure out its proper role in their daily lives -- high-tech companies will be constantly offering new gewgaws, trying to lure us in so they can sell us ads.

Personally, I'd love to see more social media firms develop business models that aren't reliant on advertising. If you're a social- media firm selling ads, your goal is to get people to interrupt what they're doing all day long so they come and stare at your service as much as possible. It's in your economic self-interest to interrupt your "users" -- a telling word, that -- as much as possible. But social media services that actually charge people for their products no longer have this misalignment with the cognitive needs of their customers. (And they're no longer "users" -- they're "customers.")

I'd be interested to see if pay-for-our-service social networks could survive. Facebook's average revenues per person are a measly $5 per person, annually. So they could charge you a little over 40 cents per month on your mobile-phone bill -- a pretty modest sum in developed countries, though less so globally -- and make all the money they're currently making, while removing all ads from their service, and redesigning it to suit the needs of the paying customers, not the advertisers.

BNR: One of the things I enjoyed most in reading Smarter than You Think was its inclusion of ideas that take turn current assumptions about how to use our new computer-and- Internet tools on their heads. I'm thinking particularly of Drop.io, the now-defunct service file-sharing service that had a built-in delete function as a default: users were asked to specify how long their uploads would remain before self-destructing. What were the most surprising ideas you encountered as you researched?

CT: I loved Drop.io's model! I was so sad when they were phagocytosed by Facebook. I think the "artificial forgetting" model is one clever and surprising concept that I encountered and continue to encounter. People are realizing that there can be enormous value in not saving copies of their utterances -- that the evanescence of the utterance is pat of what makes it valuable. This is precisely what I hear from people who use Snapchat. They like the idea that they're just tossing pictures and witticisms back and forth with friends that aren't for the archives. Sure, they know it's possible for your friend to sneakily save a picture you intended to be transient ? but the whole point of the app is that it establishes a new social contract, and if it's violated, you'll know who violated it. (Whereas with Facebook, people regularly tell me they're baffled by who, precisely, can see what.)

Another idea that surprised and delighted me were the ideas in "memory engineering" -- i.e., tools for making our vast archives of our lives useful again. If you've used a social network for years, you've effectively done a sort of diarying, but there's usually no way to go back in time easily. So I loved the idea behind "Timehop," an app that you can let into your online archives -- your photos, your updates, your checkins, your texts -- and every day it sends you a little gazette summarizing what you were doing a year ago. It becomes, as the people who used it told me, a sort of daily Proustian cookie, a way to reflect on the shape of their life. They were, in classic "centaur" fashion, using the unique abilities of computers -- their perfect and relentless timekeeping, their capacious storage -- to help tickle and tease their human memories.

BNR: Recently, at the 2013 National Book Awards ceremony, the novelist E. L. Doctorow delivered a speech telegraphing his concern, as a writer, over the digitally-mediated future, specifically with regard to the sense that both powerful states and large corporations have a natural tendency to want to control free expression. You cite a number of powerful examples of autocracies -- like Azerbaijan's government -- learning to use the Internet in ways that belie the "netroots" utopia suggested by early champions. Are we in danger of underestimating the power of regimes to use the networked world to more efficiently hold onto power?

CT: New media have often produced giddily utopian ideas about politics. The telegraph was supposed to make humanity so linked that war would end ("It is impossible that old prejudices and hostilities should longer exist," as Charles F. Briggs and Augustus Maverick wrote in 1858). They said the same thing about radio, and even the airplane. ("Aerial man" would transcend such tribal urges, argued Charlotte Perkins Gilman.) It was the same in the early days of the Internet -- plenty of naïve predictions that governments would be unable to contain speech and activism online.

I think most of that naivety is gone, though. When I talk to political dissidents in other countries, local activists, and political thinkers, they all have a pretty robust understanding of the civic challenges of online life: The fact that governments and spy agencies hoover up utterances, that they quite adeptly filter and block speech they don't like. You don't see much hyperbole about "Twitter revolutions" or "Facebook revolutions" either, thankfully.

In contrast -- and quite hopefully -- I'm seeing more conversation about "what to do about it" -- i.e., what sort of changes are necessary to make online speech less spied-upon, scooped-up, and surveyed ("surveilled"? I've never been clear on that usage). A lot of the big solutions aren't technological. They're political. In the US, activists of all stripes are realizing that the U.S. spy agency's addiction to dragnet spying was created by bad laws -- like the PATRIOT Act -- and is only going to be curtailed with better laws that rein it in. It's an open question as to whether the sclerotic U.S. political system will really take action on this; I'm kind of doubtful. Then again, as Ann Cavoukian, the privacy commission for Ontario, pointed out to me this summer, even the mood in the heavily-gerrymandered House is changing. This summer, it nearly voted to defund the NSA program that collects Americans' phone- call info: 205 to 217, with 94 Republicans voting to defund the program. That's kind of amazing. If the Snowden leaks keep coming, and I gather they will, this sort of political headwind will only increase. I hope, anyway!

BNR: And as more Internet conversation moves to branded corporate spaces, should we be doing more to resist the privatization of the sphere in which political and creative exchanges are now taking place? Do you believe we are headed for a place where a so-called "Digital Magna Carta" is going to be necessary?

CT: I totally agree. I think we all need to be more actively trying to pull our social lives out of these few, highly centralized, highly commercialized spaces -- and put it back in the hands of the people. In addition to the political changes I've talked about here (reining in governmental spying in the West), we also ought to start running more of our own DIY online services.

We don't need huge companies like Twitter and Facebook and Pinterest to broker all of our online expression. Publishing stuff online isn't rocket science any more. There's oodles of free software for running one's own discussion forums, like PhpBBS. There's software like Pogoplug or Tonido that let you run a "cloud" service for sharing your photos, videos, or whatever from your own laptop at home. There are paid microblogging services that don't rely on advertising like App.net. And there are public-spirited geeks like the ones at the Freedom Box Foundation that are working on creating social-networking software that, again, lives at home, under your control.

I'm not naïve enough to think people are going to abandon all the big commercial services for these bespoke, roll-your-own ones. Nor do I think they should. There are some big advantages to the huge, commercial spaces. They create very large audiences, which can be great for everyone from artists to activists. They're harder for governments to shut down, as per Ethan Zuckerman's "Cute Cat Theory" (Google it!). But I think everyone should be encouraged to learn how to use DIY tools for networking. Frankly, it should be taught at the high school level, the way we teach things like Home Ec and music. It's a modern part of civics and culture.

The more we carve out noncommercial spaces for online talk, the better!

BNR: Of the technology-based changes in our thinking that you chronicle -- social engagement across networks, collaborative problem-solving, "new literacies" like video, or human-computer cognitive partnerships -- is there one that you think is poised to make the biggest change in our world? Is anything sneaking up on us?

CT: There's nothing I can predict, no.

But that's because I'm not a futurist. I'm a reporter! I don't try to predict the future. I just report on what I see happening in the world and people around me. The downside is I can't see any farther into the future than the average bear. The upside is, hopefully, what I'm describing is more accurate than futuristic predictions, because it's based in actual interviews and stories of real people.

BNR: This is a book that tries to capture a rapidly- changing terrain. What's emerged since you completed the book. Has anything emerged that made you wish you'd had a chance to get in just one more chapter?

CT: There were about 25 chapters I wish I'd had time to write. I'd hoped to write about the idea of creating more noncorporate spaces online. I'd hoped to write more about how books and the act of long-form reading is evolving. I'd hoped to write about the literary style of short-form utterances, and the existential weirdness of the so-called "Internet of Things" (objects that start talking to each other online), and how online maps are becoming sort of like word processors that we use to think geographically.

So I'm shaving all that stuff off into my future journalism.

BNR: In your column for Wired, you've championed the defter-than-usual portrayal of innovative technology on the CBS drama The Good Wife. A recent episode featured rolling "telepresence avatars" like those made by Double Robotics -- a screen showing your face is mounted on a camera-equipped, remote controlled mini-Segway. Can we expect to see these trundling into our conference rooms any time soon?

CT: Heh -- I doubt it. They're pretty silly. Robots that roll around have serious trouble navigating the human world, which was engineered primarily for people with full use of both legs. I think [series creators] Robert and Michelle King know this, though, which is precisely why they put them in the show: To make fun of them.

On the other hand, flying drones equipped with cameras: Those are real, and they're coming to a corner of the sky near you. They're incredibly cheap, currently pretty unregulated, and for amateurs just looking to horse around, awfully fun to fly. (I've built and flown a DIY drone myself, albeit one that doesn't have a camera.) So everyday drones are going to cause all manner of crazy privacy issues as they start peeking over rooftops, fences, corporate headquarters, major protests, and everywhere else. This is something I'm aiming to write a lot more about. --December 19, 2013

Read More Show Less

Customer Reviews

Be the first to write a review
( 0 )
Rating Distribution

5 Star

(0)

4 Star

(0)

3 Star

(0)

2 Star

(0)

1 Star

(0)

Your Rating:

Your Name: Create a Pen Name or

Barnes & Noble.com Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & Noble.com that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & Noble.com does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at BN.com or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation

Reminder:

  • - By submitting a review, you grant to Barnes & Noble.com and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Noble.com Terms of Use.
  • - Barnes & Noble.com reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & Noble.com also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on BN.com. It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

 
Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously

    If you find inappropriate content, please report it to Barnes & Noble
    Why is this product inappropriate?
    Comments (optional)