This Is Running for Your Life: Essaysby Michelle Orange
Michelle Orange uses the lens of pop culture to decode the defining characteristics of our media-drenched times
In This Is Running for Your Life, Michelle Orange takes us from Beirut to Hawaii to her grandmother's retirement home in Canada in her quest to understand how people behave in a world increasingly mediated--for better and for/i>/b>/b>… See more details below
Michelle Orange uses the lens of pop culture to decode the defining characteristics of our media-drenched times
In This Is Running for Your Life, Michelle Orange takes us from Beirut to Hawaii to her grandmother's retirement home in Canada in her quest to understand how people behave in a world increasingly mediated--for better and for worse--by images and interactivity. Orange's essays range from the critical to the journalistic to the deeply personal; she seamlessly combines stories from her own life with incisive analysis as she explores everything from the intimacies we develop with celebrities and movie characters to the troubled creation of the most recent edition of the Diagnostic and Statistical Manual of Mental Disorders.
With the insight of a young Joan Didion and the empathy of a John Jeremiah Sullivan, Orange dives into popular culture and the status quo and emerges with a persuasive and provocative book about how we live now. Her singular voice will resonate for years to come.
- Farrar, Straus and Giroux
- Publication date:
- Sales rank:
- Product dimensions:
- 5.10(w) x 7.50(h) x 1.00(d)
Read an Excerpt
The Uses of Nostalgia and Some Thoughts on Ethan Hawke’s Face
Let’s call it the theory of receptivity. It’s the idea, often cited by young people in their case against the relevance of even marginally older people, that one’s taste—in music or film, literature or fine cuisine—petrifies during life’s peak of happiness or nadir of misery. Or maybe it’s not that simple. Maybe a subtler spike on the charts—upward, downward, anomalous points in between—might qualify, so long as it’s formative. Let’s say that receptivity, anyway, can be tied to the moments when, for whatever reason, a person opens herself to the things we can all agree make life worth living in a new and definitive way, whether curiosity has her chasing down the world’s pleasures, or the world has torn a strip from her, exposing raw surface area to the winds.
During these moments—sleepaway camp right before your bar mitzvah; the year you were captain of the hockey team and the baseball team; the time after you got your license and before you totaled the Volvo—you are closely attuned to your culture, reaching out and in to consume it in vast quantities. When this period ends, your senses seal off what they have absorbed and build a sensibility that becomes, for better or worse, definitive: This is the stuff I like. These films/books/artists tell the story of who I am. There is no better-suited hairstyle. This is as good/bad as it gets for me.
The theory suggests that we only get a couple of these moments in life, a couple of sound tracks, and that timing is paramount. If you came of age in the early eighties, for instance, you may hold a relatively shitty cultural moment to be the last time anything was any good simply because that was the last time you were open and engaged with what was happening around you, the last time you felt anything really—appallingly—deeply.
I worry about this theory. I worry because it suggests that receptivity is tied closely to youth, and firsts, and also because as with many otherwise highly rejectable theories—Reaganomics and communism come to mind—there is that insolent nub of truth in it.
* * *
My worry started a couple of years ago, when I felt myself separating, effortlessly and against my better judgment, from what I had unwittingly been a part of for two decades: the Next Generation. It was when I noticed myself taking a step back from the yellow line when the R train blew into the station, rather than a step forward, the way I used to, the way the kids flanking me still did. It was when I began giving more than a passing thought to the age of the people around me—then much more, to the point where I find myself calibrating the age of new acquaintances as a matter of course, ranking a given group in a way that is new and troubling to me, not by interesting eye color or willingness to engage intelligently or suspected willingness to engage carnally or crack comic timing (not actually true; I will always rank by crack comic timing) and not even specifically by age but by age in relation to me.
For me it was the subway thing. For others it’s a first gray hair, death in the family, weeklong hangover, or the moment an indie sensation’s breakthrough single comes to sound like a family of flying squirrels recorded it in a cutlery drawer. I’ve never done it before so I can’t say for sure, but it feels like a peculiar time to greet the first intimations of mortality. Marketing demographics have put us all on a strict schedule, one that ties a person’s relevance to the years when he’s considered most receptive. It may be that Nielsen-style demos are the clearest terms we’ve come up with to gauge social standing between the onset of legal adulthood and retirement age. Taste and other less participatory cultural alignments have come to situate individuals in specific eras, dividing generations by band preference or favorite cereal or national disaster, and creating a powerfully unified sense of time’s passage that is otherwise pretty hard to come by. We’re especially primed, in passing out of that long stretch of peak market desirability, to reexamine our relationship to the culture, which means to examine our relationship with time. But modernity’s strange intermeshing of futurism and nostalgia has made time an elusive, sometimes contradictory source of information.
Looking to your peers for a sense of temporal equilibrium can be equally confounding. It seems to me that, between about age twenty-eight and maybe age forty-three, there now exists a gray area, where anyone within that range could be any age within that range. Rather than relaxing into the shared parameters of a condition traditionally known as adulthood, it is this cohort that gets the most freaky and pedantic about how old they are and how old the people within that same cohort might be. Age panic has an earlier onset but more superficial proportions; it’s more often tied up in the byways of vanity or status insecurity than given to the headlong realization that we’re all going to die die die.
Nothing makes the paradox of the way we now experience time more plain than clock-watching the end of youth. It would appear the clock has come to rule every aspect of who we are and what we do, from daily, micromanaged routines to five-year plans to deciding to marry while you’ll still look good in the pictures. And yet we are utterly careless with time, from passing ungodly stretches of it in a state of many-screened distraction to missing that part where you come into a sense of yourself as an adult reconciled with your own mortality and that of the people you love. Entering the prime of adulthood lends a new and dizzying urgency to the polarity of that relationship. Something important is supposed to be happening, but no one can quite say what, or when—only that it had better happen soon, ideally before it happens to that other guy. Unless it’s bad, in which case reverse that last part. Until whatever it is that’s supposed to happen happens, all we have in the way of orientation to a mean are numbers, and so we look to them.
The generation that felt some pride of ownership over the tech revolution is currently passing through this shadow demographic, eyeballing each other grimly as the teenage natives snicker over our digital accents. By thirty we had gone through four different music formats, which necessitated four different buying cycles, which brought about four different opportunities to revisit the question of where we stand on Alice in Chains. It all feels a little rushed, doesn’t it—the crucible of confronting one’s own taste and the terms on which it was formed? Somehow the decision to purchase Siamese Dream on iTunes—the old CD too scuffed for a laptop’s delicate system—seems fraught with the weight of a larger commitment, like renewing marriage vows, or making some more furtive, less romantic acceptance of the inexorability of the bond. Did I even choose Billy Corgan, or did he choose me?
For all the dorked-out debates about sound quality and texture and ear feel, the music is exactly the same; only time can clarify your relationship to it. But when that happens, we’re meant to feel sheepish about aligning ourselves with the past, as though, despite the fairly obvious contiguousness of our bodies and minds and excellent memories for Beastie Boys lyrics, there’s no viable, meaningful way to tie it with the present. And, more curiously, despite the pathological pseudo-nostalgic recycling that defines modern popular culture. But then maybe the theory of receptivity has pivoted its allegiances toward a technological sensibility, where content is content and it matters less whether you listen to Dead or Alive or deadmau5 than how you listen to the latter’s inevitable remix of the former.
In which case the question of taste and cultural alignment recedes, and examining our relationship with time means examining our relationship with technology—which has come to feel cultish, if not cultural—and vice versa. From a certain angle the entire digital shebang has consisted of dreaming up more and more sophisticated ways to contain and control time. Our devices offered themselves as second or even substitute homes, a place where time was both pliable and strictly monitored. We unfastened our watches, which suffered from always and only moving forward, and adopted new and advanced timekeepers that did whatever we wanted with time: tell it, track it, transcend it, save it, spend it, defy it, kill it, record it, recall it, replay it, reorder it, relive it. Very often they do all of those things at once, in the name of simply staying on top of it.
Keeping impossibly current has become the key selling point of smartphone connectivity. In recent ads for one network provider two men are delivered a seemingly fresh piece of news by a third party as they gaze into their devices. “That’s so [comically small amount of] seconds ago,” the pair smirk each time, to the third party’s great humiliation. To really get the jump on time requires dedication, and if that dedication looks like total enslavement to the uninitiated, let them enjoy the view from the pastures.
Moving at a pace that is not just fast—we’ve been fast—but erratic and discontinuous makes defining a self against some shared sense of time untenable. And yet we persist in seeking a binding cultural memory, a common frame of reference—perhaps out of habit, perhaps because it still feels like the best hope for a stable, unified identity. But modern cultural memory is afflicted by a kind of dementia, its fragments ever floating around us—clearly ordered one moment, hopelessly scrambled the next. It’s easy to feel as though upholding a meaningful continuum between the present and the past might require leaving the culture you mean to restore.
One result of this would seem to be our habit of leaning on fractional age distinctions and investing meaning in invented microgenerations, if only to get clear of the chaos.
People manifest this kind of thing differently. “Your friend mentioned her age a lot,” a friend of mine said after a dinner party recently. “It was weird.” He had just turned thirty-one and hated admitting it. My other friend had just turned thirty-four and couldn’t stop talking about it. “Maybe she’s trying to remind herself,” I said. The numbers only sound more unlikely as they mount. At a different party a few weeks later, I was sent on an El Caminian emotional journey when someone I barely knew asked me, in front of a roomful of people I didn’t know at all, how old I was. Whereas it would have been an ordinarily rude and possibly condescending line of inquiry even, say, five years ago, something about the question and the context had a political, almost passive-aggressive tang. Just what was it he wanted to know, or to tell me? He was thirty-four—older than I was, but only a little, as was everyone else in the room. I know this because most of them were television actresses whose heads swiveled in my direction as the question was being asked, not unkindly but with big, expectant eyes. I want to say—and suspect it could be true—that an actual hush fell upon us.
The dude was history. It was the actresses I looked up the next day. But then I find myself on IMDb fairly regularly lately. I am suddenly in possession of the ages of a number of actors and actresses whom I have been watching for years, in some cases decades. We’ve been the same age the whole time, as it turns out, it just never occurred to me to check, or to care, until now. Movie stars existed in a distant, ageless realm; even those cast to represent a specific stage of life weren’t confined by time in quite the same way. I remember watching E.T. and finding Drew Barrymore’s charming little-girl-ness exotic, as though she were showing me something I wouldn’t otherwise know. But we were almost the same age. I know that now.
One of the great, time-released pleasures of moviegoing is watching the actors of your generation grow older. Maybe pleasures isn’t precisely the right word—but maybe it is. With time comes the impulse to seek out evidence of accrued wisdom, pain, or contentment—the mark of experience—in their faces. This one had a baby; that one just lost her dad. Along with the R-train moment, for me it was watching Ethan Hawke in Before Sunset that left no doubt: this thing was really happening. Life had begun to show itself as more than a series of days, or movies, all in a row, which I might or might not attend.
In Sunset, set and shot nine years after Before Sunrise’s slacker riff on one enchanted, European evening, the characters played by Hawke and Julie Delpy reunite for a late-afternoon walk through Paris, delicately stepping around the last decade’s worth of disappointment and longing. Perhaps the one striking formal difference between the two films is that Sunset takes place in real time, where Sunrise uses elliptical fades to tell the story of an entire night spent wandering around Vienna. Time had become more present, and the present moment more urgent.
Loping through the Latin Quarter in 2003, Hawke appears gaunt and slightly stooped and basically body-slammed by time. But it was his face—with its rough skin, scored forehead and sunken cheeks, and, especially, the deep, exclamatory furrow wedged between his eyes—that transfixed me. Some said he’d come through a divorce, and it had taken its toll; that’s what life does to people. I’d heard about such things but never seen it rendered so plainly, and on the face of someone only a few years older. It was shocking, even a little horrifying. And yet so marvelous to see, so unexpectedly righteous and true. Testify, Ethan Hawke’s Face, I thought. Tell it for real.
If they last long enough and have earned a large enough share of our hearts, movie stars are often cued to acknowledge time’s work on-screen. Traditionally, either a mirror or a younger character reflects the bad news, and we pause to consider it with them. At sixty-one, Katharine Hepburn gave herself a rueful once-over in The Lion in Winter. Forty-eight-year-old Marlon Brando was taunted by his teenage lover in Last Tango in Paris, “You must have been very handsome, years ago.” In Towering Inferno, Fred Astaire (then seventy-five) gets the former treatment and Paul Newman (then forty-nine) the latter.
Pauline Kael was galled by this kind of thing. “It’s self-exploitation and it’s horrible,” she wrote about Hepburn’s pantomimed requiem for her beauty. But then Kael didn’t foresee the coming rarity of actors aging normally on-screen; nor, of course, the futility of an actress fudging her age on IMDb. Neither character acknowledges Hawke’s transformation in Before Sunset, probably because flashbacks to the previous film, and his previous, almost unrecognizably vernal self, make the point more poignantly than a more direct reference could.
I must admit, I was never much of a fan. I remember finding Hawke too on the nose, somehow, too much the thing he was supposed to be—always an actor first instead of a living, changing, insinuating being, someone who demanded watching. Of the many things I failed to imagine back then, watching Before Sunrise, I could not have conceived of a future in which a reprise of his role would feel like an act of generosity. I could not have fathomed feeling so grateful to Ethan Hawke for lending his face to a handmade, jewel-cut meditation on what life does to people—a slow-cooked sequel to a film about those too young and smitten to be concerned about what life might do to them. And what was life doing to me? I worry.
* * *
I worry, specifically, about 1999.
* * *
The year didn’t register too broadly on my personal barometers—fairly crap, if nowhere near as crap as 2000. But it was an extraordinary year in film. It was, I am prepared to argue, one of the greatest years for film in movie history, and certainly the best since I had been alive. Possibly the best year in the second half of the twentieth century, but there’s a two-drink minimum if you want me to summon the table-rapping righteousness I’d need to go that far. Making this case ten and more years later now, increasingly the rebuttal is written on the impossibly pink and pinker faces of my sparring partners: But you were really young in 1999. That’s the last time you felt anything really—appallingly—deeply. I call the theory of receptivity and now find you slightly sadder than before.
But it’s just not true. I was young, yes, but I was a terrible young person—an embarrassment to my kind, really. And 1999, for me, was a nonstarter; hardly a time, I think, when I found things new and exciting because I was young, or that I now associate with my new and exciting, young world. If anything, I felt old and worn-out and generally skeptical, and it just so happened that the only thing I was good at was spending a lot of time alone, in the dark. I never saw more movies in a single year, it’s true, but it was my great good fortune—and I remember thinking at the time, Can you believe this? Again this week?—that so many important directors of the last generation and the next one seemed to be cramming their best work into the final seconds of the century.
Why the confluence? Hollywood’s obsessive chartings account for little beyond box office, and even some critics get their narrow shoulders up about making lists of a year’s best and worst films. But perhaps patterns should form the beginning of a story, not an end. Perhaps a run as hugely, almost freakishly accomplished as 1999’s holds meaning that we can’t get at any other way. If they’re anything like Ethan Hawke’s face, anyway, the integrity of the patterns formed in our culture can at least remind us of exactly how many miles can be racked up over ten years—what a decade looks and feels like—which is handy information, especially for those who have not yet developed a sense of it for themselves.
Check it out: Leaving aside the big-ticket items that weren’t half-bad, like The Sixth Sense, American Beauty (okay, half-bad), The Insider, and, say, horror kitsch like Sleepy Hollow and The Blair Witch Project, something was up with the year’s output of smaller, largely independent, madly inventive films. An era was either peaking or having an intense, pre-expiration paroxysm. It was a culminating moment, in any case, when there was still something like legitimate independent film, and the people with money weren’t as frightened about taking risks; perhaps the co-opt had begun, but some fight was still left in the independent system. Even the old pros were stepping up their game, trying something new: Spike Lee brought Summer of Sam, David Lynch had The Straight Story, Kubrick unleashed the love-it-or-laugh-at-it Eyes Wide Shut, Woody Allen offered the underrated Sweet and Lowdown, and Pedro Almodóvar broke through with All About My Mother.
The reign of 1999 began with the wide release of Rushmore, in February. Forebear of all things fresh and wonderful, it set the tone and the bar for the year. Then came The Matrix and Office Space in March. I snuck out of my own stultifying office job to see Mike Judge’s second film over a distended lunch. A multiplex was underneath my office building, and another close by; my coworkers and I all cut out for movies now and then, each pretending we were the only one. Then there was Go, Hideous Kinky (oh, Kate!), Open Your Eyes, and eXistenZ, which I zoned out of on a kind of principle.
And then: Election. Glorious Election. After that came Last Night, a scene of which was shot in my apartment building; Buena Vista Social Club; then—yes—the first Austin Powers sequel; Run Lola Run; and the goddamn South Park movie. What a great time—a great summer—to be dating, now that I think of it. With South Park: Bigger, Longer, and Uncut, I could tell in the first fifteen minutes—as I was cramping up from the laughter gushing from some awesome, astonished wellspring inside me, never before tapped in a movie theater—that Mr. Easter Island beside me would hardly do.
Dick! Romance. And then the fall season, where things get ridiculous: Three Kings and the one-two crunch of Boys Don’t Cry and Fight Club, both of which spat me into the streets feeling like someone who’d just been kidnapped, injected with adrenaline and heartbreak, and jack-heeled out of a moving car. American Movie; Dogma; 42 Up; Sunshine; Girl, Interrupted; Holy Smoke (ohhh, Kate); and Mansfield Park are comparatively minor but still pretty fucking good. Then came Being John Malkovich, a true-blue circuit-buster, and The End of the Affair, which I am prepared to fight about, and finally Magnolia and one of my favorite films of all and ever: The Talented Mr. Ripley.
* * *
So, about that. About 1999. Then I had just spent four years studying the film and the literature of the past, as though that is indeed where all good and worthwhile things can be found. Then I was recently out of school and unconcerned with continuity, or the connection of inner scheming with the form a life might take in its fullest expression. Then I answered a relative’s holiday dinner-table question about what was next for me with the only option that seemed both honest and objectively accurate: Oblivion?
There’s a chronic history of that kind of thing, but let’s not get into it. Let’s just say that even when I wasn’t playing the in-house Visigoth at family gatherings, “the future” was a concept that tucked my mind into twilight mode, like a pleasant whiff of ether or the wave of a prestidigitator’s hand. In the fog of youth, all of my ambitions were internal and subject to daily renewal, so that each morning I thrashed my way back into the world, having just found my footing when the sun began to set. All of my longings had in some sense to do with time—for time, against time—and the trip wire that kept me from experiencing the world with any kind of reliable, butt-wagging rhythm. At the end of a century defined by its compressions of distance, space seemed more like a cute theory, a dead question. The quest for a modern self is defined not by a map but a schedule; to lack a clear timeline is to be lost. Often I went to the movies to mess with time, to get it off my back or keep it from staring glumly at me from across the room. Just as often I went to get right with it, to tether myself to the present in a way I couldn’t otherwise manage.
It worked to that weirdo’s advantage, I think, that in 1999, at the movies anyway, so little nostalgia perfumed the air. Through the nineties we dressed like filthy hippies one year and peg-legged mods the next, wearing Goodwill weeds to mourn a time when we didn’t exist. For a while, nostalgia was only acceptable in the very young, as though working forward from our parents’ generation might show us where we fit into the picture. Back then I figured that the mixed-up, inter-era quality of much of what was going on would make it impossible to reheat on the nostalgia-market stove. But then the eighties returned with the millennium, and the aughts had hardly hobbled out before the nineties came pogoing back, and I saw clothes I still had in my closet arranged in store windows for maximum retro cachet. Is it that time already?
Being a practicing nostalgic is no longer a stain on your record; odds are your records will come back into fashion before you begin to miss them. Its prevalence makes it tough to differentiate between meaningful recovery of the past and the perpetuation of a craze. A lot of crap gets unearthed not because it’s good but because ours are kitschy times. Often that’s the same difference: evoke a moment however you like, as long as it’s both past and specific. The yearning behind that impulse has less to do with sensibility than the drive toward a more stable sense of time.
That nostalgia has become an integral part of American culture is odd not least because it was initially considered the exact opposite. In The Future of Nostalgia, Russian-American scholar Svetlana Boym connects the beginning of closely measured, delimited time to the birth of what an eighteenth-century doctor called “the hypochondria of the heart.” Basically, absent the option of sweating the apocalypse, a pile of closely bordered countries were left in a kind of spiritual pickle. According to Boym, it’s no coincidence that, in medieval Europe, the end of the End Times was also the beginning of nostalgia, which is to say of individuals and collectives looking back to sustain a sense of identity; of pooling memory funds from which to draw meaning; and of shrinking time’s unwieldy continuum to reflect a specific situation, specific values, and specific ideals. The more objective our measurements for space and time, the stronger our impulse to transcend them with a sense of personalized order. The longing for a sense of confinement—of home—is both personal and communal, its basis purely human, its paradox even more so.
A Swiss doctor put a name to this longing in 1688, while writing a medical dissertation on a pathology he had observed in Europeans displaced from their homelands—students, servants, and soldiers, mostly. Johannes Hofer described the condition in terms of extreme homesickness, using Greek for street cred: nostos, meaning “to return home,” and algia, “longing, or sickness.” Over the centuries, treatments for nostalgia included leeches and potions and other earthy voodoos, but the first prescription was the most basic: get back where you belong.
Nostalgists were said to see ghosts, hear voices of loved ones, dream themselves home, mistake the imagined for the real, and confuse their tenses. If they had been able to find vintage candy stores or Ramones 45s on eBay, perhaps they would have, and we’d all roll our eyes and think, Let it go, man; live in the now. But in its origins as a mental illness, nostalgia is a fairly pristine metaphor for ambivalence toward modern displacements, the foreboding of the present moment and an untellably gnarly future. Which goes a long way to explaining why the first Americans considered nostalgia to be a weakness of the Old World. They were too far away from all of that, too new and forward-looking for crybaby callbacks to the good old days. Motility was progress, and a national sense of place and pride what they would make it—the future, not the past, was where American dreams were set.
The nostalgia diagnosis had disappeared by the twentieth century (but was revived in Israel, according to Boym), if not before the emergence of suitable heirs like neurasthenia, the diagnosis minted by the American physician George Miller Beard in 1869. Henry James’s brother William nicknamed the complaint, which was described as a kind of exhaustion with the new, “Americanitis,” and pharmacies began stocking elixirs to restore youth and ward off the anxiety and fatigue Beard blamed on the new world’s unwieldy speed and excessive freedoms. Neurasthenic men (including Teddy Roosevelt) were directed to recalibrate themselves against nature’s rhythms, while women (including Charlotte Perkins Gilman) were put to bed with orders to quiet their minds. Lengthening life spans only seemed to intensify a focus on the primacy of youth, with F. Scott Fitzgerald bewailing a generation’s first wrinkle even while the party staggered on.
I think you know the rest: We now live on a global clock, every standardized minute counted off on the screens we stare at all day. Our world has never been so closely observed and recorded and mediated, yet our lives have never seemed more self-contained. Western societies are increasingly a matter of discrete single, couple, or family plots, private spaces designed to sustain themselves apart from any conception of a whole. That tendency toward a discretionary existence accounts for the familiarity of the floating, customized Xanadu of the Internet, as well as the hunger for community it seemed to satisfy. The clock was restarted, and the challenge to scale one’s finite sense of time against an ultimate infinity was compounded by a sense of hair-straightening acceleration—the sudden potential to experience all things, all at once. It became possible—it became progress—to live at a speed and spacelessness that held the present in an exploratory suspension. We could prospect this new world like towheads in Narnia, with the sense that life on the outside was paused where we left it, and that “together” we might invent an end to loneliness.
What nobody told us is that nature may abhor a vacuum, but in its natural state longing is one big sucking sound. Over the last decade, the tightening cycle of nostalgia choking Western culture has proliferated into a kind of fractal loop, and for this we blame each other. But our backward fixations are less a product of the desire to stop the clock or retreat to a more fruitful era than the failure to adjust to a blown-out sense of time. In fact, what we call nostalgia today is too much remembrance of too little. We remember with the totemic shallowness, the emotional stinginess of sentiment. And we experience the present with the same superficial effort. Like overworked busboys gesturally wiping down tables between lunch-rush patrons, we launder the events of the day with the estrangements of irony, the culture’s favored detergent—or dead-earnest ideology, its competing brand—just to get on to the next one.
On the one hand, perhaps all of the world’s longing has led to this moment. Maybe this is what the poets warned about—Werther and Wordsworth and Whitman, all the wigged-out piners down the ages—maybe it wasn’t precisely petunias and print media and high-speed rail they were worried about but this exact moment. This, of course, has been said before. It’s impossible to know how different our concerns are from those of Hofer’s Swiss soldier sulking on the coast of Sweden, or the Dutch student dreaming of her mother’s toast with hagelslag at Oxford. It’s impossible to know how deeply programmed we are to long for different times, places, tastes—the tinted comforts of memory. It’s impossible to know, in a time that is no time and only time and all times, all the time, how that programming might shake out. But one suspects Broadway revivals might be involved.
On the other hand are the human things unchanged by time and technology. Things, perhaps, like the cosmic, wall-eating longing that takes light-years to get to you only to confer—burning past your self and any emotion you have known or might know to your very molecules—the unbearable nothingness from which it came. That stuff’s still kicking around. But again, the vessels we fashion to contain and commodify fathomless emotions now often look a lot like Jersey Boys, or jelly shoes, or the memes that streak across the Internet, fostering what little cultural intimacy is tenable when we are so many and moving so goddamn fast.
If anything could, recalibrating and redistributing the weight of our shared past might begin to restore a sense of pace to the culture, relieve it from the sleeperhold of easy nostalgias, and reroute the collective longing behind those impulses in some more useful direction. Svetlana Boym says nostalgia outbreaks often follow a revolution, and to the Velvet and the French I suppose we must add the Facebook. Though it seems unfair for a fogyish revival to court its constituents as they move through their thirties, which is to say just as the fog is finally clearing.
* * *
I was buying gum and uranium-enriched sunscreen not too long ago when the drugstore clerk was swept into a Proustian vortex by the sight of my gold Motorola RAZR on the counter. “Ohhhh,” she sighed, dumping my purchases into a tiny plastic bag. “I know that phone. I wanted one of those so bad.” She beamed at the memory. “That was the phone.” It was a rare moment of gadget relevance for me, so much so that I didn’t notice her use of the past tense and said something shy about how you don’t see many of the gold ones around. My clerk frowned. “Tsssk—not anymore,” she said. “I’m talkin’ like two years ago. When I was in high school.” I punched in my PIN. “I have a BlackBerry now.” I covered the phone with my palm and slid it off the counter. “You need to upgrade.”
And I did, believe me. Let’s call it the theory of receptivity 2.0.
The problem, if you will permit it to be thought a problem, is that I can already feel myself reaching the point my grandmother hit in her eighties. For her it was the airplane, the car, the telephone, the radio, the movies, um, the atom bomb, television, microwaves, space travel, CD, DVD, ADHD—fine. But the cell phone was a gadget too far; my grandma simply topped out. There’s a limit to the assimilating one person can do in a lifetime, and she reached it with fifteen years to go. I was the teenager who complained about being made to whip cream by hand in her kitchen, the congenitally late but ultimately enthusiastic adopter of everything wireless, compressed, ephemeral, convenient, and generally knuckle-sparing about the digital revolution. And yet every other week now, when I hear of something like Google glasses—which I guess are goggles that annotate the visible world with information about what can be bought, eaten, or sexually enjoyed therein—my first thought is a grandmotherly Aaaaand I’m out.
My knees are still good, my friends! I have perfect eyesight, and you know why? Because I let someone cut into my eyeballs with a laser. I don’t hold biweekly Fight Club vigils in my living room, frost commemorative Tracy Flick cupcakes for friends, or wrap myself around a life-size Ralph Fiennes pillow each night. But the older I get, the more protective I feel of something like 1999, a time that felt interesting even then because it was so firmly allied with the present. The longings I associate with it are longings outside of time, larger than me and the movies both. To experience such a radical burst of cinema in my own time stopped me in my tracks, but hardly permanently. If anything, it kept me seeking that feeling, of being a part of something remarkable, and staying awake enough to know it. If anything, I fear not having it in me to care in that same way about the latest tablet, or to develop strong feelings for what amounts to a delivery system, or to imprint sense memories on a soon-to-be-obsolete aluminum slab. Which is to say I worry less about being left behind than not wanting to board the party bus in the first place.
In a 1968 conversation with Marshall McLuhan, Norman Mailer used the example of plane travel—the latter word, as McLuhan points out, taken from the French verb to work—to illustrate his fear of a poorly inhabited present. “I don’t want to come on and be everybody’s Aunt Sophronia and complain about the good old days, which I never knew either,” Mailer said, by way of qualifying his feeling that flying a thousand miles in an hour means moving through “whole areas of existence which we have not necessarily gained. It may be confounding, it may finally be destructive of what is best in the human spirit.”
McLuhan replied with one of his casually immortal predictions: “If you push that all the way, what it means is that we will increasingly tend to inhabit all of these areas in depth, simultaneously.” Mailer took this like a shiv to the spleen. “But we will not inhabit them well!” he cried. “We will inhabit them with a desperately bad fit!”
I’m not sure I ever knew the good old days either. It’s too soon to tell. And believe me, young people, I know the case against me better than you ever could: I rarely go to shows anymore; I don’t troll the sites I can’t even name for hot new sounds; I never got into Mumblecore; too often I read new books because I’m being paid to; and it’s probably a matter of months before I look in the mirror and see Ethan Hawke staring back. I’m right there with you. But tell me, have you seen 1999? I was young then, but it didn’t mean that much to me. It seems like a while ago, I know, but it won’t be long before you’re standing where I am now, trying to sort your personal history from the stuff that stands alone. Time used to do that work for us, but time’s a little tired these days. Time needs a minute. For those of us born into pieces—you and me both, pal—the challenge is not salvaging a meaningful sense of time but determining how to build one within our current parameters, and then inhabit it well. I guess I can only say you’d be amazed how much the 1999s and the Ethan Hawkes of the world can help you with that, if you let them.
* * *
About two years after moving to New York and not long after the release of Before Sunset, I found myself sharing a room with Ethan Hawke. Should you move to New York and stick around long enough, eventually you will too. A group of us were huddled in a penthouse at the Tribeca Ritz for an informal brunch. Whether genuinely felt or a function of decorum, the hostess showed a helpless ambivalence about the space, which she informed me was bought for a song—and here we bow our heads to consider whatever her version of that tune might be—when the building went condo in late 2001. She and I had paused our tour of the apartment to consider the spare bedroom’s northern exposure when I felt Ethan Hawke draw up to my side.
I could tell you about the way he ate chicken curry with his hands but parsed a cupcake with a knife and fork, or the loud, actorly register of his voice across the room, or the way, during our brief exchange—the hell-or-high-water piece of him every person in that room was going to claim before he got away—he’d only pause from his uncomfortable flitting to grant eye contact when I paid him a compliment. I remember those things, as well as the view—out of the living room’s panoramic, southwesterly window bay—of what seemed to be endless harbor beyond Miss Liberty’s straining arm, and the horror I felt when a puzzled author with a blond bowl-cut approached the glass to ask if the lone child on the premises belonged to me. But beyond them is a memory of accrued and connective meaning. Beyond them is simply the sense of being engaged in a new and yet distinctly familiar way with Ethan Hawke’s face, and the felicities of time.
As a kid, probably around when I first saw that face, I nursed a secret conviction about how the perfect movie could be made. It would tell the story of a life, start to finish, only instead of makeup and lighting and showy acting, a commitment of decades would capture the effects of time. How had no one thought of imprinting the celluloid version of a story that spanned, say, thirty years with the truth by actually filming it over thirty years? It would be a life’s work, and a work shot through with life. If someone just took the time, I felt sure, by at least one measure of default it would be the greatest movie ever made.
I used to refer to myself as a nostalgic child: I mourned our kitchen garbage bin’s transition from paper to plastic as though it were the end of the belle epoque; when my father bought a new car, I took it as a personal insult to the past; and as far back as I can remember, I have been drawn to people and art beyond my lifetime’s reach. Ten was a tough birthday, as I recall; the ascent to double digits felt so total. How different, anyway, would twelve be from twenty-one, or seventy-two from twenty-seven, or thirty-three from thirty-three? Surely a person had to hit one hundred for time’s fullness to register in the same way; the rest is numerical noise.
But it’s something like learning of a movie that matches my ten-year-old specifications almost exactly, a film currently listed on IMDb as the Untitled 12-Year Project, which began shooting in 2002 and has a scheduled release date of 2015—it is finding out that this movie is being directed by Richard Linklater and counts Ethan Hawke among its cast that gives all of my early, instinctive jockeying with time and its tyrannies a kind of amber—not to say golden—glow. Like that drawing of the old lady whose nose becomes a young woman’s chin in another glance, time’s sorceries freeze under close scrutiny and flower when permitted a shift of perspective.
Had I known, had time allowed, I would have asked Ethan Hawke about the untitled twelve-year project, if only to game his attentions with another flattering word.
Copyright © 2013 by Michelle Orange
and post it to your social network
Most Helpful Customer Reviews
See all customer reviews >