The Marginalian
The Marginalian

Search results for “oliver sacks”

Beloved Film Critic Roger Ebert on Writing, Life, and Mortality

“Most people choose to write a blog. I needed to.”

What a cultural loss to bid farewell to beloved critic Roger Ebert at the age of 70, after a long battle with the cancer that first claimed his jaw and, now, his life. Though I’d followed Ebert’s writing for some time, with the sort of detached appreciation one directs at cultural commentators, it wasn’t until I encountered him in the flesh at TED 2011, where he delivered his brave and stirring talk about learning to speak again, that I found myself in sheer awe of his spirit. A few months later, his memoir, Life Itself (public library), was released and I absorbed it voraciously. Today, some of its most resonant parts come back to mind, a bittersweet reminder of the incredible mind we’ve lost.

Roger Ebert (photograph by Anne Ryan, USA Today)

Ebert begins with an apt and beautiful metaphor for his existence:

I was born inside the movie of my life. The visuals were before me, the audio surrounded me, the plot unfolded inevitably but not necessarily. I don’t remember how I got into the movie, but it continues to entertain me.

In recalling the mismatch between his memory of his childhood home and the reality of the house once he returned as an adult, he captures that ineffable feeling of questioning the very fabric of reality:

I got the feeling I sometimes have when reality realigns itself. It’s a tingling sensation moving like a wave through my body. I know the feeling precisely. I doubt I’ve experienced it ten times in my life. I felt it at Smith Drugs when I was seven or eight and opened a nudist magazine and discovered that all women had breasts. I felt it when my father told me he had cancer. I felt it when I proposed marriage. Yes, and I felt it in the old Palais des Festivals at Cannes, when the Ride of the Valkyries played during the helicopter attack in Apocalypse Now.

The shape-shifting quality of memory is something Ebert returns to again and again:

One of the rewards of growing old is that you can truthfully say you lived in the past. … In these years after my illness, when I can no longer speak and am set aside from the daily flow, I live more in my memory and discover that a great many things are safely stored away. It all seems still to be in there somewhere. … You find a moment from your past, undisturbed ever since, still vivid, surprising you. In high school I fell under the spell of Thomas Wolfe: ‘A stone, a leaf, an unfound door; of a stone, a leaf, a door. And of all the forgotten faces.’ Now I feel all the faces returning to memory.

[…]

I remember everything. All my life I’ve been visited by unexpected flashes of memory unrelated to anything taking place at the moment. These retrieved moments I consider and replace on the shelf. When I began writing this book, memories came flooding to the surface, not because of any conscious effort but simply in the stream of writing. I started in a direction and the memories were waiting there, sometimes of things I hadn’t consciously thought about since.

Thomas Wolfe was, in fact, a big part of how fell in love with reading shortly after his high school graduation:

I read endlessly, often in class, always late at night. There was no pattern; one book led randomly to another. The great influence was Thomas Wolfe, who burned with the need to be a great novelist, and I burned in sympathy. I felt that if I could write like him, I would have nothing more to learn. I began to ride my bike over to campus and steal quietly into the bookstores.

Roger Ebert

Echoing E. B. White’s famous admonition that “writer who waits for ideal conditions under which to work will die without putting a word on paper” and Isabel Allende’s counsel to “show up, show up, show up, and after a while the muse shows up, too,” Ebert shares some invaluable advice:

My colleague late at night, a year or two older, was Bill Lyon, who covered Champaign High School sports and became a columnist for the Philadelphia Inquirer. … Bill and I would labor deep into the night on Fridays, composing our portraits of the [football] games. I was a subscriber to the Great Lead Theory, which teaches that a story must have an opening paragraph so powerful as to leave few readers still standing. … Lyon watched as I ripped one sheet of copy paper after another out of my typewriter and finally gave me the most useful advice I have ever received as a writer: ‘One, don’t wait for inspiration, just start the damn thing. Two, once you begin, keep on until the end. How do you know how the story should begin until you find out where it’s going?’ These rules saved me half a career’s worth of time and gained me a reputation as the fastest writer in town. I’m not faster. I spend less time not writing.

Much like his ability to summon memories without deliberate effort, Ebert’s mastery of the writing process is largely an unconscious act, a state of mesmerism experienced in finding your purpose and doing what you love:

When I write, I fall into the zone many writers, painters, musicians, athletes, and craftsmen of all sorts seem to share: In doing something I enjoy and am expert at, deliberate thought falls aside and it is all just there. I think of the next word no more than the composer thinks of the next note.

He marvels at how the social web, despite his initial skepticism, liberated his impulse for self-expression as his writing took on an autobiographical life of its own:

My blog became my voice, my outlet, my ‘social media’ in a way I couldn’t have dreamed of. Into it I poured my regrets, desires, and memories. Some days I became possessed. The comments were a form of feedback I’d never had before, and I gained a better and deeper understanding of my readers. I made ‘online friends,’ a concept I’d scoffed at. Most people choose to write a blog. I needed to. I didn’t intend for it to drift into autobiography, but in blogging there is a tidal drift that pushes you that way. … the Internet encourages first-person writing, and I’ve always written that way. How can a movie review be written in the third person, as if it were an account of facts? If it isn’t subjective, there’s something false about it.

The blog let loose the flood of memories. Told sometimes that I should write my memoirs, I failed to see how I possibly could. I had memories, I had lived a good life in an interesting time, but I was at a loss to see how I could organize the accumulation of a lifetime. It was the blog that taught me how. It pushed me into first-person confession, it insisted on the personal, it seemed to organize itself in manageable fragments. Some of these words, since rewritten and expanded, first appeared in blog forms. Most are here for the first time. They come pouring forth in a flood of relief.

Roger Ebert and wife Chaz, 2011 (photograph by Fred Thornhill/Reuters via The New York Times)

He captures the diverse spectrum of what we call “journalism,” sharply aware of where he plants his own stake:

I used journalism to stay at one remove from my convictions: I wouldn’t risk arrest but would bravely report about those who did. My life has followed that pattern. I observe and describe at a prudent reserve.

At the heart of cinema, Ebert sees a deep resonance with the human condition:

If you pay attention to the movies they will tell you what people desire and fear. Movies are hardly ever about what they seem to be about. Look at a movie that a lot of people love, and you will find something profound, no matter how silly the film may be.

On the art of the interview:

My secret as an interviewer was that I was actually impressed by the people I interviewed … I am beneath everything else a fan. I was fixed in this mode as a young boy and am awed by people who take the risks of performance. I become their advocate and find myself in sympathy.

On writing as a substitute for the human pleasures that were taken from him by his illness:

What’s sad about not eating is the experience, whether at a family reunion or at midnight by yourself in a greasy spoon under the L tracks. The loss of dining, not the loss of food. Unless I’m alone, it doesn’t involve dinner if it doesn’t involve talking. The food and drink I can do without easily. The jokes, gossip, laughs, arguments, and memories I miss. I ran in crowds where anyone was likely to start reciting poetry on a moment’s notice. Me too. But not me anymore. So yes, it’s sad. Maybe that’s why writing has become so important to me. You don’t realize it, but we’re at dinner right now.

On our relationship with mortality, at once rather complex and rather simple:

We’re all dying in increments.

Complement Life Itself with Ebert’s unforgettable TED talk:

BP

Uncreative Writing: Redefining Language and Authorship in the Digital Age

“An updated notion of genius would have to center around one’s mastery of information and its dissemination.”

“And your way, is it really YOUR way?,” Henry Miller famously asked. “Substantially all ideas are second-hand, consciously and unconsciously drawn from a million outside sources,” Mark Twain consoled Helen Keller when she was accused of plagiarism. Even our brains might be wired for the necessary forgettings of creativity. What, then, is the value of “originality” — or even its definition?

A recent interview on The Awl reminded me of a wonderful book by Kenneth Goldsmith — MoMA’s first poetry laureate, founder of the massive grassroots audio archive Ubu Web, and professor at my alma mater, UPenn’s Kelly Writers House — titled Uncreative Writing: Managing Language in the Digital Age (public library; UK). Much like Vannevar Bush did in 1945 when he envisioned the future of knowledge and presaged the value of what he poetically termed “trail blazers, those who find delight in the task of establishing useful trails through the enormous mass of the common record,” Goldsmith examines the importance of sorting existing ideas and makes a case for the cultural value of stealing like an artist, particularly as we’re building our new literary canon.

Goldsmith writes in the introduction:

In 1969 the conceptual artist Douglas Huebler wrote, ‘The world is full of objects, more or less interesting; I do not wish to add any more.’ I’ve come to embrace Huebler’s ideas, though it might be retooled as ‘The world is full of texts, more or less interesting; I do not wish to add any more.’ It seems an appropriate response to a new condition in writing today: faced with an unprecedented amount of available text, the problem is not needing to write more of it; instead, we must learn to negotiate the vast quantity that exists. How I make my way through the thicket of information — how I manage it, how I parse it, how I organize and distribute it — is what distinguishes my writing from yours.

He samples a beautiful concept that broadens our definition of genius:

Literary critic Marjorie Perloff has recently begun using the term unoriginal genius to describe this tendency emerging in literature. Her idea is that, because of changes brought on by technology and the Internet, our notion of genius — a romantic isolated figure — is outdated. An updated notion of genius would have to center around one’s mastery of information and its dissemination. Perloff has coined a term, moving information, to signify both the act of pushing language around as well as the act of being emotionally moved by that process. She posits that today’s writer resembles more a programmer than a tortured genius, brilliantly conceptualizing, constructing, executing, and maintaining a writing machine.

(Though, one might argue, information is only valuable when it’s synthesized into knowledge, which is then in turn transmuted into wisdom — so, perhaps, an even better concept would be moving wisdom.)

Goldsmith goes on to examine how technology has sparked a new culture of transformation as authorship:

Today, technology has exacerbated these mechanistic tendencies in writing … inciting younger writers to take their cues from the workings of technology and the Web as ways of constructing literature. As a result, writers are exploring ways of writing that have been thought, traditionally, to be outside the scope of literary practice: word processing, databasing, recycling, appropriation, intentional plagiarism, identity ciphering, and intensive programming, to name but a few.

[…]

There’s been an explosion of writers employing strategies of copying and appropriation over the past few years, with the computer encouraging writers to mimic its workings. When cutting and pasting are integral to the writing process, i would be mad to imagine that writers wouldn’t exploit these functions in extreme ways that weren’t intended by their creators.

Except, of course, none of this is new. We already know that as far back as the Middle Ages, authors were making remarkable florilegia, the Tumblrs of their day, by literally cutting and pasting text from existing manuscripts to create entirely new contexts.

Still, Goldsmith is careful not to disparage traditional literature but laments the stale values it has instilled in us:

I’m not saying that such writing should be discarded. . . . But I’m sensing that literature — infinite in its potential of ranges and expression — is in a rut, tending to hit the same note again and again, confining itself to the narrowest of spectrums, resulting in a practice that has fallen out of step and unable to take part in arguably the most vital and exciting cultural discourses of our time. I find this to be a profoundly sad moment — and a great lost opportunity for literary creativity to revitalize itself in ways it hasn’t imagined.

Perhaps one reason writing is stuck might be the way creative writing is taught. In regard to the many sophisticated ideas concerning media, identity, and sampling developed over the past century, books about how to be a creative writer have completely missed the boat, relying on clichéd notions of what it means to be ‘creative.’

For the past several years, Goldsmith has been teaching a Penn class after which the book is titled, inverting the paradigm of traditional “creative writing” courses. His students are penalized for any semblance of originality and “creativity,” and rewarded for plagiarism, repurposing, sampling, and outright stealing. But as counterproductive and blasphemous as this may sound, it turns out to be a gateway to something unusual yet inevitable, that certain slot machine quality of creativity:

The secret: the suppression of self-expression is impossible. Even when we do something as seemingly ‘uncreative’ as retyping a few pages, we express ourselves in a variety of ways. The act of choosing and reframing tells us as much about ourselves as our story about our mother’s cancer operation. It’s just that we’ve never been taught to value such choices. After a semester of forcibly suppressing a student’s ‘creativity’ by making them plagiarize and transcribe, she will approach me with a sad face at the end of the semester, telling me how disappointed she was because, in fact, what we had accomplished was not uncreative at all; by not being ‘creative,’ she produced the most creative body of work writing in her life. By taking an opposite approach to creativity — the most trite, overused, and ill-defined concept in a writer’s training — she had emerged renewed and rejuvenated, on fire and in love again with writing.

Goldsmith echoes legendary designer Charles Eames, who famously advised to “innovate only as a last resort,” and writes:

Having worked in advertising for many years as a ‘creative director,’ I can tell you that, despite what cultural pundits might say, creativity — as [it has] been defined by our culture with its endless parade of formulaic novels, memoirs, and films — is the thing to flee from, not only as a member of the ‘creative class’ but also as a member of the ‘artistic class.’ Living when technology is changing the rules of the game in every aspect of our lives, it’s time to question and tear down such clichés and lay them on the floor in front of us, then reconstruct these smoldering embers into something new, something contemporary, something — finally — relevant.

In addressing the most common contestations to his ideas about accepting all language as poetry by mere reframing — about what happens to the notion of authorship, about how careers and canons are to be established, about whether the heart of literature is reducible to mere algorithms — Goldsmith seconds a sentiment French polymath Henri Poincaré shared more then a century ago when he noted that to create is merely to choose wisely from the existing pool of ideas:

What becomes important is what you — the author — [decide] to choose. Success lies in knowing what to include and — more important — what to leave out. If all language can be transformed into poetry by mere reframing — an exciting possibility — then she who reframes words in the most charged and convincing way will be judged the best. I agree that the moment we throw judgment and quality out the window we’re in trouble. Democracy is fine for YouTube, but it’s generally a recipe for disaster when it comes to art. While all the words may be created equal — and thus treated — the way in which they’re assembled isn’t; it’s impossible to suspend judgment and folly to dismiss quality. Mimesis and replication [don’t] eradicate authorship, rather they simply place new demands on authors who must take these new conditions into account as part and parcel of the landscape when conceiving of a work of art: if you don’t want it copied, don’t put it online.

Ultimately, he argues that all of this is about the evolution — rather than the destruction — of authorship:

In 1959 the poet and artist Brion Gysin claimed that writing was fifty years behind painting. And he might still be right: in the art world, since impressionism, the avant-garde has been the mainstream. Innovation and risk taking have been consistently rewarded. But, in spite of the successes of modernism, literature has remained on two parallel tracks, the mainstream and the avant-garde, with the two rarely intersecting. Yet the conditions of digital culture have unexpectedly forced a collision, scrambling the once-sure footing of both camps. Suddenly, we all find ourselves in the same boat grappling with new questions concerning authorship, originality, and the way meaning is forged.

The rest of Uncreative Writing goes on to explore the history of appropriation in art, the emerging interchangeability between words and images in digital culture, the challenges of defining one’s identity in the vastness of the online environment, and many other pressing facets of what it means to be a writer — or, even more broadly, a creator — in the age of the internet. Complement it with the equally subversive How To Talk About Books You Haven’t Read.

Photographs: Cameron Wittig (top); Grand Life Hotels (bottom)

BP

The Genius of Dogs and How It Expands Our Understanding of Human Intelligence

“Genius means that someone can be gifted with one type of cognition while being average or below average in another.”

The Genius of Dogs and How It Expands Our Understanding of Human Intelligence

For much of modern history, dogs have inspired a wealth of art and literature, profound philosophical meditations, scientific curiosity, deeply personal letters, photographic admiration, and even some cutting-edge data visualization. But what is it that makes dogs so special in and of themselves, and so dear to us?

Despite the mind-numbing title, The Genius of Dogs: How Dogs Are Smarter than You Think (public library) by Brian Hare, evolutionary anthropologist and founder of the Duke Canine Cognition Center, and Vanessa Woods offers a fascinating tour of radical research on canine cognition, from how the self-domestication of dogs gave them a new kind of social intelligence to what the minds of dogs reveal about our own. In fact, one of the most compelling parts of the book has less to do with dogs and more with genius itself.

In examining the definition of genius, Hare echoes British novelist Amelia E. Barr, who wisely noted in 1901 that “genius is nothing more nor less than doing well what anyone can do badly.” Hare points out that standardized tests provide a very narrow — and thus poor — definition of genius:

As you probably remember, tests such as IQ tests, GREs, and SATs focus on basic skills like reading, writing, and analytical abilities. The tests are favored because on average, they predict scholastic success. But they do not measure the full capabilities of each person. They do not explain Ted Turner, Ralph Lauren, Bill Gates, and Mark Zuckerberg, who all dropped out of college and became billionaires.

Instead, Hare offers a conception of genius that borrows from Howard Gardner’s seminal 1983 theory of multiple intelligences:

A cognitive approach is about celebrating different kinds of intelligence. Genius means that someone can be gifted with one type of cognition while being average or below average in another.

For a perfect example, Hare points to reconstructionist Temple Grandin:

Temple Grandin, at Colorado State University, is autistic yet is also the author of several books, including Animals Make Us Human, and has done more for animal welfare than almost anyone. Although Grandin struggles to read people’s emotions and social cues, her extraordinary understanding of animals has allowed her to reduce the stress of millions of farm animals.

The cognitive revolution changed the way we think about intelligence. It began in the decade that all social revolutions seemed to have happened, the sixties. Rapid advances in computer technology allowed scientists to think differently about the brain and how it solves problems. Instead of the brain being either more or less full of intelligence, like a glass of wine, the brain is more like a computer, where different parts work together. USB ports,keyboards, and modems bring in new information from the environment; a processor helps digest and alter the information into a usable format, while a hard drive stores important information for later use. Neuroscientists realized that, like a computer, many parts of the brain are specialized for solving different types of problems.

An example of this comes from the study of memory, which we already know is fascinating in its fallibility:

One of the best-studied cognitive abilities is memory. In fact, we usually think of geniuses as people who have an extraordinary memory for facts and figures, since such people often score off the charts on IQ tests. But just as there are different types of intelligence, there are different types of memory. There is memory for events, faces, navigation, things that occurred recently or long ago — the list goes on. If you have a good memory in one of these areas, it does not necessarily mean your other types of memory are equally good.

Ultimately, the notion of multiple intelligences is what informs the research on dog cognition:

There are many definitions of intelligence competing for attention in popular culture. But the definition that has guided my research and that applies throughout the book is a very simple one. The genius of dogs — of all animals, for that matter, including humans — has two criteria:

  1. A mental skill that is strong compared with others, either within your own species or in closely related species.
  2. The ability to spontaneously make inferences.

(This second criterion comes strikingly close to famous definitions of creativity.)

The Genius of Dogs goes on to explore the specific types of intelligence at which dogs excel, including their empathic acumen of taking another’s visual perspective and learning from another’s actions, their ability to interpret and act upon human communicative gestures, and the unique ways in which they go about asking for help. Pair it with John Homans’s indispensable What’s a Dog For?, artist Maira Kalman’s illustrated love letter to our canine companions, and cognitive scientist Alexandra Horowitz on how a dog actually “sees” the world through smell.

Public domain photographs via Flickr Commons

BP

View Full Site

The Marginalian participates in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn commissions by linking to Amazon. In more human terms, this means that whenever you buy a book on Amazon from any link on here, I receive a small percentage of its price, which goes straight back into my own colossal biblioexpenses. Privacy policy. (TLDR: You're safe — there are no nefarious "third parties" lurking on my watch or shedding crumbs of the "cookies" the rest of the internet uses.)