The Marginalian
The Marginalian

Search results for “charles darwin”

2013’s Best Books on Writing and Creativity

Timeless wisdom and practical advice on the pleasures and perils of the written word and the creative life.

After the year’s best books in photography, psychology and philosophy, art and design, history and biography, science and technology, “children’s” (though we all know what that means), and pets and animals, the season’s subjective selection of best-of reading lists concludes with the year’s best reads on writing and creativity.

1. WHY WE WRITE

The question of why writers write holds especial mesmerism, both as a piece of psychological voyeurism and as a beacon of self-conscious hope that if we got a glimpse of the innermost drivers of greats, maybe, just maybe, we might be able to replicate the workings of genius in our own work. So why do great writers write? George Orwell itemized four universal motives. Joan Didion saw it as access to her own mind. For David Foster Wallace, it was about fun. Joy Williams found in it a gateway from the darkness to the light. For Charles Bukowski, it sprang from the soul like a rocket. Italo Calvino found in writing the comfort of belonging to a collective enterprise.

In Why We Write: 20 Acclaimed Authors on How and Why They Do What They Do (public library), editor Meredith Maran seeks out answers on the why and advice on the how of writing from twenty of today’s most acclaimed authors.

Prolific novelist Isabel Allende shares in Kurt Vonnegut’s insistence on rooting storytelling in personal experience and writes:

I need to tell a story. It’s an obsession. Each story is a seed inside of me that starts to grow and grow, like a tumor, and I have to deal with it sooner or later. Why a particular story? I don’t know when I begin. That I learn much later. Over the years I’ve discovered that all the stories I’ve told, all the stories I will ever tell, are connected to me in some way. If I’m talking about a woman in Victorian times who leaves the safety of her home and comes to the Gold Rush in California, I’m really talking about feminism, about liberation, about the process I’ve gone through in my own life, escaping from a Chilean, Catholic, patriarchal, conservative, Victorian family and going out into the world.

Though many famous writers have notoriously deliberate routines and rituals, Allende’s is among the most unusual and rigorous. Ultimately, however, she echoes Chuck Close (“Inspiration is for amateurs — the rest of us just show up and get to work.”), Thomas Edison (“Success is the product of the severest kind of mental and physical application.”), E. B. White (“A writer who waits for ideal conditions under which to work will die without putting a word on paper.”) and Tchaikovsky (“A self-respecting artist must not fold his hands on the pretext that he is not in the mood.”), stressing the importance of work ethic over the proverbial muse:

I start all my books on January eighth. Can you imagine January seventh? It’s hell. Every year on January seventh, I prepare my physical space. I clean up everything from my other books. I just leave my dictionaries, and my first editions, and the research materials for the new one. And then on January eighth I walk seventeen steps from the kitchen to the little pool house that is my office. It’s like a journey to another world. It’s winter, it’s raining usually. I go with my umbrella and the dog following me. From those seventeen steps on, I am in another world and I am another person. I go there scared. And excited. And disappointed — because I have a sort of idea that isn’t really an idea. The first two, three, four weeks are wasted. I just show up in front of the computer. Show up, show up, show up, and after a while the muse shows up, too. If she doesn’t show up invited, eventually she just shows up.

She offers three pieces of advice for aspiring writers:

  • It’s worth the work to find the precise word that will create a feeling or describe a situation. Use a thesaurus, use your imagination, scratch your head until it comes to you, but find the right word.
  • When you feel the story is beginning to pick up rhythm—the characters are shaping up, you can see them, you can hear their voices, and they do things that you haven’t planned, things you couldn’t have imagined—then you know the book is somewhere, and you just have to find it, and bring it, word by word, into this world.
  • When you tell a story in the kitchen to a friend, it’s full of mistakes and repetitions. It’s good to avoid that in literature, but still, a story should feel like a conversation. It’s not a lecture.

Celebrated journalist and New Yorker staff writer Susan Orlean considers the critical difference between fiction and nonfiction, exploring the osmotic balance of escapism and inner stillness:

When it comes to nonfiction, it’s important to note the very significant difference between the two stages of the work. Stage one is reporting. Stage two is writing.

Reporting is like being the new kid in school. You’re scrambling to learn something very quickly, being a detective, figuring out who the people are, dissecting the social structure of the community you’re writing about. Emotionally, it puts you in the place that everybody dreads. You’re the outsider. You can’t give in to your natural impulse to run away from situations and people you don’t know. You can’t retreat to the familiar.

Writing is exactly the opposite. It’s private. The energy of it is so intense and internal, it sometimes makes you feel like you’re going to crumple. A lot of it happens invisibly. When you’re sitting at your desk, it looks like you’re just sitting there, doing nothing.

A necessary antidote to the tortured-genius cultural mythology of the writer, Orlean, like Ray Bradbury, conceives of writing as a source of joy, even when challenging:

Writing gives me great feelings of pleasure. There’s a marvelous sense of mastery that comes with writing a sentence that sounds exactly as you want it to. It’s like trying to write a song, making tiny tweaks, reading it out loud, shifting things to make it sound a certain way. It’s very physical. I get antsy. I jiggle my feet a lot, get up a lot, tap my fingers on the keyboard, check my e-mail. Sometimes it feels like digging out of a hole, but sometimes it feels like flying. When it’s working and the rhythm’s there, it does feel like magic to me.

She ends with four pieces of wisdom for writers:

  • You have to simply love writing, and you have to remind yourself often that you love it.
  • You should read as much as possible. That’s the best way to learn how to write.
  • You have to appreciate the spiritual component of having an opportunity to do something as wondrous as writing. You should be practical and smart and you should have a good agent and you should work really, really hard. But you should also be filled with awe and gratitude about this amazing way to be in the world.
  • Don’t be ashamed to use the thesaurus. I could spend all day reading Roget’s! There’s nothing better when you’re in a hurry and you need the right word right now.

True to Alan Watts’s philosophy and the secret to the life of purpose, Michael Lewis remained disinterested in money as a motive — in fact, he recognized the trap of the hedonic treadmill and got out before it was too late:

Before I wrote my first book in 1989, the sum total of my earnings as a writer, over four years of freelancing, was about three thousand bucks. So it did appear to be financial suicide when I quit my job at Salomon Brothers — where I’d been working for a couple of years, and where I’d just gotten a bonus of $225,000, which they promised they’d double the following year—to take a $40,000 book advance for a book that took a year and a half to write.

My father thought I was crazy. I was twenty-seven years old, and they were throwing all this money at me, and it was going to be an easy career. He said, “Do it another ten years, then you can be a writer.” But I looked around at the people on Wall Street who were ten years older than me, and I didn’t see anyone who could have left. You get trapped by the money. Something dies inside. It’s very hard to preserve the quality in a kid that makes him jump out of a high-paying job to go write a book.

More than a living, Lewis found in writing a true calling — the kind of deep flow that fully absorbs the mind and soul:

There’s no simple explanation for why I write. It changes over time. There’s no hole inside me to fill or anything like that, but once I started doing it, I couldn’t imagine wanting to do anything else for a living. I noticed very quickly that writing was the only way for me to lose track of the time.

[…]

I used to get the total immersion feeling by writing at midnight. The day is not structured to write, and so I unplug the phones. I pull down the blinds. I put my headset on and play the same soundtrack of twenty songs over and over and I don’t hear them. It shuts everything else out. So I don’t hear myself as I’m writing and laughing and talking to myself. I’m not even aware I’m making noise. I’m having a physical reaction to a very engaging experience. It is not a detached process.

“Art suffers the moment other people start paying for it,” Hugh MacLeod famously wrote. It might be an overly cynical notion, one that perpetuates the unjustified yet deep-seated cultural guilt over simultaneously doing good and doing well, but Lewis echoes the sentiment:

Once you have a career, and once you have an audience, once you have paying customers, the motives for doing it just change.

And yet Lewis approaches the friction between intrinsic and extrinsic motivation — one experienced by anyone who loves what they do and takes pride in clarity of editorial vision, but has an audience whose approval or disapproval becomes increasingly challenging to tune out — with extraordinary candor and insight:

Commercial success makes writing books a lot easier to do, and it also creates pressure to be more of a commercial success. If you sold a million books once, your publisher really, really thinks you might sell a million books again. And they really want you to do it.

That dynamic has the possibility of constraining the imagination. There are invisible pressures. There’s a huge incentive to write about things that you know will sell. But I don’t find myself thinking, “I can’t write about that because it won’t sell.” It’s such a pain in the ass to write a book, I can’t imagine writing one if I’m not interested in the subject.

And yet his clarity of vision is still what guides the best of his work:

Those are the best moments, when I’ve got the whale on the line, when I see exactly what it is I’ve got to do.

After that moment there’s always misery. It never goes quite like you think, but that moment is a touchstone, a place to come back to. It gives you a kind of compass to guide you through the story.

That feeling has never done me wrong. Sometimes you don’t understand the misery it will lead to, but it’s always been right to feel it. And it’s a great feeling.

Lewis offers some advice to aspiring writers, adding to the collected wisdom of literary greats with his three guidelines:

  1. It’s always good to have a motive to get you in the chair. If your motive is money, find another one.
  2. I took my biggest risk when I walked away from a lucrative job at age twenty-seven to be a writer. I’m glad I was too young to realize what a dumb decision it seemed to be, because it was the right decision for me.
  3. A lot of my best decisions were made in a state of self-delusion. When you’re trying to create a career as a writer, a little delusional thinking goes a long way.

Sample more of this indispensable compendium here, here, here, and here.

2. MANAGE YOUR DAY-TO-DAY

We seem to have a strange but all too human cultural fixation on the daily routines and daily rituals of famous creators, from Vonnegut to Burroughs to Darwin — as if a glimpse of their day-to-day would somehow magically infuse ours with equal potency, or replicating it would allow us to replicate their genius in turn. And though much of this is mere cultural voyeurism, there is something to be said for the value of a well-engineered daily routine to anchor the creative process. Manage Your Day-to-Day: Build Your Routine, Find Your Focus, and Sharpen Your Creative Mind (public library), edited by Behance’s 99U editor-in-chief Jocelyn Glei and featuring contributions from a twenty of today’s most celebrated thinkers and doers, delves into the secrets of this holy grail of creativity.

Reflecting Thomas Edison’s oft-cited proclamation that “genius is one percent inspiration, ninety-nine percent perspiration,” after which 99U is named, the crucial importance of consistent application is a running theme. (Though I prefer to paraphrase Edison to “Genius is one percent inspiration, ninety-nine percent aspiration” — since true aspiration produces effort that feels gratifying rather than merely grueling, enhancing the grit of perspiration with the gift of gratification.)

One of the book’s strongest insights comes from Gretchen Rubin — author of The Happiness Project: Or, Why I Spent a Year Trying to Sing in the Morning, Clean My Closets, Fight Right, Read Aristotle, and Generally Have More Fun, one of these 7 essential books on the art and science of happiness, titled after her fantastic blog of the same name — who points to frequency as the key to creative accomplishment:

We tend to overestimate what we can do in a short period, and underestimate what we can do over a long period, provided we work slowly and consistently. Anthony Trollope, the nineteenth-century writer who managed to be a prolific novelist while also revolutionizing the British postal system, observed, “A small daily task, if it be really daily, will beat the labours of a spasmodic Hercules.” Over the long run, the unglamorous habit of frequency fosters both productivity and creativity.

Frequency, she argues, helps facilitate what Arthur Koestler has famously termed “bisociation” — the crucial ability to link the seemingly unlinkable, which is the defining characteristic of the creative mind. Rubin writes:

You’re much more likely to spot surprising relationships and to see fresh connections among ideas, if your mind is constantly humming with issues related to your work. When I’m deep in a project, everything I experience seems to relate to it in a way that’s absolutely exhilarating. The entire world becomes more interesting. That’s critical, because I have a voracious need for material, and as I become hyperaware of potential fodder, ideas pour in. By contrast, working sporadically makes it hard to keep your focus. It’s easy to become blocked, confused, or distracted, or to forget what you were aiming to accomplish.

[…]

Creativity arises from a constant churn of ideas, and one of the easiest ways to encourage that fertile froth is to keep your mind engaged with your project. When you work regularly, inspiration strikes regularly.

Echoing Alexander Graham Bell, who memorably wrote that “it is the man who carefully advances step by step … who is bound to succeed in the greatest degree,” and Virginia Woolf, who extolled the creative benefits of keeping a diary, Rubin writes:

Step by step, you make your way forward. That’s why practices such as daily writing exercises or keeping a daily blog can be so helpful. You see yourself do the work, which shows you that you can do the work. Progress is reassuring and inspiring; panic and then despair set in when you find yourself getting nothing done day after day. One of the painful ironies of work life is that the anxiety of procrastination often makes people even less likely to buckle down in the future.

Riffing on wisdom from her latest book, Happier at Home: Kiss More, Jump More, Abandon a Project, Read Samuel Johnson, and My Other Experiments in the Practice of Everyday Life, Rubin offers:

I have a long list of “Secrets of Adulthood,” the lessons I’ve learned as I’ve grown up, such as: “It’s the task that’s never started that’s more tiresome,” “The days are long, but the years are short,” and “Always leave plenty of room in the suitcase.” One of my most helpful Secrets is, “What I do every day matters more than what I do once in a while.”

With a sentiment reminiscent of William James’s timeless words on habit, she concludes:

Day by day, we build our lives, and day by day, we can take steps toward making real the magnificent creations of our imaginations.

Entrepreneurship guru and culture-sage Seth Godin seconds Rubin and admonishes against confusing vacant ritualization with creative rituals that actually spur productivity:

Everybody who does creative work has figured out how to deal with their own demons to get their work done. There is no evidence that setting up your easel like Van Gogh makes you paint better. Tactics are idiosyncratic. But strategies are universal, and there are a lot of talented folks who are not succeeding the way they want to because their strategies are broken.

The strategy is simple, I think. The strategy is to have a practice, and what it means to have a practice is to regularly and reliably do the work in a habitual way.

There are many ways you can signify to yourself that you are doing your practice. For example, some people wear a white lab coat or a particular pair of glasses, or always work in a specific place — in doing these things, they are professionalizing their art.

He echoes Chuck Close (“Inspiration is for amateurs — the rest of us just show up and get to work.”), Tchaikovsky (“a self-respecting artist must not fold his hands on the pretext that he is not in the mood.”) E. B. White (“A writer who waits for ideal conditions under which to work will die without putting a word on paper.”), and Isabel Allende (“Show up, show up, show up, and after a while the muse shows up, too.”), observing:

The notion that I do my work here, now, like this, even when I do not feel like it, and especially when I do not feel like it, is very important. Because lots and lots of people are creative when they feel like it, but you are only going to become a professional if you do it when you don’t feel like it. And that emotional waiver is why this is your work and not your hobby.

Originally featured in May — read more here.

3. STILL WRITING

“At its best, the sensation of writing is that of any unmerited grace,” Annie Dillard famously observed, adding the quintessential caveat, “It is handed to you, but only if you look for it. You search, you break your heart, your back, your brain, and then — and only then — it is handed to you.” And yet, Zadie Smith admonished in her 10 rules of writing, it’s perilous to romanticize the “vocation of writing”: “There is no ‘writer’s lifestyle.’ All that matters is what you leave on the page.”

Still, surely there must be more to it than that — whole worlds rise and fall, entire universes blossom and die daily in that enchanted space between the writer’s sensation of writing and the word’s destiny of being written on a page. For all that’s been mulled about the writing life and its perpetual osmosis of everyday triumphs and tragedies, its existential feats and failures, at its heart remains an immutable mystery — how can a calling be at once so transcendent and so soul-crushing, and what is it that enthralls so many souls into its paradoxical grip, into feeling compelled to write “not because they can but because they have to”? That, and oh so much more, is what Dani Shapiro explores in Still Writing: The Pleasures and Perils of a Creative Life (public library) — her magnificent memoir of the writing life, at once disarmingly personal and brimming with widely resonant wisdom on the most universal challenges and joys of writing.

Shapiro opens with the kind of crisp conviction that underpins the entire book:

Everything you need to know about life can be learned from a genuine and ongoing attempt to write.

Book sculpture by an anonymous artist left at Edinburgh’s Filmhouse. Click image for more.

Far from a lazy aphorism, however, this proclamation comes from her own hard-earned experience — fragments of which resonate deeply with most of us, on one level or another — that Shapiro synthesizes beautifully:

When I wasn’t writing, I was reading. And when I wasn’t writing or reading, I was staring out the window, lost in thought. Life was elsewhere — I was sure of it—and writing was what took me there. In my notebooks, I escaped an unhappy and lonely childhood. I tried to make sense of myself. I had no intention of becoming a writer. I didn’t know that becoming a writer was possible. Still, writing was what saved me. It presented me with a window into the infinite. It allowed me to create order out of chaos.

‘Paper Typewriter’ by artist Jennifer Collier. Click image for more.

Above all, however, Shapiro’s core point has to do with courage and the creative life:

The writing life requires courage, patience, persistence, empathy, openness, and the ability to deal with rejection. It requires the willingness to be alone with oneself. To be gentle with oneself. To look at the world without blinders on. To observe and withstand what one sees. To be disciplined, and at the same time, take risks. To be willing to fail — not just once, but again and again, over the course of a lifetime. “Ever tried, ever failed,” Samuel Beckett once wrote. “No matter. Try again. Fail again. Fail better.” It requires what the great editor Ted Solotoroff once called endurability.

In other words, it requires grit — that the science of which earned psychologist Angela Duckworth her recent MacArthur “genius” grant and the everyday art of which earns actual geniuses their status.

Writing is also, as Shapiro poetically puts it, a way “to forge a path out of [our] own personal wilderness with words” — a way to both exercise and exorcise our most fundamental insecurities and to practice what Rilke so memorably termed living the questions, the sort of “negative capability” of embracing uncertainty that Keats thought was so fundamental to the creative process. Shapiro echoes that Dillardian insistence on presence as the heart of the creative life:

‘Flights of mind’ by artist Vita Wells. Click image for more.

We are all unsure of ourselves. Every one of us walking the planet wonders, secretly, if we are getting it wrong. We stumble along. We love and we lose. At times, we find unexpected strength, and at other times, we succumb to our fears. We are impatient. We want to know what’s around the corner, and the writing life won’t offer us this. It forces us into the here and now. There is only this moment, when we put pen to page.

[…]

The page is your mirror. What happens inside you is reflected back. You come face-to-face with your own resistance, lack of balance, self-loathing, and insatiable ego—and also with your singular vision, guts, and fortitude. No matter what you’ve achieved the day before, you begin each day at the bottom of the mountain. … Life is usually right there, though, ready to knock us over when we get too sure of ourselves. Fortunately, if we have learned the lessons that years of practice have taught us, when this happens, we endure. We fail better. We sit up, dust ourselves off, and begin again.

In fact, it’s hard not to feel Dillard’s influence and the echo of her voice in Shapiro’s own words as she considers the conflicted yet inexorable mesmerism of writing:

What is it about writing that makes it—for some of us — as necessary as breathing? It is in the thousands of days of trying, failing, sitting, thinking, resisting, dreaming, raveling, unraveling that we are at our most engaged, alert, and alive. Time slips away. The body becomes irrelevant. We are as close to consciousness itself as we will ever be. This begins in the darkness. Beneath the frozen ground, buried deep below anything we can see, something may be taking root. Stay there, if you can. Don’t resist. Don’t force it, but don’t run away. Endure. Be patient. The rewards cannot be measured. Not now. But whatever happens, any writer will tell you: This is the best part.

These rewards manifest not as grand honors and prizes and bestseller rankings — though hardly any writer would deny the warming pleasure of those, however fleeting — but in the cumulative journey of becoming. As Cheryl Strayed put it in her timelessly revisitable meditation on life, “The useless days will add up to something. . . . These things are your becoming.” Ultimately, Shapiro seconds this sentiment by returning to the notion of presence and the art of looking as the centripetal force that summons the scattered fragments of our daily experience into our cumulative muse — a testament to the combinatorial nature of creativity, reassuring us that no bit of life is “useless” and reminding us of the vital importance of what Stephen King has termed the art of “creative sleep”. Shapiro writes:

If I dismiss the ordinary — waiting for the special, the extreme, the extraordinary to happen — I may just miss my life.

[…]

To allow ourselves to spend afternoons watching dancers rehearse, or sit on a stone wall and watch the sunset, or spend the whole weekend rereading Chekhov stories—to know that we are doing what we’re supposed to be doing — is the deepest form of permission in our creative lives. The British author and psychologist Adam Phillips has noted, “When we are inspired, rather like when we are in love, we can feel both unintelligible to ourselves and most truly ourselves.” This is the feeling I think we all yearn for, a kind of hyperreal dream state. We read Emily Dickinson. We watch the dancers. We research a little known piece of history obsessively. We fall in love. We don’t know why, and yet these moments form the source from which all our words will spring.

Originally featured in October — sample it further with Shapiro’s meditation on the perils of plans.

4. ODD TYPE WRITERS

Famous authors are notorious for their daily routines — sometimes outrageous, usually obsessive, invariably peculiar. In Odd Type Writers: From Joyce and Dickens to Wharton and Welty, the Obsessive Habits and Quirky Techniques of Great Authors (public library), Brooklyn-based writer Celia Blue Johnson takes us on a guided tour of great writers’ unusual techniques, prompts, and customs of committing thought to paper, from their ambitious daily word quotas to their superstitions to their inventive procrastination and multitasking methods.

As curious as these habits are, however, Johnson reminds us that public intellectuals often engineer their own myths, which means the quirky behaviors recorded in history’s annals should be taken with a grain of Salinger salt. She offers a necessary disclaimer, enveloped in a thoughtful meta-disclaimer:

One must always keep in mind that these writers and the people around them may have, at some point, embellished the facts. Quirks are great fodder for gossip and can morph into gross exaggeration when passed from one person to the next. There’s also no way to escape the self-mythologizing particularly when dealing with some of the greatest storytellers that ever lived. Yet even when authors stretch the truth, they reveal something about themselves, when it is the desire to project a certain image or the need to shy away from one.

Jack Kerouac’s hand-drawn cross-country road trip map from ‘On the Road’

Mode and medium of writing seem to be a recurring theme of personal idiosyncrasy. Wallace Stevens composed his poetry on slips of paper while walking — an activity he, like Maira Kalman, saw as a creative stimulant — then handed them to his secretary to type up. Edgar Allan Poe, champion of marginalia, wrote his final drafts on separate pieces of paper attached into a running scroll with sealing wax. Jack Kerouac was especially partial to scrolling: In 1951, planning the book for years and amassing ample notes in his journals, he wrote On The Road in one feverish burst, letting it pour onto pages taped together into one enormously long strip of paper — a format he thought lent itself particularly well to his project, since it allowed him to maintain his rapid pace without pausing to reload the typewriter at the end of each page. When he was done, he marched into his editor Robert Giroux’s office and proudly spun out the scroll across the floor. The result, however, was equal parts comical and tragic:

To [Kerouac’s] dismay, Giroux focused on the unusual packaging. He asked, “But Jack, how can you make corrections on a manuscript like that?” Giroux recalled saying, “Jack, you know you have to cut this up. It has to be edited.” Kerouac left the office in a rage. It took several years for Kerouac’s agent, Sterling Lord, to finally find a home for the book, at the Viking Press.

James Joyce in his white coat

James Joyce wrote lying on his stomach in bed, with a large blue pencil, clad in a white coat, and composed most of Finnegans Wake with crayon pieces on cardboard. But this was a matter more of pragmatism than of superstition or vain idiosyncrasy: Of the many outrageously misguided myths the celebrated author of Ulysses and wordsmith of little-known children’s books, one was actually right: he was nearly blind. His childhood myopia developed into severe eye problems by his twenties. To make matters worse, he developed rheumatic fever when he was twenty-five, which resulted in a painful eye condition called iritis. By 1930, he had undergone twenty-five eye surgeries, none of which improved his sight. The large crayons thus helped him see what he was writing, and the white coat helped reflect more light onto the page at night. (As someone partial to black bedding, not for aesthetic reasons but because I believe it provides a deeper dark at night, I can certainly relate to Joyce’s seemingly arbitrary but actually physics-driven attire choice.)

Virginia Woolf was equally opinionated about the right way to write as she was about the right way to read. In her twenties, she spent two and a half hours every morning writing, on a three-and-half-foot tall desk with an angled top that allowed her to look at her work both up-close and from afar. But according to her nephew and irreverent collaborator, Quentin Bell, Woolf’s prescient version of today’s trendy standing desk was less a practical matter than a symptom of her sibling rivalry with her sister, the Bloomsbury artist Vanessa Bell — the same sibling rivalry that would later inspire a charming picture-book: Vanessa painted standing, and Virginia didn’t want to be outdone by her sister. Johnson cites Quentin, who was known for his wry family humor:

This led Virginia to feel that her own pursuit might appear less arduous than that of her sister unless she set matters on a footing of equality.

Many authors measured the quality of their output by uncompromisingly quantitative metrics like daily word quotas. Jack London wrote 1,000 words a day every single day of his career and William Golding once declared at a party that he wrote 3,000 words daily, a number Norman Mailer and Arthur Conan Doyle shared. Raymond Chandler, a man of strong opinions on the craft of writing, didn’t subscribe to a specific daily quota, but was known to write up to 5,000 words a day at his most productive. Anthony Trollope, who began his day promptly at 5:30 A.M. every morning, disciplined himself to write 250 words every 15 minutes, pacing himself with a watch. Stephen King does whatever it takes to reach his daily quota of 2,000 adverbless words and Thomas Wolfe keeps his at 1,800, not letting himself stop until he has reached it.

Flannery O’Connor and her peacocks

We already know how much famous authors loved their pets, but for many their non-human companions were essential to the creative process. Edgar Allan Poe considered his darling tabby named Catterina his literary guardian who “purred as if in complacent approval of the world proceeding under [her] supervision.” Flannery O’Connor developed an early affection for domestic poultry, from her childhood chicken (which, curiously enough, could walk backwards and once ended up in a newsreel clip) to her growing collection of pheasants, ducks, turkeys, and quail. Most famously, however, twenty-something O’Connor mail-ordered six peacocks, a peahen, and four peachicks, which later populated her fiction. But by far the most bizarre pet-related habit comes from Colette, who enlisted her dog in a questionable procrastination mechanism:

Colette would study the fur of her French bulldog, Souci, with a discerning eye. Then she’d pluck a flea from Souci’s back and would continue the hunt until she was ready to write.

But arguably the strangest habit of all comes from Friedrich Schiller, relayed by his friend Goethe:

[Goethe] had dropped by Schiller’s home and, after finding that his friend was out, decided to wait for him to return. Rather than wasting a few spare moments, the productive poet sat down at Schiller’s desk to jot down a few notes. Then a peculiar stench prompted Goethe to pause. Somehow, an oppressive odor had infiltrated the room.

Goethe followed the odor to its origin, which was actually right by where he sat. It was emanating from a drawer in Schiller’s desk. Goethe leaned down, opened the drawer, and found a pile of rotten apples. The smell was so overpowering that he became light-headed. He walked to the window and breathed in a few good doses of fresh air. Goethe was naturally curious about the trove of trash, though Schiller’s wife, Charlotte, could only offer the strange truth: Schiller had deliberately let the apples spoil. The aroma, somehow, inspired him, and according to his spouse, he “could not live or work without it.”

Charles Dickens’s manuscript for ‘Our Mutual Friend.’ Image courtesy of The Morgan Library.

Then there was the color-coding of the muses: In addition to his surprising gastronome streak, Alexandre Dumas was also an aesthete: For decades, he penned all of his fiction on a particular shade of blue paper, his poetry on yellow, and his articles on pink; on one occasion, while traveling in Europe, he ran out of his precious blue paper and was forced to write on a cream-colored pad, which he was convinced made his fiction suffer. Charles Dickens was partial to blue ink, but not for superstitious reasons — because it dried faster than other colors, it allowed him to pen his fiction and letters without the drudgery of blotting. Virginia Woolf used different-colored inks in her pens — greens, blues, and purples. Purple was her favorite, reserved for letters (including her love letters to Vita Sackville-West, diary entries, and manuscript drafts. Lewis Carroll also preferred purple ink (and shared with Woolf a penchant for standing desks), but for much more pragmatic reasons: During his years teaching mathematics at Oxford, teachers were expected to use purple ink to correct students’ work — a habit that carried over to Carroll’s fiction.

But lest we hastily surmise that writing in a white coat would make us a Joyce or drowning pages in purple ink a Woolf, Johnson prefaces her exploration with another important, beautifully phrased disclaimer:

That power to mesmerize has an intangible, almost magical quality, one I wouldn’t dare to try to meddle with by attempting to define it. It was never my goal as I wrote this book to discover what made literary geniuses tick. The nuances of any mind are impossible to pinpoint.

[…]

You could adopt one of these practices or, more ambitiously, combine several of them, and chances are you still wouldn’t invoke genius. These tales don’t hold a secret formula for writing a great novel. Rather, the authors in the book prove that the path to great literature is paved with one’s own eccentricities rather than someone else’s.

Originally featured in September — for more quirky habits, read the original article here.

5. MAXIMIZE YOUR POTENTIAL

“You are what you settle for,” Janis Joplin admonished in her final interview. “You are ONLY as much as you settle for.” In Maximize Your Potential: Grow Your Expertise, Take Bold Risks & Build an Incredible Career (public library), which comes on the heels of their indispensable guide to mastering the pace of productivity and honing your creative routine, editor Jocelyn Glei and her team at Behance’s 99U pull together another package of practical wisdom from 21 celebrated creative entrepreneurs. Despite the somewhat self-helpy, SEO-skewing title, this compendium of advice is anything but contrived. Rather, it’s a no-nonsense, experience-tested, life-approved cookbook for creative intelligence, exploring everything from harnessing the power of habit to cultivating meaningful relationships that enrich your work to overcoming the fear of failure.

In the introduction, Glei affirms the idea that, in the age of make-your-own-success and build-your-own-education, the onus and thrill of finding fulfilling work falls squarely on us, not on the “system”:

If the twentieth-century career was a ladder that we climbed from one predictable rung to the next, the twenty-first-century career is more like a broad rock face that we are all free-climbing. There’s no defined route, and we must use our own ingenuity, training, and strength to rise to the top. We must make our own luck.

Stressing the importance of staying open and alert in order to maximize your “luck quotient,” Glei cites Stanford’s Tina Seelig, who writes about the importance of cultivating awareness and embracing the unfamiliar in her book What I Wish I Knew When I Was 20:

Lucky people take advantage of chance occurrences that come their way. Instead of going through life on cruise control, they pay attention to what’s happening around them and, therefore, are able to extract greater value from each situation… Lucky people are also open to novel opportunities and willing to try things outside of their usual experiences. They’re more inclined to pick up a book on an unfamiliar subject, to travel to less familiar destinations, and to interact with people who are different than themselves.

But “luck,” it turns out, is a grab-bag term composed of many interrelated elements, each dissected in a different chapter. In a section on reprogramming your daily habits, Scott H. Young echoes William James and recaps the science of rewiring your “habit loops”, reminding us how routines dictate our days:

If you think hard about it, you’ll notice just how many “automatic” decisions you make each day. But these habits aren’t always as trivial as what you eat for breakfast. Your health, your productivity, and the growth of your career are all shaped by the things you do each day — most by habit, not by choice.

Even the choices you do make consciously are heavily influenced by automatic patterns. Researchers have found that our conscious mind is better understood as an explainer of our actions, not the cause of them. Instead of triggering the action itself, our consciousness tries to explain why we took the action after the fact, with varying degrees of success. This means that even the choices we do appear to make intentionally are at least somewhat influenced by unconscious patterns.

Given this, what you do every day is best seen as an iceberg, with a small fraction of conscious decision sitting atop a much larger foundation of habits and behaviors.

We can’t, however, simply will ourselves into better habits. Since willpower is a limited resource, whenever we’ve overexerted our self-discipline in one domain, a concept known as “ego depletion” kicks in and renders us mindless automata in another. Instead, Young suggests, the key to changing a habit is to invest heavily in the early stages of habit-formation so that the behavior becomes automated and we later default into it rather than exhausting our willpower wrestling with it. Young also cautions that it’s a self-defeating strategy to try changing several habits at once. Rather, he advises, spend one month on each habit alone before moving on to the next — a method reminiscent of the cognitive strategy of “chunking” that allows our brains to commit more new information to memory.

As both a lover of notable diaries and the daily keeper of a very unnotable one, I was especially delighted to find an entire section dedicated to how a diary boosts your creativity — something Virginia Woolf famously championed, later echoed by Anaïs Nin’s case for the diary as a vital sandbox for writing and Joan Didion’s conviction that keeping a notebook gives you better access to yourself.

Though the chapter, penned by Steven Kramer and Teresa Amabile of the Harvard Business School, co-authors of The Progress Principle, along with 13-year IDEO veteran Ela Ben-Ur, frames the primary benefit of a diary as a purely pragmatic record of your workday productivity and progress — while most dedicated diarists would counter that the core benefits are spiritual and psychoemotional — it does offer some valuable insight into the psychology of how journaling elevates our experience of everyday life:

This is one of the most important reasons to keep a diary: it can make you more aware of your own progress, thus becoming a wellspring of joy in your workday.

Citing their research into the journals of more than two hundred creative professionals, the authors point to a pattern that reveals the single most important motivator: palpable progress on meaningful work:

On the days when these professionals saw themselves moving forward on something they cared about — even if the progress was a seemingly incremental “small win” — they were more likely to be happy and deeply engaged in their work. And, being happier and more deeply engaged, they were more likely to come up with new ideas and solve problems creatively.

Even more importantly, however, they argue that a diary offers an invaluable feedback loop:

Although the act of reflecting and writing, in itself, can be beneficial, you’ll multiply the power of your diary if you review it regularly — if you listen to what your life has been telling you. Periodically, maybe once a month, set aside time to get comfortable and read back through your entries. And, on New Year’s Day, make an annual ritual of reading through the previous year.

This, they suggest, can yield profound insights into the inner workings of your own mind — especially if you look for specific clues and patterns, trying to identify the richest sources of meaning in your work and the types of projects that truly make your heart sing. Once you understand what motivates you most powerfully, you’ll be able to prioritize this type of work in going forward. Just as important, however, is cultivating a gratitude practice and acknowledging your own accomplishments in the diary:

This is your life; savor it. Hold on to the threads across days that, when woven together, reveal the rich tapestry of what you are achieving and who you are becoming. The best part is that, seeing the story line appearing, you can actively create what it — and you — will become.

The lack of a straight story line, however, might also be a good thing. That’s what Jonathan Fields, author of Uncertainty: Turning Fear and Doubt into Fuel for Brilliance and creator of the wonderful Good Life Project, explores in another chapter:

Every creative endeavor, from writing a book to designing a brand to launching a company, follows what’s known as an Uncertainty Curve. The beginning of a project is defined by maximum freedom, very little constraint, and high levels of uncertainty. Everything is possible; options, paths, ideas, variations, and directions are all on the table. At the same time, nobody knows exactly what the final output or outcome will be. And, at times, even whether it will be. Which is exactly the way it should be.

Echoing John Keats’s assertion that “negative capability” is essential to the creative process Rilke’s counsel to live the questions, Richard Feynman’s assertion that the role of great scientists is to remain uncertain, and Anaïs Nin’s insistence that inviting the unknown helps us live more richly, Fields reminds us of what Orson Welles so memorably termed “the gift of ignorance”:

Those who are doggedly attached to the idea they began with may well execute on that idea. And do it well and fast. But along the way, they often miss so many unanticipated possibilities, options, alternatives, and paths that would’ve taken them away from that linear focus on executing on the vision, and sent them back into a place of creative dissidence and uncertainty, but also very likely yielded something orders of magnitude better.

All creators need to be able to live in the shade of the big questions long enough for truly revolutionary ideas and insights to emerge. They need to stay and act in that place relentlessly through the first, most obvious wave of ideas.

Fields argues that if we move along the Uncertainty Curve either too fast or too slowly, we risk either robbing the project of its creative potential and ending up in mediocrity. Instead, becoming mindful of the psychology of that process allows us to pace ourselves better and master that vital osmosis between freedom and constraint. He sums up both the promise and the peril of this delicate dance beautifully:

Nothing truly innovative, nothing that has advanced art, business, design, or humanity, was ever created in the face of genuine certainty or perfect information. Because the only way to be certain before you begin is if the thing you seek to do has already been done.

Originally featured in October — read more here, then sample more of this invaluable compendium with this short read on the psychology of getting unstuck.

6. MAKE ART MAKE MONEY

“Art suffers the moment other people start paying for it,” Hugh MacLeod proclaimed in Ignore Everybody, echoing a prevalent cultural sentiment. And yet there’s something terribly disheartening and defeatist in the assumption that we’ve created a society in which it’s impossible to both make good art and not worry about money — an assumption that tells us art is necessarily bad if commercially successful, and commercial success necessarily unattainable if the art is any good. But in Make Art Make Money: Lessons from Jim Henson on Fueling Your Creative Career, writer Elizabeth Hyde Stevens sets out to debunk this toxic myth through the life and legacy of the beloved Muppeteer.

The story begins with a skit titled “Business, Business,” which Henson performed on The Ed Sullivan Show in 1968. It tells the story of two conflicting sets of creatures — the slot-machine-eyed, cash-register-voiced corporate heads who talk in business-ese, and the naïve, light-bulb-headed softies who talk of love, joy, and beauty:

Stevens writes:

“Business, Business” implies that business and idealism are diametrically opposed. The idealist is attacked not just by the establishment, but also from within, where greed starts to change one’s motives.

For the most part, money is the enemy of art. … Put simply, great art wants quality, whereas good business wants profit. Quality requires many man-hours to produce, which any accountant will tell you cuts significantly into your profit. Great artists fight for such expenditures, whereas successful businessmen fight against them.

And yet, like most dogmatic dichotomies — take, for instance, science and spirituality — this, too, is invariably reductionistic. Henson’s life and legacy, Stevens argues, is proof that art and business can be — and inherently are — complementary rather than contradictory. Produced only six months after the Summer of Love, “Business, Business” straddled a profound cultural shift as a new generation of “light-bulb idealists” — baby boomers, flower children, and hippies who lived in youth collectives, listened to rock, and championed free love — rejected the material ideals of their parents and embraced the philosophy of Alan Watts. And yet Henson himself was an odd hybrid of these two worlds. When he made “Business, Business,” he was thirty-one, which placed him squarely between the boomers and their parents, and lived in New York City with his wife, living comfortably after having made hundreds of television commercials for everything from lunch meats to computers. In his heart, however, Henson, a self-described Mississippi Tom Sawyer who often went barefoot, was an artist — and he was ready to defend this conviction with the choices he made.

Stevens writes:

Henson was already a capitalist when he made “Business, Business.” And we could even conclude that the skit describes his own conversion from idealism to capitalism. In 1968, he had an agent who got him TV appearances on Ed Sullivan and freelance commercial gigs hawking products as unhippielike as IBM computers and Getty oil.

Yet Jim Henson’s business wasn’t oil — it was art. While today, most artists are too timid to admit it, Henson freely referred to himself as an “artist,” and his agent went even further, calling him “artsy-craftsy.” Henson may have worked in show business, but he’d also traveled in Europe as a young man, sketching pictures of its architecture. He owned a business, but his business rested on the ideas the idealists were shouting—brotherhood, joy, and love. He wore a beard. Biographers would say it was to cover acne scars, but in the context of the late sixties, it aligns Henson with a category of people that is unmistakable. Though a capitalist, he was also a staunch artist.

“It seems to be difficult for any one to take in the idea that two truths cannot conflict,” pioneering astronomer Maria Mitchell wrote in her diaries. And yet what Henson’s case tells us, Stevens suggests in returning to “Business, Business,” is that the very notion of “selling out” is one big non-truth that pits two parallel possibilities against each other:

If art and money are at odds, which side was Jim Henson really on? If you watch the skit, the clue is in the characters’ voices. Of the Slinky-necked business-heads and idealist-heads, Henson was really both and neither, because in “Business, Business,” he parodies both. Locked in conflict, they sound like blowhards and twerps, respectively, but they were both facets of his life. As an employer to two other men, Henson was the boss man — the suit, cash register, and slot machine — who wrote the checks. But he also got together with his friends to sing, laugh, and play with puppets in the kind of collectivism that hippies celebrated.

Jim’s sketches of Rowlf from ‘Imagination Illustrated: The Jim Henson Journal.’ Click image for more.

This cultural ambivalence — which dates at least as far back as Tchaikovsky’s concerns over creative integrity vs. commissioned work and which was famously, beautifully articulated by Calvin & Hobbes creator Bill Watterson in his 1990 Kenyon College commencement address — is arguably exacerbated today. Stevens laments:

Today — especially with Generation X and Millennials — serious artists often refuse contact with business. Large numbers of liberal arts graduates bristle when presented with the corporate world, rejecting its values to protect their ideals. Devoted artists move home to a parent’s basement to complete their masterpieces, while the more pragmatic artists live in cloistered “Neverland” artist collectives, grant-funded arts colonies, and university faculty lounges.

Stevens, however, sees in Henson’s story hope for healing this “split personality” by learning to embrace our inner contradictions, which are core to what it means to be human. Stevens writes:

What is a human being? Complex to the point of absurdity, a whole person is both greedy and generous. It is foolish to think we can’t be both artists and entrepreneurs, especially when Henson was so wildly successful in both categories.

Since he was in college, Jim Henson was a natural capitalist. He owned a printmaking business and made commercials for lunchmeats. In the 1970s, he became a merchandizing millionaire and made Hollywood movies. By 1987, he had shows on all three major networks plus HBO and PBS. … Of course, Henson was not just another Trump. Believe the beard.

[…]

When Henson joined on to the experimental PBS show Sesame Street in 1968, he was underpaid for his services creating Big Bird and Oscar. Yet he spent his free nights in his basement, shooting stop-motion films that taught kids to count. If you watch these counting films, the spirit of Henson’s gift shines through. I think any struggling artist today could count Henson among their ilk. He had all the makings of a tragic starving artist. The only difference between him and us is that he made peace with money. He found a way to make art and money dance.

Jim’s sketches of Kermit and Miss Piggy on bicycles at Battersea Park from ‘Imagination Illustrated: The Jim Henson Journal.’ Click image for more.

The key, of course, is to master this dance with equal parts determination and grace. Riffing off Lewis Hyde’s famous meditation on gift economies in The Gift, where he argues that the artist must first cultivate a protected gift-sphere for making pure art and then make contact with the market, Stevens offers a blueprint:

The dance involves art and money, but not at the same time. In the first stage, it is paramount that the artist “reserves a protected gift-sphere in which the art is created.” He keeps money out of it. But in the next two phases, they can dance. The way I see it, Hyde’s dance steps go a little something like this:

  1. Make art.
  2. Make art make money.
  3. Make money make art.

It is the last step that turns this dance into a waltz — something cyclical so that the money is not the real end. Truly, for Jim Henson, money was a fuel that fed art.

Originally featured in September — read more here.

7. ITALO CALVINO: LETTERS, 1941–1985

Italo Calvino: Letters, 1941-1985 (public library) offers more than four decades of wisdom in 600+ pages of personal correspondence by one of the 20th century’s most enchanting writers and most beautiful minds. In one letter, written on July 27, 1949, Calvino contributes one of his many insights on writing:

To write well about the elegant world you have to know it and experience it to the depths of your being just as Proust, Radiguet and Fitzgerald did: what matters is not whether you love it or hate it, but only to be quite clear about your position regarding it.

In another, he considers the secret of living well:

The inferno of the living is not something that will be; if there is one, it is what is already here, the inferno where we live every day, that we form by being together. There are two ways to escape suffering it. The first is easy for many: accept the inferno and become such a part of it that you can no longer see it. The second is risky and demands constant vigilance and apprehension: seek and learn to recognize who and what, in the midst of inferno, are not inferno, then make them endure, give them space.

In a lengthy letter to literary critic Mario Motta dated January 16, 1950, Calvino addresses the alleged death of the novel, a death toll still nervously resounding today:

There have been so many debates on the novel in the last thirty years, both by those who claimed it was dead and by those who wanted it to be alive in a certain way, that if one conducts the debate without serious preliminary work to establish the terms of the question as it has to be set up and as it has never been set up before, we’ll end up saying and making others say a lot of commonplaces.

Calvino echoes Herbert Spencer’s admonition that “to have a specific style is to be poor in speech” in a March 1950 letter to Elsa Morante, one of the most influential postwar novelists, whom he had befriended:

The fact is that I already feel I am a prisoner of a kind of style and it is essential that I escape from it at all costs: I’m now trying to write a totally different book, but it’s damned difficult; I’m trying to break up the rhythms, the echoes which I feel the sentences I write eventually slide into, as into pre-existing molds, I try to see facts and things and people in the round instead of being drawn in colors that have no shading. For that reason the book I’m going to write interests me infinitely more than the other one.

As dangerous as the blind adhesion to a style, Calvino writes in a May 1959 letter, is the blind reliance on tools, the cult of medium over message — but harnessing the power of tools is one of the craft’s greatest arts:

One should never have taboos about the tools we use, that as long as the thought or images or style one wants to put forward do not become deformed by the medium, one must on the contrary try to make use of the most powerful and most efficient of those tools.

Several years later, Calvino returns to his conception of fiction, this time with more dimension and more sensitivity to the inherent contradictions of literature:

One cannot construct in fiction a harmonious language to express something that is not yet harmonious. We live in a cultural ambience where many different languages and levels of knowledge intersect and contradict each other.

Sample the altogether fantastic volume with Calvino’s advice on writing, his prescient meditation on abortion and the meaning of life, his poetic resume, and his thoughts on America.

8. MAKE GOOD ART

Commencement season is upon us and, after Greil Marcus’s soul-stirring speech on the essence of art at the 2013 School of Visual Arts graduation ceremony, here comes an exceptional adaptation of one of the best commencement addresses ever delivered: In May of 2012, beloved author Neil Gaiman stood up in front of the graduating class at Philadelphia’s University of the Arts and dispensed some timeless advice on the creative life; now, his talk comes to life as a slim but potent book titled Make Good Art (public library).

Best of all, it’s designed by none other than the inimitable Chip Kidd, who has spent the past fifteen years shaping the voice of contemporary cover design with his prolific and consistently stellar output, ranging from bestsellers like cartoonist Chris Ware’s sublime Building Stories and neurologist Oliver Sacks’s The Mind’s Eye to lesser-known gems like The Paris Review‘s Women Writers at Work and The Letter Q, that wonderful anthology of queer writers’ letters to their younger selves. (Fittingly, Kidd also designed the book adaptation of Ann Patchett’s 2006 commencement address.)

When things get tough, this is what you should do: Make good art. I’m serious. Husband runs off with a politician — make good art. Leg crushed and then eaten by a mutated boa constrictor — make good art. IRS on your trail — make good art. Cat exploded — make good art. Someone on the Internet thinks what you’re doing is stupid or evil or it’s all been done before — make good art. Probably things will work out somehow, eventually time will take the sting away, and that doesn’t even matter. Do what only you can do best: Make good art. Make it on the bad days, make it on the good days, too.

A wise woman once said, “If you are not making mistakes, you’re not taking enough risks.” Gaiman articulates the same sentiment with his own brand of exquisite eloquence:

I hope that in this year to come, you make mistakes.

Because if you are making mistakes, then you are making new things, trying new things, learning, living, pushing yourself, changing yourself, changing your world. You’re doing things you’ve never done before, and more importantly, you’re Doing Something.

So that’s my wish for you, and all of us, and my wish for myself. Make New Mistakes. Make glorious, amazing mistakes. Make mistakes nobody’s ever made before. Don’t freeze, don’t stop, don’t worry that it isn’t good enough, or it isn’t perfect, whatever it is: art, or love, or work or family or life.

Whatever it is you’re scared of doing, Do it.

Make your mistakes, next year and forever.

Originally featured in May — read the full article here, along with a video of Gaiman’s original commencement address.

BP

The 13 Best Psychology and Philosophy Books of 2013

How to think like Sherlock Holmes, make better mistakes, master the pace of productivity, find fulfilling work, stay sane, and more.

After the best biographies, memoirs, and history books of 2013, the season’s subjective selection of best-of reading lists continue with the most stimulating psychology and philosophy books published this year. (Catch up on the 2012 roundup here and 2011’s here.)

1. ON LOOKING: ELEVEN WALKS WITH EXPERT EYES

“How we spend our days,” Annie Dillard wrote in her timelessly beautiful meditation on presence over productivity, “is, of course, how we spend our lives.” And nowhere do we fail at the art of presence most miserably and most tragically than in urban life — in the city, high on the cult of productivity, where we float past each other, past the buildings and trees and the little boy in the purple pants, past life itself, cut off from the breathing of the world by iPhone earbuds and solipsism. And yet: “The art of seeing has to be learned,” Marguerite Duras reverberates — and it can be learned, as cognitive scientist Alexandra Horowitz invites us to believe in her breathlessly wonderful On Looking: Eleven Walks with Expert Eyes (public library) — a record of her quest to walk around a city block with eleven different “experts,” from an artist to a geologist to a dog, and emerge with fresh eyes mesmerized by the previously unseen fascinations of a familiar world. It is undoubtedly one of the most stimulating books of the year, if not the decade, and the most enchanting thing I’ve read in ages. In a way, it’s the opposite but equally delightful mirror image of Christoph Niemann’s Abstract City — a concrete, immersive examination of urbanity — blending the mindfulness of Sherlock Holmes with the expansive sensitivity of Thoreau.

Horowitz begins by pointing our attention to the incompleteness of our experience of what we conveniently call “reality”:

Right now, you are missing the vast majority of what is happening around you. You are missing the events unfolding in your body, in the distance, and right in front of you.

By marshaling your attention to these words, helpfully framed in a distinct border of white, you are ignoring an unthinkably large amount of information that continues to bombard all of your senses: the hum of the fluorescent lights, the ambient noise in a large room, the places your chair presses against your legs or back, your tongue touching the roof of your mouth, the tension you are holding in your shoulders or jaw, the map of the cool and warm places on your body, the constant hum of traffic or a distant lawn-mower, the blurred view of your own shoulders and torso in your peripheral vision, a chirp of a bug or whine of a kitchen appliance.

This adaptive ignorance, she argues, is there for a reason — we celebrate it as “concentration” and welcome its way of easing our cognitive overload by allowing us to conserve our precious mental resources only for the stimuli of immediate and vital importance, and to dismiss or entirely miss all else. (“Attention is an intentional, unapologetic discriminator,” Horowitz tells us. “It asks what is relevant right now, and gears us up to notice only that.”) But while this might make us more efficient in our goal-oriented day-to-day, it also makes us inhabit a largely unlived — and unremembered — life, day in and day out.

For Horowitz, the awakening to this incredible, invisible backdrop of life came thanks to Pumpernickel, her “curly haired, sage mixed breed” (who also inspired Horowitz’s first book, the excellent Inside of a Dog: What Dogs See, Smell, and Know), as she found herself taking countless walks around the block, becoming more and more aware of the dramatically different experiences she and her canine companion were having along the exact same route:

Minor clashes between my dog’s preferences as to where and how a walk should proceed and my own indicated that I was experiencing almost an entirely different block than my dog. I was paying so little attention to most of what was right before us that I had become a sleepwalker on the sidewalk. What I saw and attended to was exactly what I expected to see; what my dog showed me was that my attention invited along attention’s companion: inattention to everything else.

The book was her answer to the disconnect, an effort to “attend to that inattention.” It is not, she warns us, “about how to bring more focus to your reading of Tolstoy or how to listen more carefully to your spouse.” Rather, it is an invitation to the art of observation:

Together, we became investigators of the ordinary, considering the block — the street and everything on it—as a living being that could be observed.

In this way, the familiar becomes unfamiliar, and the old the new.

Her approach is based on two osmotic human tendencies: our shared capacity to truly see what is in front of us, despite our conditioned concentration that obscures it, and the power of individual bias in perception — or what we call “expertise,” acquired by passion or training or both — in bringing attention to elements that elude the rest of us. What follows is a whirlwind of endlessly captivating exercises in attentive bias as Horowitz, with her archetypal New Yorker’s “special fascination with the humming life-form that is an urban street,” and her diverse companions take to the city.

First, she takes a walk all by herself, trying to note everything observable, and we quickly realize that besides her deliciously ravenous intellectual curiosity, Horowitz is a rare magician with language. (“The walkers trod silently; the dogs said nothing. The only sound was the hum of air conditioners,” she beholds her own block; passing a pile of trash bags graced by a stray Q-tip, she ponders parenthetically, “how does a Q-tip escape?”; turning her final corner, she gazes at the entrance of a mansion and “its pair of stone lions waiting patiently for royalty that never arrives.” Stunning.)

But as soon as she joins her experts, Horowitz is faced with the grimacing awareness that despite her best, most Sherlockian efforts, she was “missing pretty much everything.” She arrives at a newfound, profound understanding of what William James meant when he wrote, “My experience is what I agree to attend to. Only those items which I notice shape my mind.”:

I would find myself at once alarmed, delighted, and humbled at the limitations of my ordinary looking. My consolation is that this deficiency of mine is quite human. We see, but we do not see: we use our eyes, but our gaze is glancing, frivolously considering its object. We see the signs, but not their meanings. We are not blinded, but we have blinders.

Originally featured in August, with a closer look at the expert insights. For another peek at this gem, which is easily among my top three favorite books of the past decade, learn how to do the step-and-slide.

2. TIME WARPED

Given my soft spot for famous diaries, it should come as no surprise that I keep one myself. Perhaps the greatest gift of the practice has been the daily habit of reading what I had written on that day a year earlier; not only is it a remarkable tool of introspection and self-awareness, but it also illustrates that our memory “is never a precise duplicate of the original [but] a continuing act of creation” and how flawed our perception of time is — almost everything that occurred a year ago appears as having taken place either significantly further in the past (“a different lifetime,” I’d often marvel at this time-illusion) or significantly more recently (“this feels like just last month!”). Rather than a personal deficiency of those of us befallen by this tendency, however, it turns out to be a defining feature of how the human mind works, the science of which is at first unsettling, then strangely comforting, and altogether intensely interesting.

That’s precisely what acclaimed BBC broadcaster and psychology writer Claudia Hammond explores in Time Warped: Unlocking the Mysteries of Time Perception (public library) — a fascinating foray into the idea that our experience of time is actively created by our own minds and how these sensations of what neuroscientists and psychologists call “mind time” are created. As disorienting as the concept might seem — after all, we’ve been nursed on the belief that time is one of those few utterly reliable and objective things in life — it is also strangely empowering to think that the very phenomenon depicted as the unforgiving dictator of life is something we might be able to shape and benefit from. Hammond writes:

We construct the experience of time in our minds, so it follows that we are able to change the elements we find troubling — whether it’s trying to stop the years racing past, or speeding up time when we’re stuck in a queue, trying to live more in the present, or working out how long ago we last saw our old friends. Time can be a friend, but it can also be an enemy. The trick is to harness it, whether at home, at work, or even in social policy, and to work in line with our conception of time. Time perception matters because it is the experience of time that roots us in our mental reality. Time is not only at the heart of the way we organize life, but the way we experience it.

Discus chronologicus, a depiction of time by German engraver Christoph Weigel, published in the early 1720s; from Cartographies of Time. (Click for details)

Among the most intriguing illustrations of “mind time” is the incredible elasticity of how we experience time. (“Where is it, this present?,” William James famously wondered. “It has melted in our grasp, fled ere we could touch it, gone in the instant of becoming.”) For instance, Hammond points out, we slow time down when gripped by mortal fear — the cliche about the slow-motion car crash is, in fact, a cognitive reality. This plays out even in situations that aren’t life-or-death per se but are still associated with strong feelings of fear. Hammond points to a study in which people with arachnophobia were asked to look at spiders — the very object of their intense fear — for 45 seconds and they overestimated the elapsed time. The same pattern was observed in novice skydivers, who estimated the duration of their peers’ falls as short, whereas their own, from the same altitude, were deemed longer.

Inversely, time seems to speed up as we get older — a phenomenon of which competing theories have attempted to make light. One, known as the “proportionality theory,” uses pure mathematics, holding that a year feels faster when you’re 40 than when you’re 8 because it only constitutes one fortieth of your life rather than a whole eighth. Among its famous proponents are Vladimir Nabokov and William James. But Hammond remains unconvinced:

The problem with the proportionality theory is that it fails to account for the way we experience time at any one moment. We don’t judge one day in the context of our whole lives. If we did, then for a 40-year-old every single day should flash by because it is less than one fourteen-thousandth of the life they’ve had so far. It should be fleeting and inconsequential, yet if you have nothing to do or an enforced wait at an airport for example, a day at 40 can still feel long and boring and surely longer than a fun day at the seaside packed with adventure for a child. … It ignores attention and emotion, which … can have a considerable impact on time perception.

Another theory suggests that perhaps it is the tempo of life in general that has accelerated, making things from the past appear as slower, including the passage of time itself.

But one definite change does take place with age: As we grow older, we tend to feel like the previous decade elapsed more rapidly, while the earlier decades of our lives seem to have lasted longer. Similarly, we tend to think of events that took place in the past 10 years as having happened more recently than they actually did. (Quick: What year did the devastating Japanese tsunami hit? When did we lose Maurice Sendak?) Conversely, we perceive events that took place more than a decade ago as having happened even longer ago. (When did Princess Diana die? What year was the Chernobyl disaster?) This, Hammond points out, is known as “forward telescoping”:

It is as though time has been compressed and — as if looking through a telescope — things seem closer than they really are. The opposite is called backward or reverse telescoping, also known as time expansion. This is when you guess that events happened longer ago than they really did. This is rare for distant events, but not uncommon for recent weeks.

[…]

The most straightforward explanation for it is called the clarity of memory hypothesis, proposed by the psychologist Norman Bradburn in 1987. This is the simple idea that because we know that memories fade over time, we use the clarity of a memory as a guide to its recency. So if a memory seems unclear we assume it happened longer ago.

Originally featured in July, with a deeper dive into the psychology of why time slows down when we’re afraid, speeds up as we age, and gets warped when we’re on vacation.

3. HOW TO FIND FULFILLING WORK

“If one wanted to crush and destroy a man entirely, to mete out to him the most terrible punishment,” wrote Dostoevsky, “all one would have to do would be to make him do work that was completely and utterly devoid of usefulness and meaning.” Indeed, the quest to avoid work and make a living of doing what you love is a constant conundrum of modern life. In How to Find Fulfilling Work (public library) — the latest installment in The School of Life’s wonderful series reclaiming the traditional self-help genre as intelligent, non-self-helpy, yet immensely helpful guides to modern living, which previously gave us Philippa Perry’s How to Stay Sane and Alain de Botton’s How to Think More About Sex — philosopher Roman Krznaric (remember him?) explores the roots of this contemporary quandary and guides us to its fruitful resolution:

The desire for fulfilling work — a job that provides a deep sense of purpose, and reflects our values, passions and personality — is a modern invention. … For centuries, most inhabitants of the Western world were too busy struggling to meet their subsistence needs to worry about whether they had an exciting career that used their talents and nurtured their wellbeing. But today, the spread of material prosperity has freed our minds to expect much more from the adventure of life.

We have entered a new age of fulfillment, in which the great dream is to trade up from money to meaning.

Krznaric goes on to outline two key afflictions of the modern workplace — “a plague of job dissatisfaction” and “uncertainty about how to choose the right career” — and frames the problem:

Never have so many people felt so unfulfilled in their career roles, and been so unsure what to do about it. Most surveys in the West reveal that at least half the workforce are unhappy in their jobs. One cross-European study showed that 60 per cent of workers would choose a different career if they could start again. In the United States, job satisfaction is at its lowest level — 45 per cent — since record-keeping began over two decades ago.

Of course, Krznaric points out, there’s plenty of cynicism and skepticism to go around, with people questioning whether it’s even possible to find a job in which we thrive and feel complete. He offers an antidote to the default thinking:

There are two broad ways of thinking about these questions. The first is the ‘grin and bear it’ approach. This is the view that we should get our expectations under control and recognize that work, for the vast majority of humanity — including ourselves — is mostly drudgery and always will be. Forget the heady dream of fulfillment and remember Mark Twain’s maxim. “Work is a necessary evil to be avoided.” … The history is captured in the word itself. The Latin labor means drudgery or toil, while the French travail derives from the tripalium, an ancient Roman instrument of torture made of three sticks. … The message of the ‘grin and bear it’ school of thought is that we need to accept the inevitable and put up with whatever job we can get, as long as it meets our financial needs and leaves us enough time to pursue our ‘real life’ outside office hours. The best way to protect ourselves from all the optimistic pundits pedaling fulfillment is to develop a hardy philosophy of acceptance, even resignation, and not set our hearts on finding a meaningful career.

I am more hopeful than this, and subscribe to a different approach, which is that it is possible to find work that is life-enhancing, that broadens our horizons and makes us feel more human.

[…]

This is a book for those who are looking for a job that is big enough for their spirit, something more than a ‘day job’ whose main function is to pay the bills.

‘Never have so many people felt so unfulfilled in their career roles, and been so unsure what to do about it.’

Krznaric considers the five keys to making a career meaningful — earning money, achieving status, making a difference, following our passions, and using our talents — but goes on to demonstrate that they aren’t all created equal. In particular, he echoes 1970s Zen pioneer Alan Watts and modern science in arguing that money alone is a poor motivator:

Schopenhauer may have been right that the desire for money is widespread, but he was wrong on the issue of equating money with happiness. Overwhelming evidence has emerged in the last two decades that the pursuit of wealth is an unlikely path to achieving personal wellbeing — the ancient Greek ideal of eudaimonia or ‘the good life.’ The lack of any clear positive relationship between rising income and rising happiness has become one of the most powerful findings in the modern social sciences. Once our income reaches an amount that covers our basic needs, further increases add little, if anything, to our levels of life satisfaction.

The second false prophet of fulfillment, as Y-Combinator Paul Graham has poignantly cautioned and Debbie Millman has poetically articulated, is prestige. Krznaric admonishes:

We can easily find ourselves pursuing a career that society considers prestigious, but which we are not intrinsically devoted to ourselves — one that does not fulfill us on a day-to-day basis.

Krznaric pits respect, which he defines as “being appreciated for what we personally bring to a job, and being valued for our individual contribution,” as the positive counterpart to prestige and status, arguing that “in our quest for fulfilling work, we should seek a job that offers not just good status prospects, but good respect prospects.”

Rather than hoping to create a harmonious union between the pursuit of money and values, we might have better luck trying to combine values with talents. This idea comes courtesy of Aristotle, who is attributed with saying, ‘Where the needs of the world and your talents cross, there lies your vocation.’

Originally featured in April — read the full article here.

4. INTUITION PUMPS

“If you are not making mistakes, you’re not taking enough risks,” Debbie Millman counseled. “Make New Mistakes. Make glorious, amazing mistakes. Make mistakes nobody’s ever made before,” Neil Gaiman advised young creators. In Intuition Pumps And Other Tools for Thinking (public library), the inimitable Daniel Dennett, one of our greatest living philosophers, offers a set of thinking tools — “handy prosthetic imagination-extenders and focus holders” — that allow us to “think reliably and even gracefully about really hard questions” — to enhance your cognitive toolkit. He calls these tools “intuition pumps” — thought experiments designed to stir “a heartfelt, table-thumping intuition” (which we know is a pillar of even the most “rational” of science) about the question at hand, a kind of persuasion tool the reverse-engineering of which enables us to think better about thinking itself. Intuition, of course, is a domain-specific ability that relies on honed critical thinking rather than a mystical quality bestowed by the gods — but that’s precisely Dennett’s point, and his task is to help us hone it.

Though most of his 77 “intuition pumps” address concrete questions, a dozen are “general-purpose” tools that apply deeply and widely, across just about any domain of thinking. The first of them is also arguably the most useful yet most uncomfortable: making mistakes.

Echoing Dorion Sagan’s case for why science and philosophy need each other, Dennett begins with an astute contribution to the best definitions of philosophy, wrapped in a necessary admonition about the value of history:

The history of philosophy is in large measure the history of very smart people making very tempting mistakes, and if you don’t know the history, you are doomed to making the same darn mistakes all over again. … There is no such thing as philosophy-free science, just science that has been conducted without any consideration of its underlying philosophical assumptions.

He speaks for the generative potential of mistakes and their usefulness as an empirical tool:

Sometimes you don’t just want to risk making mistakes; you actually want to make them — if only to give you something clear and detailed to fix.

Therein lies the power of mistakes as a vehicle for, as Rilke famously put it, “living the questions” and thus advancing knowledge in a way that certainty cannot — for, as Richard Feynman memorably noted, the scientist’s job is to remain unsure, and so seems the philosopher’s. Dennett writes:

We philosophers are mistake specialists. … While other disciplines specialize in getting the right answers to their defining questions, we philosophers specialize in all the ways there are of getting things so mixed up, so deeply wrong, that nobody is even sure what the right questions are, let alone the answers. Asking the wrong questions risks setting any inquiry off on the wrong foot. Whenever that happens, this is a job for philosophers! Philosophy — in every field of inquiry — is what you have to do until you figure out what questions you should have been asking in the first place.

[…]

Mistakes are not just opportunities for learning; they are, in an important sense, the only opportunity for learning or making something truly new. Before there can be learning, there must be learners. There are only two non-miraculous ways for learners to come into existence: they must either evolve or be designed and built by learners that evolved. Biological evolution proceeds by a grand, inexorable process of trial and error — and without the errors the trials wouldn’t accomplish anything.

Dennett offers a caveat that at once highlights the importance of acquiring knowledge and reminds us of the power of “chance-opportunism”:

Trials can be either blind or foresighted. You, who know a lot, but not the answer to the question at hand, can take leaps — foresighted leaps. You can look before you leap, and hence be somewhat guided from the outset by what you already know. You need not be guessing at random, but don’t look down your nose at random guesses; among its wonderful products is … you!

And since evolution is the highest epitome of how the process of trial and error drives progress, Dennett makes a case for understanding evolution as a key to understanding everything else we humans value:

Evolution … is the central, enabling process not only of life but also of knowledge and learning and understanding. If you attempt to make sense of the world of ideas and meanings, free will and morality, art and science and even philosophy itself without a sound and quite detailed knowledge of evolution, you have one hand tied behind your back. … For evolution, which knows nothing, the steps into novelty are blindly taken by mutations, which are random copying “errors” in DNA.

Dennett echoes Dostoyevsky (“Above all, don’t lie to yourself. The man who lies to himself and listens to his own lie comes to a point that he cannot distinguish the truth within him, or around him, and so loses all respect for himself and for others.”) and offers the key to making productive mistakes:

The chief trick to making good mistakes is not to hide them — especially not from yourself. Instead of turning away in denial when you make a mistake, you should become a connoisseur of your own mistakes, turning them over in your mind as if they were works of art, which in a way they are. … The trick is to take advantage of the particular details of the mess you’ve made, so that your next attempt will be informed by it and not just another blind stab in the dark.

We have all heard the forlorn refrain “Well, it seemed like a good idea at the time!” This phrase has come to stand for the rueful reflection of an idiot, a sign of stupidity, but in fact we should appreciate it as a pillar of wisdom. Any being, any agent, who can truly say, “Well, it seemed like a good idea at the time!” is standing on the threshold of brilliance.

Originally featured in May — read the full article here.

5. MASTERMIND: HOW TO THINK LIKE SHERLOCK HOLMES

“The habit of mind which leads to a search for relationships between facts,” wrote James Webb Young in his famous 1939 5-step technique for creative problem-solving, “becomes of the highest importance in the production of ideas.” But just how does one acquire those vital cognitive customs? That’s precisely what science writer Maria Konnikova explores in Mastermind: How to Think Like Sherlock Holmes (UK; public library) — an effort to reverse-engineer Holmes’s methodology into actionable insights that help develop “habits of thought that will allow you to engage mindfully with yourself and your world as a matter of course.”

Bridging ample anecdotes from the adventures of Conan Doyle’s beloved detective with psychology studies both classic and cutting-edge, Konnikova builds a compelling case at the intersection of science and secular spiritualism, stressing the power of rigorous observation alongside a Buddhist-like, Cageian emphasis on mindfulness. She writes:

The idea of mindfulness itself is by no means a new one. As early as the end of the nineteenth century, William James, the father of modern psychology, wrote that, ‘The faculty of voluntarily bringing back a wandering attention, over and over again, is the very root of judgment, character, and will. … An education which should improve this faculty would be the education par excellence.’ That faculty, at its core, is the very essence of mindfulness. And the education that James proposes, an education in a mindful approach to life and to thought.

[…]

In recent years, studies have shown that meditation-like thought (an exercise in the very attentional control that forms the center of mindfulness), for as little as fifteen minutes a day, can shift frontal brain activity toward a pattern that has been associated with more positive and more approach-oriented emotional states, and that looking at scenes of nature, for even a short while, can help us become more insightful, more creative, and more productive. We also know, more definitively than we ever have, that our brains are not built for multitasking — something that precludes mindfulness altogether. When we are forced to do multiple things at once, not only do we perform worse on all of them but our memory decreases and our general wellbeing suffers a palpable hit.

But for Sherlock Holmes, mindful presence is just a first step. It’s a means to a far larger, far more practical and practically gratifying goal. Holmes provides precisely what William James had prescribed: an education in improving our faculty of mindful thought and in using it in order to accomplish more, think better, and decide more optimally. In its broadest application, it is a means for improving overall decision making and judgment ability, starting from the most basic building block of your own mind.

But mindfulness, and the related mental powers it bestows upon its master, is a skill acquired with grit and practice, rather than an in-born talent or an easy feat attained with a few half-hearted tries:

It is most difficult to apply Holmes’s logic in those moments that matter the most. And so, all we can do is practice, until our habits are such that even the most severe stressors will bring out the very thought patterns that we’ve worked so hard to master.

Echoing Carl Sagan, Konnikova examines the role of intuition — a grab-bag concept embraced by some of history’s greatest scientific minds, cultural icons, and philosophers — as both a helpful directional signpost of intellectual inquiry and a dangerous blind spot:

Our intuition is shaped by context, and that context is deeply informed by the world we live in. It can thus serve as a blinder — or blind spot — of sorts. … With mindfulness, however, we can strive to find a balance between fact-checking our intuitions and remaining open-minded. We can then make our best judgments, with the information we have and no more, but with, as well, the understanding that time may change the shape and color of that information.

“I consider that a man’s brain originally is like a little empty attic, and you have to stock it with such furniture as you choose,” Holmes famously remarked. Indeed, much like the inventor’s mind, the problem-solver’s mind is the product of that very choice: The details and observations we select to include in our “brain attic” shape and filter our perception of reality. Konnikova writes:

Observation with a capital O — the way Holmes uses the word when he gives his new companion a brief history of his life with a single glance — does entail more than, well, observation (the lowercase kind). It’s not just about the passive process of letting objects enter into your visual field. It is about knowing what and how to observe and directing your attention accordingly: what details do you focus on? What details do you omit? And how do you take in and capture those details that you do choose to zoom in on? In other words, how do you maximize your brain attic’s potential? You don’t just throw any old detail up there, if you remember Holmes’s early admonitions; you want to keep it as clean as possible. Everything we choose to notice has the potential to become a future furnishing of our attics — and what’s more, its addition will mean a change in the attic’s landscape that will affect, in turn, each future addition. So we have to choose wisely.

Choosing wisely means being selective. It means not only looking but looking properly, looking with real thought. It means looking with the full knowledge that what you note — and how you note it — will form the basis of any future deductions you might make. It’s about seeing the full picture, noting the details that matter, and understanding how to contextualize those details within a broader framework of thought.

Originally featured in January — read the full article for more, including Konnikova’s four rules for Sherlockian thinking.

6. MAKE GOOD ART

Commencement season is upon us and, after Greil Marcus’s soul-stirring speech on the essence of art at the 2013 School of Visual Arts graduation ceremony, here comes an exceptional adaptation of one of the best commencement addresses ever delivered: In May of 2012, beloved author Neil Gaiman stood up in front of the graduating class at Philadelphia’s University of the Arts and dispensed some timeless advice on the creative life; now, his talk comes to life as a slim but potent book titled Make Good Art (public library).

Best of all, it’s designed by none other than the inimitable Chip Kidd, who has spent the past fifteen years shaping the voice of contemporary cover design with his prolific and consistently stellar output, ranging from bestsellers like cartoonist Chris Ware’s sublime Building Stories and neurologist Oliver Sacks’s The Mind’s Eye to lesser-known gems like The Paris Review‘s Women Writers at Work and The Letter Q, that wonderful anthology of queer writers’ letters to their younger selves. (Fittingly, Kidd also designed the book adaptation of Ann Patchett’s 2006 commencement address.)

When things get tough, this is what you should do: Make good art. I’m serious. Husband runs off with a politician — make good art. Leg crushed and then eaten by a mutated boa constrictor — make good art. IRS on your trail — make good art. Cat exploded — make good art. Someone on the Internet thinks what you’re doing is stupid or evil or it’s all been done before — make good art. Probably things will work out somehow, eventually time will take the sting away, and that doesn’t even matter. Do what only you can do best: Make good art. Make it on the bad days, make it on the good days, too.

A wise woman once said, “If you are not making mistakes, you’re not taking enough risks.” Gaiman articulates the same sentiment with his own brand of exquisite eloquence:

I hope that in this year to come, you make mistakes.

Because if you are making mistakes, then you are making new things, trying new things, learning, living, pushing yourself, changing yourself, changing your world. You’re doing things you’ve never done before, and more importantly, you’re Doing Something.

So that’s my wish for you, and all of us, and my wish for myself. Make New Mistakes. Make glorious, amazing mistakes. Make mistakes nobody’s ever made before. Don’t freeze, don’t stop, don’t worry that it isn’t good enough, or it isn’t perfect, whatever it is: art, or love, or work or family or life.

Whatever it is you’re scared of doing, Do it.

Make your mistakes, next year and forever.

Originally featured in May — read the full article here, along with a video of Gaiman’s original commencement address.

7. HOW CHILDREN SUCCEED

In How Children Succeed: Grit, Curiosity, and the Hidden Power of Character (public library) — a necessary addition to these fantastic reads on educationPaul Tough, whose writing has appeared in The New Yorker, Slate, Esquire, The New York Times, sets out to investigate the essential building blocks of character through the findings and practical insight of exceptional educators and bleeding-edge researchers. One of his core arguments is based on the work of pioneering psychologist and 2013 MacArthur “genius” grantee Angela Duckworth, who studied under positive psychology godfather Martin Seligman at my alma mater, the University of Pennsylvania, and has done more than anyone for advancing our understanding of how self-control and grit — the relentless work ethic of sustaining your commitments toward a long-term goal — impact success.

Duckworth had come to Penn in 2002, at the age of thirty-two, later in life than a typical graduate student. The daughter of Chinese immigrants, she had been a classic multitasking overachiever in her teens and twenties. After completing her undergraduate degree at Harvard (and starting a summer school for low-income kids in Cambridge in her spare time), she had bounced from one station of the mid-nineties meritocracy to the next: intern in the White House speechwriting office, Marshall scholar at Oxford (where she studied neuroscience), management consultant for McKinsey and Company, charter-school adviser.

Duckworth spent a number of years toying with the idea of starting her own charter school, but eventually concluded that the model didn’t hold much promise for changing the circumstances of children from disadvantaged backgrounds, those whom the education system was failing most tragically. Instead, she decided to pursue a PhD program at Penn. In her application essay, she shared how profoundly the experience of working in schools had changed her view of school reform and wrote:

The problem, I think, is not only the schools but also the students themselves. Here’s why: learning is hard. True, learning is fun, exhilarating and gratifying — but it is also often daunting, exhausting and sometimes discouraging. . . . To help chronically low-performing but intelligent students, educators and parents must first recognize that character is at least as important as intellect.

Duckworth began her graduate work by studying self-discipline. But when she completed her first-year thesis, based on a group of 164 eighth-graders from a Philadelphia middle school, she arrived at a startling discovery that would shape the course of her career: She found that the students’ self-discipline scores were far better predictors of their academic performance than their IQ scores. So she became intensely interested in what strategies and tricks we might develop to maximize our self-control, and whether those strategies can be taught. But self-control, it turned out, was only a good predictor when it came to immediate, concrete goals — like, say, resisting a cookie. Tough writes:

Duckworth finds it useful to divide the mechanics of achievement into two separate dimensions: motivation and volition. Each one, she says, is necessary to achieve long-term goals, but neither is sufficient alone. Most of us are familiar with the experience of possessing motivation but lacking volition: You can be extremely motivated to lose weight, for example, but unless you have the volition — the willpower, the self-control — to put down the cherry Danish and pick up the free weights, you’re not going to succeed. If a child is highly motivated, the self-control techniques and exercises Duckworth tried to teach [the students in her study] might be very helpful. But what if students just aren’t motivated to achieve the goals their teachers or parents want them to achieve? Then, Duckworth acknowledges, all the self-control tricks in the world aren’t going to help.

This is where grit comes in — the X-factor that helps us attain more long-term, abstract goals. To address this, Duckworth and her colleague Chris Peterson developed the Grit Scale — a deceptively simple test, on which you evaluate how much twelve statements apply to you, from “I am a hard worker” to “New ideas and projects sometimes distract me from previous ones.” The results are profoundly predictive of success at such wide-ranging domains of achievement as the National Spelling Bee and the West Point military academy. Tough describes the surprising power of this seemingly mundane questionnaire:

For each statement, respondents score themselves on a five-point scale, ranging from 5, “very much like me,” to 1, “not like me at all.” The test takes about three minutes to complete, and it relies entirely on self-report — and yet when Duckworth and Peterson took it out into the field, they found it was remarkably predictive of success. Grit, Duckworth discovered, is only faintly related to IQ — there are smart gritty people and dumb gritty people — but at Penn, high grit scores allowed students who had entered college with relatively low college-board scores to nonetheless achieve high GPAs. At the National Spelling Bee, Duckworth found that children with high grit scores were more likely to survive to the later rounds. Most remarkable, Duckworth and Peterson gave their grit test to more than twelve hundred freshman cadets as they entered the military academy at West Point and embarked on the grueling summer training course known as Beast Barracks. The military has developed its own complex evaluation, called the whole candidate score, to judge incoming cadets and predict which of them will survive the demands of West Point; it includes academic grades, a gauge of physical fitness, and a leadership potential score. But the more accurate predictor of which cadets persisted in Beast Barracks and which ones dropped out turned out to be Duckworth’s simple little twelve-item grit questionnaire.

You can take the Grit Scale here (registration is free).

8. THINKING: THE NEW SCIENCE OF DECISION-MAKING, PROBLEM-SOLVING AND PREDICTION

Every year, intellectual impresario and Edge editor John Brockman summons some of our era’s greatest thinkers and unleashes them on one provocative question, whether it’s the single most elegant theory of how the world works or the best way to enhance our cognitive toolkit. This year, he sets out on the most ambitious quest yet, a meta-exploration of thought itself: Thinking: The New Science of Decision-Making, Problem-Solving, and Prediction (public library) collects short essays and lecture adaptations from such celebrated and wide-ranging (though not in gender) minds as Daniel Dennett, Jonathan Haidt, Dan Gilbert, and Timothy Wilson, covering subjects as diverse as morality, essentialism, and the adolescent brain.

One of the most provocative contributions comes from Nobel-winning psychologist Daniel Kahneman — author of the indispensable Thinking, Fast and Slow, one of the best psychology books of 2012 — who examines “the marvels and the flaws of intuitive thinking.”

In the 1970s, Kahneman and his colleague Amos Tversky, self-crowned “prophets of irrationality,” began studying what they called “heuristics and biases” — mental shortcuts we take, which frequently result in cognitive errors. Those errors, however, reveal a great deal about how our minds work:

If you want to characterize how something is done, then one of the most powerful ways of characterizing how the mind does anything is by looking at the errors that the mind produces while it’s doing it because the errors tell you what it is doing. Correct performance tells you much less about the procedure than the errors do.

One of the most fascinating examples of heuristics and biases is what we call intuition — a complex cluster of cognitive processes, sometimes helpful but often misleading. Kahneman notes that thoughts come to mind in one of two ways: Either by “orderly computation,” which involves a series of stages of remembering rules and then applying them, or by perception, an evolutionary function that allows us to predict outcomes based on what we’re perceiving. (For instance, seeing a woman’s angry face helps us predict the general sentiment and disposition of what she’s about to say.) It is the latter mode that precipitates intuition. Kahneman explains the interplay:

There is no sharp line between intuition and perception. … Perception is predictive. . . . If you want to understand intuition, it is very useful to understand perception, because so many of the rules that apply to perception apply as well to intuitive thinking. Intuitive thinking is quite different from perception. Intuitive thinking has language. Intuitive thinking has a lot of word knowledge organized in different ways more than mere perception. But some very basic characteristics [of] perception are extended almost directly to intuitive thinking.

He then considers how the two types of mental operations established by modern cognitive science illuminate intuition:

Type 1 is automatic, effortless, often unconscious, and associatively coherent. . . . Type 2 is controlled, effortful, usually conscious, tends to be logically coherent, rule-governed. Perception and intuition are Type 1. … Type 2 is more controlled, slower, is more deliberate. . . . Type 2 is who we think we are. [And yet] if one made a film on this, Type 2 would be a secondary character who thinks that he is the hero because that’s who we think we are, but in fact, it’s Type 1 that does most of the work, and it’s most of the work that is completely hidden from us.

Type 1 also encompasses all of our practiced skills — for instance, driving, speaking, and understanding a language — which after a certain threshold of mastery enter autopilot mode. (Though this presents its own set of problems.) Underpinning that mode of thinking is our associative memory, which Kahneman unpacks:

You have to think of [your associative memory] as a huge repository of ideas, linked to each other in many ways, including causal links and other links, and activation spreading from ideas to other ideas until a small subset of that enormous network is illuminated, and the subset is what’s happening in the mind at the moment. You’re not conscious of it, you’re conscious of very little of it.

The Type 1 modality of thought gives rise to a System 1 of interpretation, which is at the heart of what we call “intuition” — but which is far less accurate and reliable than we like to believe:

System 1 infers and invents causes and intentions. [This] happens automatically. Infants have it. . . . We’re equipped … for the perception of causality.

It neglects ambiguity and suppresses doubt and … exaggerates coherence. Associative coherence [is] in large part where the marvels turn into flaws. We see a world that is vastly more coherent than the world actually is. That’s because of this coherence-creating mechanism that we have. We have a sense-making organ in our heads, and we tend to see things that are emotionally coherent, and that are associatively coherent.

Most treacherous of all is our tendency to use our very confidence — and overconfidence — as evidence itself:

What’s interesting is that many a time people have intuitions that they’re equally confident about except they’re wrong. That happens through the mechanism I call “the mechanism of substitution.” You have been asked a question, and instead you answer another question, but that answer comes by itself with complete confidence, and you’re not aware that you’re doing something that you’re not an expert on because you have one answer. Subjectively, whether it’s right or wrong, it feels exactly the same. Whether it’s based on a lot of information, or a little information, this is something that you may step back and have a look at. But the subjective sense of confidence can be the same for intuition that arrives from expertise, and for intuitions that arise from heuristics. . . .

In other words, intuition, like attention, is “an intentional, unapologetic discriminator [that] asks what is relevant right now, and gears us up to notice only that” — a humbling antidote to our culture’s propensity for self-righteousness, and above all a reminder to allow yourself the uncomfortable luxury of changing your mind.

Originally featured in October — read the full article here.

9. MANAGE YOUR DAY-TO-DAY

We seem to have a strange but all too human cultural fixation on the daily routines and daily rituals of famous creators, from Vonnegut to Burroughs to Darwin — as if a glimpse of their day-to-day would somehow magically infuse ours with equal potency, or replicating it would allow us to replicate their genius in turn. And though much of this is mere cultural voyeurism, there is something to be said for the value of a well-engineered daily routine to anchor the creative process. Manage Your Day-to-Day: Build Your Routine, Find Your Focus, and Sharpen Your Creative Mind (public library), edited by Behance’s 99U editor-in-chief Jocelyn Glei, delves into the secrets of this holy grail of creativity. Twenty of today’s most celebrated thinkers and doers explore such facets of the creative life as optimizing your idea-generation, defying the demons of perfectionism, managing procrastination, and breaking through your creative blocks, with insights from magnificent minds ranging from behavioral economist Dan Ariely to beloved graphic designer Stefan Sagmeister.

In the foreword to the book, Behance founder Scott Belsky, author of the indispensable Making Ideas Happen, points to “reactionary workflow” — our tendency to respond to requests and other stimuli rather than create meaningful work — as today’s biggest problem and propounds a call to arms:

It’s time to stop blaming our surroundings and start taking responsibility. While no workplace is perfect, it turns out that our gravest challenges are a lot more primal and personal. Our individual practices ultimately determine what we do and how well we do it. Specifically, it’s our routine (or lack thereof), our capacity to work proactively rather than reactively, and our ability to systematically optimize our work habits over time that determine our ability to make ideas happen.

[…]

Only by taking charge of your day-to-day can you truly make an impact in what matters most to you. I urge you to build a better routine by stepping outside of it, find your focus by rising above the constant cacophony, and sharpen your creative prowess by analyzing what really matters most when it comes to making your ideas happen.

One of the book’s strongest insights comes from Gretchen Rubin — author of The Happiness Project: Or, Why I Spent a Year Trying to Sing in the Morning, Clean My Closets, Fight Right, Read Aristotle, and Generally Have More Fun, one of these 7 essential books on the art and science of happiness, titled after her fantastic blog of the same name — who points to frequency as the key to creative accomplishment:

We tend to overestimate what we can do in a short period, and underestimate what we can do over a long period, provided we work slowly and consistently. Anthony Trollope, the nineteenth-century writer who managed to be a prolific novelist while also revolutionizing the British postal system, observed, “A small daily task, if it be really daily, will beat the labours of a spasmodic Hercules.” Over the long run, the unglamorous habit of frequency fosters both productivity and creativity.

Frequency, she argues, helps facilitate what Arthur Koestler has famously termed “bisociation” — the crucial ability to link the seemingly unlinkable, which is the defining characteristic of the creative mind. Rubin writes:

You’re much more likely to spot surprising relationships and to see fresh connections among ideas, if your mind is constantly humming with issues related to your work. When I’m deep in a project, everything I experience seems to relate to it in a way that’s absolutely exhilarating. The entire world becomes more interesting. That’s critical, because I have a voracious need for material, and as I become hyperaware of potential fodder, ideas pour in. By contrast, working sporadically makes it hard to keep your focus. It’s easy to become blocked, confused, or distracted, or to forget what you were aiming to accomplish.

[…]

Creativity arises from a constant churn of ideas, and one of the easiest ways to encourage that fertile froth is to keep your mind engaged with your project. When you work regularly, inspiration strikes regularly.

Echoing Alexander Graham Bell, who memorably wrote that “it is the man who carefully advances step by step … who is bound to succeed in the greatest degree,” and Virginia Woolf, who extolled the creative benefits of keeping a diary, Rubin writes:

Step by step, you make your way forward. That’s why practices such as daily writing exercises or keeping a daily blog can be so helpful. You see yourself do the work, which shows you that you can do the work. Progress is reassuring and inspiring; panic and then despair set in when you find yourself getting nothing done day after day. One of the painful ironies of work life is that the anxiety of procrastination often makes people even less likely to buckle down in the future.

Riffing on wisdom from her latest book, Happier at Home: Kiss More, Jump More, Abandon a Project, Read Samuel Johnson, and My Other Experiments in the Practice of Everyday Life, Rubin offers:

I have a long list of “Secrets of Adulthood,” the lessons I’ve learned as I’ve grown up, such as: “It’s the task that’s never started that’s more tiresome,” “The days are long, but the years are short,” and “Always leave plenty of room in the suitcase.” One of my most helpful Secrets is, “What I do every day matters more than what I do once in a while.”

With a sentiment reminiscent of William James’s timeless words on habit, she concludes:

Day by day, we build our lives, and day by day, we can take steps toward making real the magnificent creations of our imaginations.

Entrepreneurship guru and culture-sage Seth Godin seconds Rubin and admonishes against confusing vacant ritualization with creative rituals that actually spur productivity:

Everybody who does creative work has figured out how to deal with their own demons to get their work done. There is no evidence that setting up your easel like Van Gogh makes you paint better. Tactics are idiosyncratic. But strategies are universal, and there are a lot of talented folks who are not succeeding the way they want to because their strategies are broken.

The strategy is simple, I think. The strategy is to have a practice, and what it means to have a practice is to regularly and reliably do the work in a habitual way.

There are many ways you can signify to yourself that you are doing your practice. For example, some people wear a white lab coat or a particular pair of glasses, or always work in a specific place — in doing these things, they are professionalizing their art.

He echoes Chuck Close (“Inspiration is for amateurs — the rest of us just show up and get to work.”), Tchaikovsky (“a self-respecting artist must not fold his hands on the pretext that he is not in the mood.”) E. B. White (“A writer who waits for ideal conditions under which to work will die without putting a word on paper.”), and Isabel Allende (“Show up, show up, show up, and after a while the muse shows up, too.”), observing:

The notion that I do my work here, now, like this, even when I do not feel like it, and especially when I do not feel like it, is very important. Because lots and lots of people are creative when they feel like it, but you are only going to become a professional if you do it when you don’t feel like it. And that emotional waiver is why this is your work and not your hobby.

Originally featured in May — read the full article here. Also of note: 99U’s sequel, Maximize Your Potential, which collects practical wisdom from 21 celebrated creative entrepreneurs.

10. GIVE AND TAKE

“The principle of give and take; that is diplomacy— give one and take ten,” Mark Twain famously smirked. But for every such cynicism, there’s a heartening meditation on the art of asking and the beautiful osmosis of altruism. “The world is just,” Amelia Barr admonished in her rules for success, “it may, it does, patronize quacks; but it never puts them on a level with true men.” After all, it pays to be nice because, as Austin Kleon put it, “the world is a small town,” right?

Well, maybe — maybe not. Just as the world may be, how givers and takers fare in matters of success proves to be more complicated. So argues organizational psychology wunderkind Adam Grant (remember him?), the youngest-tenured and highest-rated Wharton professor at my alma mater, in Give and Take: A Revolutionary Approach to Success (public library).

Grant’s extensive research has shed light on a crucial element of success, debunking some enduring tenets of cultural mythology:

According to conventional wisdom, highly successful people have three things in common: motivation, ability, and opportunity. If we want to succeed, we need a combination of hard work, talent, and luck. [But there is] a fourth ingredient, one that’s critical but often neglected: success depends heavily on how we approach our interactions with other people. Every time we interact with another person at work, we have a choice to make: do we try to claim as much value as we can, or contribute value without worrying about what we receive in return?

At the heart of his insight is a dichotomy of behavioral styles people adopt in pursuing success:

Takers have a distinctive signature: they like to get more than they give. They tilt reciprocity in their own favor, putting their own interests ahead of others’ needs. Takers believe that the world is a competitive, dog-eat-dog place. They feel that to succeed, they need to be better than others. To prove their competence, they self-promote and make sure they get plenty of credit for their efforts. Garden-variety takers aren’t cruel or cutthroat; they’re just cautious and self-protective. “If I don’t look out for myself first,” takers think, “no one will.”

Grant contrasts takers with givers:

In the workplace, givers are a relatively rare breed. They tilt reciprocity in the other direction, preferring to give more than they get. Whereas takers tend to be self-focused, evaluating what other people can offer them, givers are other-focused, paying more attention to what other people need from them. These preferences aren’t about money: givers and takers aren’t distinguished by how much they donate to charity or the compensation that they command from their employers. Rather, givers and takers differ in their attitudes and actions toward other people. If you’re a taker, you help others strategically, when the benefits to you outweigh the personal costs. If you’re a giver, you might use a different cost-benefit analysis: you help whenever the benefits to others exceed the personal costs. Alternatively, you might not think about the personal costs at all, helping others without expecting anything in return. If you’re a giver at work, you simply strive to be generous in sharing your time, energy, knowledge, skills, ideas, and connections with other people who can benefit from them.

Outside the workplace, Grant argues by citing Yale psychologist Margaret Clark’s research, most of us are givers in close relationships like marriages and friendships, contributing without preoccupation with keeping score. In the workplace, however, few of us are purely givers or takers — rather, what dominates is a third style:

We become matchers, striving to preserve an equal balance of giving and getting. Matchers operate on the principle of fairness: when they help others, they protect themselves by seeking reciprocity. If you’re a matcher, you believe in tit for tat, and your relationships are governed by even exchanges of favors.

True to psychologists’ repeated insistence that personality is fluid rather than fixed, Grant notes:

Giving, taking, and matching are three fundamental styles of social interaction, but the lines between them aren’t hard and fast. You might find that you shift from one reciprocity style to another as you travel across different work roles and relationships. It wouldn’t be surprising if you act like a taker when negotiating your salary, a giver when mentoring someone with less experience than you, and a matcher when sharing expertise with a colleague. But evidence shows that at work, the vast majority of people develop a primary reciprocity style, which captures how they approach most of the people most of the time. And this primary style can play as much of a role in our success as hard work, talent, and luck.

Originally featured in April — for a closer look at Grant’s findings on the science of success, read the full article here.

11. THE EXAMINED LIFE

Despite ample evidence and countless testaments to the opposite, there persists a toxic cultural mythology that creative and intellectual excellence comes from a passive gift bestowed upon the fortunate few by the gods of genius, rather than being the product of the active application and consistent cultivation of skill. So what might the root of that stubborn fallacy be? Childhood and upbringing, it turns out, might have a lot to do.

In The Examined Life: How We Lose and Find Ourselves (public library), psychoanalyst and University College London professor Stephen Grosz builds on more than 50,000 hours of conversation from his quarter-century experience as a practicing psychoanalyst to explore the machinery of our inner life, with insights that are invariably profound and often provocative — for instance, a section titled “How praise can cause a loss of confidence,” in which Grosz writes:

Nowadays, we lavish praise on our children. Praise, self-confidence and academic performance, it is commonly believed, rise and fall together. But current research suggests otherwise — over the past decade, a number of studies on self-esteem have come to the conclusion that praising a child as ‘clever’ may not help her at school. In fact, it might cause her to under-perform. Often a child will react to praise by quitting — why make a new drawing if you have already made ‘the best’? Or a child may simply repeat the same work — why draw something new, or in a new way, if the old way always gets applause?

Grosz cites psychologists Carol Dweck and Claudia Mueller’s famous 1998 study, which divided 128 children ages 10 and 11 into two groups. All were asked to solve mathematical problems, but one group were praised for their intellect (“You did really well, you’re so clever.”) while the other for their effort (“You did really well, you must have tried really hard.”) The kids were then given more complex problems, which those previously praised for their hard work approached with dramatically greater resilience and willingness to try different approaches whenever they reached a dead end. By contrast, those who had been praised for their cleverness were much more anxious about failure, stuck with tasks they had already mastered, and dwindled in tenacity in the face of new problems. Grosz summarizes the now-legendary findings:

Ultimately, the thrill created by being told ‘You’re so clever’ gave way to an increase in anxiety and a drop in self-esteem, motivation and performance. When asked by the researchers to write to children in another school, recounting their experience, some of the ‘clever’ children lied, inflating their scores. In short, all it took to knock these youngsters’ confidence, to make them so unhappy that they lied, was one sentence of praise.

He goes on to admonish against today’s culture of excessive parental praise, which he argues does more for lifting the self-esteem of the parents than for cultivating a healthy one in their children:

Admiring our children may temporarily lift our self-esteem by signaling to those around us what fantastic parents we are and what terrific kids we have — but it isn’t doing much for a child’s sense of self. In trying so hard to be different from our parents, we’re actually doing much the same thing — doling out empty praise the way an earlier generation doled out thoughtless criticism. If we do it to avoid thinking about our child and her world, and about what our child feels, then praise, just like criticism, is ultimately expressing our indifference.

To explore what the healthier substitute for praise might be, he recounts observing an eighty-year-old remedial reading teacher named Charlotte Stiglitz, the mother of the Nobel Prize-winning economist Joseph Stiglitz, who told Grosz of her teaching methodology:

I don’t praise a small child for doing what they ought to be able to do,’ she told me. ‘I praise them when they do something really difficult — like sharing a toy or showing patience. I also think it is important to say “thank you”. When I’m slow in getting a snack for a child, or slow to help them and they have been patient, I thank them. But I wouldn’t praise a child who is playing or reading.

Rather than utilizing the familiar mechanisms of reward and punishment, Grosz observed, Charlotte’s method relied on keen attentiveness to “what a child did and how that child did it.” Presence, he argues, helps build the child’s confidence by way of indicating he is worthy of the observer’s thoughts and attention — its absence, on the other hand, divorces in the child the journey from the destination by instilling a sense that the activity itself is worthless unless it’s a means to obtaining praise. Grosz reminds us how this plays out for all of us, and why it matters throughout life:

Being present, whether with children, with friends, or even with oneself, is always hard work. But isn’t this attentiveness — the feeling that someone is trying to think about us — something we want more than praise?

Originally featured in May — read the full article here.

12. TO SELL IS HUMAN

Whether it’s “selling” your ideas, your writing, or yourself to a potential mate, the art of the sell is crucial to your fulfillment in life, both personal and professional. So argues Dan Pink in To Sell Is Human: The Surprising Truth About Moving Others (public library; UK) — a provocative anatomy of the art-science of “selling” in the broadest possible sense of the word, substantiated by ample research spanning psychology, behavioral economics, and the social sciences.

Pink, wary of the disagreeable twinges accompanying the claim that everyone should self-identify as a salesperson, preemptively counters in the introduction:

I’m convinced we’ve gotten it wrong.

This is a book about sales. But it is unlike any book about sales you have read (or ignored) before. That’s because selling in all its dimensions — whether pushing Buicks on a lot or pitching ideas in a meeting — has changed more in the last ten years than it did over the previous hundred. Most of what we think we understand about selling is constructed atop a foundation of assumptions that have crumbled.

[…]

Selling, I’ve grown to understand, is more urgent, more important, and, in its own sweet way, more beautiful than we realize. The ability to move others to exchange what they have for what we have is crucial to our survival and our happiness. It has helped our species evolve, lifted our living standards, and enhanced our daily lives. The capacity to sell isn’t some unnatural adaptation to the merciless world of commerce. It is part of who we are.

One of Pink’s most fascinating arguments echoes artist Chuck Close, who famously noted that “our whole society is much too problem-solving oriented. It is far more interesting to [participate in] ‘problem creation.’” Pink cites the research of celebrated social scientists Jacob Getzels and Mihaly Csikszentmihalyi, who in the 1960s recruited three dozen fourth-year art students for an experiment. They brought the young artists into a studio with two large tables. The first table displayed 27 eclectic objects that the school used in its drawing classes. The students were instructed to select one or more objects, then arrange a still life on the second table and draw it. What happened next reveals an essential pattern about how creativity works:

The young artists approached their task in two distinct ways. Some examined relatively few objects, outlined their idea swiftly, and moved quickly to draw their still life. Others took their time. They handled more objects, turned them this way and that, rearranged them several times, and needed much longer to complete the drawing. As Csikszentmihalyi saw it, the first group was trying to solve a problem: How can I produce a good drawing? The second was trying to find a problem: What good drawing can I produce?

As Csikszentmihalyi then assembled a group of art experts to evaluate the resulting works, he found that the problem-finders’ drawings had been ranked much higher in creativity than the problem-solvers’. Ten years later, the researchers tracked down these art students, who at that point were working for a living, and found that about half had left the art world, while the other half had gone on to become professional artists. That latter group was composed almost entirely of problem-finders. Another decade later, the researchers checked in again and discovered that the problem-finders were “significantly more successful — by the standards of the artistic community — than their peers.” Getzels concluded:

It is in fact the discovery and creation of problems rather than any superior knowledge, technical skill, or craftsmanship that often sets the creative person apart from others in his field.

Pink summarizes:

The more compelling view of the nature of problems has enormous implications for the new world of selling. Today, both sales and non-sales selling depend more on the creative, heuristic, problem-finding skills of artists than on the reductive, algorithmic, problem-solving skills of technicians.

Another fascinating chapter reveals counterintuitive insights about the competitive advantages of introversion vs. extraversion. While new theories might extol the power of introverts over traditional exaltations of extraversion, the truth turns out to be quite different: Pink turns to the research of social psychologist Adam Grant, management professor at the Wharton School of Business at the University of Pennsylvania (my alma mater).

Grant measured where a sample of call center sales representatives fell on the introversion-extraversion spectrum, then correlated that with their actual sales figures. Unsurprisingly, Grant found that extraverts averaged $125 per hour in revenue, exceeding introverts’ $120. His most surprising finding, however, was that “ambiverts” — those who fell in the middle of the spectrum, “not too hot, not too cold” — performed best of all, with an hourly average of $155. The outliers who brought in an astounding $208 per hour scored a solid 4 on the 1-7 introversion-extraversion scale.

Pink synthesizes the findings into an everyday insight for the rest of us:

The best approach is for the people on the ends to emulate those in the center. As some have noted, introverts are ‘geared to inspect,’ while extraverts are ‘geared to respond.’ Selling of any sort — whether traditional sales or non-sales selling — requires a delicate balance of inspecting and responding. Ambiverts can find that balance. They know when to speak and when to shut up. Their wider repertoires allow them to achieve harmony with a broader range of people and a more varied set of circumstances. Ambiverts are the best movers because they’re the most skilled attuners.

Pink goes on to outline “the new ABCs of moving others” — attunement (“the ability to bring one’s actions and outlook into harmony with other people an with the context you’re [sic] in”), buoyancy (a trifecta of “interrogative self-talk” that moves from making statements to asking questions, contagious “positivity,” and an optimistic “explanatory style” of explaining negative events to yourself), and clarity (“the capacity to help others see their situations in fresh and more revealing ways and to identify problems they didn’t realize they had”).

Originally featured in February — read the full article here, where you can watch the charming companion video.

13. HOW TO STAY SANE

“I pray to Jesus to preserve my sanity,” Jack Kerouac professed in discussing his writing routine. But those of us who fall on the more secular end of the spectrum might need a slightly more potent sanity-preservation tool than prayer. That’s precisely what writer and psychotherapist Philippa Perry offers in How To Stay Sane (public library; UK), part of The School of Life’s wonderful series reclaiming the traditional self-help genre as intelligent, non-self-helpy, yet immensely helpful guides to modern living.

At the heart of Perry’s argument — in line with neurologist Oliver Sacks’s recent meditation on memory and how “narrative truth,” rather than “historical truth,” shapes our impression of the world — is the recognition that stories make us human and learning to reframe our interpretations of reality is key to our experience of life:

Our stories give shape to our inchoate, disparate, fleeting impressions of everyday life. They bring together the past and the future into the present to provide us with structures for working towards our goals. They give us a sense of identity and, most importantly, serve to integrate the feelings of our right brain with the language of our left.

[…]

We are primed to use stories. Part of our survival as a species depended upon listening to the stories of our tribal elders as they shared parables and passed down their experience and the wisdom of those who went before. As we get older it is our short-term memory that fades rather than our long-term memory. Perhaps we have evolved like this so that we are able to tell the younger generation about the stories and experiences that have formed us which may be important to subsequent generations if they are to thrive.

I worry, though, about what might happen to our minds if most of the stories we hear are about greed, war and atrocity.

Perry goes on to cite research indicating that people who watch television for more than four hours a day see themselves as far more likely to fall victim in a violent incident in the forthcoming week than their peers who watch less than two hours a day. Just like E. B. White advocated for the responsibility of the writer to “to lift people up, not lower them down,” so too is our responsibility as the writers of our own life-stories to avoid the well-documented negativity bias of modern media — because, as artist Austin Kleon wisely put it, “you are a mashup of what you let into your life.” Perry writes:

Be careful which stories you expose yourself to.

[…]

The meanings you find, and the stories you hear, will have an impact on how optimistic you are: it’s how we evolved. … If you do not know how to draw positive meaning from what happens in life, the neural pathways you need to appreciate good news will never fire up.

[…]

The trouble is, if we do not have a mind that is used to hearing good news, we do not have the neural pathways to process such news.

Yet despite the adaptive optimism bias of the human brain, Perry argues a positive outlook is a practice — and one that requires mastering the art of vulnerability and increasing our essential tolerance for uncertainty:

You may find that you have been telling yourself that practicing optimism is a risk, as though, somehow, a positive attitude will invite disaster and so if you practice optimism it may increase your feelings of vulnerability. The trick is to increase your tolerance for vulnerable feelings, rather than avoid them altogether.

[…]

Optimism does not mean continual happiness, glazed eyes and a fixed grin. When I talk about the desirability of optimism I do not mean that we should delude ourselves about reality. But practicing optimism does mean focusing more on the positive fall-out of an event than on the negative. … I am not advocating the kind of optimism that means you blow all your savings on a horse running at a hundred to one; I am talking about being optimistic enough to sow some seeds in the hope that some of them will germinate and grow into flowers.

Another key obstruction to our sanity is our chronic aversion to being wrong, entwined with our damaging fear of the unfamiliar. Perry cautions:

We all like to think we keep an open mind and can change our opinions in the light of new evidence, but most of us seem to be geared to making up our minds very quickly. Then we process further evidence not with an open mind but with a filter, only acknowledging the evidence that backs up our original impression. It is too easy for us to fall into the rap of believing that being right is more important than being open to what might be.

If we practice detachment from our thoughts we learn to observe them as though we are taking a bird’s eye view of our own thinking. When we do this, we might find that our thinking belongs to an older, and different, story to the one we are now living.

Perry concludes:

We need to look at the repetitions in the stories we tell ourselves [and] at the process of the stories rather than merely their surface content. Then we can begin to experiment with changing the filter through which we look at the world, start to edit the story and thus regain flexibility where we have been getting stuck.

Originally featured in February — read the full article here.

BP

The 13 Best Biographies, Memoirs, and History Books of 2013

From Alan Turing to Susan Sontag, by way of a lost cat, a fierce Victorian lady-journalist, and some very odd creative habits.

It’s that time of year again, the time for those highly subjective, grossly non-exhaustive, yet inevitable and invariably fun best-of reading lists. To kick off the season, here are my thirteen favorite biographies, memoirs, and history books of 2013. (Catch up on last year’s best history books.)

1. LOST CAT

“Dogs are not about something else. Dogs are about dogs,” Malcolm Gladwell asserted indignantly in the introduction to The Big New Yorker Book of Dogs. Though hailed as memetic rulers of the internet, cats have also enjoyed a long history as artistic and literary muses, but never have they been at once more about cats and more about something else than in Lost Cat: A True Story of Love, Desperation, and GPS Technology (public library) by firefighter-turned-writer Caroline Paul and illustrator extraordinaire Wendy MacNaughton, she of many wonderful collaborations — a tender, imaginative memoir infused with equal parts humor and humanity. (You might recall a subtle teaser for this gem in Wendy’s wonderful recent illustration of Gay Talese’s taxonomy of cats.) Though “about” a cat, this heartwarming and heartbreaking tale is really about what it means to be human — about the osmosis of hollowing loneliness and profound attachment, the oscillation between boundless affection and paralyzing fear of abandonment, the unfair promise of loss implicit to every possibility of love.

After Caroline crashes an experimental plane she was piloting, she finds herself severely injured and spiraling into the depths of depression. It both helps and doesn’t that Caroline and Wendy have just fallen in love, soaring in the butterfly heights of new romance, “the phase of love that didn’t obey any known rules of physics,” until the crash pulls them into a place that would challenge even the most seasoned and grounded of relationships. And yet they persevere as Wendy patiently and lovingly takes care of Caroline.

When Caroline returns from the hospital with a shattered ankle, her two thirteen-year-old tabbies — the shy, anxious Tibby (short for Tibia, affectionately — and, in these circumstances, ironically — named after the shinbone) and the sociable, amicable Fibby (short for Fibula, after the calf bone on the lateral side of the tibia) — are, short of Wendy, her only joy and comfort:

Tibia and Fibula meowed happily when I arrived. They were undaunted by my ensuing stupor. In fact they were delighted; suddenly I had become a human who didn’t shout into a small rectangle of lights and plastic in her hand, peer at a computer, or get up and disappear from the vicinity, only to reappear through the front door hours later. Instead, I was completely available to them at all times. Amazed by their good luck, they took full feline advantage. They asked for ear scratches and chin rubs. They rubbed their whiskers along my face. They purred in response to my slurred, affectionate baby talk. But mostly they just settled in and went to sleep. Fibby snored into my neck. Tibby snored on the rug nearby. Meanwhile I lay awake, circling the deep dark hole of depression.

Without my cats, I would have fallen right in.

And then, one day, Tibby disappears.

Wendy and Caroline proceed to flyer the neighborhood, visit every animal shelter in the vicinity, and even, in their desperation, enlist the help of a psychic who specializes in lost pets — but to no avail. Heartbroken, they begin to mourn Tibby’s loss.

And then, one day five weeks later, Tibby reappears. But once the initial elation of the recovery has worn off, Caroline begins to wonder where he’d been and why he’d left. He is now no longer eating at home and regularly leaves the house for extended periods of time — Tibby clearly has a secret place he now returns to. Even more worrisomely, he’s no longer the shy, anxious tabby he’d been for thirteen years — instead, he’s a half pound heavier, chirpy, with “a youthful spring in his step.” But why would a happy cat abandon his loving lifelong companion and find comfort — find himself, even — elsewhere?

When the relief that my cat was safe began to fade, and the joy of his prone, snoring form — sprawled like an athlete after a celebratory night of boozing — started to wear thin, I was left with darker emotions. Confusion. Jealousy. Betrayal. I thought I’d known my cat of thirteen years. But that cat had been anxious and shy. This cat was a swashbuckling adventurer back from the high seas. What siren call could have lured him away? Was he still going to this gilded place, with its overflowing food bowls and endless treats?

There only one obvious thing left to do: Track Tibby on his escapades. So Caroline, despite Wendy’s lovingly suppressed skepticism, heads to a spy store — yes, those exist — and purchases a real-time GPS tracker, complete with a camera that they program to take snapshots every few minutes, which they then attach to Tibby’s collar.

What follows is a wild, hilarious, and sweet tale of tinkering, tracking, and tenderness. Underpinning the obsessive quest is the subtle yet palpable subplot of Wendy and Caroline’s growing love for each other, the deepening of trust and affection that happens when two people share in a special kind of insanity.

“Evert quest is a journey, every journey a story. Every story, in turn, has a moral,” writes Caroline in the final chapter, then offers several “possible morals” for the story, the last of which embody everything that makes Lost Cat an absolute treat from cover to cover:

6. You can never know your cat. In fact, you can never know anyone as completely as you want.

7. But that’s okay, love is better.

Take a closer look here, then hear MacNaughton and Paul in conversation about combining creative collaboration with a romantic relationship.

2. SUSAN SONTAG: THE COMPLETE ROLLING STONE INTERVIEW

In 1978, Rolling Stone contributing editor Jonathan Cott interviewed Susan Sontag in twelve hours of conversation, beginning in Paris and continuing in New York, only a third of which was published in the magazine. More than three decades later and almost a decade after Sontag’s death, the full, wide-ranging magnificence of their tête-à-tête, spanning literature, philosophy, illness, mental health, music, art, and much more, is at last released in Susan Sontag: The Complete Rolling Stone Interview (public library) — a rare glimpse of one of modern history’s greatest minds in her element.

Cott marvels at what made the dialogue especially extraordinary:

Unlike almost any other person whom I’ve ever interviewed — the pianist Glenn Gould is the one other exception — Susan spoke not in sentences but in measured and expansive paragraphs. And what seemed most striking to me was the exactitude and “moral and linguistic fine-tuning” — as she once described Henry James’s writing style—with which she framed and elaborated her thoughts, precisely calibrating her intended meanings with parenthetical remarks and qualifying words (“sometimes,” “occasionally,” “usually,” “for the most part,” “in almost all cases”), the munificence and fluency of her conversation manifesting what the French refer to as an ivresse du discours — an inebriation with the spoken word. “I am hooked on talk as a creative dialogue,” she once remarked in her journals, and added: “For me, it’s the principal medium of my salvation.

In one segment of the conversation, Sontag discusses how the false divide between “high” and pop culture impoverishes our lives. In another, she makes a beautiful case for the value of history:

I really believe in history, and that’s something people don’t believe in anymore. I know that what we do and think is a historical creation. I have very few beliefs, but this is certainly a real belief: that most everything we think of as natural is historical and has roots — specifically in the late eighteenth and early nineteenth centuries, the so-called Romantic revolutionary period — and we’re essentially still dealing with expectations and feelings that were formulated at that time, like ideas about happiness, individuality, radical social change, and pleasure. We were given a vocabulary that came into existence at a particular historical moment. So when I go to a Patti Smith concert at CBGB, I enjoy, participate, appreciate, and am tuned in better because I’ve read Nietzsche.

In another meditation, she argues for the existential and creative value of presence:

What I want is to be fully present in my life — to be really where you are, contemporary with yourself in your life, giving full attention to the world, which includes you. You are not the world, the world is not identical to you, but you’re in it and paying attention to it. That’s what a writer does — a writer pays attention to the world. Because I’m very against this solipsistic notion that you find it all in your head. You don’t, there really is a world that’s there whether you’re in it or not.

In another passage, she considers how taking responsibility empowers rather than disempowers us:

I want to feel as responsible as I possibly can. As I told you before, I hate feeling like a victim, which not only gives me no pleasure but also makes me feel very uncomfortable. Insofar as it’s possible, and not crazy, I want to enlarge to the furthest extent possible my sense of my own autonomy, so that in friendship and love relationships I’m eager to take responsibility for both the good and the bad things. I don’t want this attitude of “I was so wonderful and that person did me in.” Even when it’s sometimes true, I’ve managed to convince myself that I was at least co-responsible for bad things that have happened to me, because it actually makes me feel stronger and makes me feel that things could perhaps be different.

The conversation, in which Sontag reaches unprecedented depths of self-revelation, also debunks some misconceptions about her public image as an intellectual in the dry, scholarly sense of the term:

Most of what I do, contrary to what people think, is so intuitive and unpremeditated and not at all that kind of cerebral, calculating thing people imagine it to be. I’m just following my instincts and intuitions. […] An argument appears to me much more like the spokes of a wheel than the links of a chain.

Originally featured earlier this month — take a closer look here.

3. MAURICE SENDAK: A CELEBRATION OF THE ARTIST AND HIS WORK

Maurice Sendak is celebrated by many, myself included, as the greatest and most influential children’s book artist of the past century. A year after Sendak’s death comes Maurice Sendak: A Celebration of the Artist and His Work (public library) — the companion volume to the wonderful 2013 exhibition at New York’s Society of Illustrators. From rich essays by historians and artists who contextualize Sendak’s life and legacy to a selection of his best-loved and notable little-known illustrations, the book is a treasure trove of insight on Sendak’s spirit, sensibility, and evolution as an artist.

Dive deeper with excerpts exploring Sendak’s lessons on art and storytelling and his lovely vintage posters celebrating the joy of reading.

4. DIVINE FURY: A HISTORY OF GENIUS

“Genius is nothing more nor less than doing well what anyone can do badly,” celebrated British novelist Amelia E. Barr wrote in her 9 rules for success in 1901. Indeed, the notion of what genius is and isn’t endures as one of our culture’s greatest fixations. We apply the label of “genius” to everyone from our greatest luminaries to exceptional children’s book editors to our dogs, and we even nickname prestigious cultural awards after it. But what, precisely, is genius? Why was the concept of it born in the first place, where did it begin, how did it evolve, and what does it mean today? That’s precisely what historian Darrin M. McMahon explores in Divine Fury: A History of Genius (public library) — a fascinating, first-of-its-kind chronicle of the evolution of genius as a cultural concept, its permutations across millennia of creative history, and its more recent role as a social equalizer and a double-edged sword of democratization.

McMahon begins:

Even today, more than 2,000 years after its first recorded use by the Roman author Plautus, [the word “genius”] continues to resonate with power and allure. The power to create. The power to divine the secrets of the universe. The power to destroy. With its hints of madness and eccentricity, sexual prowess and protean possibility, genius remains a mysterious force, bestowing on those who would assume it superhuman abilities and godlike powers. Genius, conferring privileged access to the hidden workings of the world. Genius, binding us still to the last vestiges of the divine.

Such lofty claims may seem excessive in an age when football coaches and rock stars are frequently described as “geniuses.” The luster of the word — once reserved for a pantheon of eminence, the truly highest of the high — has no doubt faded over time, the result of inflated claims and general overuse. The title of a BBC television documentary on the life of the Nobel Prize-winning physicist Richard Feynman sums up the situation: No Ordinary Genius. There was a time when such a title would have been redundant. That time is no more.

History’s 100 geniuses of literature and language, visualized. Click image for details.

McMahon argues that, in an age where we’re urged to explore the “genius” in all of us, we’ve grown increasingly obsessed with the word and the idea of genius, robbing it of substance in the process. Particularly in the last century, we’ve applied the label of “genius” frivolously and indiscriminately to everyone from rock stars to startup founders to, even, Adolf Hitler, whom TIME magazine crowned “man of the year” in 1938 for his evil genius. And yet the impulse to know — to be — genius is among our greatest, most profound human yearnings for union with divinity, something the legendary literary critic Harold Bloom has explored in his own meditation on genius. For the perfect embodiment of this desire, McMahon points to Albert Einstein, whom he considers “the quintessential modern genius”:

“I want to know how God created the world,” Einstein once observed. “I want to know his thoughts.” It was, to be sure, a manner of speaking, like the physicist’s celebrated line about the universe and dice. Still, the aspiration is telling. For genius, from its earliest origins, was a religious notion, and as such was bound up not only with the superhuman and transcendent, but also with the capacity for violence, destruction, and evil that all religions must confront.

McMahon sets out to unravel this lineage of unexpected associations by tracing the history of genius, both as a concept and as a figure, from antiquity to today, exploring a vibrant spectrum of individuals who both embodied and shaped the label — poets, philosophers, artists, scientists, inventors, composers, military strategists, entrepreneurs, and even a horse. As much a history of ideas as a psychological history of our grasping after the divine, the journey he takes us on is above all one of introspection through the lens of history. Reminding us that, as Toni Morrison memorably wrote, “definitions belong to the definers, not the defined,” McMahon argues for the social construction of genius:

If we wish to appreciate the role that genius has played in the modern world, we must recall the evil with the good, bearing in mind as we do so the uncomfortable thought that genius is ultimately the product of the hopes and longings of ordinary people. We are the ones who marvel and wonder, longing for the salvation genius might bring. We are the ones who pay homage and obeisance. In a very real sense, the creator of genius is us.

Which is not to deny that geniuses almost always possess something special, something real, however elusive that something may be. But it is to recognize the commonsense fact that genius is in part a social creation — what historians like to call a “construction” — and, as such, of service to those who build. That fact reminds us further that for all their originality (and originality is itself a defining feature of genius in its modern form), extraordinary human beings not only define their images but embody them, stepping into molds prepared by the social imaginary and the exemplars who came before. Even outliers as remarkable, as deviant, as Einstein and Hitler are no exceptions to this rule: however inimitable — however unique — their genius was partly prepared for them, worked out over the course of generations.

Originally featured in October — read the full article here.

5. MAD GIRL’S LOVE SONG

Half a century ago this year, Sylvia Plath — celebrated poet, little-known artist, lover of the world — took her own life, leaving behind her husband Ted Hughes and their two children. In the highly anticipated biography Mad Girl’s Love Song: Sylvia Plath and Life Before Ted (public library) — titled after the exquisite Plath poemAndrew Wilson explores the poorly understood period of Plath’s life before her relationship with Hughes. Diving into the darkest corners of her diaries and letters, as well as previously unavailable archives and direct interviews with those who knew Plath, Wilson sets out to “trace the sources of her mental instabilities and examine how a range of personal, economic, and societal factors — the real disquieting muses — conspired against her.”

He writes in the introduction:

In her journal in 1950 she wrote of how she was living on the ‘edge.’ She was not alone, she added, as all of us were standing on the edge of a precipice looking down into darkness, peering into an unnerving pit below.

This book will show what compelled Plath to peek over the edge and stare into the abyss of the human psyche.

Wilson notes Plath’s chronic dissonance between repression and an insatiable hunger for life:

Plath was an addict of experience, and she could not bear the fact that young women like her were denied something so life-enhancing. In the same letter she goes on to write of her deep envy of males, anger she describes as ‘insidious, malignant, latent.’

Sex — or rather the constraints and repressions surrounding it — played a central role in Plath’s creative and psychological development. She realized, as she wrote in her journal in the autumn of 1950, she was too well brought up to disregard tradition, yet she hated boys who could express themselves sexually while she had no choice but to ‘drag’ herself from one date to the next in ‘soggy desire.’ The system, she added, disgusted her.

But Wilson tends to jump to causality a little too eagerly. As Clay Shirky poignantly pointed out about the tragic loss of Aaron Swartz, “suicide is not only about proximate causes.” Wilson writes:

If too much has been made of the symptoms of Plath’s mental illness, so too little attention has been paid to its possible causes. Sylvia Plath was an angry young woman born in a country and at a time that only exacerbated and intensified her fury. Not only did she feel maddened that she could not express herself sexually, she also was furious that she had not been born into a family of greater means. Her letters and journals are full of references to feeling inferior and self-conscious because of her low status. As a scholarship girl at Smith College — one of America’s top universities for women — she was surrounded by the daughters of the country’s great and the good. She peeled potatoes, chopped vegetables, and waited on tables as a way of reducing her course fees. In order to try and take the burden off her mother — who worked at Boston University’s College of Practical Arts and Letters to pay the shortfall between her daughter’s fees and her scholarship — Sylvia volunteered for extra jobs at the college and, in whatever spare time she had, she wrote poems and stories for money. If she took boys home to her family’s two-bedroom house in Wellesley, Massachusetts — where she was forced to share a room with her mother — she worried that they would see the marks and rips in the wallpaper; on occasions like these, the lights would have to be kept low so as to try and disguise the blemishes. In her first semester at Smith, in the fall of 1950, she wrote in her journal of the arduous transition period between childhood and young adulthood. To help her make sense of this new, troubling reality, she made a list of certain aspects of life that she found difficult, an inventory of notes addressed to herself that she could use to boost her confidence when it was low. One of the sections focuses on her economic position in society. She noted how she knew she would have to compete with other girls who had been born into wealthier families. The Plaths, she realized, were not only of modest means but they didn’t come from a line of well-connected intellectuals. She observed how boys from richer families would often remark, in a casual fashion, of her ‘side of town,’ and although they didn’t mean to be cruel, she felt the comments keenly.

Originally featured in February — read the full article here.

6. DARWIN: A GRAPHIC BIOGRAPHY

Joining other famous graphic biographies of cultural icons like Richard Feynman, Hunter S. Thompson, The Carter Family, and Steve Jobs, Darwin: A Graphic Biography (public library) offers a delightful visual take on the story of the father of evolution, decoder of human emotion, hopeless romantic, and occasional grump.

Written by journalist Eugene Byrne and illustrated by cartoonist Simon Gurr, the story takes us into the life and times of Darwin — from a curious child on a “beeting” expedition to a patient young man persevering through the ups and downs of battling creationist oppression to a worldwide legend — tracing his intellectual adventures amidst the fascinating scientific world of the 1800s.

The best part? This illustrated version of Darwin’s famous balance sheet on the pros and cons of marriage:

Originally featured in February — see more panels here.

7. EIGHTY DAYS

“Anything one man can imagine, other men can make real,” science fiction godfather Jules Verne famously proclaimed. He was right about the general sentiment but oh how very wrong about its gendered language: Sixteen years after Verne’s classic novel Eighty Days Around the World, his vision for speed-circumnavigation would be made real — but by a woman. On the morning of November 14, 1889, Nellie Bly, an audacious newspaper reporter, set out to outpace Verne’s fictional itinerary by circumnavigating the globe in seventy-five days, thus setting the real-world record for the fastest trip around the world. In Eighty Days: Nellie Bly and Elizabeth Bisland’s History-Making Race Around the World (public library), Matthew Goodman traces the groundbreaking adventure, beginning with a backdrop of Bly’s remarkable journalistic fortitude and contribution to defying our stubbornly enduring biases about women writers:

No female reporter before her had ever seemed quite so audacious, so willing to risk personal safety in pursuit of a story. In her first exposé for The World, Bly had gone undercover … feigning insanity so that she might report firsthand on the mistreatment of the female patients of the Blackwell’s Island Insane Asylum. … Bly trained with the boxing champion John L. Sullivan; she performed, with cheerfulness but not much success, as a chorus girl at the Academy of Music (forgetting the cue to exit, she momentarily found herself all alone onstage). She visited with a remarkable deaf, dumb, and blind nine-year-old girl in Boston by the name of Helen Keller. Once, to expose the workings of New York’s white slave trade, she even bought a baby. Her articles were by turns lighthearted and scolding and indignant, some meant to edify and some merely to entertain, but all were shot through with Bly’s unmistakable passion for a good story and her uncanny ability to capture the public’s imagination, the sheer force of her personality demanding that attention be paid to the plight of the unfortunate, and, not incidentally, to herself.

For all her extraordinary talent and work ethic, Bly’s appearance was decidedly unremarkable — a fact that shouldn’t matter, but one that would be repeatedly remarked upon by her critics and commentators, something we’ve made sad little progress on in discussing women’s professional, intellectual, and creative merit more than a century later. Goodman paints a portrait of Bly:

She was a young woman in a plaid coat and cap, neither tall nor short, dark nor fair, not quite pretty enough to turn a head: the sort of woman who could, if necessary, lose herself in a crowd.

[…]

Her voice rang with the lilt of the hill towns of western Pennsylvania; there was an unusual rising inflection at the ends of her sentences, the vestige of an Elizabethan dialect that had still been spoken in the hills when she was a girl. She had piercing gray eyes, though sometimes they were called green, or blue-green, or hazel. Her nose was broad at its base and delicately upturned at the end — the papers liked to refer to it as a “retroussé” nose — and it was the only feature about which she was at all self-conscious. She had brown hair that she wore in bangs across her forehead. Most of those who knew her considered her pretty, although this was a subject that in the coming months would be hotly debated in the press.

But, as if the ambitious adventure weren’t scintillating enough, the story takes an unexpected turn: That fateful November morning, as Bly was making her way to the journey’s outset at the Hoboken docks, a man named John Brisben Walker passed her on a ferry in the opposite direction, traveling from Jersey City to Lower Manhattan. He was the publisher of a high-brow magazine titled The Cosmopolitan, the same publication that decades later, under the new ownership of William Randolph Hearst, would take a dive for the commercially low-brow. On his ferry ride, Walker skimmed that morning’s edition of The World and paused over the front-page feature announcing Bly’s planned adventure around the world. A seasoned media manipulator of the public’s voracious appetite for drama, he instantly birthed an idea that would seize upon a unique publicity opportunity — The Cosmopolitan would send another circumnavigator to race against Bly. To keep things equal, it would have to be a woman. To keep them interesting, she’d travel in the opposite direction.

And so it went:

Elizabeth Bisland was twenty-eight years old, and after nearly a decade of freelance writing she had recently obtained a job as literary editor of The Cosmopolitan, for which she wrote a monthly review of recently published books entitled “In the Library.” Born into a Louisiana plantation family ruined by the Civil War and its aftermath, at the age of twenty she had moved to New Orleans and then, a few years later, to New York, where she contributed to a variety of magazines and was regularly referred to as the most beautiful woman in metropolitan journalism. Bisland was tall, with an elegant, almost imperious bearing that accentuated her height; she had large dark eyes and luminous pale skin and spoke in a low, gentle voice. She reveled in gracious hospitality and smart conversation, both of which were regularly on display in the literary salon that she hosted in the little apartment she shared with her sister on Fourth Avenue, where members of New York’s creative set, writers and painters and actors, gathered to discuss the artistic issues of the day. Bisland’s particular combination of beauty, charm, and erudition seems to have been nothing short of bewitching.

But Bisland was no literary bombshell. Wary of beauty’s fleeting and superficial nature — she once lamented, “After the period of sex-attraction has passed, women have no power in America” — she blended Edison’s circadian relentlessness and Tchaikovsky’s work ethic:

She took pride in the fact that she had arrived in New York with only fifty dollars in her pocket, and that the thousands of dollars now in her bank account had come by virtue of her own pen. Capable of working for eighteen hours at a stretch, she wrote book reviews, essays, feature articles, and poetry in the classical vein. She was a believer, more than anything else, in the joys of literature, which she had first experienced as a girl in ancient volumes of Shakespeare and Cervantes that she found in the library of her family’s plantation house. (She taught herself French while she churned butter, so that she might read Rousseau’s Confessions in the original — a book, as it turned out, that she hated.) She cared nothing for fame, and indeed found the prospect of it distasteful.

And yet, despite their competitive circumstances and seemingly divergent dispositions, something greater bound the two women together, some ineffable force of culture that quietly united them in a bold defiance of their era’s normative biases:

On the surface the two women … were about as different as could be: one woman a Northerner, the other from the South; one a scrappy, hard-driving crusader, the other priding herself on her gentility; one seeking out the most sensational of news stories, the other preferring novels and poetry and disdaining much newspaper writing as “a wild, crooked, shrieking hodge-podge,” a “caricature of life.” Elizabeth Bisland hosted tea parties; Nellie Bly was known to frequent O’Rourke’s saloon on the Bowery. But each of them was acutely conscious of the unequal position of women in America. Each had grown up without much money and had come to New York to make a place for herself in big-city journalism, achieving a hard-won success in what was still, unquestionably, a man’s world.

Originally featured in May — read the full article, including Bly’s entertaining illustrated packing list, here.

8. ODD TYPE WRITERS

Famous authors are notorious for their daily routines — sometimes outrageous, usually obsessive, invariably peculiar. In Odd Type Writers: From Joyce and Dickens to Wharton and Welty, the Obsessive Habits and Quirky Techniques of Great Authors (public library), Brooklyn-based writer Celia Blue Johnson takes us on a guided tour of great writers’ unusual techniques, prompts, and customs of committing thought to paper, from their ambitious daily word quotas to their superstitions to their inventive procrastination and multitasking methods.

As curious as these habits are, however, Johnson reminds us that public intellectuals often engineer their own myths, which means the quirky behaviors recorded in history’s annals should be taken with a grain of Salinger salt. She offers a necessary disclaimer, enveloped in a thoughtful meta-disclaimer:

One must always keep in mind that these writers and the people around them may have, at some point, embellished the facts. Quirks are great fodder for gossip and can morph into gross exaggeration when passed from one person to the next. There’s also no way to escape the self-mythologizing particularly when dealing with some of the greatest storytellers that ever lived. Yet even when authors stretch the truth, they reveal something about themselves, when it is the desire to project a certain image or the need to shy away from one.

Jack Kerouac’s hand-drawn cross-country road trip map from ‘On the Road’

Mode and medium of writing seem to be a recurring theme of personal idiosyncrasy. Wallace Stevens composed his poetry on slips of paper while walking — an activity he, like Maira Kalman, saw as a creative stimulant — then handed them to his secretary to type up. Edgar Allan Poe, champion of marginalia, wrote his final drafts on separate pieces of paper attached into a running scroll with sealing wax. Jack Kerouac was especially partial to scrolling: In 1951, planning the book for years and amassing ample notes in his journals, he wrote On The Road in one feverish burst, letting it pour onto pages taped together into one enormously long strip of paper — a format he thought lent itself particularly well to his project, since it allowed him to maintain his rapid pace without pausing to reload the typewriter at the end of each page. When he was done, he marched into his editor Robert Giroux’s office and proudly spun out the scroll across the floor. The result, however, was equal parts comical and tragic:

To [Kerouac’s] dismay, Giroux focused on the unusual packaging. He asked, “But Jack, how can you make corrections on a manuscript like that?” Giroux recalled saying, “Jack, you know you have to cut this up. It has to be edited.” Kerouac left the office in a rage. It took several years for Kerouac’s agent, Sterling Lord, to finally find a home for the book, at the Viking Press.

James Joyce in his white coat

James Joyce wrote lying on his stomach in bed, with a large blue pencil, clad in a white coat, and composed most of Finnegans Wake with crayon pieces on cardboard. But this was a matter more of pragmatism than of superstition or vain idiosyncrasy: Of the many outrageously misguided myths the celebrated author of Ulysses and wordsmith of little-known children’s books, one was actually right: he was nearly blind. His childhood myopia developed into severe eye problems by his twenties. To make matters worse, he developed rheumatic fever when he was twenty-five, which resulted in a painful eye condition called iritis. By 1930, he had undergone twenty-five eye surgeries, none of which improved his sight. The large crayons thus helped him see what he was writing, and the white coat helped reflect more light onto the page at night. (As someone partial to black bedding, not for aesthetic reasons but because I believe it provides a deeper dark at night, I can certainly relate to Joyce’s seemingly arbitrary but actually physics-driven attire choice.)

Virginia Woolf was equally opinionated about the right way to write as she was about the right way to read. In her twenties, she spent two and a half hours every morning writing, on a three-and-half-foot tall desk with an angled top that allowed her to look at her work both up-close and from afar. But according to her nephew and irreverent collaborator, Quentin Bell, Woolf’s prescient version of today’s trendy standing desk was less a practical matter than a symptom of her sibling rivalry with her sister, the Bloomsbury artist Vanessa Bell — the same sibling rivalry that would later inspire a charming picture-book: Vanessa painted standing, and Virginia didn’t want to be outdone by her sister. Johnson cites Quentin, who was known for his wry family humor:

This led Virginia to feel that her own pursuit might appear less arduous than that of her sister unless she set matters on a footing of equality.

Many authors measured the quality of their output by uncompromisingly quantitative metrics like daily word quotas. Jack London wrote 1,000 words a day every single day of his career and William Golding once declared at a party that he wrote 3,000 words daily, a number Norman Mailer and Arthur Conan Doyle shared. Raymond Chandler, a man of strong opinions on the craft of writing, didn’t subscribe to a specific daily quota, but was known to write up to 5,000 words a day at his most productive. Anthony Trollope, who began his day promptly at 5:30 A.M. every morning, disciplined himself to write 250 words every 15 minutes, pacing himself with a watch. Stephen King does whatever it takes to reach his daily quota of 2,000 adverbless words and Thomas Wolfe keeps his at 1,800, not letting himself stop until he has reached it.

Flannery O’Connor and her peacocks

We already know how much famous authors loved their pets, but for many their non-human companions were essential to the creative process. Edgar Allan Poe considered his darling tabby named Catterina his literary guardian who “purred as if in complacent approval of the world proceeding under [her] supervision.” Flannery O’Connor developed an early affection for domestic poultry, from her childhood chicken (which, curiously enough, could walk backwards and once ended up in a newsreel clip) to her growing collection of pheasants, ducks, turkeys, and quail. Most famously, however, twenty-something O’Connor mail-ordered six peacocks, a peahen, and four peachicks, which later populated her fiction. But by far the most bizarre pet-related habit comes from Colette, who enlisted her dog in a questionable procrastination mechanism:

Colette would study the fur of her French bulldog, Souci, with a discerning eye. Then she’d pluck a flea from Souci’s back and would continue the hunt until she was ready to write.

But arguably the strangest habit of all comes from Friedrich Schiller, relayed by his friend Goethe:

[Goethe] had dropped by Schiller’s home and, after finding that his friend was out, decided to wait for him to return. Rather than wasting a few spare moments, the productive poet sat down at Schiller’s desk to jot down a few notes. Then a peculiar stench prompted Goethe to pause. Somehow, an oppressive odor had infiltrated the room.

Goethe followed the odor to its origin, which was actually right by where he sat. It was emanating from a drawer in Schiller’s desk. Goethe leaned down, opened the drawer, and found a pile of rotten apples. The smell was so overpowering that he became light-headed. He walked to the window and breathed in a few good doses of fresh air. Goethe was naturally curious about the trove of trash, though Schiller’s wife, Charlotte, could only offer the strange truth: Schiller had deliberately let the apples spoil. The aroma, somehow, inspired him, and according to his spouse, he “could not live or work without it.”

Charles Dickens’s manuscript for ‘Our Mutual Friend.’ Image courtesy of The Morgan Library.

Then there was the color-coding of the muses: In addition to his surprising gastronome streak, Alexandre Dumas was also an aesthete: For decades, he penned all of his fiction on a particular shade of blue paper, his poetry on yellow, and his articles on pink; on one occasion, while traveling in Europe, he ran out of his precious blue paper and was forced to write on a cream-colored pad, which he was convinced made his fiction suffer. Charles Dickens was partial to blue ink, but not for superstitious reasons — because it dried faster than other colors, it allowed him to pen his fiction and letters without the drudgery of blotting. Virginia Woolf used different-colored inks in her pens — greens, blues, and purples. Purple was her favorite, reserved for letters (including her love letters to Vita Sackville-West, diary entries, and manuscript drafts. Lewis Carroll also preferred purple ink (and shared with Woolf a penchant for standing desks), but for much more pragmatic reasons: During his years teaching mathematics at Oxford, teachers were expected to use purple ink to correct students’ work — a habit that carried over to Carroll’s fiction.

But lest we hastily surmise that writing in a white coat would make us a Joyce or drowning pages in purple ink a Woolf, Johnson prefaces her exploration with another important, beautifully phrased disclaimer:

That power to mesmerize has an intangible, almost magical quality, one I wouldn’t dare to try to meddle with by attempting to define it. It was never my goal as I wrote this book to discover what made literary geniuses tick. The nuances of any mind are impossible to pinpoint.

[…]

You could adopt one of these practices or, more ambitiously, combine several of them, and chances are you still wouldn’t invoke genius. These tales don’t hold a secret formula for writing a great novel. Rather, the authors in the book prove that the path to great literature is paved with one’s own eccentricities rather than someone else’s.

Originally featured in September — for more quirky habits, read the original article here. Runner up: Mason Currey’s Daily Rituals: How Artists Work.

9. ITALO CALVINO: LETTERS, 1941–1985

Italo Calvino: Letters, 1941-1985 (public library) offers more than four decades of wisdom in 600+ pages of personal correspondence by one of the 20th century’s most enchanting writers and most beautiful minds. In one letter, written on July 27, 1949, Calvino contributes one of his many insights on writing:

To write well about the elegant world you have to know it and experience it to the depths of your being just as Proust, Radiguet and Fitzgerald did: what matters is not whether you love it or hate it, but only to be quite clear about your position regarding it.

In another, he considers the secret of living well:

The inferno of the living is not something that will be; if there is one, it is what is already here, the inferno where we live every day, that we form by being together. There are two ways to escape suffering it. The first is easy for many: accept the inferno and become such a part of it that you can no longer see it. The second is risky and demands constant vigilance and apprehension: seek and learn to recognize who and what, in the midst of inferno, are not inferno, then make them endure, give them space.

Sample the altogether fantastic volume with Calvino’s advice on writing, his prescient meditation on abortion and the meaning of life, his poetic resume, and his thoughts on America.

10. DUKE: A LIFE OF DUKE ELLINGTON

Much like Freud engineered his own myth and Salinger crafted his personal legend, jazz legend Duke Ellington — whose funeral was witnessed by 10,000 people in the pews at the Cathedral Church of St. John the Divine, another 2,500 listening outside via loudspeakers, and thousands more tuned into the live radio broadcast, even prompting President Nixon to take a timeout from Watergate and praise “America’s foremost composer” — sculpted his public image with meticulous, obsessive, almost paranoid precision. In Duke: A Life of Duke Ellington (public library), writer, playwright, librettist, and Wall Street Journal theater critic Terry Teachout sets out to lift the veneer of Ellington’s polished public persona and uncover the mysterious complexity of Duke’s private person. Though Teachout — who also penned Pops, the excellent 2009 biography of Louis Armstrong — calls his biography “not so much a work of scholarship as an act of synthesis” for its collaging of existing research, interviews, and materials, don’t let his humility deceive you: This is a masterwork of dimensional insight into an icon who sought to flatten and flatter himself as much as possible and to shroud his exceptional artistry in exceptional artifice, a man woven of paradoxes, who, despite his chronic failings of private self-control, exerted his every faculty on controlling his public image. And yet, somehow, Teachout manages to peel away these protective layers and expose the flawed human being beneath them by elevating rather than diminishing Ellington’s humanity, enriching rather than discrediting his legacy.

Despite surrounding himself with a formidable entourage of deft PR custodians, he was ultimately his own best publicist — a man who employed the same charisma that made him an incredible entertainer in making his off-stage image as credible as possible, despite its assiduous artifice and methodical manipulation. Teachout writes:

That was Ellington’s way. He talked not to explain himself but to conceal himself. Even Ruth, his adoring younger sister, said that he “definitely wasn’t direct. He wasn’t direct with anybody about anything.” Yet he talked so fluently and impressively that nearly everyone believed him, save for those who had reason to know better.

Behind closed doors: Composing at the Dorchester, his favorite London hotel, in 1963. Unposed offstage photos of Ellington are comparatively rare. He went out of his way to shape his public image to his liking—and to keep his private life out of the papers

His publicists — who dubbed him “Harlem’s Aristocrat of Jazz” — took great care to echo and amplify the image Duke himself was projecting, pitching him not only as a mere jazzman but as a true artist bearing the seal of approval of the era’s glitterati. They issued actual publicity manuals that were sent out to the managers of theaters and ballrooms where Ellington performed. One read:

Sell Ellington as a great artist, a musical genius whose unique style and individual theories of harmony have created a new music. . . . Ellington’s genius as a composer, arranger and musician has won him the respect and admiration of such authorities as Percy Grainger, head of the department of music at the New York University; Basil Cameron, conductor of the Seattle Symphony Orchestra; Leopold Stokowski, famed conductor of the celebrated Philadelphia Orchestra; Paul Whiteman, whose name is synonymous with jazz, and many others.

Ellington was especially attached to the idea of serving as a spokesperson for African Americans — an aspiration admirable enough on the surface, but only if unbridled from ego and self-inflation, something of which Ellington was far from innocent given the amount of personal publicity he poured into his objective. To support this goal of his, another publicity pamphlet emphasized his presentability in addition to his talent:

He is as genial as he is intelligent, always creates a good impression upon newspaper people with whom he comes in contact and invariably supplies them with good copy for their stories.

Ellington’s lifelong desire to “act on behalf of the race,” as he himself put it, was an expression of his own life’s contradictions — the son of a butler and the grandson of a slave, he carried himself with an air of regality; a high school dropout, he made a special effort to teach himself the etiquette and manners of high society. Teachout notes the effect of this deliberate application:

For all his polish, it was his artistry, not his personality, that was the source of his enduring appeal. But it was the personality that made white people who might not otherwise have done so give him a second glance, and in time it opened doors of opportunity through which few other blacks had been allowed to pass.

Arguably the most accurate, succinctly eloquent description of Ellington’s elusive personhood comes from Rex Stewart, cornetist of the Duke Ellington Orchestra:

Ellington is the most complex and paradoxical individual that I’ve ever known . . . a combination of Sir Galahad, Scrooge, Don Quixote, and God knows what other saints and sinners that were apt to pop out of his ever-changing personality.

Indeed, Ellington was a bundle of inner contradictions — the kind we all grapple with by virtue of being human, only his were far more numerous, more entangled, and more full of friction than average. Teachout writes:

He was at once deeply (if superstitiously) religious and a tireless philanderer who, in the words of an admiring friend, had the sexual appetite of “a romping, stomping alley cat.” He pretended to be a devoted family man for the benefit of the ever-vigilant press, he deserted Edna, his first and only wife, later settling into a long-term relationship with a Cotton Club showgirl whom he chose not to marry (he never divorced Edna) and on whom he cheated as often as he liked.

In fact, one of Ellington’s most pressing publicity concerns was keeping his affairs out of the papers — information he felt would greatly compromise the very presentability and wholesomeness he worked so hard to craft in order to feel like he belonged in high society. As Teachout observes, he went to great lengths to make sure “his fans saw only what he wished them to see, and nothing more.” At one point, he even went as far as paying off gossip columnists and placing expensive ads in newspapers to prevent his relationship with Evie from being reported.

Teachout, however, takes great care not to dim the enormity of Ellington’s talent in light of his immutable imperfection, noting instead that he used the former as a vehicle for both exorcising and tucking away the latter:

He was, like Chopin, Paul Klee, Jorge Luis Borges, and Flannery O’Connor, a disciplined lyric miniaturist who knew how to express the grandest of emotions on the smallest of scales, and who needed no more room in which to suggest his immortal longings.

Originally featured in October — take a deeper dive here, then also see this fascinating excerpt on Duke’s diet.

Complement with Teachout’s Design Matters interview, where he talks to Debbie Millman about the book and Duke’s elusive “immortal longings”:

11. TURING: PIONEER OF THE INFORMATION AGE

It is to Alan Turinggodfather of the digital universe, voracious reader, tragic hero of his era’s inhumane bigotry — that we owe an enormous amount of today’s givens, including my writing this very sentence and your reading it. In Turing: Pioneer of the Information Age (public library), philosophy professor and Turing Archive for the History of Computing director B. Jack Copeland turns to conversations and correspondence with some of Turing’s closest friends and collaborators to explore the life and legacy of this man of uncommon genius with unprecedented depth and insight, from the invention of the Universal Turing Machine — the granddaddy of the modern stored program computer — to Turing’s codebreaking feats during WWII to the tragic and mysterious circumstances of his death.

The first personal computer (Image courtesy Harry Huskey)

Copeland succinctly captures the magnitude of Turing’s contribution to contemporary life:

To Turing we owe the brilliant innovation of storing applications, and all the other programs necessary for computers to do our bidding, inside the computer’s memory, ready to be opened when we wish. We take for granted that we use the same slab hardware to shop, manage our finances, type our memoirs, play our favorite music and videos, and send instant messages across the street or around the world. Like many great ideas, this one now seems as obvious as the wheel and the arch, but with this single invention — the stored-program universal computer — Turing changed the way we live.

Alan Turing

Indeed, it took an exceptional mind — one inhabiting the outermost fringes of the obvious, in every imaginable way — to conceive of such world-changing technology. Copeland goes on to paint a portrait of Turing more dimensional and moving than ever before:

He was a Spartan in all things, inner and outer, and had no time for pleasing decor, soft furnishings, superfluous embellishment, or unnecessary words. To him what mattered was the truth. Everything else was mere froth.

[…]

What would it have been like to meet him? Turing was tallish (5 feet 10 inches) and broadly built. He looked strong and fit. You might have mistaken his age, as he always seemed younger than he was. He was good-looking but strange. If you came across him at a party, you would certainly notice him. In fact, you might ask, ‘Who on earth is that?’ It wasn’t just his shabby clothes or dirty fingernails. It was the whole package. Part of it was the unusual noise he made. This has often been described as a stammer, but it wasn’t. It was his way of preventing people from interrupting him, while he thought out what he was trying to say. ‘Ah… Ah… Ah… Ah… Ah.’ He did it loudly.

If you crossed the room to talk to him, you would have probably found him gauche and rather reserved. He was decidedly lah-di-dah, but the reserve wasn’t standoffishness. He was shy, a man of few words. Polite small talk did not come easily to him. He might — if you were lucky — smile engagingly, his blue eyes twinkling, and come out with something quirky that would make you laugh. If conversation developed, you’d probably find him vivid and funny. He might ask you, in his rather high-pitched voice, whether you think a computer could ever enjoy strawberries and cream or could make you fall in love with it.

[…]

Like everyone else, Turing craved affection and company, but he never seemed to quite fit in anywhere. He was bothered by his own social strangeness — although, like his hair, it was a force of nature he could do little about. Occasionally he could be very rude. If he thought that someone wasn’t listening to him with sufficient attention, he would simply walk away. Turing was the sort of man who, usually unintentionally, ruffled people’s feathers — especially pompous people, people in authority, and scientific poseurs. … Beneath the cranky, craggy, irreverent exterior there was an unworldly innocence, though, as well as sensitivity and modesty.

Originally featured in April — read the original article here.

12. AUTOBIOGRAPHY OF MARK TWAIN, VOLUME 2

Autobiography of Mark Twain, Volume 2: The Complete and Authoritative Edition (public library) — the highly anticipated sequel to the excellent first installment — reveals previously unknown facets of the greatest American satirist, celebrated as “the Lincoln of literature.” A large part of what made Twain Twain was his capacity for cultural nitpicking, from his irreverent advice to little girls to his critique of the press to his snarky commentary on the outrageous requests he received, but one subject to which Twain applied his exquisite satire with absolute seriousness was religion — something that comes fully ablaze in this new volume.

In April of 1906, Twain — who famously believed that any claim of originality was merely misguided narcissism — offers this humorous lament on religion as a manifestation of human egotism:

The human race … sits up nine nights in the week to admire its own originality. The race has always been able to think well of itself, and it doesn’t like people who throw bricks at its naïve self-appreciation. It is sensitive upon this point. The other day I furnished a sentiment in response to a man’s request — to wit:

“The noblest work of God?” Man.

“Who found it out?” Man.

I thought it was very good, and smart, but the other person didn’t.

Twain treated all forms of dogmatic authority, from religious to parental, with equal irreverence. Spread from his ‘Advice to Little Girls’ illustrated by Vladimir Rudinsky. Click image for more.

In another meditation, dictated in 1906 and posthumously published in 1963 in the Hudson Review under the title “Reflections on Religion,” then eventually included in the altogether excellent The Bible According to Mark Twain: Irreverent Writings on Eden, Heaven, and the Flood by America’s Master Satirist, Twain revisits the subject of evidence-free idolatry of deistic character:

We deal in a curious and laughable confusion of notions concerning God. We divide Him in two, bring half of Him down to an obscure and infinitesimal corner of the world to confer salvation upon a little colony of Jews — and only Jews, no one else — and leave the other half of Him throned in heaven and looking down and eagerly and anxiously watching for results. We reverently study the history of the earthly half, and deduce from it the conviction that the earthly half has reformed, is equipped with morals and virtues, and in no way resembles the abandoned, malignant half that abides upon the throne. We conceive that the earthly half is just, merciful, charitable, benevolent, forgiving, and full of sympathy for the sufferings of mankind and anxious to remove them.

Apparently we deduce this character not by examining facts, but by diligently declining to search them, measure them, and weigh them. The earthly half requires us to be merciful, and sets us an example by inventing a lake of fire and brimstone in which all of us who fail to recognize and worship Him as God are to be burned through all eternity. And not only we, who are offered these terms, are to be thus burned if we neglect them, but also the earlier billions of human beings are to suffer this awful fate, although they all lived and died without ever having heard of Him or the terms at all. This exhibition of mercifulness may be called gorgeous. We have nothing approaching it among human savages, nor among the wild beasts of the jungle.

‘All gods are better than their reputation,’ inscription dated December 23, 1902 from a first edition of ‘A Double-Barrelled Detective Story’ (Kevin MacDonnell Collection)

An early proponent of the conviction that evidence should outweigh mythology, he continues:

There is no evidence that there is to be a Heaven hereafter. … Heaven exists solely upon hearsay evidence — evidence furnished by unknown persons; persons who did not prove that they had ever been there.

[…]

According to the hearsay evidence the character of every conspicuous god is made up of love, justice, compassion, forgiveness, sorrow for all suffering and desire to extinguish it. Opposed to this beautiful character — built wholly upon valueless hearsay evidence – it is the absolute authentic evidence furnished us every day in the year, and verifiable by our eyes and our other senses, that the real character of these gods is destitute of love, mercy, compassion, justice and other gentle and excellent qualities, and is made up of all imaginable cruelties, persecutions and injustices. The hearsay character rests upon evidence only — exceedingly doubtful evidence. The real character rests upon proof — proof unassailable.

Twain then traces the evolution — or, as it were, devolution — of religion over the course of human history, considering Christianity’s odds for survival:

Do I think the Christian religion is here to stay? Why should I think so? There had been a thousand religions before it was born. They are all dead. There had been millions of gods before ours was invented. Swarms of them are dead and forgotten long ago. Our is by long odds the worst God that the ingenuity of man has begotten from his insane imagination — and shall He and his Christianity be immortal against the great array of probabilities furnished by the theological history of the past? No. I think that Christianity and its God must follow the rule. They must pass on in their turn and make room for another God and a stupider religion. Or perhaps a better [one] than this? No. That is not likely. History shows that in the matter of religions we progress backward and not the other way.

(More than a century later, legendary atheist Richard Dawkins would come to echo this sentiment in his newly published biography, writing: “I learned from my mother that Christianity was one of many religions and they contradicted each other. They couldn’t all be right, so why believe the one in which, by sheer accident of birth, I happened to be brought up?”)

Originally featured in October — full article here.

13. THE SECRET HISTORY OF VLADIMIR NABOKOV

Vladimir Nabokov — beloved author, butterfly-lover, no-bullshit lecturer, hater of clichés, man of strong opinions — endures as Russia’s most revered literary émigré export. While his journey to cultural acclaim in America was in many ways a story of hope, it was also one underpinned by profound sadness and loss that would come to permeate his work. After the Bolshevik Revolution, when Nabokov was only eighteen, his family was forced to flee their hometown of St. Petersburg. As refugees in nomadic exile, they finally settled in Berlin in 1920. Two years later, Nabokov’s father, who had become secretary of Russian Provisional Government, was killed by accident while trying to shield the real target of a political assassination. Shortly thereafter, Nabokov’s mother and sister moved to Prague, but he remained in Berlin and garnered considerable recognition as a poet. In 1923, he met Véra Evseyevna Slonim, the Jewish-Russian love of his life, with whom he’d remain for the rest of his days.

In The Secret History of Vladimir Nabokov (public library), Andrea Pitzer, founder of Harvard’s narrative nonfiction site Nieman Storyboard, shines an unprecedented, kaleidoscopic spotlight on the author’s largely enigmatic life and its complex political context. What few realize — and what Pitzer reveals through newly-declassified intelligence files and rigorously researched military records — is that Nabokov wove serious and unsettling political history into the fabric of his fiction, which had gone undetected for decades: until now.

Vladimir Nabokov’s United States Certificate of Naturalization
(Image courtesy Andrea Pitzer)

Originally featured in April, with the fascinating story of Nabokov’s travails with homeland security.

* * *

Honorable mentions: An Appetite for Wonder: The Making of a Scientist by Richard Dawkins, Salinger: The Private War of J.D. Salinger by David Shields and Shane Salerno, The Girls of Atomic City: The Untold Story of the Women Who Helped Win World War II by Denise Kiernan, and The Selected Letters of Willa Cather, edited by Andrew Jewell and Janis Stout.

BP

Helen Keller on Optimism

“The struggle which evil necessitates is one of the greatest blessings. It makes us strong, patient, helpful men and women. It lets us into the soul of things and teaches us that although the world is full of suffering, it is full also of the overcoming of it.”

Decades before the dawn of the positive psychology movement and a century before what neuroscience has taught us about the benefits of optimism, Helen Keller (June 27, 1880–June 1, 1968) — the remarkable woman who grew up without sight and hearing until, with the help of her teacher Annie Sullivan, she learned to speak, read, write, and inhabit the life of the mind with such grace and fierceness that made her one of history’s most inspired intellectual heroes — penned a timeless treatise on optimism as a philosophy of life. Simply titled Optimism (public library | free ebook), it was originally published in 1903 and written — a moment of pause here — after Keller learned to write on a grooved board over a sheet of paper, using the grooves and the end of her index pencil to guide her writing.

She opens the first half of the book, Optimism Within, by reflecting on the universal quest for happiness, that alluring and often elusive art-science at the heart of all human aspiration:

Could we choose our environment, and were desire in human undertakings synonymous with endowment, all men would, I suppose, be optimists. Certainly most of us regard happiness as the proper end of all earthly enterprise. The will to be happy animates alike the philosopher, the prince and the chimney-sweep. No matter how dull, or how mean, or how wise a man is, he feels that happiness is his indisputable right.

But Keller admonishes against the “what-if” mentality that pegs our happiness on the attainment of material possession, which always proves vacant, rather than on accessing a deeper sense of purpose:

Most people measure their happiness in terms of physical pleasure and material possession. Could they win some visible goal which they have set on the horizon, how happy they could be! Lacking this gift or that circumstance, they would be miserable. If happiness is to be so measured, I who cannot hear or see have every reason to sit in a corner with folded hands and weep. If I am happy in spite of my deprivations, if my happiness is so deep that it is a faith, so thoughtful that it becomes a philosophy of life, — if, in short, I am an optimist, my testimony to the creed of optimism is worth hearing.

Recounting her own miraculous blossoming from the inner captivity of a deaf-mute to the intellectual height of a cultural luminary, she brings exquisite earnestness to this rhetorical question:

Once I knew only darkness and stillness. Now I know hope and joy. Once I fretted and beat myself against the wall that shut me in. Now I rejoice in the consciousness that I can think, act and attain heaven. … Can anyone who escaped such captivity, who has felt the thrill and glory of freedom, be a pessimist?
My early experience was thus a leap from bad to good. If I tried, I could not check the momentum of my first leap out of the dark; to move breast forward as a habit learned suddenly at that first moment of release and rush into the light. With the first word I used intelligently, I learned to live, to think, to hope.

Still, Keller is careful to distinguish between intelligent and reckless optimism:

Optimism that does not count the cost is like a house builded on sand. A man must understand evil and be acquainted with sorrow before he can write himself an optimist and expect others to believe that he has reason for the faith that is in him.

Reflecting once again on her own experience, she argues that, much like the habits of mind William James advocated for as the secret of life, optimism is a choice:

I know what evil is. Once or twice I have wrestled with it, and for a time felt its chilling touch on my life; so I speak with knowledge when I say that evil is of no consequence, except as a sort of mental gymnastic. For the very reason that I have come in contact with it, I am more truly an optimist. I can say with conviction that the struggle which evil necessitates is one of the greatest blessings. It makes us strong, patient, helpful men and women. It lets us into the soul of things and teaches us that although the world is full of suffering, it is full also of the overcoming of it. My optimism, then, does not rest on the absence of evil, but on a glad belief in the preponderance of good and a willing effort always to cooperate with the good, that it may prevail. I try to increase the power God has given me to see the best in everything and every one, and make that Best a part of my life. The world is sown with good; but unless I turn my glad thoughts into practical living and till my own field, I cannot reap a kernel of the good.

Keller explores the two anchors of optimism — one’s inner life and the outer world — and admonishes against the toxic nature of doubt:

I demand that the world be good, and lo, it obeys. I proclaim the world good, and facts range themselves to prove my proclamation overwhelmingly true. To what good I open the doors of my being, and jealously shut them against what is bad. Such is the force of this beautiful and willful conviction, it carries itself in the face of all opposition. I am never discouraged by absence of good. I never can be argued into hopelessness. Doubt and mistrust are the mere panic of timid imagination, which the steadfast heart will conquer, and the large mind transcend.

Like Isabel Allende, who sees creativity as order to the chaos of life, Keller riffs on Carlyle and argues for creative enterprise as a source of optimism:

Work, production, brings life out of chaos, makes the individual a world, an order; and order is optimism.

And yet she is sure to caution against the cult of productivity, a reminder all the timelier today as we often squander presence in favor of productivity, and uses Darwin’s famed daily routine to make her point:

Darwin could work only half an hour at a time; yet in many diligent half-hours he laid anew the foundations of philosophy. I long to accomplish a great and noble task; but it is my chief duty and joy to accomplish humble tasks as though they were great and noble. It is my service to think how I can best fulfill the demands that each day makes upon me, and to rejoice that others can do what I cannot.

She sees optimism, like Italo Calvino did literature, as a collective enterprise:

I love the good that others do; for their activity is an assurance that whether I can help or not, the true and the good will stand sure.

Though her tone at times may appear to be overly religious on the surface, Keller’s skew is rather philosophical, demonstrating that, not unlike science has a spiritual quality, optimism is a kind of secular religion:

I trust, and nothing that happens disturbs my trust. I recognize the beneficence of the power which we all worship as supreme — Order, Fate, the Great Spirit, Nature, God. I recognize this power in the sun that makes all things grow and keeps life afoot. I make a friend of this indefinable force, and straightway I feel glad, brave and ready for any lot Heaven may decree for me. This is my religion of optimism.

[…]

Deep, solemn optimism, it seems to me, should spring from this firm belief in the presence of God in the individual; not a remote, unapproachable governor of the universe, but a God who is very near every one of us, who is present not only in earth, sea and sky, but also in every pure and noble impulse of our hearts, “the source and centre of all minds, their only point of rest.”

In the second half of the book, Optimism Without, she makes an eloquent addition to these notable definitions of philosophy and touches on the ancient quandary of whether what we perceive as external reality might be an illusion:

Philosophy is the history of a deaf-blind person writ large. From the talks of Socrates up through Plato, Berkeley and Kant, philosophy records the efforts of human intelligence to be free of the clogging material world and fly forth into a universe of pure idea. A deaf-blind person ought to find special meaning in Plato’s Ideal World. These things which you see and hear and touch are not the reality of realities, but imperfect manifestations of the Idea, the Principal, the Spiritual; the Idea is the truth, the rest is delusion.

Much like legendary filmmaker Andrei Tarkovsky advised the young to learn to enjoy their own company, Keller argues for philosophy as the gateway to finding richness in life without leaving one’s self — an art all the more important in the age of living alone. She writes:

My brethren who enjoy the fullest use of the senses are not aware of any reality which may not equally well be in reach of my mind. Philosophy gives to the mind the prerogative of seeing truth, and bears us not a realm where I, who am blind, and not different from you who see. … It seemed to me that philosophy had been written for my special consolation, whereby I get even with some modern philosophers who apparently think that I was intended as an experimental case for their special instruction! But in a little measure my small voice of individual experience does join in the declaration of philosophy that the good is the only world, and that world is a world of spirit. It is also a universe where order is All, where an unbroken logic holds the parts together, where distance defines itself as non-existence, where evil, as St. Augustine held, is delusion, and therefore is not. The meaning of philosophy to me is not only its principles, but also in the happy isolation of its great expounders. They were seldom of the world, even when like Plato and Leibnitz they moved in its courts and drawing rooms. To the tumult of life they were deaf, and they were blind to its distraction and perplexing diversities. Sitting alone, but not in darkness, they learned to find everything in themselves…

In a sentiment Neil deGrasse Tyson would come to echo more than a century later in his articulate case for why our smallness amidst the cosmos should be a source of assurance rather than anxiety, Keller observes:

Thus from the philosophy I learn that we see only shadows and know only in part, and that all things change; but the mind, the unconquerable mind, compasses all truth, embraces the universe as it is, converts the shadows to realities and makes tumultuous changes seem but moments in an eternal silence, or short lines in the infinite theme of perfection, and the evil but “a halt on the way to good.” Though with my hand I grasp only a small part of the universe, with my spirit I see the whole, and in my thought I can compass the beneficent laws by which it is governed. The confidence and trust which these conceptions inspire teach me to rest safe in my life as in a fate, and protect me from spectral doubts and fears.

Keller argues of America as a mecca of optimism. And yet, as hearteningly patriotic as her case may be, a look at the present state of the plight of marriage equality, the gaping wound of income inequality, and the indignity of immigrants’ struggles (of whom I am one) reveals how much further we have to go to live up to this optimistic ideal:

It is true, America has devoted herself largely to the solution of material problems — breaking the fields, opening mines, irrigating the deserts, spanning the continent with railroads; but she is doing these things in a new way, by educating her people, by placing at the service of every man’s need every resource of human skill. She is transmuting her industrial wealth into the education of her workmen, so that unskilled people shall have no place in American life, so that all men shall bring mind and soul to the control of matter. Her children are not drudges and slaves. The Constitution has declared it, and the spirit of our institutions has confirmed it. The best the land can teach them they shall know. They shall learn that there is no upper class in their country, and no lower, and they shall understand how it is that God and His world are for everybody.

America might do all this, and still be selfish, still be a worshipper of Mammon. But America is the home of charity as well as commerce. … Who shall measure the sympathy, skill and intelligence with which she ministers to all who come to her, and lessens the ever-swelling tide of poverty, misery and degradation which every year rolls against her gates from all the nations? When I reflect on all these facts, I cannot but think that, Tolstoi and other theorists to the contrary, it is a splendid thing to be an American. In America the optimist finds abundant reason for confidence in the present and hope for the future, and this hope, this confidence, may well extend over all the great nations of the earth.

Further on, she adds, “It is significant that the foundation of that law is optimistic” — and yet what more pessimistic a law than an immigration policy based on the assumption that if left to their own devices, more immigrants would do harm than would do good, what sadder than a policy built on the belief that affording love the freedom of equality would result in destruction rather than dignity?

Still, some of Keller’s seemingly over-optimistic contentions have been since confirmed by modern science — for instance, the decline of violence, which she rightly observes:

If we compare our own time with the past, we find in modern statistics a solid foundation for a confident and buoyant world-optimism. Beneath the doubt, the unrest, the materialism, which surround us still glows and burns at the world’s best life a steadfast faith.

[…]

During the past fifty years crime has decreased. True, the records of to-day contain a longer list of crime. But our statistics are more complete and accurate than the statistics of times past. Besides, there are many offenses on the list which half a century ago would not have been thought of as crimes. This shows that the public conscience is more sensitive than it ever was.

Our definition of crime has grown stricter,* our punishment of it more lenient and intelligent. The old feeling of revenge has largely disappeared. It is no longer an eye for an eye, a tooth for a tooth. The criminal is treated as one who is diseased. He is confined not merely for punishment, but because he is a menace to society. While he is under restraint, he is treated with human care and disciplined so that his mind shall be cured of its disease, and he shall be restored to society able to do his part of its work.

* Though this may be mostly true on a theoretical level, practical disgraces to democracy like the epidemic of rape in the military offer a tragic counterpoint.

In reflecting on the relationship between education and the good life, Keller argues for the broadening of education from an industrial model of rote memorization to fostering “scholars who can link the unlinkable”. Though this ideal, too, is a long way from reality today, Keller’s words shine as a timeless guiding light to aspire toward:

Education broadens to include all men, and deepens to teach all truths. Scholars are no longer confined to Greek, Latin and mathematics, but they also study science converts the dreams of the poet, the theory of the mathematician and the fiction of the economist into ships, hospitals and instruments that enable one skilled hand to perform the work of a thousand. The student of to-day is not asked if he has learned his grammar. Is he a mere grammar machine, a dry catalogue of scientific facts, or has he acquired the qualities of manliness? His supreme lesson is to grapple with great public questions, to keep his mind hospitable to new idea and new views of truth, to restore the finer ideals that are lost sight of in the struggle for wealth and to promote justice between man and man. He learns that there may be substitutes for human labor — horse-power and machinery and books; but “there are no substitutes for common sense, patience, integrity, courage.”

In a sentiment philosopher Judith Butler would come to second in her fantastic recent commencement address on the value of the humanities as a tool of empathy, Keller argues:

The highest result of education is tolerance. Long ago men fought and died for their faith; but it took ages to teach them the other kind of courage — the courage to recognize the faiths of their brethren and their rights of conscience. Tolerance is the first principle of community; it is the spirit which conserves the best that all men think. No loss by flood and lightening, no destruction of cities and temples by the hostile forces of nature, has deprived man of so many noble lives and impulses as those which his tolerance has destroyed.

“However vast the darkness, we must supply our own light,” Stanley Kubrick memorably asserted, and it’s hard not to see in his words an echo of Keller’s legacy. She presages the kernel of Martin Seligman’s seminal concept of learned optimism and writes:

The test of all beliefs is their practical effect in life. If it be true that optimism compels the world forward, and pessimism retards it, then it is dangerous to propagate a pessimistic philosophy. One who believes that the pain in the world outweighs the joy, and expresses that unhappy conviction, only adds to the pain. … Life is a fair field, and the right will prosper if we stand by our guns.

Let pessimism once take hold of the mind, and life is all topsy-turvy, all vanity and vexation of spirit. … If I regarded my life from the point of view of the pessimist, I should be undone. I should seek in vain for the light that does not visit my eyes and the music that does not ring in my ears. I should beg night and day and never be satisfied. I should sit apart in awful solitude, a prey to fear and despair. But since I consider it a duty to myself and to others to be happy, I escape a misery worse than any physical deprivation.

In the final and most practical part of the book, The Practice of Optimism, Keller urges:

Who shall dare let his incapacity for hope or goodness cast a shadow upon the courage of those who bear their burdens as if they were privileges? The optimist cannot fall back, cannot falter; for he knows his neighbor will be hindered by his failure to keep in line. He will therefore hold his place fearlessly and remember the duty of silence. Sufficient unto each heart is its own sorrow. He will take the iron claws of circumstance in his hand and use them as tools to break away the obstacle that block his path. He will work as if upon him alone depended the establishment of heaven and earth.

She once again return to the notion of optimism as a collective good rather than merely an individual choice, even a national asset:

Every optimist moves along with progress and hastens it, while every pessimist would keep the worlds at a standstill. The consequence of pessimism in the life of a nation is the same as in the life of the individual. Pessimism kills the instinct that urges men to struggle against poverty, ignorance and crime, and dries up all the fountains of joy in the world.

[…]

Optimism is the faith that leads to achievement; nothing can be done without hope.

In an ever-timelier remark in our age of fear-mongering sensationalism in the news — a remark E. B. White would come to second decades later in arguing that a writer “should tend to lift people up, not lower them down” — Keller points to the responsibility of the press in upholding its share of this collective enterprise:

Our newspapers should remember this. The press is the pulpit of the modern world, and on the preachers who fill it much depends. If the protest of the press against unrighteous measures is to avail, then for ninety nine days the word of the preacher should be buoyant and of good cheer, so that on the hundredth day the voice of censure may be a hundred times strong.

Keller ends on a note of inextinguishable faith in the human spirit and timeless hope for the future of our world:

As I stand in the sunshine if a sincere and earnest optimism, my imagination “paints yet more glorious triumphs on the cloud-curtain of the future.” Out of the fierce struggle and turmoil of contending systems and powers I see a brighter spiritual era slowly emerge —an era in which there shall be no England, no France, no Germany, no America, no this people or that, but one family, the human race; one law, peace; one need, harmony; one means, labor…

Pair Optimism — which is available as a free download in multiple formats from Project Gutenberg — with these 7 heartening reads on the subject, then revisit Keller’s stirring first experience of dance and her memorable meeting with Mark Twain, who later became her creative champion and confidante.

BP

View Full Site

The Marginalian participates in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn commissions by linking to Amazon. In more human terms, this means that whenever you buy a book on Amazon from any link on here, I receive a small percentage of its price, which goes straight back into my own colossal biblioexpenses. Privacy policy. (TLDR: You're safe — there are no nefarious "third parties" lurking on my watch or shedding crumbs of the "cookies" the rest of the internet uses.)