“Do you have the courage to bring forth the treasures that are hidden within you?”
By Maria Popova
“When you’re an artist,” Amanda Palmer wrote in her magnificent manifesto for the creative life, “nobody ever tells you or hits you with the magic wand of legitimacy. You have to hit your own head with your own handmade wand.” The craftsmanship of that wand, which is perhaps the most terrifying and thrilling task of the creative person in any domain of endeavor, is what Elizabeth Gilbert explores in Big Magic: Creative Living Beyond Fear (public library) — a lucid and luminous inquiry into the relationship between human beings and the mysteries of the creative experience, as defined by Gilbert’s beautifully broad notion of “living a life that is driven more strongly by curiosity than by fear.” It’s an expansive definition that cracks open the possibilities within any human life, whether you’re a particle physicist or a postal worker or a poet — and the pursuit of possibility is very much at the heart of Gilbert’s mission to empower us to enter into creative endeavor the way one enters into a monastic order: “as a devotional practice, as an act of love, and as a lifelong commitment to the search for grace and transcendence.”
This, I believe, is the central question upon which all creative living hinges: Do you have the courage to bring forth the treasures that are hidden within you?
Surely something wonderful is sheltered inside you. I say this with all confidence, because I happen to believe we are all walking repositories of buried treasure. I believe this is one of the oldest and most generous tricks the universe plays on us human beings, both for its own amusement and for ours: The universe buries strange jewels deep within us all, and then stands back to see if we can find them.
The hunt to uncover those jewels — that’s creative living.
The courage to go on that hunt in the first place — that’s what separates a mundane existence from a more enchanted one.
The often surprising results of that hunt — that’s what I call Big Magic.
That notion of summoning the courage to bring forth one’s hidden treasures is one Gilbert borrowed from Jack Gilbert — a brilliant poet to whom she is related not by genealogy but by creative kinship, graced with the astonishing coincidence of their last names and a university teaching position they both occupied a generation apart. She reflects on the poet’s unusual creative ethos:
“We must risk delight,” he wrote. “We must have the stubbornness to accept our gladness in the ruthless furnace of this world.”
He seemed to live in a state of uninterrupted marvel, and he encouraged [his students] to do the same. He didn’t so much teach them how to write poetry, they said, but why: because of delight. Because of stubborn gladness. He told them that they must live their most creative lives as a means of fighting back against the ruthless furnace of this world.
Most of all, though, he asked his students to be brave. Without bravery, he instructed, they would never be able to realize the vaulting scope of their own capacities. Without bravery, they would never know the world as richly as it longs to be known. Without bravery, their lives would remain small — far smaller than they probably wanted their lives to be.
But this notion of bravery seeds a common confusion, which Gilbert takes care to dispel:
We all know that when courage dies, creativity dies with it. We all know that fear is a desolate boneyard where our dreams go to desiccate in the hot sun. This is common knowledge; sometimes we just don’t know what to do about it.
Creativity is a path for the brave, yes, but it is not a path for the fearless, and it’s important to recognize the distinction.
If your goal in life is to become fearless, then I believe you’re already on the wrong path, because the only truly fearless people I’ve ever met were straight-up sociopaths and a few exceptionally reckless three-year-olds — and those aren’t good role models for anyone.
Bravery, Gilbert suggests, is the product of a certain kind of obstinacy in the face of fear — and that obstinacy, rather than one’s occupation, is what defines the creative life:
While the paths and outcomes of creative living will vary wildly from person to person, I can guarantee you this: A creative life is an amplified life. It’s a bigger life, a happier life, an expanded life, and a hell of a lot more interesting life. Living in this manner — continually and stubbornly bringing forth the jewels that are hidden within you — is a fine art, in and of itself.
To be sure, Gilbert — whose writing lives in the Venn diagram of Brené Brown, Dani Shapiro, Cheryl Strayed, and David Whyte — is a far cry from the self-help canon of authoritarian advice dictated by a detached expert. What makes her book so immensely helpful is precisely its lived and living nature. She writes:
The only reason I can speak so authoritatively about fear is that I know it so intimately. I know every inch of fear, from head to toe. I’ve been a frightened person my entire life. I was born terrified. I’m not exaggerating; you can ask anyone in my family, and they’ll confirm that, yes, I was an exceptionally freaked-out child. My earliest memories are of fear, as are pretty much all the memories that come after my earliest memories.
Growing up, I was afraid not only of all the commonly recognized and legitimate childhood dangers (the dark, strangers, the deep end of the swimming pool), but I was also afraid of an extensive list of completely benign things (snow, perfectly nice babysitters, cars, playgrounds, stairs, Sesame Street, the telephone, board games, the grocery store, sharp blades of grass, any new situation whatsoever, anything that dared to move, etc., etc., etc.).
I was a sensitive and easily traumatized creature who would fall into fits of weeping at any disturbance in her force field. My father, exasperated, used to call me Pitiful Pearl. We went to the Delaware shore one summer when I was eight years old, and the ocean upset me so much that I tried to get my parents to stop all the people on the beach from going into the surf.
I can’t help but see in this tragicomic anecdote a magnificent metaphor for the psychology of trolling. The impulse to attack others who have dared to put themselves and their art into the world springs from the same fear-seed. What is trolling, after all, if not a concentrated effort to stop others from going into the surf — not because trolls try to protect the rest of the world from the perils of bad art but because they seek to protect themselves from the fear that if they dare plunge into the surf, their own art might wash up ashore lifeless.
Kierkegaard knew this when he contemplated the psychology of trolling two centuries ago, and Neil Gaiman knew it when he delivered his spectacular speech on courage and the creative life. All of us know this on some primordial level when we contemplate the metaphorical surf, for every time we decide to swim we must also allow for the possibility of sinking, which seems decidedly less mortifying if there weren’t other people swimming while we sink.
Gilbert considers the somewhat mysterious, somewhat perfectly sensical stimulus that eventually sent her on a life-path of plunging into the surf:
Over the years, I’ve often wondered what finally made me stop playing the role of Pitiful Pearl, almost overnight. Surely there were many factors involved in that evolution (the tough-mom factor, the growing-up factor), but mostly I think it was just this: I finally realized that my fear was boring.
Around the age of fifteen, I somehow figured out that my fear had no variety to it, no depth, no substance, no texture. texture. I noticed that my fear never changed, never delighted, never offered a surprise twist or an unexpected ending. My fear was a song with only one note — only one word, actually — and that word was “STOP!” My fear never had anything more interesting or subtle to offer than that one emphatic word, repeated at full volume on an endless loop: “STOP, STOP, STOP, STOP!”
I also realized that my fear was boring because it was identical to everyone else’s fear. I figured out that everyone’s song of fear has exactly that same tedious lyric: “STOP, STOP, STOP, STOP!” True, the volume may vary from person to person, but the song itself never changes, because all of us humans were equipped with the same basic fear package when we were being knitted in our mothers’ wombs.
Far from a uniquely human faculty, this fear is there for a reason — an evolutionary mechanism that aided us in our survival, much like it has aided every living creature that made it to this point of evolutionary history. Gilbert writes:
If you pass your hand over a petri dish containing a tadpole, the tadpole will flinch beneath your shadow. That tadpole cannot write poetry, and it cannot sing, and it will never know love or jealousy or triumph, and it has a brain the size of a punctuation mark, but it damn sure knows how to be afraid of the unknown.
And yet the human gift, Gilbert reminds us, is the willingness to march forward — in terror and transcendence, and often alone — even though we too flinch beneath the shadow of the unknown:
Creativity is sacred, and it is not sacred.
What we make matters enormously, and it doesn’t matter at all.
We toil alone, and we are accompanied by spirits.
We are terrified, and we are brave.
Art is a crushing chore and a wonderful privilege.
Only when we are at our most playful can divinity finally get serious with us.
“It’s easy to be a hero when you’re only risking your life.”
By Maria Popova
“To be an artist,” Louise Bourgeois wrote in her diary, “is a guarantee to your fellow humans that the wear and tear of living will not let you become a murderer.” But how is an artist to be when the world itself becomes murderous? Nowhere is the wear and tear of living more trying for the creative spirit — for the human spirit — than in war. And yet generations of artists have not only persevered through some of the ugliest, most gruesome periods in our civilizational history but have helped the rest of us endure by bolstering the spirit of their fellow humans. Bourgeois herself lived through two world wars, as did the artist she considered the greatest master: Pablo Picasso (October 25, 1881–April 8, 1973), a man whose unflinching creative courage became nothing short of heroism under the duress of war.
Despite frequent harassment by the Gestapo, Picasso refused to leave Nazi-occupied Paris. He was forbidden from exhibiting or publishing, all of his books were banned, and even the reproduction of his work was prohibited — but he continued to make art. When the Germans outlawed bronze casting, he went on making sculptures with bronze smuggled by the French Resistance — a symbolic act which the deflated creative community saw as an emboldening beam of hope.
In Conversations with Picasso (public library) — the same vintage treasure that gave us the artist’s views on success, where ideas come from, and the glory of dust — Hungarian photographer Brassaï recounts a 1943 conversation with the French poet Jacques Prévert, another friend of Picasso’s. Brassaï, who had been visiting the artist’s studio and interviewing him for over a decade, reflects on the bravery with which Picasso had withstood the occupation of Paris since the Nazis had taken over three years earlier:
At the time of the invasion, he could have left if he had wanted, could have gone anywhere he wished, to Mexico, Brazil, the United States. He didn’t lack for money or opportunities or invitations. Even during the Occupation, the United States consul requested several times that he leave France. But he stayed. His presence among us is a comfort and a spur, not only for those of us who are his friends, but even for those who don’t know him.
Prévert agrees and considers the true meaning of heroism:
It was an act of courage. The man is not a hero. He is afraid, just like anyone who has something to say or defend. It’s easy to be a hero when you’re only risking your life. For his part, he could, and still can, lose everything. Who knows what turn the war will take? Paris may be destroyed. He’s got a bad record with the Nazis, and could be interned, deported, taken hostage. Even his works — “degenerate” art, “Bolshevik” art — have already been condemned and may be burned at the stake. No one in the world, not the pope or the Holy Ghost, could prevent such an auto-da-fé. And the more desperate Hitler and his acolytes become, the more dangerous, deadly, and destructive their rage may be. Can Picasso guess how they might react? He has assumed the risk. He has come back to occupied Paris. He is with us. Picasso is a great guy.
Brassaï reflects on the larger payoff of Picasso’s courageous resistance, recounting a very different kind of joyous invasion:
From one day to the next, Picasso’s studio was invaded. His courageous attitude made him a standard-bearer, and the whole world wanted to salute him as the symbol of recovered freedom. Poets, painters, art critics, museum directors, writers dressed in the uniform of Allied armies, officers or simple soldiers, climbed the steep staircase in a compact mob. There was a crush of people at his place. He has become just as popular in Red China, in Soviet Russia, as he was in the United States after his major exhibition in New York. And, for months, Picasso good-naturedly relished universal glory, graciously made himself available to journalists, to photographers, and even to the curious who wanted to see him “in the flesh.”
Conversations with Picasso is an indispensable read in its totality, full of the great artist’s ideas on every aspect of art and life, and strewn with cameos and recollections by some of his most influential contemporaries, including Henri Matisse, Simone de Beauvoir, Jean-Paul Sartre, and Henry Miller.
Joining the canon of insightful meta-diarists is Sarah Manguso with Ongoingness: The End of a Diary (public library) — a collection of fragmentary, piercing meditations on time, memory, the nature of the self, and the sometimes glorious, sometimes harrowing endeavor of filling each moment with maximum aliveness while simultaneously celebrating its presence and grieving its passage.
Looking back on the 800,000 words she produced over a quarter-century of journaling, Manguso offers an unusual meta-reflection exuding the concise sagacity of Zen teachings and the penetrating insight of Marshall McLuhan’s “probes.” She becomes, in fact, a kind of McLuhan of the self, probing not the collective conscience but the individual psyche, yet extracting widely resonant human truth and transmuting it into enormously expansive wisdom.
Manguso traces the roots of her diaristic journey, which began as an almost compulsive hedge against forgetting, against becoming an absentee in her own life, against the anguishing anxiety that time was slipping from her grip:
I wrote so I could say I was truly paying attention. Experience in itself wasn’t enough. The diary was my defense against waking up at the end of my life and realizing I’d missed it.
The trouble was that I failed to record so much.
I’d write about a few moments, but the surrounding time — there was so much of it! So much apparent nothing I ignored, that I treated as empty time between the memorable moments.
I tried to record each moment, but time isn’t made of moments; it contains moments. There is more to it than moments.
So I tried to pay close attention to what seemed like empty time.
I wanted to comprehend my own position in time so I could use my evolving self as completely and as usefully as possible. I didn’t want to go lurching around, half-awake, unaware of the work I owed the world, work I didn’t want to live without doing.
And yet this process of chronicling her orientation to the moment soon revealed that the recording itself was an editorial act — choosing which moments to record and which to omit is, as Susan Sontag observed of the fiction writer’s task to choose which story to tell from among all the ones that could be told, about becoming a storyteller of one’s own life; synthesizing the robust fact of time into a fragmentary selection of moments invariably produces a work of fiction. As Manguso puts it, the diary becomes “a series of choices about what to omit, what to forget.”
But alongside this pursuit of the fullness of the moment Manguso found a dark underbelly — a kind of leaning forward into the next moment before this one has come to completion. This particularly Western affliction has immensely varied symptoms, but Manguso found that it her own life its most perilous manifestation was the tendency to hop from one romantic relationship to another, oscillating between beginnings and endings, unable to inhabit the stillness of the middles. She writes:
I’d become intolerant of waiting. My forward momentum barely stopped for the length of the touch.
I thought my momentum led to the next person, but in fact it only led away from the last person.
My behavior was an attempt to stop time before it swept me up. It was an attempt to stay safe, free to detach before life and time became too intertwined for me to write down, as a detached observer, what had happened.
Once I understood what I was doing, with each commitment I wakened slightly more from my dream of pure potential.
It was a failure of my imagination that made me keep leaving people. All I could see in the world were beginnings and endings: moments to survive, record, and, once recorded, safely forget.
I knew I was getting somewhere when I began losing interest in the beginnings and the ends of things.
As her relationship to these markers of time changed, she became interested not in the “short tragic love stories” that had once bewitched her but in “the kind of love to which the person dedicates herself for so long, she no longer remembers quite how it began.” Eventually, she got married. Echoing Wendell Berry’s memorable meditation on marriage and freedom, she writes:
Marriage isn’t a fixed experience. It’s a continuous one. It changes form but is still always there, a rivulet under a frozen stream. Now, when I feel a break in the continuity of till death do us part, I think to myself, Get back in the river.
In a significant way, the stability of time inherent to such continuity was an experience foreign to Manguso and counter to the flow of impermanence that her diary recorded. This was a whole new way of measuring life not by its constant changes but by its unchanging constants:
In my diary I recorded what had changed since the previous day, but sometimes I wondered: What if I recorded only what hadn’t changed? Weather still fair. Cat still sweet. Cook oats in same pot. Continue reading same book. Make bed in same way, put on same blue jeans, water garden in same order … Would that be a better, truer record?
The record-keeping of truth, of course, is the domain of memory — and yet our memory is not an accurate recording device but, as legendary neurologist Oliver Sacks has pointed out over and over, a perpetually self-revising dossier. Manguso considers what full attentiveness to the present might look like when unimpeded by the tyranny of memory:
The least contaminated memory might exist in the brain of a patient with amnesia — in the brain of someone who cannot contaminate it by remembering it. With each recollection, the memory of it further degrades. The memory and maybe the fact of every kiss start disappearing the moment the two mouths part.
When I was twelve I realized that photographs were ruining my memory. I’d study the photos from an event and gradually forget everything that had happened between the shutter openings. I couldn’t tolerate so much lost memory, and I didn’t want to spectate my life through a viewfinder, so I stopped taking photographs. All the snapshots of my life for the next twenty years were shot by someone else. There aren’t many, but there are enough.
For Manguso, memory and its resulting record became stubborn self-defense not only against forgetting but also against being forgotten — a special case of our general lifelong confrontation with mortality:
My life, which exists mostly in the memories of the people I’ve known, is deteriorating at the rate of physiological decay. A color, a sensation, the way someone said a single word — soon it will all be gone. In a hundred and fifty years no one alive will ever have known me.
Being forgotten like that, entering that great and ongoing blank, seems more like death than death.
I assumed that maximizing the breadth and depth of my autobiographical memory would be good for me, force me to write and live with greater care, but in the last thing one writer ever published, when he was almost ninety years old, he wrote a terrible warning.
He said he’d liked remembering almost as much as he’d liked living but that in his old age, if he indulged in certain nostalgias, he would get lost in his memories. He’d have to wander them all night until morning.
He responded to my fan letter when he was ninety. When he was ninety-one, he died.
I just wanted to retain the whole memory of my life, to control the itinerary of my visitations, and to forget what I wanted to forget.
Good luck with that, whispered the dead.
Upon arriving at a view of death reminiscent of Alan Watts’s, Manguso revisits the limiting fragmentation of life’s ongoingness into beginnings and endings:
The experiences that demanded I yield control to a force greater than my will — diagnoses, deaths, unbreakable vows — weren’t the beginnings or the ends of anything. They were the moments when I was forced to admit that beginnings and ends are illusory. That history doesn’t begin or end, but it continues.
For just a moment, with great effort, I could imagine my will as a force that would not disappear but redistribute when I died, and that all life contained the same force, and that I needn’t worry about my impending death because the great responsibility of my life was to contain the force for a while and then relinquish it.
Then something happened — something utterly ordinary in the grand human scheme that had an extraordinary impact on Manguso’s private dance with memory and mortality: she became a mother. She writes:
I began to inhabit time differently.
I used to exist against the continuity of time. Then I became the baby’s continuity, a background of ongoing time for him to live against. I was the warmth and milk that was always there for him, the agent of comfort that was always there for him.
My body, my life, became the landscape of my son’s life. I am no longer merely a thing living in the world; I am a world.
Time kept reminding me that I merely inhabit it, but it began reminding me more gently.
As she awoke to this immutable continuity of life, Manguso became more acutely aware of those bewitched by beginnings. There is, of course, a certain beauty — necessity, even — to that beginner’s refusal to determine what is impossible before it is even possible. She writes:
My students still don’t know what they will never be. Their hope is so bright I can almost see it.
I used to value the truth of whether this student or that one would achieve the desired thing. I don’t value that truth anymore as much as I value their untested hope. I don’t care that one in two hundred of them will ever become what they feel they must become. I care only that I am able to witness their faith in what’s coming next.
But even that enlivening “untested hope” is a dialogic function of time and impermanence. Manguso captures the central challenge of memory, of attentiveness to life, of the diary itself:
The essential problem of ongoingness is that one must contemplate time as that very time, that very subject of one’s contemplation, disappears.
Left alone in time, memories harden into summaries. The originals become almost irretrievable.
Occasionally, a memory retains its stark original reality. Manguso recalls one particular incident from her son’s early childhood:
One day the baby gently sat his little blue dog in his booster seat and offered it a piece of pancake.
The memory should already be fading, but when I bring it up I almost choke on it — an incapacitating sweetness.
The memory throbs. Left alone in time, it is growing stronger.
The baby had never seen anyone feed a toy a pancake. He invented it. Think of the love necessary to invent that… An unbearable sweetness.
The feeling strengthens the more I remember it. It isn’t wearing smooth. It’s getting bigger, an outgrowth of new love.
Perhaps there is an element of “untested hope” in journaling itself — we are drawn to the practice because we hope that the diary would safe-keep precisely such throbbing, self-strengthening memories; that, in recording the unfolding ways in which we invent ourselves into personhood, it would become a constant reassurance of our own realness, a grownup version of The Velveteen Rabbit, reminding us that “real isn’t how you are made [but] a thing that happens to you.” Bearing witness to the happening itself, without trying to fragment it into beginnings and endings, is both the task of living and the anguish of the liver.
Manguso captures this elegantly:
Perhaps all anxiety might derive from a fixation on moments — an inability to accept life as ongoing.
The best thing about time passing is the privilege of running out of it, of watching the wave of mortality break over me and everyone I know. No more time, no more potential. The privilege of ruling things out. Finishing. Knowing I’m finished. And knowing time will go on without me.
Look at me, dancing my little dance for a few moments against the background of eternity.
She revisits her original tussle with time, memory, beginnings, and endings:
How ridiculous to believe myself powerful enough to stop time just by thinking.
Often I believe I’m working toward a result, but always, once I reach the result, I realize all the pleasure was in planning and executing the path to that result.
It comforts me that endings are thus formally unappealing to me — that more than beginning or ending, I enjoy continuing.
Seen in this way, the diary becomes not a bastion of memory but a white flag to forgetting, extended not in resignation but in celebration. Manguso writes:
I came to understand that the forgotten moments are the price of continued participation in life, a force indifferent to time.
Now I consider the diary a compilation of moments I’ll forget, their record finished in language as well as I could finish it — which is to say imperfectly.
Someday I might read about some of the moments I’ve forgotten, moments I’ve allowed myself to forget, that my brain was designed to forget, that I’ll be glad to have forgotten and be glad to rediscover as writing. The experience is no longer experience. It is writing. I am still writing.
And I’m forgetting everything. My goal now is to forget it all so that I’m clean for death. Just the vaguest memory of love, of participation in the great unity.
Time punishes us by taking everything, but it also saves us — by taking everything.
How to be alone, wake up from illusion, master the art of asking, fathom your place in the universe, and more.
By Maria Popova
After the year’s most intelligent and imaginative children’s books and best science books, here are my favorite books on psychology and philosophy published this year, along with the occasional letter and personal essay — genres that, at their most excellent, offer hearty helpings of both disciplines. Perhaps more precisely, these are the year’s finest books on how to live sane, creative, meaningful lives. (And since the subject is of the most timeless kind, revisit the selections 2013, 2012, and 2011.)
1. A GUIDE FOR THE PERPLEXED
Werner Herzog is celebrated as one of the most influential and innovative filmmakers of our time, but his ascent to acclaim was far from a straight trajectory from privilege to power. Abandoned by his father at an early age, Herzog survived a WWII bombing that demolished the house next door to his childhood home and was raised by a single mother in near-poverty. He found his calling in filmmaking after reading an encyclopedia entry on the subject as a teenager and took a job as a welder in a steel factory in his late teens to fund his first films. These building blocks of his character — tenacity, self-reliance, imaginative curiosity — shine with blinding brilliance in the richest and most revealing of Herzog’s interviews. Werner Herzog: A Guide for the Perplexed (public library) — not to be confused with E.F. Schumacher’s excellent 1978 philosophy book of the same title — presents the director’s extensive, wide-ranging conversation with writer and filmmaker Paul Cronin. His answers are unfiltered and to-the-point, often poignant but always unsentimental, not rude but refusing to infest the garden of honest human communication with the Victorian-seeded, American-sprouted weed of pointless politeness.
Herzog’s insights coalesce into a kind of manifesto for following one’s particular calling, a form of intelligent, irreverent self-help for the modern creative spirit — indeed, even though Herzog is a humanist fully detached from religion, there is a strong spiritual undertone to his wisdom, rooted in what Cronin calls “unadulterated intuition” and spanning everything from what it really means to find your purpose and do what you love to the psychology and practicalities of worrying less about money to the art of living with presence with an age of productivity. As Cronin points out in the introduction, Herzog’s thoughts collected in the book are “a decades-long outpouring, a response to the clarion call, to the fervent requests for guidance.”
And yet in many ways, A Guide for the Perplexed could well have been titled A Guide to the Perplexed, for Herzog is as much a product of his “cumulative humiliations and defeats,” as he himself phrases it, as of his own “chronic perplexity,” to borrow E.B. White’s unforgettable term — Herzog possesses that rare, paradoxical combination of absolute clarity of conviction and wholehearted willingness to inhabit his own inner contradictions, to pursue life’s open-endedness with equal parts focus of vision and nimbleness of navigation.
A certain self-reliance that permeates his films and his mind, a refusal to let the fear of failure inhibit trying — a sensibility the voiceover in the final scene of Herzog’s The Unprecedented Defence of the Fortress Deutschkreuz captures perfectly: “Even a defeat is better than nothing at all.”
If the odds of finding one’s soul mate are so dreadfully dismal and the secret of lasting love is largely a matter of concession, is it any wonder that a growing number of people choose to go solo? The choice of solitude, of active aloneness, has relevance not only to romance but to all human bonds — even Emerson, perhaps the most eloquent champion of friendship in the English language, lived a significant portion of his life in active solitude, the very state that enabled him to produce his enduring essays and journals. And yet that choice is one our culture treats with equal parts apprehension and contempt, particularly in our age of fetishistic connectivity. Hemingway’s famous assertion that solitude is essential for creative work is perhaps so oft-cited precisely because it is so radical and unnerving in its proposition.
While Maitland lives in a region of Scotland with one of the lowest population densities in Europe, where the nearest supermarket is more than twenty miles away and there is no cell service (pause on that for a moment), she wasn’t always a loner — she grew up in a big, close-knit family as one of six children. It was only when she became transfixed by the notion of silence, the subject of her previous book, that she arrived, obliquely, at solitude. She writes:
I got fascinated by silence; by what happens to the human spirit, to identity and personality when the talking stops, when you press the off button, when you venture out into that enormous emptiness. I was interested in silence as a lost cultural phenomenon, as a thing of beauty and as a space that had been explored and used over and over again by different individuals, for different reasons and with wildly differing results. I began to use my own life as a sort of laboratory to test some ideas and to find out what it felt like. Almost to my surprise, I found I loved silence. It suited me. I got greedy for more. In my hunt for more silence, I found this valley and built a house here, on the ruins of an old shepherd’s cottage.
Maitland’s interest in solitude, however, is somewhat different from that in silence — while private in its origin, it springs from a public-facing concern about the need to address “a serious social and psychological problem around solitude,” a desire to “allay people’s fears and then help them actively enjoy time spent in solitude.” And so she does, posing the central, “slippery” question of this predicament:
Being alone in our present society raises an important question about identity and well-being.
How have we arrived, in the relatively prosperous developed world, at least, at a cultural moment which values autonomy, personal freedom, fulfillment and human rights, and above all individualism, more highly than they have ever been valued before in human history, but at the same time these autonomous, free, self-fulfilling individuals are terrified of being alone with themselves?
We live in a society which sees high self-esteem as a proof of well-being, but we do not want to be intimate with this admirable and desirable person.
We see moral and social conventions as inhibitions on our personal freedoms, and yet we are frightened of anyone who goes away from the crowd and develops “eccentric” habits.
We believe that everyone has a singular personal “voice” and is, moreover, unquestionably creative, but we treat with dark suspicion (at best) anyone who uses one of the most clearly established methods of developing that creativity — solitude.
We think we are unique, special and deserving of happiness, but we are terrified of being alone.
We are supposed now to seek our own fulfillment, to act on our feelings, to achieve authenticity and personal happiness — but mysteriously not do it on our own.
Today, more than ever, the charge carries both moral judgement and weak logic.
Maitland goes on to explore the underlying psychology of our unease from the fall of the Roman Empire to the rise of the “male spinster” and how to cultivate the five deepest rewards of solitude. Read more here.
3. WAKING UP
Nietzsche’s famous proclamation that “God is dead” is among modern history’s most oft-cited aphorisms, and yet as is often the case with its ilk, such quotations often miss the broader context in a way that bespeaks the lazy reductionism with which we tend to approach questions of spirituality today. Nietzsche himself clarified the full dimension of his statement six years later, in a passage from The Twilight of Idols, where he explained that “God” simply signified the supersensory realm, or “true world,” and wrote: “We have abolished the true world. What has remained? The apparent one perhaps? Oh no! With the true world we have also abolished the apparent one.”
In Waking Up: A Guide to Spirituality Without Religion (public library | IndieBound), philosopher, neuroscientist, and mindful skeptic Sam Harris offers a contemporary addition to this lineage of human inquiry — an extraordinary and ambitious masterwork of such integration between science and spirituality, which Harris himself describes as “by turns a seeker’s memoir, an introduction to the brain, a manual of contemplative instruction, and a philosophical unraveling of what most people consider to be the center of their inner lives.” Or, perhaps most aptly, an effort “to pluck the diamond from the dunghill of esoteric religion.”
Harris begins by recounting an experience he had at age sixteen — a three-day wilderness retreat designed to spur spiritual awakening of some sort, which instead left young Harris feeling like the contemplation of the existential mystery in the presence of his own company was “a source of perfect misery.” This frustrating experience became “a sufficient provocation” that launched him into a lifelong pursuit of the kinds of transcendent experiences that gave rise to the world’s major spiritual traditions, examining them instead with a scientist’s vital blend of skepticism and openness and a philosopher’s aspiration to be “scrupulously truthful.”
Our minds are all we have. They are all we have ever had. And they are all we can offer others… Every experience you have ever had has been shaped by your mind. Every relationship is as good or as bad as it is because of the minds involved.
Noting that the entirety of our experience, as well as our satisfaction with that experience, is filtered through our minds — “If you are perpetually angry, depressed, confused, and unloving, or your attention is elsewhere, it won’t matter how successful you become or who is in your life — you won’t enjoy any of it.” — Harris sets out to reconcile the quest to achieve one’s goals with a deeper longing, a recognition, perhaps, that presence is far more rewarding than productivity. He writes:
Most of us spend our time seeking happiness and security without acknowledging the underlying purpose of our search. Each of us is looking for a path back to the present: We are trying to find good enough reasons to be satisfied now.
Acknowledging that this is the structure of the game we are playing allows us to play it differently. How we pay attention to the present moment largely determines the character of our experience and, therefore, the quality of our lives.
Virginia Woolf called letter-writing “the humane art” — an epithet only amplified today, in an age when we so frequently mistake reaction for response and succumb to expectations of immediacy that render impossible the beautiful, contemplative mutuality at the heart of the notion of co-respondence. This, perhaps, is why yesteryear’s greatest letters appeal to us more irrepressibly than ever.
For years, Shaun Usher has been unearthing and highlighting brilliant, funny, poignant, exquisitely human letters from luminaries and ordinary people alike on his magnificent website. This year, the best of them were released in Letters of Note: Correspondence Deserving of a Wider Audience (public library | IndieBound) — the aptly titled, superb collection featuring contributions from such cultural icons as Virginia Woolf, Roald Dahl, Louis Armstrong, Kurt Vonnegut, Nick Cave, Richard Feynman, Jack Kerouac, and more.
“You gotta be willing to fail… if you’re afraid of failing, you won’t get very far,”Steve Jobs cautioned. “There is no such thing as failure — failure is just life trying to move us in another direction,”Oprah counseled new Harvard graduates. In his wonderfully heartening letter of fatherly advice, F. Scott Fitzgerald gave his young daughter Scottie a list of things to worry and not worry about in life; among the unworriables, he listed failure, “unless it comes through your own fault.” And yet, as Debbie Millman observed in Fail Safe, her magnificent illustrated-essay-turned-commencement-address, most of us “like to operate within our abilities” — stepping outside of them risks failure, and we do worry about it, very much. How, then, can we transcend that mental block, that existential worry, that keeps us from the very capacity for creative crash that keeps us growing and innovating?
That’s precisely what curator and art historian Sarah Lewis, who has under her belt degrees from Harvard and Oxford, curatorial positions at the Tate Modern and the MoMA, and an appointment on President Obama’s Arts Policy Committee, examines in The Rise: Creativity, the Gift of Failure, and the Search for Mastery (public library | IndieBound) — an exploration of how “discoveries, innovations, and creative endeavors often, perhaps even only, come from uncommon ground” and why this “improbable ground of creative endeavor” is an enormous source of advantages on the path to self-actualization and fulfillment, brought to life through a tapestry of tribulations turned triumphs by such diverse modern heroes as legendary polar explorer Captain Scott, dance icon Paul Taylor, and pioneering social reformer Frederick Douglass. Lewis, driven by her lifelong “magpie curiosity about how we become,” crafts her argument slowly, meticulously, stepping away from it like a sculptor gaining perspective on her sculpture and examining it through other eyes, other experiences, other particularities, which she weaves together into an intricate tapestry of “magpielike borrowings” filtered through the sieve of her own point of view.
Lewis begins with a visit with the women of Columbia University’s varsity archery team, who spend countless hours practicing a sport that requires equal parts impeccable precision of one’s aim and a level of comfort with the uncontrollable — all the environmental interferences, everything that could happen between the time the arrow leaves the bow and the time it lands on the target, having followed its inevitably curved line. From this unusual sport Lewis draws a metaphor for the core of human achievement:
There is little that is vocational about [contemporary] culture anymore, so it is rare to see what doggedness looks like with this level of exactitude… To spend so many hours with a bow and arrow is a kind of marginality combined with a seriousness of purpose rarely seen.
In the archers’ doggedness Lewis finds the central distinction that serves as a backbone of her book — far more important than success (hitting the bull’s-eye) is the attainment of mastery (“knowing it means nothing if you can’t do it again and again”), and in bridging the former with the latter lives the substance of true achievement. (The distinction isn’t unlike what psychologist Carol Dweck found in her pioneering work on the difference between “fixed” and “growth” mindsets.) Lewis writes:
Mastery requires endurance. Mastery, a word we don’t use often, is not the equivalent of what we might consider its cognate — perfectionism — an inhuman aim motivated by a concern with how others view us. Mastery is also not the same as success — an event-based victory based on a peak point, a punctuated moment in time. Mastery is not merely a commitment to a goal, but to a curved-line, constant pursuit.
This is why, Lewis argues, a centerpiece of mastery is the notion of failure. She cites Edison, who famously said of his countless fruitless attempts to create a feasible lightbulb: “I have not failed, I’ve just found 10,000 ways that won’t work.” In fact, Lewis points out that embedded in the very word “failure” — a word originally synonymous with bankruptcy, devised to assess creditworthiness in the 19th century, “a seeming dead end forced to fit human worth” — is the bias of our limited understanding of its value:
The word failure is imperfect. Once we begin to transform it, it ceases to be that any longer. The term is always slipping off the edges of our vision, not simply because it’s hard to see without wincing, but because once we are ready to talk about it, we often call the event something else — a learning experience, a trial, a reinvention — no longer the static concept of failure.
In its stead, Lewis offers another 19th-century alternative: “blankness,” which beautifully captures the wide-open field of possibility for renewal, for starting from scratch, after an unsuccessful attempt. Still, she considers the challenge of pinning down into plain language a concept so complex and fluid — even fashionable concepts like grit fail failure:
Trying to find a precise word to describe the dynamic is fleeting, like attempting to locate francium, an alkali metal measured but never isolated in any weighted quantity or seen in a way that the eye can detect — one of the most unstable, enigmatic elements on the Earth. No one knows what it looks like in an appreciable form, but there it is, scattered throughout ores in the Earth’s crust. Many of us have a similar sense that these implausible rises must be possible, but the stories tend to stay strewn throughout our lives, never coalescing into a single dynamic concept… The phenomenon remains hidden, and little discussed. Partial ideas do exist — resilience, reinvention, and grit — but there’s no one word to describe the passing yet vital, constant truth that just when it looks like winter, it is spring.
When we don’t have a word for an inherently fleeting idea, we speak about it differently, if at all. There are all sorts of generative circumstances — flops, folds, wipeouts, and hiccups — yet the dynamism it inspires is internal, personal, and often invisible… It is a cliché to say simply that we learn the most from failure. It is also not exactly true. Transformation comes from how we choose to speak about it in the context of story, whether self-stated or aloud.
One essential element of understanding the value of failure is the notion of the “deliberate incomplete.” (Cue in Marie Curie, who famously noted in a letter to her brother: “One never notices what has been done; one can only see what remains to be done.”) Lewis writes:
We thrive, in part, when we have purpose, when we still have more to do. The deliberate incomplete has long been a central part of creation myths themselves. In Navajo culture, some craftsmen and women sought imperfection, giving their textiles and ceramics an intended flaw called a “spirit line” so that there is a forward thrust, a reason to continue making work. Nearly a quarter of twentieth century Navajo rugs have these contrasting-color threads that run out from the inner pattern to just beyond the border that contains it; Navajo baskets and often pottery have an equivalent line called a “heart line” or a “spirit break.” The undone pattern is meant to give the weaver’s spirit a way out, to prevent it from getting trapped and reaching what we sense is an unnatural end.
There is an inevitable incompletion that comes with mastery. It occurs because the greater our proficiency, the more smooth our current path, the more clearly we may spot the mountain that hovers in our gaze. “What would you say increases with knowledge?” Jordan Elgrably once asked James Baldwin. “You learn how little you know,” Baldwin said.
A related concept is that of the “near win” — those moments when we come so close to our aim, yet miss it by a hair:
At the point of mastery, when there seems nothing left to move beyond, we find a way to move beyond ourselves. Success motivates. Yet the near win — the constant auto-correct of a curved-line path — can propel us in an ongoing quest. We see it whenever we aim, climb, or create with mastery as our aim, when the outcome is determined by what happens at the margins.
Lewis goes on to illustrate these concepts with living examples from the stories of such pioneering figures as the great polar explorer Captain Scott, dance icon Paul Taylor, and pioneering social reformer Frederick Douglass. Read more here.
6. THE ACCIDENTAL UNIVERSE
It says something about physicist and writer Alan Lightman — the very first person to receive dual appointments in science and the humanities at MIT — that a book of his is not only among the best science books of the year, but also a masterwork of philosophy. But that is precisely what The Accidental Universe: The World You Thought You Knew (public library | IndieBound) is — a spectacular journey to the frontiers of theoretical physics, exploring how the possibility of multiple universes illuminates the heart of the human experience and our quest for Beauty, Truth, and Meaning. Lightman’s enchanting writing reveals him not only as a scientist of towering expertise, but also as an insightful philosopher and poet of the cosmos, partway between Seneca and Carl Sagan.
In the foreword, Lightman recounts attending a lecture by the Dalai Lama at MIT, “one of the world’s spiritual leaders sitting cross-legged in a modern temple of science,” and hearing about the Buddhist concept of sunyata, translated as “emptiness” — the notion that objects in the physical universe are vacant of inherent meaning and that we imbue them with meaning and value with the thoughts of our own minds. From this, Lightman argues while adding to history’s finest definitions of science, arises a central challenge of the human condition:
As a scientist, I firmly believe that atoms and molecules are real (even if mostly empty space) and exist independently of our minds. On the other hand, I have witnessed firsthand how distressed I become when I experience anger or jealousy or insult, all emotional states manufactured by my own mind. The mind is certainly its own cosmos. As Milton wrote in Paradise Lost, “[The mind] can make a heaven of hell or a hell of heaven.” In our constant search for meaning in this baffling and temporary existence, trapped as we are within our three pounds of neurons, it is sometimes hard to tell what is real. We often invent what isn’t there. Or ignore what is. We try to impose order, both in our minds and in our conceptions of external reality. We try to connect. We try to find truth. We dream and we hope. And underneath all of these strivings, we are haunted by the suspicion that what we see and understand of the world is only a tiny piece of the whole.
Science does not reveal the meaning of our existence, but it does draw back some of the veils.
Amid her moving reflections on grief, grace, and gratitude is one especially enchanting essay titled “The Book of Welcome,” in which Lamott considers the uncomfortable art of letting yourself be seen:
Trappings and charm wear off… Let people see you. They see your upper arms are beautiful, soft and clean and warm, and then they will see this about their own, some of the time. It’s called having friends, choosing each other, getting found, being fished out of the rubble. It blows you away, how this wonderful event ever happened — me in your life, you in mine.
Two parts fit together. This hadn’t occurred all that often, but now that it does, it’s the wildest experience. It could almost make a believer out of you. Of course, life will randomly go to hell every so often, too. Cold winds arrive and prick you: the rain falls down your neck: darkness comes. But now there are two of you: Holy Moly.
Unlike many other puzzles we confront, questions of trust don’t just involve attempting to grasp and analyze a perplexing concept. They all share another characteristic: risk. So while it’s true that we turn our attention to many complex problems throughout our lives, finding the answers to most doesn’t usually involve navigating the treacherous landscape of our own and others’ competing desires.
Trust implies a seeming unknowable — a bet of sorts, if you will. At its base is a delicate problem centered on the balance between two dynamic and often opposing desires — a desire for someone else to meet your needs and his desire to meet his own.
But despite what pop culture may tell us, decades’ worth of attempts to decode the signals of trustworthiness — sought in everything from facial expression to voice to handwriting — have proven virtually useless, and the last five years of research have rendered previous assertions about certain nonverbal cues wrong. (No, a sideways glance doesn’t automatically indicate that the person is lying to you.) As DeSteno wryly observes, “If polygraphs were foolproof, we wouldn’t need juries.” He explains what makes measures of trust especially complicated:
Unlike many forms of communication, issues of trust are often characterized by a competition or battle…. It’s not always an adaptive strategy to be an open book to others, or even to ourselves. Consequently, trying to discern if someone can be trusted is fundamentally different from trying to assess characteristics like mathematical ability. … Deciding to be trustworthy depends on the momentary balance between competing mental forces pushing us in opposite directions, and being able to predict which of those forces is going to prevail in any one instance is a complicated business.
Contrary to long-held doctrine, isolated gestures and expressions aren’t reliable indicators of what a person feels or intends to do. Two types of context — what I call configural and situational — are essential for correct interpretation. And they’ve been missing in most attempts to discover what trustworthiness and its opposite look like.
To figure out this multifaceted puzzle, DeSteno, whose lab studies how emotional states shape our social and moral behavior, took a cross-disciplinary approach, turning to the work of economists, computer scientists, security officers, physiologists and other psychologists, and enlisting the direct help of social psychologist David Pizarro and economist Robert Frank. With combined expertise spanning behavioral economics, evolutionary biology, nonverbal behavior, and emotional biases in decision making, they built, with equal parts rigor and humility, the richest framework for understanding trust that science has ever accomplished. Specifically, they focused on the two main components of trust — how it works and whether we’re able to predict who deserves it. DeSteno writes:
In the end, what emerged are not only new insights into how to detect the trustworthiness of others, but also an entirely new way to think about how trust influences our lives, our success, and our interactions with those around us.
“Have compassion for everyone you meet, even if they don’t want it,”Lucinda Williams sang from my headphones into my heart one rainy October morning on the train to Hudson. “What seems cynicism is always a sign, always a sign…” I was headed to Hudson for a conversation with a very different but no less brilliant musician, and a longtime kindred spirit — the talented and kind Amanda Palmer. In an abandoned schoolhouse across the street from her host’s home, we sat down to talk about her magnificent and culturally necessary new book, The Art of Asking: How I Learned to Stop Worrying and Let People Help (public library | IndieBound) — a beautifully written inquiry into why we have such a hard time accepting compassion in all of its permutations, from love to what it takes to make a living, what lies behind our cynicism in refusing it, and how learning to accept it makes possible the greatest gifts of our shared humanity.
I am partial, perhaps, because my own sustenance depends on accepting help. But I also deeply believe and actively partake in both the yin and the yang of that vitalizing osmosis of giving and receiving that keeps today’s creative economy alive, binding artists and audiences, writers and readers, musicians and fans, into the shared cause of creative culture. “It’s only when we demand that we are hurt,” Henry Miller wrote in contemplating the circles of giving and receiving in 1942, but we still seem woefully caught in the paradoxical trap of too much entitlement to what we feel we want and too little capacity to accept what we truly need. The unhinging of that trap is what Amanda explores with equal parts deep personal vulnerability, profound insight into the private and public lives of art, and courageous conviction about the future of creative culture.
The most urgent clarion call echoing throughout the book, which builds on Amanda’s terrific TED talk, is for loosening our harsh and narrow criteria for what it means to be an artist, and, most of all, for undoing our punishing ideas about what renders one a not-artist, or — worse yet — a not-artist-enough. Amanda writes of the anguishing Impostor Syndrome epidemic such limiting notions spawn:
People working in the arts engage in street combat with The Fraud Police on a daily basis, because much of our work is new and not readily or conventionally categorized. When you’re an artist, nobody ever tells you or hits you with the magic wand of legitimacy. You have to hit your own head with your own handmade wand. And you feel stupid doing it.
There’s no “correct path” to becoming a real artist. You might think you’ll gain legitimacy by going to university, getting published, getting signed to a record label. But it’s all bullshit, and it’s all in your head. You’re an artist when you say you are. And you’re a good artist when you make somebody else experience or feel something deep or unexpected.
But in the history of creative genius, this pathology appears to be a rather recent development — the struggle to be an artist, of course, is nothing new, but the struggle to believe being one seems to be a uniquely modern malady. In one of the most revelatory passages in the book, Amanda points out a little-known biographical detail about the life of Henry David Thoreau — he who decided to live the self-reliant life by Walden pond and memorably proclaimed: “If the day and the night are such that you greet them with joy, and life emits a fragrance like flowers and sweet-scented herbs, is more elastic, more starry, more immortal — that is your success.” It is a detail that, today, would undoubtedly render Thoreau the target of that automatic privilege narrative as we point a finger and call him a “poser”:
Thoreau wrote in painstaking detail about how he chose to remove himself from society to live “by his own means” in a little 10-foot x 15-foot hand-hewn cabin on the side of a pond. What he left out of Walden, though, was the fact that the land he built on was borrowed from his wealthy neighbor, that his pal Ralph Waldo Emerson had him over for dinner all the time, and that every Sunday, Thoreau’s mother and sister brought over a basket of freshly-baked goods for him, including donuts.
The idea of Thoreau gazing thoughtfully over the expanse of transcendental Walden Pond, a bluebird alighting onto his threadbare shoe, all the while eating donuts that his mom brought him just doesn’t jibe with most people’s picture of him of a self-reliant, noble, marrow-sucking back-to-the-woods folk-hero.
If Thoreau lived today, steeped in a culture that tells him taking the donuts chips away at his credibility, would he have taken them? And why don’t we? Amanda writes:
Taking the donuts is hard for a lot of people.
It’s not the act of taking that’s so difficult, it’s more the fear of what other people are going to think when they see us slaving away at our manuscript about the pure transcendence of nature and the importance of self-reliance and simplicity. While munching on someone else’s donut.
Maybe it comes back to that same old issue: we just can’t see what we do as important enough to merit the help, the love.
Try to picture getting angry at Einstein devouring a donut brought to him by his assistant, while he sat slaving on the theory of relativity. Try to picture getting angry at Florence Nightingale for snacking on a donut while taking a break from tirelessly helping the sick.
To the artists, creators, scientists, non-profit-runners, librarians, strange-thinkers, start-uppers and inventors, to all people everywhere who are afraid to accept the help, in whatever form it’s appearing,
Please, take the donuts.
To the guy in my opening band who was too ashamed to go out into the crowd and accept money for his band,
Take the donuts.
To the girl who spent her twenties as a street performer and stripper living on less than $700 a month who went on to marry a best-selling author who she loves, unquestioningly, but even that massive love can’t break her unwillingness to accept his financial help, please….
Just take the fucking donuts.
But Thoreau, it turns out, got one thing right in his definition of success, which emanates from Amanda’s words a century and a half later:
The happiest artists I know are generally the ones who can manage to make a reasonable living from their art without having to worry too much about the next paycheck. Not to say that every artist who sits around the campfire, or plays in tiny bars, is “happier” than those singing in stadiums — but more isn’t always better. If feeling the connection between yourself and others is the ultimate goal it can be harder when you are separated from the crowd by a 30-foot barrier. And it can be easier to do — though riskier — when they’re sitting right beside you. The ideal sweet spot is the one in which the artist can freely share their talents and directly feel the reverberations of their artistic gifts to their community. In other words, it works best when everybody feels seen.
As artists, and as humans: If your fear is scarcity, the solution isn’t necessarily abundance.
Read more and watch my conversation with Palmer here.
10. LEONARDO’S BRAIN
One September day in 2008, Leonard Shlain found himself having trouble buttoning his shirt with his right hand. He was admitted into the emergency room, diagnosed with Stage 4 brain cancer, and given nine months to live. Shlain — a surgeon by training and a self-described “synthesizer by nature” with an intense interest in the ennobling intersection of art and science, author of the now-legendary Art & Physics — had spent the previous seven years working on what he considered his magnum opus: a sort of postmortem brain scan of Leonardo da Vinci, performed six centuries after his death and fused with a detective story about his life, exploring what the unique neuroanatomy of the man commonly considered humanity’s greatest creative genius might reveal about the essence of creativity itself.
Shlain finished the book on May 3, 2009. He died a week later. His three children — Kimberly, Jordan, and filmmaker Tiffany Shlain — spent the next five years bringing their father’s final legacy to life. The result is Leonardo’s Brain: Understanding Da Vinci’s Creative Genius (public library | IndieBound) — an astonishing intellectual, and at times spiritual, journey into the center of human creativity via the particular brain of one undereducated, left-handed, nearly ambidextrous, vegetarian, pacifist, gay, singularly creative Renaissance male, who Shlain proposes was able to attain a different state of consciousness than “practically all other humans.”
Noting that “a writer is always refining his ideas,” Shlain points out that the book is a synthesis of his threepreviousbooks, and an effort to live up to Kafka’s famous proclamation that “a book must be the axe for the frozen sea inside us.” It is also a beautiful celebration of the idea that art and science belong together and enrich one another whenever they converge.
Shlain argues that Leonardo — who painted the eternally mysterious Mona Lisa, created visionary anatomical drawings long before medical anatomy existed, made observations of bird flight in greater detailed than any previous scientist, mastered engineering, architecture, mathematics, botany, and cartography, might be considered history’s first true scientist long before the word was coined for Mary Somerville, presaged Newton’s Third Law, Bernoulli’s law, and elements of chaos theory, and was a deft composer who sang “divinely,” among countless other domains of mastery — is the individual most worthy of the title “genius” in both science and art:
The divergent flow of art and science in the historical record provides evidence of a distinct compartmentalization of genius. The river of art rarely intersected with the meander of science.
Although both art and science require a high degree of creativity, the difference between them is stark. For visionaries to change the domain of art, they must make a breakthrough that can only be judged through the lens of posterity. Great science, on the other hand, must be able to predict the future. If a scientist’s hypotheses cannot be turned into a law that can be verified by future investigators, it is not scientifically sound. Another contrast: Art and science represent the difference between “being” and “doing.” Art’s raison d’être is to evoke an emotion. Science seeks to solve problems by advancing knowledge.
Leonardo’s story continues to compel because he represents the highest excellence all of us lesser mortals strive to achieve — to be intellectually, creatively, and emotionally well-rounded. No other individual in the known history of the human species attained such distinction both in science and art as the hyper-curious, undereducated, illegitimate country boy from Vinci.
Using a wealth of available information from Leonardo’s notebooks, various biographical resources, and some well-reasoned speculation, Shlain goes on to perform a “posthumous brain scan” seeking to illuminate the unique wiring of Da Vinci’s brain and how it explains his unparalleled creativity.
“Faith is the ability to honor stillness at some moments,” Alan Lightman wrote in his sublime meditation on science and spirituality, “and at others to ride the passion and exuberance.” In his conversation with E.O. Wilson, the poet Robert Hass described beauty as a “paradox of stillness and motion.” But in our Productivity Age of perpetual motion, it’s increasingly hard — yet increasingly imperative — to honor stillness, to build pockets of it into our lives, so that our faith in beauty doesn’t become half-hearted, lopsided, crippled. The delicate bridling of that paradox is what novelist and essayist Pico Iyer explores in The Art of Stillness: Adventures in Going Nowhere (public library | IndieBound) — a beautifully argued case for the unexpected pleasures of “sitting still as a way of falling in love with the world and everything in it,” revealed through one man’s sincere record of learning to “take care of his loved ones, do his job, and hold on to some direction in a madly accelerating world.”
Iyer begins by recounting a snaking drive up the San Gabriel Mountains outside Los Angeles to visit his boyhood hero — legendary singer-songwriter Leonard Cohen. In 1994, shortly after the most revealing interview he ever gave, Cohen had moved to the Mt. Baldy Zen Center to embark on five years of seclusion, serving as personal assistant to the great Japanese Zen teacher Kyozan Joshu Sasaki, then in his late eighties. Midway through his time at the Zen Center, Cohen was ordained as a Rinzai Zen Buddhist monk and given the Dharma name Jikan — Pali for “silence.” Iyer writes:
I’d come up here in order to write about my host’s near-silent, anonymous life on the mountain, but for the moment I lost all sense of where I was. I could hardly believe that this rabbinical-seeming gentleman in wire-rimmed glasses and wool cap was in truth the singer and poet who’d been renowned for thirty years as an international heartthrob, a constant traveler, and an Armani-clad man of the world.
Cohen, who once described the hubbub of his ordinary state of mind as “very much like the waiting room at the DMV,” had sought in the sequestered Zen community a more extreme, more committed version of a respite most of us long for in the midst of modern life — at least at times, at least on some level, and often wholeheartedly, achingly. Iyer reflects on Cohen’s particular impulse and what it reveals about our shared yearning:
Leonard Cohen had come to this Old World redoubt to make a life — an art — out of stillness. And he was working on simplifying himself as fiercely as he might on the verses of one of his songs, which he spends more than ten years polishing to perfection. The week I was visiting, he was essentially spending seven days and nights in a bare meditation hall, sitting stock-still. His name in the monastery, Jikan, referred to the silence between two thoughts.
One evening — four in the morning, the end of December — Cohen took time out from his meditations to walk down to my cabin and try to explain what he was doing here.
Sitting still, he said with unexpected passion, was “the real deep entertainment” he had found in his sixty-one years on the planet. “Real profound and voluptuous and delicious entertainment. The real feast that is available within this activity.”
Was he kidding? Cohen is famous for his mischief and ironies.
He wasn’t, I realized as he went on. “What else would I be doing?” he asked. “Would I be starting a new marriage with a young woman and raising another family? Finding new drugs, buying more expensive wine? I don’t know. This seems to me the most luxurious and sumptuous response to the emptiness of my own existence.”
Typically lofty and pitiless words; living on such close terms with silence clearly hadn’t diminished his gift for golden sentences. But the words carried weight when coming from one who seemed to have tasted all the pleasures that the world has to offer.
Iyer beholds his encounter with Cohen with the same incredulous amazement that most of us modern cynics experience, at first reluctantly, when confronted with something or someone incomprehensibly earnest, for nothing dissolves snark like unflinching sincerity. For Cohen, Iyer observes, the Zen practice was not a matter of “piety or purity” but of practical salvation and refuge from “the confusion and terror that had long been his bedfellows.” Iyer writes:
Sitting still with his aged Japanese friend, sipping Courvoisier, and listening to the crickets deep into the night, was the closest he’d come to finding lasting happiness, the kind that doesn’t change even when life throws up one of its regular challenges and disruptions.
“Nothing touches it,” Cohen said, as the light came into the cabin, of sitting still… Going nowhere, as Cohen described it, was the grand adventure that makes sense of everywhere else.
We’ve lost our Sundays, our weekends, our nights off — our holy days, as some would have it; our bosses, junk mailers, our parents can find us wherever we are, at any time of day or night. More and more of us feel like emergency-room physicians, permanently on call, required to heal ourselves but unable to find the prescription for all the clutter on our desk.
Not many years ago, it was access to information and movement that seemed our greatest luxury; nowadays it’s often freedom from information, the chance to sit still, that feels like the ultimate prize. Stillness is not just an indulgence for those with enough resources — it’s a necessity for anyone who wishes to gather less visible resources. Going nowhere, as Cohen had shown me, is not about austerity so much as about coming closer to one’s senses.
If the notion of mental illness in animals seems like far-fetched anthropocentrism, a field of science that has been gathering momentum for more than 150 years strongly suggests otherwise. That’s precisely what Senior TED Fellow Laurel Braitman explores in Animal Madness: How Anxious Dogs, Compulsive Parrots, and Elephants in Recovery Help Us Understand Ourselves (public library | IndieBound). Braitman, who holds a Ph.D. in history and anthropology of science from MIT, argues that we humans are far from unique in our capacity for “emotional thunderstorms that make our lives more difficult” and that nonhuman animals are bedeviled by varieties of mental illness strikingly similar to our own. With equal parts rigor and compassion, she examines evidence from veterinary science, psychology and pharmacology research, first-hand accounts by neuroscientists, zoologists, animal trainers, and other experts, the work of legendary scientists and philosophers like Charles Darwin and Rene Descartes, and her own experience with dozens of animals spanning a multitude of species and mental health issues, from depressed dogs to self-harming dolphins to canine Alzheimer’s and PTSD.
Braitman’s journey begins with one particularly troubled nonhuman animal — Oliver, the Bernese Mountain Dog she adopted, whose “extreme fear, anxiety, and compulsions” prompted her, in the way that a concerned parent on the verge of despair grasps for answers, to explore whether and how other animals could be mentally ill. Considering the tapestry of evidence threads she uncovered during her research, she writes:
Humans and other animals are more similar than many of us might think when it comes to mental states and behaviors gone awry — experiencing churning fear, for example, in situations that don’t call for it, feeling unable to shake a paralyzing sadness, or being haunted by a ceaseless compulsion to wash our hands or paws. Abnormal behaviors like these tip into the territory of mental illness when they keep creatures — human or not — from engaging in what is normal for them. This is true for a dog single-mindedly focused on licking his tail until it’s bare and oozy, a sea lion fixated on swimming in endless circles, a gorilla too sad and withdrawn to play with her troop members, or a human so petrified of escalators he avoids department stores.
Every animal with a mind has the capacity to lose hold of it from time to time. Sometimes the trigger is abuse or mistreatment, but not always. I’ve come across depressed and anxious gorillas, compulsive horses, rats, donkeys, and seals, obsessive parrots, self-harming dolphins, and dogs with dementia, many of whom share their exhibits, homes, or habitats with other creatures who don’t suffer from the same problems. I’ve also gotten to know curious whales, confident bonobos, thrilled elephants, contented tigers, and grateful orangutans. There is plenty of abnormal behavior in the animal world, captive, domestic, and wild, and plenty of evidence of recovery; you simply need to know where and how to find it.
Braitman is careful to acknowledge that such a notion is likely to unnerve our notions of human exceptionalism and offers a wise caveat:
Acknowledging parallels between human and other animal mental health is a bit like recognizing capacities for language, tool use, and culture in other creatures. That is, it’s a blow to the idea that humans are the only animals to feel or express emotion in complex and surprising ways. It is also anthropomorphic, the projection of human emotions, characteristics, and desires onto nonhuman beings or things. We can choose, though, to anthropomorphize well and, by doing so, make more accurate interpretations of animals’ behavior and emotional lives. Instead of self-centered projection, anthropomorphism can be a recognition of bits and pieces of our human selves in other animals and vice versa.
Braitman goes on to trace how our evolving understanding of animal psychology, from Charles Darwin to Jane Goodall, sheds invaluable light on things of deep concern to us humans — notions like anxiety, altruism, depression, and happiness. Read more here.
Slingerland frames the paradoxical premise at the heart of his book with an illustrative example: a game called Mindball at his local science museum in Vancouver, in which two players sit opposite one another, each wearing an electrode-equipped headband that registers general activity in the brain, and try to mentally push a metal ball from the center of the table to the other player; whoever does this first wins. There is, of course, a rub:
The motive force — measured by each player’s electrodes, and conveyed to the ball by a magnet hidden underneath the table—is the combination of alpha and theta waves produced by the brain when it’s relaxed: the more alpha and theta waves you produce, the more force you mentally exert on the ball. Essentially, Mindball is a contest of who can be the most calm. It’s fun to watch. The players visibly struggle to relax, closing their eyes, breathing deeply, adopting vaguely yogic postures. The panic they begin to feel as the ball approaches their end of the table is usually balanced out by the overeagerness of their opponent, both players alternately losing their cool as the big metal ball rolls back and forth. You couldn’t wish for a better, more condensed illustration of how difficult it is to try not to try.
Our lives, Slingerland argues, are often like “a massive game of Mindball,” when we find ourselves continually caught in this loop of trying so hard that we stymie our own efforts. Like in Mindball, where victory only comes when the player relaxes and stops trying to win, we spend our lives “preoccupied with effort, the importance of working, striving, and trying,” only to find that the more we try to will things into manifesting, the more elusive they become. Slingerland writes:
Our excessive focus in the modern world on the power of conscious thought and the benefits of willpower and self-control causes us to overlook the pervasive importance of what might be called “body thinking”: tacit, fast, and semiautomatic behavior that flows from the unconscious with little or no conscious interference. The result is that we too often devote ourselves to pushing harder or moving faster in areas of our life where effort and striving are, in fact, profoundly counterproductive.
Some of the most elusive objects of our incessant pursuits are happiness and spontaneity, both of which are strikingly resistant to conscious pursuit. Two ancient Chinese concepts might be our most powerful tools for resolving this paradox — wu-wei (pronounced oooo-way) and de (pronounced duh). Slingerland explains:
Wu-wei literally translates as “no trying” or “no doing,” but it’s not at all about dull inaction. In fact, it refers to the dynamic, effortless, and unselfconscious state of mind of a person who is optimally active and effective. People in wu-wei feel as if they are doing nothing, while at the same time they might be creating a brilliant work of art, smoothly negotiating a complex social situation, or even bringing the entire world into harmonious order. For a person in wu-wei, proper and effective conduct follows as automatically as the body gives in to the seductive rhythm of a song. This state of harmony is both complex and holistic, involving as it does the integration of the body, the emotions, and the mind. If we have to translate it, wu-wei is probably best rendered as something like “effortless action” or “spontaneous action.” Being in wu-wei is relaxing and enjoyable, but in a deeply rewarding way that distinguishes it from cruder or more mundane pleasures.
“Anxiety … makes others feel as you might when a drowning man holds on to you,”Anaïs Nin wrote. “Anxiety may be compared with dizziness. He whose eye happens to look down the yawning abyss becomes dizzy,”Kierkegaard observed. “There is no question that the problem of anxiety is a nodal point at which the most various and important questions converge, a riddle whose solution would be bound to throw a flood of light on our whole mental existence,”Freud proclaimed in his classic introductory lectures on psychoanalysis. And yet the riddle of anxiety is far from solved — rather, it has swelled into a social malady pulling countless numbers of us underwater daily. Among those most mercilessly fettered by anxiety’s grip is Scott Stossel, familiar to most as the editor of The Atlantic. In his superb mental health memoir, My Age of Anxiety: Fear, Hope, Dread, and the Search for Peace of Mind (public library | IndieBound), Stossel follows in the tradition of Montaigne to use the lens of his own experience as a prism for illuminating insight on the quintessence of our shared struggles with anxiety. From his personal memoir he weaves a cultural one, painting a portrait of anxiety though history, philosophy, religion, popular culture, literature, and a wealth of groundbreaking research in psychology and neuroscience.
Why? Because anxiety and its related psychoemotional disorders turn out to be the most common, prevalent, and undertreated form of clinically classified mental illness today, even more common than depression. Stossel contextualizes the issue with some striking statistics that reveal the cost — both financial and social — of anxiety:
According to the National Institute of Mental Health, some forty million Americans, nearly one in seven of us, are suffering from some kind of anxiety disorder at any given time, accounting for 31 percent of the expenditures on mental health care in the United States. According to recent epidemiological data, the “lifetime incidence” of anxiety disorder is more than 25 percent — which, if true, means that one in four of us can expect to be stricken by debilitating anxiety at some point in our lifetimes. And it is debilitating: Recent academic papers have argued that the psychic and physical impairment tied to living with an anxiety disorder is equivalent to living with diabetes — usually manageable, sometimes fatal, and always a pain to deal with. A study published in The American Journal of Psychiatry in 2006 found that Americans lose a collective 321 million days of work because of anxiety and depression each year, costing the economy $50 billion annually; a 2001 paper published by the U.S. Bureau of Labor Statistics once estimated that the median number of days missed each year by American workers who suffer from anxiety or stress disorders is twenty-five. In 2005 — three years before the recent economic crisis hit — Americans filled fifty-three million prescriptions for just two antianxiety drugs: Ativan and Xanax. (In the weeks after 9/11, Xanax prescriptions jumped 9 percent nationally — and by 22 percent in New York City.) In September 2008, the economic crash caused prescriptions in New York City to spike: as banks went belly up and the stock market went into free fall, prescriptions for anti-depressant and antianxiety medications increased 9 percent over the year before, while prescriptions for sleeping pills increased 11 percent.
Few people today would dispute that chronic stress is a hallmark of our times or that anxiety has become a kind of cultural condition of modernity. We live, as has been said many times since the dawn of the atomic era, in an age of anxiety — and that, cliché though it may be, seems only to have become more true in recent years as America has been assaulted in short order by terrorism, economic calamity and disruption, and widespread social transformation.
Fittingly, Alan Watts’s The Wisdom of Insecurity: A Message for an Age of Anxiety, written in the very atomic era that sparked the dawn of our present predicament, remains one of the best meditations on the subject. But, as Stossel points out, the notion of anxiety as a clinical category only appeared as recently as thirty years ago. He traces anxiety’s rise to cultural fame through the annals of academic history, pointing out that there were only three academic papers published on the subject in 1927, only fourteen in 1941, and thirty-seven in 1950. It wasn’t until psychologist Rollo May published his influential treatise on anxiety in 1950 that academia paid heed. Today, a simple Google Scholar search returns nearly three million results, and entire academic journals are dedicated to anxiety.
But despite anxiety’s catapulting into cultural concern, our understanding of it — especially as far as mental health stereotypes are concerned — remains developmentally stunted, having evolved very little since the time of seventeenth-century Jewish-Dutch philosopher Baruch Spinoza, who asserted that anxiety was a mere problem of logic and could thus be resolved with tools of reason. Stossel counters such oversimplification with a case for layered, complex causality of the disorder:
The truth is that anxiety is at once a function of biology and philosophy, body and mind, instinct and reason, personality and culture. Even as anxiety is experienced at a spiritual and psychological level, it is scientifically measurable at the molecular level and the physiological level. It is produced by nature and it is produced by nurture. It’s a psychological phenomenon and a sociological phenomenon. In computer terms, it’s both a hardware problem (I’m wired badly) and a software problem (I run faulty logic programs that make me think anxious thoughts). The origins of a temperament are many faceted; emotional dispositions that may seem to have a simple, single source — a bad gene, say, or a childhood trauma — may not.