“Oftentimes a word shall speak what accumulated volumes have labored in vain to utter: there may be years of crowded passion in a word, and half a life in a sentence.”
By Maria Popova
“You can never be sure / you die without knowing / whether anything you wrote was any good / if you have to be sure don’t write,” W.S. Merwin wrote in his gorgeous poem encapsulating his greatest mentor’s advice. No one has embodied this ethos more fully than Emily Dickinson (December 10, 1830–May 15, 1886), who lived and died a century earlier never knowing whether anything she wrote was any good, never knowing whether and how and that her body of work would revolutionize literature and rewrite the common record of human thought and feeling.
In her thirty-first year, on the pages of a national magazine, Dickinson — a central figure in Figuring, from which this essay is adapted — encountered the person who would become the closest thing she ever had to a literary mentor.
In the spring of 1862, exactly four decades ahead of Rilke’s Letters to a Young Poet, The Atlantic Monthly published a twenty-page piece titled “A Letter to a Young Contributor” by the abolitionist and women’s rights advocate Thomas Wentworth Higginson (December 22, 1823–May 9, 1911).
Addressing young writers — primarily the many women who sent the Atlantic manuscripts for consideration under male pseudonyms — the thirty-nine-year-old Higginson writes:
No editor can ever afford the rejection of a good thing, and no author the publication of a bad one. The only difficulty lies in drawing the line.
A good editor, Higginson asserts, has learned to draw that line by having “educated his eye till it has become microscopic, like a naturalist’s, and can classify nine out of ten specimens by one glance at a scale or a feather.” He chooses a strangely morbid metaphor to illustrate the editorial challenge and thrill of finding that rare undiscovered genius among “the vast range of mediocrity”:
To take the lead in bringing forward a new genius is as fascinating a privilege as that of the physician who boasted to Sir Henry Halford of having been the first man to discover the Asiatic cholera and to communicate it to the public.
He goes on to offer a bundle of advice on how an aspiring writer is to court her prospective editor: Revise amply before sending in your manuscript; write legibly with “good pens, black ink, nice white paper and plenty of it”; develop a style of expression not “polite and prosaic” but “so saturated with warm life and delicious association that every sentence shall palpitate and thrill with the mere fascination of the syllables”; counterbalance profundity of sentiment with levity of style; know that “there is no severer test of literary training than in the power to prune out your most cherished sentence, when you find that the sacrifice will help the symmetry or vigor of the whole”; don’t show off your erudition but showcase its fruits; and remember that “a phrase may outweigh a library.” He writes:
There may be phrases which shall be palaces to dwell in, treasure-houses to explore; a single word may be a window from which one may perceive all the kingdoms of the earth and the glory of them. Oftentimes a word shall speak what accumulated volumes have labored in vain to utter: there may be years of crowded passion in a word, and half a life in a sentence… Labor, therefore, not in thought alone, but in utterance; clothe and reclothe your grand conception twenty times, if need be, until you find some phrase that with its grandeur shall be lucid also.
In a sun-filled bedroom fifty miles to the west, a woman who had crowded lifetimes of passion into her thirty-one years and corked it up in the volcanic bosom of her being devoured the piece—a woman who would boldly defy Higginson’s indictment that a writer should use dashes only in “short allowance” or else they “will lose all their proper power,” a woman whose reclusive genius would become his choleric discovery.
For more than a decade, Dickinson had been welding her words to her experience with white heat in the private furnace of her being, sharing her poems only with her intimates. Now she felt beckoned to step across the threshold of the door Higginson had set ajar with his open letter inviting unknown writers into the public life of literature.
On April 16, 1862, Emily Dickinson sent Thomas Wentworth Higginson four of her poems, along with a short, arresting note in the slanted swoop of her barely decipherable hand, stripped of the era’s epistolary etiquette. “Mr. Higginson,” she addressed him bluntly, with no formal salutation, “Are you too deeply occupied to say if my Verse is alive?” She was likely making an allusion, whether conscious or not, to her revered Aurora Leigh, in which Elizabeth Barrett Browning’s heroine exults in her calling while struggling to become a published poet:
My heart’s life throbbing in my verse to show
And then Dickinson added:
The Mind is so near itself — it cannot see, distinctly — and I have none to ask. Should you think it breathed — and had you the leisure to tell me, I should feel quick gratitude.
She didn’t sign the letter, either, but instead enclosed a smaller sealed envelope with her name inscribed in pencil on a cream-colored notecard — a choice that would still puzzle Higginson thirty years later.
Two more letters followed shortly. Dickinson ended the third with the come-hither of a bespoke verse, then asked seductively: “Will you be my Preceptor, Mr. Higginson?” He would, and he did, commencing a correspondence that would last the poet’s lifetime.
But although Dickinson had so insistently enlisted Higginson as her “Preceptor,” again and again she would reject his efforts to tame and commercialize her poetry, to make it “more orderly,” buoyed by a quiet confidence in the integrity of her unorthodox verse. “Could you tell me how to grow,” she implored in her third letter to Higginson, “or is it unconveyed — like Melody — or Witchcraft?” When he offered criticism, then worried that he might have been too harsh, she assured him with humility and aplomb that it was all welcome: “Men do not call the surgeon, to commend—the Bone, but to set it, Sir, and fracture within, is more critical.” And then she promptly sent him four more poems, unheeding of his editorial suggestions.
Over the years, Dickinson would fracture Higginson’s stiff understanding of art, and through the cracks a new kind of light would flood his world. “There is always one thing to be grateful for — that one is one’s self & not somebody else,” she would tell him. Here stood a writer who was unassailably her own self. Between her unruly punctuation, Higginson would eventually find “flashes of wholly original and profound insight into nature and life,” language ablaze with “an extraordinary vividness of descriptive and imaginative power.” When her poems finally entered the world on November 12, 1890 — four years after her death — Higginson exulted in the preface:
In many cases these verses will seem to the reader like poetry torn up by the roots, with rain and dew and earth still clinging to them, giving a freshness and a fragrance not otherwise to be conveyed. In other cases, as in the few poems of shipwreck or of mental conflict, we can only wonder at the gift of vivid imagination by which this recluse woman can delineate, by a few touches, the very crises of physical or mental conflict… But the main quality of these poems is that of extraordinary grasp and insight, uttered with an uneven vigor sometimes exasperating, seemingly wayward, but really unsought and inevitable.
The volume was an astonishing success, much to the chagrin of Houghton Mifflin, who had originally rejected it. Five hundred copies vanished from the shelves on the first day of publication. Within the first year, the book had gone through eleven printings, and nearly eleven thousand copies had been absorbed into the body of culture.
That year, as the rapids of Dickinson’s verse sprang into the world, William James’s groundbreaking Principles of Psychology coined the notion of stream of consciousness. Soon, as English reviewers launched upon Dickinson attacks unequaled since those on Shelley and Keats a century earlier, Alice James — William James’s brilliant bedridden sister — would write wryly in her diary, itself an unheralded triumph of literature:
It is reassuring to hear the English pronouncement that Emily Dickinson is fifth-rate, they have such a capacity for missing quality; the robust evades them equally with the subtle… What tome of philosophy resumes the cheap farce or expresses the highest point of view of the aspiring soul more completely than the following —
How dreary to be somebody
How public, like a frog
To tell your name the livelong day
To an admiring bog.
“To come close to art means to come close to life, and if an appreciation of the dignity of man is the moral definition of democracy, then its psychological definition arises out of its determination to reconcile and combine knowledge and art, mind and life, thought and deed.”
By Maria Popova
“Progress is never permanent, will always be threatened, must be redoubled, restated and reimagined if it is to survive,” Zadie Smith wrote in her stirring essay on optimism and despair. But what does the reinvention, reassertion, and survival of progress look like when the basic fabric of democracy is under claw?
That is what Thomas Mann (June 6, 1875–August 12, 1955) examined on the cusp of World War II with a prescience that bellows across the decades to speak to our own epoch and to every epoch that will succeed us.
When Hitler seized power in 1933, the 58-year-old Mann, who had won the Nobel Prize in Literature five years earlier, went into exile in Switzerland. The following year, he visited America for the first time. He returned each year thereafter, until he finally emigrated permanently in 1938 and became one of a handful of German expatriates in the United States to vocally oppose Nazism and fascism. Between February and May 1938, just before the outbreak of the war, Mann gave a series of poignant and rousing lectures across America, published later that year as The Coming Victory of Democracy (public library) — a spirited insistence that “we must not be afraid to attempt a reform of freedom,” and a clarion call for the urgent work of continually renewing and reasserting democracy as menacing ideologies rise and fall against it.
America needs no instruction in the things that concern democracy. But instruction is one thing — and another is memory, reflection, re-examination, the recall to consciousness of a spiritual and moral possession of which it would be dangerous to feel too secure and too confident. No worth-while possession can be neglected. Even physical things die off, disappear, are lost, if they are not cared for, if they do not feel the eye and hand of the owner and are lost to sight because their possession is taken for granted. Throughout the world it has become precarious to take democracy for granted — even in America… Even America feels today that democracy is not an assured possession, that it has enemies, that it is threatened from within and from without, that it has once more become a problem. America is aware that the time has come for democracy to take stock of itself, for recollection and restatement and conscious consideration, in a word, for its renewal in thought and feeling.
In a sentiment that calls to mind Martha Graham’s notion of “divine dissatisfaction” as the motive force of all creative work, Mann notes that a certain restlessness about the state of the world and our place in it is inherent to the human animal:
It is the fate of man in no condition and under no circumstances ever to be entirely at ease upon this earth; no form of life is wholly suitable nor wholly satisfactory to him. Why this should be so, why there should always remain upon earth for this creature a modicum of insufficiency, of dissatisfaction and suffering, is a mystery — a mystery that may be a very honourable one for man, but also a very painful one; in any case it has this consequence: that humanity, in small things as in great, strives for variety, change, for the new, because it promises him an amelioration and an alleviation of his eternally semi-painful condition.
The greatest threat to democracy, Mann argues, comes from demagogues who prey on this restlessness with dangerous ideologies whose chief appeal is “the charm of novelty” — the exploitive promise of a new world order that allays some degree of dissatisfaction for some number of people, at a gruesome cost to the rest of humanity. To counter this perilous tendency, democracy must continually regenerate itself. Mann writes:
Daring and clever as fascism is in exploiting human weakness, it succeeds in meeting to some extent humanity’s painful eagerness for novelty… And what seems to me necessary is that democracy should answer this fascist strategy with a rediscovery of itself, which can give it the same charm of novelty — yes, a much higher one than that which fascism seeks to exert. It should put aside the habit of taking itself for granted, of self-forgetfulness. It should use this wholly unexpected situation — the fact, namely, that it has again become problematical — to renew and rejuvenate itself by again becoming aware of itself. For democracy’s resources of vitality and youthfulness cannot be overestimated… Fascism is a child of the times — a very offensive child — and draws whatever youth it possesses out of the times. But democracy is timelessly human, and timelessness always implies a certain amount of potential youthfulness, which need only be realized in thought and feeling in order to excel, by far, all merely transitory youthfulness in charms of every sort, in the charm of life and in the charm of beauty.
That particular strain of fascism was endemic to Mann’s time, but it has manifested in myriad guises countless times before and since. In a letter penned at the peak of the war Mann was hoping to prevent with this humanistic shift in consciousness, John Steinbeck would capture these cycles chillingly: “All the goodness and the heroisms will rise up again, then be cut down again and rise up. It isn’t that the evil thing wins — it never will — but that it doesn’t die.”
Mann considers the idea of justice as elemental to our humanity, locating in it the wellspring of our dignity:
It is a singular thing, this human nature, and distinguished from the rest of nature by the very fact that it has been endowed with the idea, is dominated by the idea, and cannot exist without it, since human nature is what it is because of the idea. The idea is a specific and essential attribute of man, that which makes him human. It is within him a real and natural fact, so impossible of neglect that those who do not respect human nature’s participation in the ideal — as force certainly does not — commit the clumsiest and, in the long run, the most disastrous mistakes. But the word “ justice ” is only one name for the idea — only one; there are other names which can be substituted that are equally strong, by no means lacking in vitality; on the contrary, even rather terrifying — for example, freedom and truth. It is impossible to decide which one should take precedence, which is the greatest. For each one expresses the idea in its totality, and one stands for the others. If we say truth, we also say freedom and justice-, if we speak of freedom and justice, we mean truth. It is a complex of an indivisible kind, freighted with spirituality and elementary dynamic force. We call it the absolute. To man has been given the absolute — be it a curse or a blessing, it is a fact. He is pledged to it, his inner being is conditioned by it, and in the human sphere a force which is opposed to truth, hostile to freedom, and lacking in justice, acts in so low and contemptible a manner because it is devoid of feeling and understanding for the relationship between man and the absolute and without comprehension of the inviolable human dignity which grows out of this relationship.
A quarter century before the pioneering social scientist John Gardner penned his influential treatise on self-renewal, Mann calls for a reinvention of democracy that places human dignity at the heart of its political and civic ideals:
We must reach higher and envisage the whole. We must define democracy as that form of government and of society which is inspired above every other with the feeling and consciousness of the dignity of man.
Echoing Theodore Roosevelt’s admonition against the cowardice of cynicism as one of the greatest obstacles to a flourishing society, Mann calls for relinquishing our reflexive cynicism about human nature:
The dignity of man — do we not feel alarmed and somewhat ridiculous at the mention of these words? Do they not savour of optimism grown feeble and stuffy — of after-dinner oratory, which scarcely harmonizes with the bitter, harsh, everyday truth about human beings? We know it — this truth. We are well aware of the nature of man, or, to be more accurate, the nature of men — and we are far from entertaining any illusions on the subject… Yes, yes, humanity — its injustice, malice, cruelty, its average stupidity and blindness are amply demonstrated, its egoism is crass, its deceitfulness, cowardice, its antisocial instincts, constitute our everyday experience; the iron pressure of disciplinary constraint is necessary to keep it under any reasonable control. Who cannot embroider upon the depravity of this strange creature called man, who does not often despair over his future… And yet it is a fact — more true today than ever — that we cannot allow ourselves, because of so much all too well-founded skepticism, to despise humanity. Despite so much ridiculous depravity, we cannot forget the great and the honourable in man, which manifest themselves as art and science, as passion for truth, creation of beauty and the idea of justice; and it is also true that insensitiveness to the great mystery which we touch upon when we say “man” or “humanity” signifies spiritual death. That is not a truth of yesterday or the day before yesterday, antiquated, unattractive, and feeble. It is the new and necessary truth of today and tomorrow, the truth which has life and youth on its side in opposition to the false and withering youthfulness of certain theories and truths of the moment.
It is only a difference of degree, not of kind, between this ordinary cynical contempt for human goodness and the most extreme acts of evil. Mann writes:
Terror destroys people, that is clear. It corrupts character, releases every evil impulse, turns them into cowardly hypocrites and shameless informers. It makes them contemptible — that is the reason why these contemners of humanity love terrorism.
Democracy wishes to elevate mankind, to teach it to think, to set it free. It seeks to remove from culture the stamp of privilege and disseminate it among the people — in a word, it aims at education. Education is an optimistic and humane concept; and respect for humanity is inseparable from it. Hostile to mankind and contemptuous of it is the opposing concept called propaganda, which tries to stultify, stupefy, level, or regiment men for the purpose of military efficiency and, above all, to keep the dictatorial system in power.
Democracy being a fertile ground for intellect and literature, for the perception of psychological truth and the search for it, contradicts itself inasmuch as it has an acute appreciation and makes a critical analysis of the absurd wickedness of man, but nevertheless insists resolutely upon the dignity of man and the possibility of educating him.
To come close to art means to come close to life, and if an appreciation of the dignity of man is the moral definition of democracy, then its psychological definition arises out of its determination to reconcile and combine knowledge and art, mind and life, thought and deed.
The anatomy of feeling, the science of psychedelics, Ursula K. Le Guin’s final poetry collection, arresting essays by Zadie Smith, Rebecca Solnit, Anne Lamott, and Audre Lorde, a physicist’s lyrical meditation on science and spirituality, and more.
By Maria Popova
I treat my annual best-of reading lists as Old Year’s resolutions in reverse — unlike traditional resolutions, which frame aspirational priorities for the new year, they present a record of the reading that merited priority over the year past. In consequence, they are invariably subjective and incomplete — a shelf’s worth of books that I, one person, read and enjoyed in the time given, with the sensibility I have. Since this year I finished writing one book and putting together another, my reading time for new releases has been especially limited, which means these annual selections are especially subjective — no doubt I missed a great many worthy and wonderful books. But of those I did read, here — in excerpts from the pieces I originally wrote about them earlier in the year — are the ones I loved with all my heart and mind:
SO FAR SO GOOD
In November of 2014, the wise and wonderful Ursula K. Le Guin (October 21, 1929–January 22, 2018) — one of the great losses of 2018 — accepted the National Book Award with a stunning speech that quickly became our era’s supreme manifesto for protecting the art of the written word from the assault of the market. In consonance with her conviction, Le Guin sent the manuscript of her final poetry collection to an independent nonprofit poetry publisher, Copper Canyon Press, who turned directly to her readers to bring it to life. And oh how alive So Far So Good (public library) is — a sort of existential atlas, traversing bordering territories of mediations, incantations, and divinations on subjects like time, impermanence, and the splendors of uncertainty. Undergirding the verses is Le Guin’s largehearted generosity of spirit — toward the reader, toward nature and reality, toward the intertwined natures of life and art.
In the vast abyss before time, self
is not, and soul commingles
with mist, and rock, and light. In time,
soul brings the misty self to be.
Then slow time hardens self to stone
while ever lightening the soul,
till soul can loose its hold of self
and both are free and can return
to vastness and dissolve in light,
the long light after time.
In one of the most arresting essays, titled “On Optimism and Despair,” Smith takes on an eternal question that has bared its sharpest edges in our cultural moment — the question John Steinbeck tussled with when he wrote to his best friend at the peak of WWII: “All the goodness and the heroisms will rise up again, then be cut down again and rise up. It isn’t that the evil thing wins — it never will — but that it doesn’t die.”
Caught in the maelstrom of the moment, we forget this cyclical nature of history — history being, as I wrote in Figuring, not what happened, but what survives the shipwrecks of judgment and chance. We forget that the present always looks different from the inside than it does from the outside — something James Baldwin knew when, in considering why Shakespeare endures, he observed: “It is said that his time was easier than ours, but I doubt it — no time can be easy if one is living through it.” We forget that our particular moment, with all its tribulations and triumphs, is not neatly islanded in the river of time but swept afloat by massive cultural currents that have raged long before it and will rage long after.
Two days after the 2016 American presidential election, Smith — a black Englishwoman living in the freshly sundered United States — was invited to give a speech upon receiving a literary award in Germany. Traveling from a country on the brink of one catastrophic political regime to a country that has survived another, Smith took the opportunity to unmoor the despair of the present from the shallow waters of the cultural moment and cast it into the oceanic context of humanity’s pasts, aswirl with examples and counterexamples of progress, with ideals attained and shattered, with abiding assurance that we shape tomorrow by how we navigate our parallel potentialities for moral ruin and moral redemption today.
Nearly half a century after the German humanistic philosopher Erich Fromm asserted that “optimism is an alienated form of faith, pessimism an alienated form of despair” and a turn of the cycle after Rebecca Solnit contemplated our grounds for hope in dark times, Smith addresses a question frequently posed before her — why her earlier novels are aglow with optimism, while her later writing “tinged with despair” — a question implying that the arc of her body of work inclines toward an admission of the failure of its central animating forces: diversity, multiculturalism, the polyphony of perspectives. With an eye to “what the ancient Greeks did to each other, and the Romans, and the seventeenth-century British, and the nineteenth-century Americans,” Smith offers a corrective that stretches the ahistorical arc of that assumption:
My best friend during my youth — now my husband — is himself from Northern Ireland, an area where people who look absolutely identical to each other, eat the same food, pray to the same God, read the same holy book, wear the same clothes and celebrate the same holidays have yet spent four hundred years at war over a relatively minor doctrinal difference they later allowed to morph into an all-encompassing argument over land, government and national identity. Racial homogeneity is no guarantor of peace, any more than racial heterogeneity is fated to fail.
Speaking from the German stage, Smith recounts visiting the country during her first European book tour in her early twenties, traveling with her father, who had been there in 1945 as a young soldier in the reconstruction:
We made a funny pair on that tour, I’m sure: a young black girl and her elderly white father, clutching our guidebooks and seeking those spots in Berlin that my father had visited almost fifty years earlier. It is from him that I have inherited both my optimism and my despair, for he had been among the liberators at Belsen and therefore seen the worst this world has to offer, but had, from there, gone forward, with a sufficiently open heart and mind, striding into one failed marriage and then another, marrying both times across various lines of class, color and temperament, and yet still found in life reasons to be cheerful, reasons even for joy.
He was a member of the white working class, a man often afflicted by despair who still managed to retain a core optimism. Perhaps in a different time under different cultural influences living in a different society he would have become one of the rabid old angry white men of whom the present left is so afeared. As it was, born in 1925 and dying in 2006, he saw his children benefit from the civilized postwar protections of free education and free health care, and felt he had many reasons to be grateful.
This is the world I knew. Things have changed, but history is not erased by change, and the examples of the past still hold out new possibilities for all of us, opportunities to remake, for a new generation, the conditions from which we ourselves have benefited… Progress is never permanent, will always be threatened, must be redoubled, restated and reimagined if it is to survive.
It is, of course, an abiding question, as old as consciousness — we are material creatures that live in a material universe, yet we are capable of experiences that transcend what we can atomize into physical facts: love, joy, the full-being gladness of a Beethoven symphony on a midsummer’s night.
The Nobel-winning physicist Niels Bohr articulated the basic paradox of living with and within such a duality: “The fact that religions through the ages have spoken in images, parables, and paradoxes means simply that there are no other ways of grasping the reality to which they refer. But that does not mean that it is not a genuine reality. And splitting this reality into an objective and a subjective side won’t get us very far.”
Nearly a century after Bohr, the physicist and writer Alan Lightman takes us further, beyond these limiting dichotomies, in Searching for Stars on an Island in Maine (public library) — a lyrical and illuminating inquiry into our dual impulse for belief in the unprovable and for trust in truth affirmed by physical evidence. Through the lens of his personal experience as a working scientist and a human being with uncommon receptivity to the poetic dimensions of life, Lightman traces our longing for absolutes in a relative world from Galileo to Van Gogh, from Descartes to Dickinson, emerging with that rare miracle of insight at the meeting point of the lucid and the luminous.
Lightman, who has previously written beautifully about his transcendent experience facing a young osprey, relays a parallel experience he had one summer night on an island off the coast of Maine, where he and his wife have been going for a quarter century. On this small, remote speck of land, severed from the mainland without ferries or bridges, each of the six families has had to learn to cross the ocean by small boat — a task particularly challenging at night. Lightman recounts the unbidden revelation of one such nocturnal crossing:
No one was out on the water but me. It was a moonless night, and quiet. The only sound I could hear was the soft churning of the engine of my boat. Far from the distracting lights of the mainland, the sky vibrated with stars. Taking a chance, I turned off my running lights, and it got even darker. Then I turned off my engine. I lay down in the boat and looked up. A very dark night sky seen from the ocean is a mystical experience. After a few minutes, my world had dissolved into that star-littered sky. The boat disappeared. My body disappeared. And I found myself falling into infinity. A feeling came over me I’d not experienced before… I felt an overwhelming connection to the stars, as if I were part of them. And the vast expanse of time — extending from the far distant past long before I was born and then into the far distant future long after I will die — seemed compressed to a dot. I felt connected not only to the stars but to all of nature, and to the entire cosmos. I felt a merging with something far larger than myself, a grand and eternal unity, a hint of something absolute. After a time, I sat up and started the engine again. I had no idea how long I’d been lying there looking up.
Lightman — the first professor at MIT to receive a dual faculty appointment in science and the humanities — syncopates this numinous experience with the reality of his lifelong devotion to science:
I have worked as a physicist for many years, and I have always held a purely scientific view of the world. By that, I mean that the universe is made of material and nothing more, that the universe is governed exclusively by a small number of fundamental forces and laws, and that all composite things in the world, including humans and stars, eventually disintegrate and return to their component parts. Even at the age of twelve or thirteen, I was impressed by the logic and materiality of the world. I built my own laboratory and stocked it with test tubes and petri dishes, Bunsen burners, resistors and capacitors, coils of electrical wire. Among other projects, I began making pendulums by tying a fishing weight to the end of a string. I’d read in Popular Science or some similar magazine that the time for a pendulum to make a complete swing was proportional to the square root of the length of the string. With the help of a stopwatch and ruler, I verified this wonderful law. Logic and pattern. Cause and effect. As far as I could tell, everything was subject to numerical analysis and quantitative test. I saw no reason to believe in God, or in any other unprovable hypotheses.
Yet after my experience in that boat many years later… I understood the powerful allure of the Absolutes — ethereal things that are all-encompassing, unchangeable, eternal, sacred. At the same time, and perhaps paradoxically, I remained a scientist. I remained committed to the material world.
Against our human finitude, temporality, and imperfection, these “Absolutes” offer infinity, eternity, perfection. Lightman defines them as concepts and beliefs that “refer to an enduring and fixed reference point that can anchor and guide us through our temporary lives” — notions like constancy, immortality, permanence, the soul, “God”; notions unprovable by the scientific method. Conversely, however, notions that belong to this realm of Absolutes fall apart when they make claims in the realm of science — claims disproven by the facts of the material world. With an eye to how the discoveries of modern science — from heliocentricity to evolution to the chemical composition of the universe — have challenged many of these Absolutes, Lightman writes:
Nothing in the physical world seems to be constant or permanent. Stars burn out. Atoms disintegrate. Species evolve. Motion is relative. Even other universes might exist, many without life. Unity has given way to multiplicity. I say that the Absolutes have been challenged rather than disproved, because the notions of the Absolutes cannot be disproved any more than they can be proved. The Absolutes are ideals, entities, beliefs in things that lie beyond the physical world. Some may be true and some false, but the truth or falsity cannot be proven.
From all the physical and sociological evidence, the world appears to run not on absolutes but on relatives, context, change, impermanence, and multiplicity. Nothing is fixed. All is in flux.
On the one hand, such an onslaught of discovery presents a cause for celebration… Is it not a testament to our minds that we little human beings with our limited sensory apparatus and brief lifespans, stuck on our one planet in space, have been able to uncover so much of the workings of nature? On the other hand, we have found no physical evidence for the Absolutes. And just the opposite. All of the new findings suggest that we live in a world of multiplicities, relativities, change, and impermanence. In the physical realm, nothing persists. Nothing lasts. Nothing is indivisible. Even the subatomic particles found in the twentieth century are now thought to be made of even smaller “strings” of energy, in a continuing regression of subatomic Russian dolls. Nothing is a whole. Nothing is indestructible. Nothing is still. If the physical world were a novel, with the business of examining evil and good, it would not have the clear lines of Dickens but the shadowy ambiguities of Dostoevsky.
“To be a good human being,” philosopher Martha Nussbaum observed, “is to have a kind of openness to the world, an ability to trust uncertain things beyond your own control” — to have, that is, a willingness to regard with an openhearted curiosity what is other than ourselves and therefore strange, discomfiting, difficult to fathom and relate to, difficult at first to love, for we cannot love what we do not understand. Out of such regard arises the awareness at the heart of Lucille Clifton’s lovely poem “cutting greens” — a recognition of “the bond of live things everywhere,” among which we are only a small part of a vast and miraculous world, and from which we can learn a great deal about being better versions of ourselves.
That is what naturalist and author Sy Montgomery, one of the most poetic science writers of our time, explores in How to Be a Good Creature: A Memoir in Thirteen Animals (public library), illustrated by artist Rebecca Green — an autobiographical adventure into the wilderness of our common humanity, where the world of science and the legacy of Aesop converge into an existential expedition to uncover the elemental truth that “knowing someone who belongs to another species can enlarge your soul in surprising ways.”
Looking back on her unusual and passionate life of swimming with electric eels, digging for mistletoe seeds in emu droppings, and communing with giant octopuses, Montgomery reflects on what she learned about leadership from an emu, about ferocity and forgiveness from an ermine, about living with a sense of wholeness despite imperfection from a one-eyed dog named Thurber (after the great New Yorker cartoonist and essayist James Thurber, who was blinded in one eye by an arrow as a child), and about what it takes for the heart to be “stretched wide with awe.”
At the New England Aquarium, Montgomery gets to know one of Earth’s most alien creatures — the subject of her exquisite book The Soul of an Octopus. She writes:
Reading an octopus’s intentions is not like reading, for instance, a dog’s. I could read [my dog] Sally’s feelings in a glance, even if the only part of her I could see was her tail, or one ear. But Sally was family, and in more than one sense. Dogs, like all placental mammals, share 90 percent of our genetic material. Dogs evolved with humans. Octavia and I were separated by half a billion years of evolution. We were as different as land from sea. Was it even possible for a human to understand the emotions of a creature as different from us as an octopus?
As Octavia slowly allows this improbable and almost miraculous cross-species creaturely connection, Montgomery reflects on the insight attributed to the ancient Greek philosopher Thales of Miletus — “The universe is alive, and has fire in it, and is full of gods.” — and writes:
Being friends with an octopus — whatever that friendship meant to her — has shown me that our world, and the worlds around and within it, is aflame with shades of brilliance we cannot fathom — and is far more vibrant, far more holy, than we could ever imagine.
“There is no time for despair, no place for self-pity, no need for silence, no room for fear,” Toni Morrison exhorted in considering the artist’s task in troubled times. In our interior experience as individuals, as in the public forum of our shared experience as a culture, our courage lives in the same room as our fear — it is in troubled times, in despairing times, that we find out who we are and what we are capable of.
That is what the great poet, essayist, feminist, and civil rights champion Audre Lorde (February 18, 1934–November 17, 1992) explores with exquisite self-possession and might of character in a series of diary entries included in A Burst of Light: and Other Essays (public library).
Seventeen days before she turned fifty, and six years after she underwent a mastectomy for breast cancer, Lorde was told she had liver cancer. She declined surgery and even a biopsy, choosing instead to go on living her life and her purpose, exploring alternative treatments as she proceeded with her planned teaching trip to Europe. In a diary entry penned on her fiftieth birthday, Lorde reckons with the sudden call to confront the ultimate fear:
I want to write down everything I know about being afraid, but I’d probably never have enough time to write anything else. Afraid is a country where they issue us passports at birth and hope we never seek citizenship in any other country. The face of afraid keeps changing constantly, and I can count on that change. I need to travel light and fast, and there’s a lot of baggage I’m going to have to leave behind me. Jettison cargo.
“Not every man knows what he shall sing at the end,” the poet Mark Strand, born within weeks of Lorde, wrote in his stunning ode to mortality. Exactly a month after her diagnosis, with the medical establishment providing more confusion than clarity as she confronts her mortality, Lorde resolves in her journal:
Dear goddess! Face-up again against the renewal of vows. Do not let me die a coward, mother. Nor forget how to sing. Nor forget song is a part of mourning as light is a part of sun.
By the spring, she had lost nearly fifty pounds. But she was brimming with a crystalline determination to do the work of visibility and kinship across difference. She taught in Germany, immersed herself in the international communities of the African Diaspora, and traveled to the world’s first Feminist Book Fair in London. “I may be too thin, but I can still dance!” she exults in her diary on the first day of June. She dances with her fear in an entry penned six days later:
I am listening to what fear teaches. I will never be gone. I am a scar, a report from the frontlines, a talisman, a resurrection. A rough place on the chin of complacency.
“Finding the words is another step in learning to see,” bryologist Robin Wall Kimmerer wrote in reflecting on what her Native American tradition and her training as a scientist taught her about how naming confers dignity upon life. If to name is to see and reveal — to remove the veil of blindness, willful or manipulated, and expose things as they really are — then it is in turn another step in remaking the world, another form of resistance to the damaging dominant narratives that go unquestioned. Walt Whitman knew this when he contemplated our greatest civic might: “I can conceive of no better service… than boldly exposing the weakness, liabilities and infinite corruptions of democracy.”
A century and a half after Whitman, Rebecca Solnit — one of our own era’s boldest public defenders of democracy, and one of the most poetic — explores this crucial causal link between the stories we tell and the world we build in Call Them by Their True Names (public library) — a collection of her essays at the nexus of politics, philosophy, and the selective record of personal and political choices we call history. Composed in response to more than a decade’s worth of cultural crises and triumphs, the pieces in the book furnish an extraordinarily lucid yet hopeful lens on the present and a boldly uncynical telescopic perspective on the future.
Solnit writes in the preface:
One of the folktale archetypes, according to the Aarne-Thompson classification of these stories, tells of how “a mysterious or threatening helper is defeated when the hero or heroine discovers his name.” In the deep past, people knew names had power. Some still do. Calling things by their true names cuts through the lies that excuse, buffer, muddle, disguise, avoid, or encourage inaction, indifference, obliviousness. It’s not all there is to changing the world, but it’s a key step.
When the subject is grim, I think of the act of naming as diagnosis. Though not all diagnosed diseases are curable, once you know what you’re facing, you’re far better equipped to know what you can do about it. Research, support, and effective treatment, as well as possibly redefining the disease and what it means, can proceed from this first step. Once you name a disorder, you may be able to connect to the community afflicted with it, or build one. And sometimes what’s diagnosed can be cured.
That, indeed, is what the philosopher and Trappist monk Thomas Merton celebrated in his beautiful fan letter to Rachel Carson after she catalyzed the modern environmental movement by speaking inconvenient truth to power in exposing the truth about pesticides, marketed at the time as harmless helpers to humanity — an act Merton considered “contributing a most valuable and essential piece of evidence for the diagnosis of the ills of our civilization.” Such naming of wrongs, betrayals, and corruptions unweaves the very fabric of the status quo. It is, Solnit argues, “the first step in the process of liberation” and often leads to shifts in the power system itself. In the age of “alternative facts,” when language is used as a weapon of oppression and manipulation, her words reverberate with the irrepressible, unsilenceable urgency of truth:
To name something truly is to lay bare what may be brutal or corrupt — or important or possible — and key to the work of changing the world is changing the story.
Chance and choice converge to make us who we are, and although we may mistake chance for choice, our choices are the cobblestones, hard and uneven, that pave our destiny. They are ultimately all we can answer for and point to in the architecture of our character. Joan Didion captured this with searing lucidity in defining character as “the willingness to accept responsibility for one’s own life” and locating in that willingness the root of self-respect.
A century before Didion, Friedrich Nietzsche (October 15, 1844–August 25, 1900) composed the score for harmonizing our choices and our contentment with the life they garner us. Nietzsche, who greatly admired Emerson’s ethos of nonconformity and self-reliant individualism, wrote fervently, almost frenetically, about how to find yourself and what it means to be a free spirit. He saw the process of becoming oneself as governed by the willingness to own one’s choices and their consequences — a difficult willingness, yet one that promises the antidote to existential hopelessness, complacency, and anguish.
The legacy of that deceptively simple yet profound proposition is what philosopher John J. Kaag explores in Hiking with Nietzsche: On Becoming Who You Are (public library) — part masterwork of poetic scholarship, part contemplative memoir concerned with the most fundamental question of human life: What gives our existence meaning?
The answer, Kaag suggests in drawing on Nietzsche’s most timeless ideas, challenges our ordinary understanding of selfhood and its cascading implications for happiness, fulfillment, and the building blocks of existential contentment. He writes:
The self is not a hermetically sealed, unitary actor (Nietzsche knew this well), but its flourishing depends on two things: first, that it can choose its own way to the greatest extent possible, and then, when it fails, that it can embrace the fate that befalls it.
At the center of Nietzsche’s philosophy is the idea of eternal return — the ultimate embrace of responsibility that comes from accepting the consequences, good or bad, of one’s willful action. Embedded in it is an urgent exhortation to calibrate our actions in such a way as to make their consequences bearable, livable with, in a hypothetical perpetuity. Nietzsche illustrates the concept with a simple, stirring thought experiment in his final book, Ecce Homo: How One Becomes What One Is:
What if some day or night a demon were to steal into your loneliest loneliness and say to you: “This life as you now live and have lived it you will have to live once again and innumerable times again; and there will be nothing new in it, but every pain and every joy and every thought and sigh and everything unspeakably small or great in your life must return to you, all in the same succession and sequence — even this spider and this moonlight between the trees, and even this moment and I myself…”
“Attention is an intentional, unapologetic discriminator. It asks what is relevant right now, and gears us up to notice only that,” cognitive scientist Alexandra Horowitz wrote in her inquiry into how our conditioned way of looking narrows the lens of our perception. Attention, after all, is the handmaiden of consciousness, and consciousness the central fact and the central mystery of our creaturely experience. From the days of Plato’s cave to the birth of neuroscience, we have endeavored to fathom its nature. But it is a mystery that only seems to deepen with each increment of approach. “Our normal waking consciousness,” William James wrote in his landmark 1902 treatise on spirituality, “is but one special type of consciousness, whilst all about it, parted from it by the filmiest of screens, there lie potential forms of consciousness entirely different… No account of the universe in its totality can be final which leaves these other forms of consciousness quite disregarded.”
Half a century after James, two new molecules punctured the filmy screen to unlatch a portal to a wholly novel universe of consciousness, shaking up our most elemental assumptions about the nature of the mind, our orientation toward mortality, and the foundations of our social, political, and cultural constructs. One of these molecules — lysergic acid diethylamide, or LSD — was a triumph of twentieth-century science, somewhat accidentally synthesized by the Swiss chemist Albert Hofmann in the year physicist Lise Meitner discovered nuclear fission. The other — the compound psilocin, known among the Aztecs as “flesh of the gods” — was the rediscovery of a substance produced by a humble brown mushroom, which indigenous cultures across eras and civilizations had been incorporating into their spiritual rituals since ancient times, and which the Roman Catholic Church had violently suppressed and buried during the Spanish conquest of the Americas.
Together, these two molecules commenced the psychedelic revolution of the 1950s and 1960s, frothing the stream of consciousness — a term James coined — into a turbulent existential rapids. Their proselytes included artists, scientists, political leaders, and ordinary people of all stripes. Their most ardent champions were the psychiatrists and physicians who lauded them as miracle drugs for salving psychic maladies as wide-ranging as anxiety, addiction, and clinical depression. Their cultural consequence was likened to that of the era’s other cataclysmic disruptor: the atomic bomb.
And then — much thanks to Timothy Leary’s reckless handling of his Harvard psilocybin studies that landed him in prison, where Carl Sagan sent him cosmic poetry — a landslide of moral panic and political backlash outlawed psychedelics, shut down clinical studies of their medical and psychiatric uses, and drove them into the underground. For decades, academic research into their potential for human flourishing languished and nearly perished. But a small subset of scientists, psychiatrists, and amateur explorers refused to relinquish their curiosity about that potential.
The 1990s brought a quiet groundswell of second-wave interest in psychedelics — a resurgence that culminated with a 2006 paper reporting on studies at Johns Hopkins, which had found that psilocybin had occasioned “mystical-type experiences having substantial and sustained personal meaning and significance” for terminally ill cancer patients — experiences from which they “return with a new perspective and profound acceptance.” In other words, the humble mushroom compound had helped people face the ultimate frontier of existence — their own mortality — with unparalleled equanimity. The basis of the experience, researchers found, was a sense of the dissolution of the personal ego, followed by a sense of becoming one with the universe — a notion strikingly similar to Bertrand Russell’s insistence that a fulfilling life and a rewarding old age are a matter of “[making] your interests gradually wider and more impersonal, until bit by bit the walls of the ego recede, and your life becomes increasingly merged in the universal life.”
More clinical experiments followed at UCLA, NYU, and other leading universities, demonstrating that this psilocybin-induced dissolution of the ego, extremely difficult if not impossible to achieve in our ordinary consciousness, has profound benefits in rewiring the faulty mental mechanisms responsible for disorders like alcoholism, anxiety, and depression.
One good way to understand a complex system is to disturb it and then see what happens. By smashing atoms, a particle accelerator forces them to yield their secrets. By administering psychedelics in carefully calibrated doses, neuroscientists can profoundly disturb the normal waking consciousness of volunteers, dissolving the structures of the self and occasioning what can be described as a mystical experience. While this is happening, imaging tools can observe the changes in the brain’s activity and patterns of connection. Already this work is yielding surprising insights into the “neural correlates” of the sense of self and spiritual experience.
Pollan examines the psilocybin studies of cancer patients, which reignited scientific interest in psychedelics, and the profound results of subsequent studies exploring the use of psychedelics in treating mental illness, including addiction, depression, obsessive-compulsive disorder. He approaches his subject as a science writer and a skeptic endowed with equal parts rigorous critical thinking and openminded curiosity. In a sentiment evocative of physicist Alan Lightman’s elegant braiding of the numinous and the scientific, he echoes Carl Sagan’s views on the mystery of reality and examines his own lens:
My default perspective is that of the philosophical materialist, who believes that matter is the fundamental substance of the world and the physical laws it obeys should be able to explain everything that happens. I start from the assumption that nature is all that there is and gravitate toward scientific explanations of phenomena. That said, I’m also sensitive to the limitations of the scientific-materialist perspective and believe that nature (including the human mind) still holds deep mysteries toward which science can sometimes seem arrogant and unjustifiably dismissive.
Was it possible that a single psychedelic experience — something that turned on nothing more than the ingestion of a pill or square of blotter paper — could put a big dent in such a worldview? Shift how one thought about mortality? Actually change one’s mind in enduring ways?
The idea took hold of me. It was a little like being shown a door in a familiar room — the room of your own mind — that you had somehow never noticed before and being told by people you trusted (scientists!) that a whole other way of thinking — of being! — lay waiting on the other side. All you had to do was turn the knob and enter. Who wouldn’t be curious? I might not have been looking to change my life, but the idea of learning something new about it, and of shining a fresh light on this old world, began to occupy my thoughts. Maybe there was something missing from my life, something I just hadn’t named.
We go through life seeing reality not as it really is, in its unfathomable depths of complexity and contradiction, but as we hope or fear or expect it to be. Too often, we confuse certainty for truth and the strength of our beliefs for the strength of the evidence. When we collide with the unexpected, with the antipode to our hopes, we are plunged into bewildered despair. We rise from the pit only by love. Perhaps Keats had it slightly wrong — perhaps truth is love and love is truth.
In general, it doesn’t feel like the light is making a lot of progress. It feels like death by annoyance. At the same time, the truth is that we are beloved, even in our current condition, by someone; we have loved and been loved. We have also known the abyss of love lost to death or rejection, and that it somehow leads to new life. We have been redeemed and saved by love, even as a few times we have been nearly destroyed, and worse, seen our children nearly destroyed. We are who we love, we are one, and we are autonomous.
She turns to the greatest paradox of the human heart — our parallel capacities for the perpendiculars of immense love and immense despair:
Love has bridged the high-rises of despair we were about to fall between. Love has been a penlight in the blackest, bleakest nights. Love has been a wild animal, a poultice, a dinghy, a coat. Love is why we have hope.
So why have some of us felt like jumping off tall buildings ever since we can remember, even those of us who do not struggle with clinical depression? Why have we repeatedly imagined turning the wheels of our cars into oncoming trucks?
We just do.
To me, this is very natural. It is hard here.
And yet, in the wreckage of this hardship, we find our most redemptive potentialities:
There is the absolute hopelessness we face that everyone we love will die, even our newborn granddaughter, even as we trust and know that love will give rise to growth, miracles, and resurrection. Love and goodness and the world’s beauty and humanity are the reasons we have hope. Yet no matter how much we recycle, believe in our Priuses, and abide by our local laws, we see that our beauty is being destroyed, crushed by greed and cruel stupidity. And we also see love and tender hearts carry the day. Fear, against all odds, leads to community, to bravery and right action, and these give us hope.
In a sentiment that calls to mind what psychologists call “the vampire problem” — the limiting loop by which we fail to imagine transformation because the very faculty doing the imagining can only be informed by the already transformed self — Lamott adds:
We can change. People say we can’t, but we do when the stakes or the pain is high enough. And when we do, life can change. It offers more of itself when we agree to give up our busyness.
“Life and Reality are not things you can have for yourself unless you accord them to all others,” Alan Watts wrote in the early 1950s, nearly a quarter century before Thomas Nagel’s landmark essay “What Is It Like to Be a Bat?” unlatched the study of other consciousnesses and seeded the disorienting awareness that other beings — “beings who walk other spheres,” to borrow Whitman’s wonderful term — experience this world we share in ways thoroughly alien to our own.
Today, we know that we need not step across the boundary of species to encounter such alien-seeming ways of inhabiting the world. There are innumerable ways of being human — we each experience life and reality in radically different ways merely by our way of seeing, but these differences are accentuated to an extreme when mental illness alters the elemental interiority of a consciousness. In these extreme cases, it can become impossible for even the most empathic imagination to grasp — not only cerebrally but with an embodied understanding — the slippery reality of an anguished consciousness so different from one’s own. Conversely, it can become impossible for those who share that anguish to articulate it, effecting an overwhelming sense of alienation and the false conviction that one is alone in one’s suffering. To convey that reality to those unbedeviled by such mental anguish, and to wrap language around its ineffable interiority for others who suffer silently from the same, is therefore a creative feat and existential service of the highest caliber.
That is what author, Happy Ending Music & Reading Series host, and my dear friend Amanda Stern accomplishes in Little Panic: Dispatches from an Anxious Life (public library) — part-memoir and part-portrait of a cruelly egalitarian affliction that cuts across all borders of age, gender, race, and class, clutching one’s entire reality and sense of self in a stranglehold that squeezes life out. What emerges is a sort of literary laboratory of consciousness, anatomizing an all-consuming yet elusive feeling-pattern to explore what it takes to break the tyranny of worry and what it means to feel at home in oneself.
Part of the splendor of the book is the way Stern unspools the thread of being to the very beginning, all the way to the small child predating conscious memory. In consonance with Maurice Sendak, who so passionately believed that a centerpiece of healthy adulthood is “having your child self intact and alive and something to be proud of,” the child-Amanda emerges from the pages alive and real to articulate in that simple, profound way only children have what the yet-undiagnosed acute anxiety disorder actually feels like from the inside:
Whenever I am afraid, worry sounds itself as sixty, seventy, radio channels playing at the same time inside my head. Refrains loop around and around my brain like fast jabber and I cannot get any of it to stop. I know there is something wrong with me, but no one knows how to fix me. Not anyone outside my body, and definitely not me. Eddie [Stern’s older brother] says a body is blood and bones and skin, and when everything falls off you’re a skeleton, but I am air pressure and tingly dots; energy and everything. I am air and nothing.
My breath flips on its side, horizontal and too wide to go through my lungs.
The grave paradox of mental illness and mental health is that, despite what we now know about how profoundly our emotions affect our physical wellbeing, these terms sever the head from the body — the physical body and the emotional body. A century after William James proclaimed that “a purely disembodied human emotion is a nonentity,” Stern offers a powerful corrective for our ongoing cultural Cartesianism. Her vivid prose, pulsating with a life in language, invites the reader into the interiority of a deeply embodied mind that experiences and comprehends the world somatically. “I was born with a basketball net slung over my top ribs, where the world dunks its balls of dread,” she writes as she channels her young self’s budding awareness that something is terribly, fundamentally wrong with her:
I am a growing constellation of errors. I don’t know what’s wrong with me, only that something is, and it must be too shameful to divulge, or so rare that even the doctors are stumped.
At the end of the book, Stern considers the centrality of anxiety in her own blink of existence and telescopes to a larger truth about this widespread yet largely invisible affliction that seems a fundamental feature of being human:
When did it start? It started before I was born. It started before my mother was born. It started when friction created the world. When does anything start? It doesn’t, it just grows, sometimes to unmanageable heights, and then, when you’re at the very edge, it becomes clear: something must be done.
Left untreated, anxiety disorders, like fingernails, grow with a person. The longer they go untended, the more mangled and painful they become. Often, they spiral, straight out of control, splitting and splintering into other disorders, like depression, social anxiety, agoraphobia. A merry-go-round of features we rise and fall upon. Separation anxiety handicaps its captors, preventing them from leaving bad relationships, moving far from home, going on trips, to parties, applying for jobs, having children, getting married, seeing friends, or falling asleep. Some people are so crippled by their anxiety they have panic attacks in anticipation of having a panic attack.
I’ve had panic attacks in nearly every part of New York City, even on Staten Island. I’ve had them in taxis, on subways, public bathrooms, banks, street corners, in Washington Square Park, on multiple piers, the Manhattan Bridge, Chinatown, the East Village, the Upper East Side, Central Park, Lincoln Center, the dressing room at Urban Outfitters, Mamoun’s Falafel, the Bobst library, the Mid-Manhattan Library, the main library branch, the Brooklyn Library, the Fort Greene Farmer’s Market, laundromats, book kiosks, in the entrance of FAO Schwartz, at the post office, the steps of the Met, on stoops, at the Brooklyn Flea, in bars, at friends’ houses, on stage, in the shower, in queen-sized beds, double beds, twin beds, in my crib.
I’ve grown so expert at hiding them, most people would never even know that I’m suffering. How, after all, do you explain that a restaurant’s decision to dim their lights swelled your throat shut, and that’s why you must leave immediately, not just the restaurant, but the neighborhood? If you cannot point to something, then it is invisible. Like a cult leader, anxiety traps you and convinces you that you’re the only one it sees.
For better or worse, we can only teach others what we understand… Each person begins, after all, as a story other people tell. And when we fall outside the confines of our common standards, we will assume our deficits define us.
My fear and my conviction were the same: that I was the flaw in the universe; the wrongly circled letter in our multiple-choice world. This terrible truth binds us all: fear there’s a single, unattainable, correct way to be human.
“A purely disembodied human emotion is a nonentity,” William James wrote in his pioneering 1884 theory of how our bodies affect our feelings. In the century-some since, breakthroughs in neurology, psychobiology, and neuroscience have contributed leaps of layered (though still incomplete) understanding of the relationship between the physical body and our emotional experience. That tessellated relationship is what neuroscientist Antonio Damasio examines in The Strange Order of Things: Life, Feeling, and the Making of Cultures (public library) — a title inspired by the disorienting fact that several billion years ago, single-cell organisms began exhibiting behaviors strikingly analogous to certain human social behaviors and 100 million years ago insects developed interactions, instruments, and cooperative strategies that we might call cultural. That such sociocultural behaviors long predate the development of the human brain casts new light on the ancient mind-body problem and offers a radical revision of how we understand mind, feeling, consciousness, and the construction of cultures.
Two decades after his landmark exploration of how the relationship between the body and the mind shapes our conscious experience, Damasio draws a visionary link between biology and social science in a fascinating investigation of homeostasis — the delicate balance that underpins our physical existence, ensures our survival, and defines our flourishing. At the heart of his inquiry is his lifelong interest in the nature of human affect — why we feel what we feel, how we use emotions to construct selfhood, what makes our intentions and our feelings so frequently contradictory, how the body and the mind conspire in the inception of emotional reality. What emerges is not an arsenal of certitudes and answers but a celebration of curiosity and a reminder that intelligent, informed speculation is how we expand the territory of knowledge by moving the boundary of the knowable further into the unknown.
Feelings, Damasio argues, are the unheralded germinators of human culture:
Human beings have distinguished themselves from all other beings by creating a spectacular collection of objects, practices, and ideas, collectively known as cultures. The collection includes the arts, philosophical inquiry, moral systems and religious beliefs, justice, governance, economic institutions, and technology and science.
Language, sociality, knowledge, and reason are the inventors and executors of these complicated processes. But feelings get to motivate them and stay on to check the results… Cultural activity began and remains deeply embedded in feeling. The favorable and unfavorable interplay of feeling and reason must be acknowledged if we are to understand the conflicts and contradictions of the human condition.
The modern-day universe of codes and ciphers began in a cottage on the prairie, with a pair of young lovers smiling at each other across a table and a rich man urging them to be spectacular.
The two young lovers were Elizebeth Smith and William Friedman, and the rich man, the eccentric textile tycoon George Fabyan.
The youngest of nine children raised in a modest Quaker home, Elizebeth was born in an era when fewer than four percent of American women graduated from college. Four years after earning her degree in Greek and English literature, she still felt like “a quivering, keenly alive, restless, mental question mark.” The following year, 1916, she began her improbable career at Riverbank Laboratories — Fabyan’s Wonderland-like estate, where the billionaire had hired Elizebeth to work on the cipher at the heart of a literary conspiracy theory claiming that Francis Bacon was the true author of Shakespeare’s works. At Riverbank, she met William, a young geneticist living in a windmill — one of the many fanciful fixtures of Riverbank — and studying seeds in order to infuse crops with optimal properties as a kind of proto genetic engineering. Over long walks, animated by parallel intellectual voraciousness and shared skepticism of the Bacon cipher conspiracy, the two fell in love.
William and Elizebeth were married at Riverbank, where they had begun collaborating on cryptographic work. The papers on the subject they wrote together — though always published under William’s name alone — soon spread their reputation beyond Riverbank. Cryptography was new then, new and thrilling and full of unmined possibility for government intelligence, and so the U.S. Navy eventually recruited the Friedmans. Fagone writes:
The savaging of Nazis, the birth of a science: It begins on the day when a twenty-three-year-old American woman decides to trust her doubt and dig with her own mind.
The room is dark but her pencil is sharp. An envelope of puzzles arrives from Washington, sent by men who have the largest of responsibilities and the tiniest of clues. With William she examines the puzzles. He is game, he looks at her with eyes like little bonfires, he is in love with her. She is not in love yet but she would not be ashamed to fall in love with such a bright and kind person. She stares at the odd blocks of text and starts to flip and stack and rearrange them on a scratch pad, a kindling of letters, a friction of alphabets hot to the touch, and then a flame catches and then catches again, until she understands that she can ignite whenever she wants, that a power is there for the taking, for her and for anyone, and nothing will ever be the same. The ribs of a pattern shine through. Something rises at the nib of her pencil and her heart whomps away. The skeletons of words leap out and make her jump.
At twenty-two, physicist Freeman Dyson (b. December 15, 1923) ascended to a position Newton had held a quarter millennium earlier at Trinity College, where Dyson lived in a room just below Ludwig Wittgenstein’s. Nearly a century later, Dyson remains one of the preeminent scientific minds of our time and a rare witness of a great many cultural milestones, triumphs, and tragedies that have shaped modern life as we know it — landmark discoveries like cosmic microwave background radiation and the double helix structure of DNA, which have profoundly changed our understanding of the universe; the invention of the atomic bomb and the scarring brutality of a World War; the rise of the Internet. He has seen the stars of countless political regimes, scientific theories, and ideologies rise and fall. In Maker of Patterns: An Autobiography Through Letters (public library), Dyson unleashes his warm wisdom and unboastful wit on subjects as varied as politics, the enchantment of science, the vacuity of celebrity, the value of the immigrant perspective, his vibrant friendship with Richard Feynman, and the complexities of being human. He recounts “a flash of illumination” on the Greyhound bus that revealed to him the nature of creativity and composes a singularly delightful account of meeting the great, troubled logician Kurt Gödel at a farewell party for T.S. Eliot at the Princeton home of Robert Oppenheimer. What emerges is not only the fascinating memoir of an uncommon genius, composed of Dyson’s letters to his loved ones, but an invaluable time-capsule of collective memory.
The rewards and redemptions of that elemental yet endangered response is what British naturalist and environmental writer Michael McCarthy, a modern-day Carson, explores in The Moth Snowstorm: Nature and Joy (public library) — part memoir and part manifesto, a work of philosophy rooted in environmental science and buoyed by a soaring poetic imagination.
The natural world can offer us more than the means to survive, on the one hand, or mortal risks to be avoided, on the other: it can offer us joy.
There can be occasions when we suddenly and involuntarily find ourselves loving the natural world with a startling intensity, in a burst of emotion which we may not fully understand, and the only word that seems to me to be appropriate for this feeling is joy.
Referring to it as joy may not facilitate its immediate comprehension either, not least because joy is not a concept, nor indeed a word, that we are entirely comfortable with, in the present age. The idea seems out of step with a time whose characteristic notes are mordant and mocking, and whose preferred emotion is irony. Joy hints at an unrestrained enthusiasm which may be thought uncool… It reeks of the Romantic movement. Yet it is there. Being unfashionable has no effect on its existence… What it denotes is a happiness with an overtone of something more, which we might term an elevated or, indeed, a spiritual quality.
A century and a half after Thoreau extolled nature as a form of prayer and an antidote to the smallening of spirit amid the ego-maelstrom we call society — “In the street and in society I am almost invariably cheap and dissipated, my life is unspeakably mean,” he lamented in his journal — McCarthy considers the role of the transcendent feelings nature can stir in us in a secular world:
They are surely very old, these feelings. They are lodged deep in our tissues and emerge to surprise us. For we forget our origins; in our towns and cities, staring into our screens, we need constantly reminding that we have been operators of computers for a single generation and workers in neon-lit offices for three or four, but we were farmers for five hundred generations, and before that hunter-gatherers for perhaps fifty thousand or more, living with the natural world as part of it as we evolved, and the legacy cannot be done away with.
Having devoted eight years of my life to it, and having a heart swelling with gratitude to the legion of writers and artists who contributed original letters and illustrations for this monumental labor of love, I must proudly include A Velocity of Being: Letters to a Young Reader (public library) — a collection of original letters to the children of today and tomorrow about why we read and what books do for the human spirit, composed by 121 of the most interesting and inspiring humans in our world: Jane Goodall, Yo-Yo Ma, Jacqueline Woodson, Ursula K. Le Guin, Mary Oliver, Neil Gaiman, Amanda Palmer, Rebecca Solnit, Elizabeth Gilbert, Shonda Rhimes, Alain de Botton, James Gleick, Anne Lamott, Diane Ackerman, Judy Blume, Eve Ensler, David Byrne, Sylvia Earle, Richard Branson, Daniel Handler, Marina Abramović, Regina Spektor, Elizabeth Alexander, Adam Gopnik, Debbie Millman, Dani Shapiro, Tim Ferriss, Ann Patchett, a 98-year-old Holocaust survivor, Italy’s first woman in space, and many more immensely accomplished and largehearted artists, writers, scientists, philosophers, entrepreneurs, musicians, and adventurers whose character has been shaped by a life of reading.
Accompanying each letter is an original illustration by a prominent artist in response to the text — including beloved children’s book illustrators like Sophie Blackall, Oliver Jeffers, Isabelle Arsenault, Jon Klassen, Shaun Tan, Olivier Tallec, Christian Robinson, Marianne Dubuc, Lisa Brown, Carson Ellis, Mo Willems, Peter Brown, and Maira Kalman.
Because this project was born of a deep concern for the future of books and a love of literature as a pillar of democratic society, we are donating 100% of proceeds from the book to the New York public library system in gratitude for their noble work in stewarding literature and democratizing access to the written record of human experience. The gesture is inspired in large part by James Baldwin’s moving recollection of how he used the library to read his way from Harlem to the literary pantheon and Ursula K. Le Guin’s insistence that “a great library is freedom.” (Le Guin is one of four contributors we lost between the outset of the project and its completion, for all of whom their letter is their last published work.)
“This capacity to wonder at trifles — no matter the imminent peril — these asides of the spirit, these footnotes in the volume of life are the highest forms of consciousness, and it is in this childishly speculative state of mind, so different from commonsense and its logic, that we know the world to be good.”
By Maria Popova
“Once we leave those domains of human experience, there’s no reason to expect the laws of nature to continue to obey our expectations, since our expectations are dependent on a limited set of experiences,” Carl Sagan observed in considering how common sense blinds us to the reality of the universe. Perhaps worse yet — worse than the wrong beliefs we held for millennia about our planet’s shape, motion, and position in the cosmos, just because it feels flat and steady beneath our feet and is the center of everything we know — common sense often blinds us to the reality of our own interior world. It impoverishes our experience of the uncommonest, most delicate, most beautiful aspects of being and leads us, as I wrote in the prelude to Figuring, to mistake our labels and models of things for the things themselves.
How to lift the blinders of common sense that unfit us for seeing wonder is what Vladimir Nabokov (April 22, 1899–July 2, 1977) explores with uncommon wisdom, wit, and splendor of sentiment in a lecture he delivered at Wellesley College in 1941, titled “The Art of Literature and Commonsense” and later included in the superb posthumous 1980 volume Lectures on Literature (public library).
In the fall of 1811 Noah Webster, working steadily through the C’s, defined commonsense as “good sound ordinary sense . . . free from emotional bias or intellectual subtlety… horse sense.” This is rather a flattering view of the creature, for the biography of commonsense makes nasty reading. Commonsense has trampled down many a gentle genius whose eyes had delighted in a too early moonbeam of some too early truth; commonsense has back-kicked dirt at the loveliest of queer paintings because a blue tree seemed madness to its well-meaning hoof; commonsense has prompted ugly but strong nations to crush their fair but frail neighbors the moment a gap in history offered a chance that it would have been ridiculous not to exploit. Commonsense is fundamentally immoral, for the natural morals of mankind are as irrational as the magic rites that they evolved since the immemorial dimness of time. Commonsense at its worst is sense made common, and so everything is comfortably cheapened by its touch. Commonsense is square whereas all the most essential visions and values of life are beautifully round, as round as the universe or the eyes of a child at its first circus show.
This “sense made common” is, of course, the seedbed of so many of our social and civilizational biases — from the dogmatic geocentrism that nearly cost Galileo his life to the mindless majority rule against which James Baldwin so fervently admonished. It is the seedbed, therefore, of conformity and thus the enemy of a society’s progress, which presupposes that we rise above the common lot of beliefs and mores to imagine the uncommon, the alternative — an act so countercultural that, throughout history, those who have dared undertake it have been punished or ostracized. Kierkegaard knew this when he contemplated why we conform and asserted that “truth always rests with the minority, and the minority is always stronger than the majority, because the minority is generally formed by those who really have an opinion, while the strength of a majority is illusory, formed by the gangs who have no opinion.” Ben Shahn knew it when he observed in his fantastic Norton lectures at Harvard that “without the nonconformist, any society of whatever degree of perfection must fall into decay.”
With an eye to the innumerable offenses against sanity and justice perpetrated by an unquestioning adherence to so-called common sense, Nabokov adds:
It is instructive to think that there is not a single person in this room, or for that matter in any room in the world, who, at some nicely chosen point in historical space-time would not be put to death there and then, here and now, by a commonsensical majority in righteous rage. The color of one’s creed, neckties, eyes, thoughts, manners, speech, is sure to meet somewhere in time or space with a fatal objection from a mob that hates that particular tone. And the more brilliant, the more unusual the man, the nearer he is to the stake. Stranger always rhymes with danger. The meek prophet, the enchanter in his cave, the indignant artist, the nonconforming little schoolboy, all share in the same sacred danger. And this being so, let us bless them, let us bless the freak; for in the natural evolution of things, the ape would perhaps never have become man had not a freak appeared in the family. Anybody whose mind is proud enough not to breed true, secretly carries a bomb at the back of his brain; and so I suggest, just for the fun of the thing, taking that private bomb and carefully dropping it upon the model city of commonsense. In the brilliant light of the ensuing explosion many curious things will appear; our rarer senses will supplant for a brief spell the dominant vulgarian that squeezes Sinbad’s neck in the catch-as-catch-can match between the adopted self and the inner one. I am triumphantly mixing metaphors because that is exactly what they are intended for when they follow the course of their secret connections — which from a writer’s point of view is the first positive result of the defeat of commonsense.
The second result is that the irrational belief in the goodness of man… becomes something much more than the wobbly basis of idealistic philosophies. It becomes a solid and iridescent truth. This means that goodness becomes a central and tangible part of one’s world, which world at first sight seems hard to identify with the modern one of newspaper editors and other bright pessimists, who will tell you that it is, mildly speaking, illogical to applaud the supremacy of good at a time when something called the police state, or communism, is trying to turn the globe into five million square miles of terror, stupidity, and barbed wire. And they may add that it is one thing to beam at one’s private universe in the snuggest nook of an unshelled and well-fed country and quite another to try and keep sane among crashing buildings in the roaring and whining night. But within the emphatically and unshakably illogical world which I am advertising as a home for the spirit, war gods are unreal not because they are conveniently remote in physical space from the reality of a reading lamp and the solidity of a fountain pen, but because I cannot imagine (and that is saying a good deal) such circumstances as might impinge upon the lovely and lovable world which quietly persists, whereas I can very well imagine that my fellow dreamers, thousands of whom roam the earth, keep to these same irrational and divine standards during the darkest and most dazzling hours of physical danger, pain, dust, death.
Nabokov locates the antipode of common sense in “the supremacy of the detail over the general, of the part that is more alive than the whole, of the little thing which a man observes and greets with a friendly nod of the spirit while the crowd around him is being driven by some common impulse to some common goal.” Speaking at the peak of WWII, as John Steinbeck is writing on the other side of the continent that “all the goodness and the heroisms will rise up again, then be cut down again and rise up,” Nabokov offers:
I take my hat off to the hero who dashes into a burning house and saves his neighbor’s child; but I shake his hand if he has risked squandering a precious five seconds to find and save, together with the child, its favorite toy. I remember a cartoon depicting a chimney sweep falling from the roof of a tall building and noticing on the way that a sign-board had one word spelled wrong, and wondering in his headlong flight why nobody had thought of correcting it. In a sense, we all are crashing to our death from the top story of our birth to the flat stones of the churchyard and wondering with an immortal Alice in Wonderland at the patterns of the passing wall. This capacity to wonder at trifles — no matter the imminent peril — these asides of the spirit, these footnotes in the volume of life are the highest forms of consciousness, and it is in this childishly speculative state of mind, so different from commonsense and its logic, that we know the world to be good.