The Marginalian
The Marginalian

Search results for “charles darwin”

A Stop-Motion Love Letter to the Power of Curiosity

“The more you know, the more you want to know… the more connections you can make between the different bits of knowledge… the more ideas you have, which is why curiosity is really the wellspring of creativity.”

“It is in our nature to explore, to reach out into the unknown,” wrote pioneering polar explorer Ernest Shackleton in reflecting on the feat that nearly took his life, adding: “The only true failure would be not to explore at all.” This vitalizing power of exploration applies as much to the exterior world we inhabit as it does to the interior. Upon turning eighty and looking back on his extraordinary life, Henry Miller observed: “Perhaps it is curiosity — about anything and everything — that made me the writer I am. It has never left me.” And yet in the century since Shackleton and the decades since Miller, despite the proliferation of access to knowledge, we seem to have lost our appetite for this singular human faculty that propels us forward. We’ve lulled ourselves into a kind of complacency, where too often we’d rather be right than uncertain or — worse yet — wrong, forgetting that “useful ignorance,” to borrow Thoreau’s beautiful term, is precisely what helps us transcend the limits of our knowledge and stretch our ability.

That vital force of self-transcendence is what Arts University Bournemouth student and self-taught animator Georgina Venning explores in her immeasurably delightful stop-motion animation of an excerpt from Ian Leslie’s RSA talk, based on his book Curious: The Desire to Know and Why Your Future Depends on It (public library).

The piece is one of the winners in the Moving Pictures category of the 2015 RSA Student Design Awards, which invite emerging designers and artists to examine social, environmental, and economic issues through compelling visual communication driven by design thinking. The category itself is an offshoot of RSA’s existing series of animated shorts, which has previously given us such gems as Susan Cain on the power of introverts and Brené Brown on vulnerability and the difference between empathy and sympathy.

Venning’s film is impressively meticulous beyond the beautiful papercraft — in order to create consistent natural light throughout the animation, she filmed one frame per day, at the exact same time of day.

Curiosity is a muscle — use it or lose it. It’s something that we consciously have to nurture in ourselves, in our families, in classrooms, at work.

Sometimes I hear that curiosity and creativity are killed by too many facts — but, actually, the opposite is true: The more you know, the more you want to know. Not only that, but the more you know, the more connections you can make between the different bits of knowledge that you have in your head and therefore the more ideas you have, which is why curiosity is really the wellspring of creativity.

Technology is replacing routine work — and that’s what technology replaces first and has done throughout history. So intellectually curious people — people who are capable of learning throughout their career, of asking questions (good questions), of adapting and collaborating with others from different disciplines; people who are capable of really thriving in this world of non-routine work, in other words — are the people who are going to do better.

In the introduction to the book, Leslie considers humanity’s historically contentious relationship with curiosity and writes:

Our oldest stories about curiosity are warnings: Adam and Eve and the apple of knowledge, Icarus and the sun, Pandora’s box. Early Christian theologians railed against curiosity: Saint Augustine claimed that “God fashioned hell for the inquisitive.” Even humanist philosopher Erasmus suggested that curiosity was greed by a different name. For most of Western history, it has been regarded as at best a distraction, at worst a poison, corrosive to the soul and to society.

There’s a reason for this. Curiosity is unruly. It doesn’t like rules, or, at least, it assumes that all rules are provisional, subject to the laceration of a smart question nobody has yet thought to ask. It disdains the approved pathways, preferring diversions, unplanned excursions, impulsive left turns. In short, curiosity is deviant. Pursuing it is liable to bring you into conflict with authority at some point, as everyone from Galileo to Charles Darwin to Steve Jobs could have attested.

A society that values order above all else will seek to suppress curiosity. But a society that believes in progress, innovation, and creativity will cultivate it, recognizing that the inquiring minds of its people constitute its most valuable asset. In medieval Europe, the inquiring mind — especially if it inquired too closely into the edicts of church or state — was stigmatized. During the Renaissance and Reformation, received wisdoms began to be interrogated, and by the time of the Enlightenment, European societies started to see that their future lay with the curious and encouraged probing questions rather than stamping on them. The result was the biggest explosion of new ideas and scientific advances in history.

The great unlocking of curiosity translated into a cascade of prosperity for the nations that precipitated it. Today, we cannot know for sure if we are in the middle of this golden period or at the end of it. But we are, at the very least, in a lull.

In the remainder of Curious, Leslie goes on to explore our best strategies for jolting ourselves out of that lull by cultivating more diverse modes of curiosity that ensure our flourishing in an increasingly complex world. Complement it with Isaac Asimov on curiosity and risk-taking and Marie Curie on curiosity, wonder, and the spirit of adventure in science.

BP

Creative Courage for Young Hearts: 15 Emboldening Picture Books Celebrating the Lives of Great Artists, Writers, and Scientists

Jane Goodall, Julia Child, Pablo Neruda, Marie Curie, E.E. Cummings, Albert Einstein, Ella Fitzgerald, Antoine de Saint-Exupéry, Frida Kahlo, and more.

UPDATE: Since the publication of this piece in 2015, I have written about other excellent picture-book biographies of Edwin Hubble, Corita Kent, Keith Haring, Maria Mitchell, Ada Lovelace, Louise Bourgeois, Wangari Maathai, Virginia Woolf, Galileo, Nellie Bly, Paul Erdos, Louis Braille, Mary Lou Williams, John Lewis, Muddy Waters, Paul Gauguin, and Jane Jacobs.

Margaret Mead extolled the value of “spiritual and mental ancestors” in how we form our identity — those people to whom we aren’t related but whose values we try to cultivate in ourselves; role models we seek out not from our immediate genetic pool but from the pool of culture the surrounds us, past and present. Seneca saw in reading, one of the oldest and most reliable ways to identify and contact these cultural ancestors, a way of being adopted into the “households of the noblest intellects.” And what better time to meet such admirable models of personhood than in childhood, that fertile seedbed for the flowering of values and identity?

Collected here are thirteen wonderful picture-books celebrating such worthwhile “spiritual and mental ancestors.” It is, of course, an incomplete reading list, yet it is a deliberate one — a great many such books exist, but few feature the trifecta of wonderfulness: a cultural icon notable for his or her lasting contribution to humanity beyond mere fame; an intelligent and nuanced life-story lovingly told; and beautiful, imaginative illustrations rewarding in their own right. Please enjoy.

JANE GOODALL

“One should want only one thing and want it constantly,” young André Gide half-observed, half-resolved in his journal. “Then one is sure of getting it.” More than a century later, Werner Herzog wrote passionately of the “uninvited duty” that a sense of purpose plants in the heart, leaving one with “no choice but to push on.” That combination of desiring something with inextinguishable intensity — which begins with letting your life speak and daring to listen — and pursuing it with steadfast doggedness is perhaps the single common thread in the lives of those we most admire as luminaries of enduring genius. It is also at the heart of what it means to find your purpose and live it.

In Me…Jane (public library), celebrated cartoonist, author, and animal rights advocate Patrick McDonnell chronicles the early life of pioneering primatologist Jane Goodall (b. April 3, 1934) and tells the heartening story of how the seed planted by a childhood dream blossomed, under the generous beams of deep dedication, into the reality of a purposeful life.

McDonnell’s protagonist is not Jane Goodall the widely influential and wildly revered science and spiritualitysage of science and the human spirit — one of a handful of people in history to have both the titles Dame and Doctor — but little Jane, the ten-year-old girl who decided that she was going to work with animals in Africa when she grew up and, despite her family’s poverty, despite living in an era when girls were not encouraged to live the life of science or adventure, despite nearly everyone telling her that it was impossible, turned her dream into reality.

With simple, enormously expressive illustrations and an eloquent economy of words, McDonnell — creator of the beloved MUTTS comic strip — begins at the very beginning: that fateful day when little Jane was given a stuffed monkey named Jubilee.

Jane and Jubilee became inseparable, and she shared with him everything she loved — especially the outdoors. Together, they watched the birds and the spiders and the squirrels fill the backyard with aliveness.

At night, Jane and Jubilee read books to better understand what they saw.

One day, tickled to find out where eggs came from, they snuck into grandma’s chicken coop and observed the miracle of life.

It was a magical world full of joy and wonder, and Jane felt very much a part of it.

Jane liked to climb her beloved beech tree with Jubilee on her back, then sit perched on its branches reading and rereading Tarzan, imagining herself in place of that other Jane, wild and filled with wonder amid the jungles of Africa.

That dream soon became an all-consuming desire not just to go to Africa but to live there, trying to understand the animals and help them.

Every night Jane tucked Jubilee into bed and fell asleep with that dream, until one day — and such is the genius of McDonnell’s elegantly simple message of the dreamer’s doggedness — she awakes in a tent in the Gombe, the seedbed of what would become a remarkable career and an extraordinary life of purpose.

Goodall herself — who founded the heartening youth-led learning and community action initiative Roots & Shoots — writes in the afterword:

We cannot live through a single day without making an impact on the world around us — and we have a choice as to what sort of difference we make… Children are motivated when they can see the positive results their hard work can have.

See more, including a wonderful jazz tribute to Goodall, here.

PABLO NERUDA

Nobel laureate Pablo Neruda was not only one of the greatest poets in human history, but also a man of extraordinary insight into the human experience and the creative impulse — take, for instance, his remarkable reflection on what a childhood encounter taught him about why we make art, quite possibly the most beautiful metaphor for the creative impulse ever committed to paper.

His story and spirit spring alive in Pablo Neruda: Poet of the People (public library) by writer Monica Brown, with absolutely stunning illustrations and hand-lettering by artist Julie Paschkis.

The story begins with the poet’s birth in Chile in 1904 with the given name of Ricardo Eliecer Neftalí Reyes Basoalto — to evade his father’s disapproval of his poetry, he came up with the pen name “Pablo Neruda” at the age of sixteen when he first began publishing his work — and traces his evolution as a writer, his political awakening as an activist, his deep love of people and language and the luminosity of life.

Neftalí wasn’t very good at soccer or at throwing acorns like his friends, but he loved to read and discovered magic between the pages.

Embedded in the story is a sweet reminder of what books do for the soul and a heartening assurance that creative genius isn’t the product of conforming to common standards of excellence but of finding one’s element.

In fact, the book is as much a celebration of Neruda as it is a love letter to language itself — swirling through Paschkis’s vibrant illustrations are words both English and Spanish, beautiful words like “fathom” and “plummet” and “flicker” and “sigh” and “azul.”

Originally featured here.

E.E. CUMMINGS

“In a Cummings poem,” Susan Cheever wrote in her spectacular biography of E. E. Cummings, “the reader must often pick his way toward comprehension, which comes, when it does, in a burst of delight and recognition.” Such a burst is what rewards the reader, whatever his or her age, in Enormous Smallness: A Story of E. E. Cummings (public library) — an uncommonly delightful picture-book celebration of Cummings’s life by Brooklyn-based poet Matthew Burgess, illustrated by Kris Di Giacomo (the artist behind the wonderful alphabet book Take Away the A).

To reimagine the beloved poet’s life in a tango of word and image is quite befitting — unbeknownst to many, Cummings had a passion for drawing and once described himself as “an author of pictures, a draughtsman of words.”

The project comes from Brooklyn-based indie powerhouse Enchanted Lion Books — publisher of some of the most daring and tender children’s books of our time — and was first envisioned by ELB founder Claudia Zoe Bedrick, who approached Burgess about writing a children’s biography of Cummings. Miraculously, Burgess had visited Cummings’s home at 4 Patchin Place in New York City three years earlier, after a serendipitous encounter with the current resident — an experience that had planted a seed of quietly germinating obsession with the legendary poet’s life.

And so the collaboration stretched between them, as Cummings might say, like “a pleasant song” — Burgess and Bedrick worked side by side for four years to bring this wonder of a book to life.

The story begins with Cummings, already known as “E. E.” and living in his New York City home where he spent the last forty years of his life, typing away as the love of his life, the fashion model and photographer Marion Moorehouse, summons him to tea-time with an elephant-shaped bell.

From there, Burgess takes the reader on an affectionate biographical detective story, tracing how Edward Estlin became E. E., what brought him to Manhattan from his native Cambridge, and how elephants (and trees, and birds) became his lifelong creative companions in the circus of his imagination.

Young Estlin’s first poem “poured out of his mouth when he was only three.”

With the loving support of the unsung champions with whom the history of creative culture is strewn — the mother who began recording his spontaneous recitations in a little book titled “Estlin’s Original Poems”; the father who stomped on his hands and knees, play-pretending into existence the mighty elephant that was little Estlin’s creative muse; the teacher who encouraged him to pursue his love of words; the uncle who gave him a book on how to write poetry — he eventually made it to Harvard.

There, he came upon the words of his favorite poet, John Keats — “I am certain of nothing but the holiness of the Heart’s affections and the truth of the Imagination” — which awakened young Estlin’s creative courage. After graduation, he began experimenting with poetry and moved to New York City, falling in love with its “irresistibly stupendous newness.”

But then World War I struck and Estlin went to France, volunteering as an ambulance-driver. While working in the French countryside, he was mistaken for a spy and sent to prison for several months.

When the war ended, he wrote a book about his experience, titled The Enormous Room. Estlin was reborn as E. E.

The following year, he published his first book of poems, Tulips & Chimneys.

Burgess writes:

Using a style all his own,
e. e. put lowercase letters where capitals normally go,
and his playful punctuation grabbed readers’ attention.

His poems were alive with experimentation
and surprise!

And because of his love for lowercase letters,
his name began to appear with two little e’s (& a little c, too).

But his expansive experimentation was too much for the small-minded literary pantheon:

Some people criticized him for painting with words.
Other said his poems were
too strange
too small.
Some said they were
no good at all.

And yet Cummings, who viewed society’s criteria for what it means to be a successful artist with mischievous wryness, was undeterred. A century before Neil Gaiman’s memorable advice that the artist’s only appropriate response to criticism is to make good art, Cummings embodied this ethos. Burgess captures this spirit with quiet elegance, weaving one of Cummings’s poems into the story:

But no matter what the world was giving or taking,
E. E. went right on dreaming and making.
For inside, he knew his poems were new and true.

love is a place

love is a place
& through this place of
love move
(with brightness of peace)
all places

yes is a world
& in this world of
yes live
(skillfully curled)
all worlds.

His poems were his way
of saying YES.

YES to the heart
and the roundness of the moon,
to birds, elephants, trees,
and everything he loved.

YES to spring, too
which always brought him back
to childhood, when the first
sign of his favorite season
was the whistling arrival
of the balloon man.

The book’s epigraph is a celebration of this unflinching yes-saying: “It takes courage to grow up and become who you really are.”

With that courage he catapulted himself into the open arms of those who also hungered for beauty and meaning, and became one of the world’s most beloved poets — a capital-A Artist of his own lowercase making.

Originally featured here.

ALBERT EINSTEIN

Albert Einstein (March 14, 1879–April 18, 1955) may have eventually bequeathed some excellent advice on the secret to learning anything, but the great scientist himself didn’t learn one of the most basic human skills — speaking — until he was nearly four years old. On a Beam of Light: A Story of Albert Einstein (public library) by Jennifer Berne, illustrated by Vladimir Radunsky — the talent behind Mark Twain’s irreverent Advice to Little Girls — tells the tale of how an unusual and awkward child blossomed into becoming “the quintessential modern genius” by the sheer virtue of his unrelenting curiosity.

The story begins with Albert’s birth — a beautiful but odd baby boy who turns one and doesn’t say a word, turns two, then three, and nary a word.

Instead, he “just looked around with his big curious eyes,” wondering about the world. His parents worried that there might be something wrong, but loved him unconditionally. And then:

One day, when Albert was sick in bed, his father brought him a compass — a small round case with a magnetic needle inside. No matter which way Albert turned the compass, the needle always pointed north, as if held by an invisible hand. Albert was so amazed his body trembled.

Suddenly, he knew there were mysteries in the world — hidden and silent, unknown and unseen. He wanted, more than anything, to understand those mysteries.

This was that pivotal spark of curiosity that catapulted his young mind into a lifetime of exploring those mysteries. (One can’t help but wonder whether a similar child, today, would have a similar awakening of mind while beholding a smartphone’s fully automated GPS map. But, perhaps, that modern child would be developing a wholly different type of intelligence.)

Young Albert began asking countless questions at home and at school — so much so, that his teachers chastised him for being a disturbance, admonishing the little boy that he would get nowhere in life unless he learned to follow the rules and behave like the other kids. And yet the mysteries of the universe drew Albert deeper into inquiry.

One day, while riding his bicycle, he gazes at the rays of sunlight beaming from the Sun to the Earth and wonders what it would be like to ride on them, transporting himself into that fantasy:

It was the biggest, most exciting thought Albert had ever had. And it filled his mind with questions.

So he set out to answer them by burying himself in books, reading and discovering the poetry of numbers, that special secret language for decoding the mysteries of the universe.

Once he graduated from college, unable to find a teaching position, he settled for a low-key, quiet government job that allowed him to spend plenty of time with his thoughts and his mathematical explorations, pondering the everyday enigmas of life, until his thoughts coalesced into ideas that made sense of it all — ideas about atoms and motion and space and time. Soon, Albert became an internationally celebrated man of genius.

But with that came the necessary amount of eccentricity — or at least what seemed eccentric from the outside, but is in fact a vital part of any creative mind. Albert, for instance, liked to play his violin when he was having a hard time solving a particularly tricky problem — a perfect way to engage the incubation stage of the creative process, wherein the mind, engulfed in unconscious processing, makes “no effort of a direct nature” in order to later arrive at “sudden illumination.”

Some of his habits, however, were decidedly, and charmingly, quirky: He regularly wandered around town eating an ice-cream cone, and he preferred to wear no socks — not because he tried to be a pseudo-nonconformist, but because he “even chose his clothes for thinking,” often clad in his signature “comfy, old saggy-baggy sweaters and pants.”

Still, everywhere he went, he remained mesmerized by the mysteries of the universe, and the echoes of his thoughts framed much of our modern understanding of the world:

Albert’s ideas helped build spaceships and satellites that travel to the moon and beyond. His thinking helped us understand the universe as no one ever had before.

And yet the central message of this altogether wonderful picture-book is that despite his genius — or, perhaps, precisely because of it — Einstein’s greatest legacy to us isn’t all the answers he bequeathed but all the open questions he left for today’s young minds to grow up pondering. Because, after all, it is “thoroughly conscious ignorance” that drives science and our understanding of life.

The final spread, reminiscent of these illustrated morphologies of Susan Sontag’s favorite things and Ronald Barthes’s likes and dislikes, captures Einstein’s life in eight essentials:

Originally featured here.

ELLA FITZGERALD

From writer Roxanne Orgill and mixed-media artist Sean Qualls comes Skit-Scat Raggedy Cat: Ella Fitzgerald (public library) — the wonderfully illustrated rags-to-riches story of how The First Lady of Song sang her way from the streets of Yonkers to the cultural hall of fame, with a National Medal of Art, a Presidential Medal of Freedom, and thirteen Grammys, including one for Lifetime Achievement.

From how she cranked the phonograph as a little girl to hear the Boswell Sisters’ honey-voices to how she saved her nickels to take the train to Harlem “forty-five minutes and a world away” for an audition to how her early passion for dancing became a lifelong love affair with song, the story captures not only her journey to public stardom but also the private gleam of this beautiful soul’s inner starlight.

For a touch loveliness, interwoven throughout the biographical narrative are snippets of Fitzgerald’s most celebrated songs, extending to kids a warm invitation to discover the wonders of jazz — a modern-day counterpart to Langston Hughes’s vintage treasure The First Book of Jazz.

HENRI MATISSE

At 8PM on the last day of 1869, a little boy named Henri entered the world in a gray textile-mill town in the north of France, in a rundown two-room cottage with a leaky roof. He didn’t have much materially, but he was blessed with perhaps the greatest gift a child could have — an unconditionally loving, relentlessly supportive mother. Like many creative icons whose destinies were shaped by the unflinching encouragement of loved ones, little Henri became the great Henri Matisse thanks to his mother’s staunch support, which began with an unusual ignition spark: At the age of twenty, Henri was hospitalized for appendicitis and his mother brought him a set of art supplies with which to occupy his recovery. “From the moment I held the box of colors in my hands,” Matisse recounted, “I knew this was my life. I threw myself into it like a beast that plunges towards the thing it loves.” And that thing flowed from love, too — it was Matisse’s mother who encouraged her son, like E.E. Cummings encouraged all aspiring artists, to disregard the formal rules of art and instead paint from the heart. “My mother loved everything I did,” he asserted. Decades later, thanks to Gertrude Stein’s patronage, which catalyzed his career and sparked his friendship with Picasso, the world too would come to love what Matisse did.

In The Iridescence of Birds: A Book About Henri Matisse (public library), writer Patricia MacLachlan and illustrator Hadley Hooper tell the heartening story of young Henri’s childhood and how it shaped his artistic path long before he began painting — how his mother, in an attempt to brighten the drab and sunless days, put bright red rugs on the floors and painted colorful plates to hang on the walls, letting little Henri mix the paints; how his father gave him pigeons, whose iridescent plumage the boy observed with endless fascination; how the beautiful silks woven by the townspeople beguiled him with their bright patterns.

With a gentle sidewise gleam, the story offers a nuanced answer to the eternal nature-versus-nurture question of whether genius is born or made. Embedded in it is a wonderful testament to the idea that attentive presence rather than praise is the key to great parenting, especially when it comes to nurturing young talent. (Indeed, such maternal presence is what legendary editor Ursula Nordstrom provided for many of the young authors and artists — including, most notably, Maurice Sendak — whom she nurtured over the course of her reign as the twentieth century’s greatest patron saint of children’s books.)

For a delightful touch of empathy via a twist of perspective, MacLachlan places the reader in little Henri’s shoes:

If you were a boy named Henri Matisse who lived in a dreary town in northern France where the skies were gray

And the days were cold

And you wanted color and light

And sun,

And your mother, to brighten your days,

Painted plates to hang on the walls

With pictures of meadows and trees,

Rivers and birds,

And she let you mix the colors of paint…

… And you raised Pigeons

Watching their sharp eyes
And red feet,

And their colors that changed with the light
As they moved…

… Would it be a surprise that you became
A fine painter who painted
Light
and
Movement

And the iridescence of birds?

Beneath the biographical particulars of the story itself is MacLachlan’s larger inquiry into the enduring question of whether artists draw what they see or what they feel and remember — Matisse’s life, she writes in the afterword, attests to the fact that the two are inextricably entwined: “He painted his feelings and he painted his childhood.”

Hooper’s illustrations are themselves a masterwork of artistry, scholarship, and creative ingenuity. She spent considerable time studying Matisse’s sensibility and colors in reproductions of his drawings, cutouts, and paintings, then researched textile patterns from the era of his childhood and even used Google Maps to picture the actual streets that he walked as a little boy. The result is not imitation but dimensional celebration. Hooper reflects on the unusual and inventive technique she chose:

I decided to try relief printing, which forced me to simplify my shapes and allowed me to focus on the color and composition. I cut the characters and backgrounds out of stiff foam and cardboard, inked them up, made prints, and scanned the results into Photoshop. The approach felt right.

Originally featured here.

MARIE CURIE

Marie Curie (November 7, 1867–July 4, 1934) is one of the most extraordinary figures in the history of science and a tireless champion of curiosity and wonder. A pioneer in researching radioactivity, a field the very name for which she coined, she was not only the first woman to win a Nobel Prize but also the first person to win two Nobel Prizes in two different sciences: chemistry and physics. In Radioactive: Marie & Pierre Curie: A Tale of Love and Fallout (public library), artist Lauren Redniss tells the story of Curie through the two invisible but immensely powerful forces that guided her life: radioactivity and love. It’s a turbulent story — a passionate romance with Pierre Curie (honeymoon on bicycles!), the epic discovery of radium and polonium, Pierre’s sudden death in a freak accident in 1906, Marie’s affair with physicist Paul Langevin, her coveted second Noble Prize — under which lie poignant reflections on the implications of Curie’s work more than a century later as we face ethically polarized issues like nuclear energy, radiation therapy in medicine, nuclear weapons and more.

Most remarkable of all, however, is the thoughtfulness with which Redniss tailored her medium to her message, turning the book into a work of art in and of itself, every detail meticulously moulded to fit the essence of the narrative.

To stay true to Curie’s spirit and legacy, Redniss rendered her poetic artwork in an early-20th-century image printing process called cyanotype, critical to the discovery of both X-rays and radioactivity itself — a cameraless photographic technique in which paper is coated with light-sensitive chemicals. Once exposed to the sun’s UV rays, this chemically-treated paper turns a deep blue color. The text in the book is a unique typeface Redniss designed using the title pages of 18th- and 19th-century manuscripts from the New York Public Library archive. She named it Eusapia LR, for the croquet-playing, sexually ravenous Italian Spiritualist medium whose séances the Curies used to attend. The book’s cover is printed in glow-in-the-dark ink.

See more, including a behind-the-scenes look at Redniss’s impressive creative process, here.

HARVEY MILK

“Injustice anywhere is a threat to justice everywhere,” Martin Luther King, Jr. wrote in his indispensable 1963 letter from Birmingham City Jail. “We are caught in an inescapable network of mutuality.” One rainy January Sunday fifteen years later, long before Edie Windsor catalyzed the triumph of marriage equality, Harvey Milk (May 22, 1930–November 27, 1978) was sworn into office on the steps of San Francisco’s City Hall and became the first openly gay elected city official in America. His assassination eleven months later devastated millions and rendered him modernity’s great secular martyr for love. His tenure, however tragically brief, forever changed the landscape of civil rights.

In The Harvey Milk Story (public library) — a wonderful addition to the best LGBT children’s books — writer Kari Krakow and artist David Gardner tell the heartening and heartbreaking story of how a little boy with big ears grew up to hear the cry for social justice and how he answered it with a groundbreaking clarion call for equality in the kingdom of love.

Harvey was Born the second child of a middle-class Jewish family in upstate New York. He was a boy at once brimming with joy, frequently entertaining the family by conducting an invisible orchestra in the living room, and full of deep sensitivity to the suffering of others.

He was deeply moved when his mother, Minnie, told him the story of the Warsaw Ghetto Jews who courageously defended themselves even as the Nazis outnumbered them — a story that imprinted him with a profound empathy for the oppressed even before he had a clear sense that he would grow up to be one of them.

Although Harvey was athletic and popular in school, he anguished under the burden of a deep wistfulness — by the time he was fourteen, he knew he was gay, but like many queer people of his time, he kept this centerpiece of identity a closely guarded secret for a great many years to come.

He came of age, after all, in an era when queer couples celebrated their love only in private and when geniuses as vital to humanity as computing pioneer Alan Turing were driven to suicide after being criminally prosecuted by the government for being gay.

After graduating from college, Harvey joined the Navy, becoming an expert deep-sea diver and ascending through the ranks until he came to head a submarine rescue vessel.

When he went to his bother Robert’s wedding, he looked so handsome in his navy uniform that his family and friends all wondered when he would settle down and get married to the “right girl.”

But instead, like the hero of the heartwarming King & King fairy tale, Harvey fell in love and settled down with the right boy, a young man named Joe.

They moved together to a little town in New York, where Harvey became a high school math and science teacher. But after six years, Harvey and Joe separated — as Krakow points out, the pressure to hide their relationship in fear of losing their jobs put an undue strain on their love. Weary of hiding his identity, Harvey moved to San Francisco’s gay-friendly Castro neighborhood — where queer couples walked down the street holding hands like any other couple would in any other city — and he fell in love again.

Together with Scott, his new partner, Harvey opened a small store called Castro Camera, which soon turned into a community center as Harvey became a one-man Craigslist, counseling neighbors on everything from finding apartments to applying for jobs.

The more Harvey listened to the people, the more he sensed that they needed a leader — not only an informal one, but one who fought on their behalf in the eyes of the law, standing up to the police who harassed them constantly and fighting against the daily indignities of discrimination, from which the political system failed to protect them. Harvey saw only one course of action — to apply for office. His customers and the community embraced his campaign and volunteered their time.

Eleven-year-old Medora Payne came every day after school to lick envelopes and hand out brochures for Harvey. She organized a fundraiser at her school, earning $39.28 for his campaign.

Bigots believed that it wasn’t right or even possible for an openly gay candidate to be elected. Indeed, Harvey lost three consecutive election cycles between 1973 and 1976, but didn’t lose faith. He remained emboldened by the unflinching conviction that the rights of minorities — not only the LGBT community, but also African Americans, Asian Americans, senior citizens, and the disabled — weren’t adequately represented in and protected by the government. His people loved him for the dedication.

At last, in 1977, he was elected to the city’s Board of Supervisors and sworn into office the following January as Supervisor Milk. He immediately set out to champion greater quality of life for the people of the city — a kind of Robert Moses without the evil genius, bolstering the city’s parks, schools, and police protection. Eventually, he introduced a pioneering gay bill of rights. After ten of the city’s eleven supervisors voted for it, Mayor George Moscone signed it into law, proclaiming with gusto as Milk stood by his side:

I don’t do this enough, taking swift and unambiguous action on a substantial move for civil rights.

It was a historic moment, marked by a moving speech Milk made in front of City Hall, calling for a gay rights march in Washington.

But as the city celebrated, one man sat consumed with hateful bigotry and personal jealousy — Dan White, the only Supervisor who hadn’t voted for Milk’s bill and who had resigned from office in a petty act of protest, only to ask for his job back ten days later. Sensing his ill will, Mayor Moscone had refused to hire him back.

On a gloomy November morning, White crept into City Hall through a basement window, with a loaded gun. He barged into Moscone’s office and shot the mayor, promptly reloading his gun and heading down the hall to Harvey Milk’s office. Five shots echoed through the marble building.

Harvey Milk was dead.

People everywhere were stunned by the news of the double assassination. They left their homes, jobs and schools to mourn the loss of these two great leaders. Crowds began forming in front of City Hall. By nightfall thousands filled the mile-long street and ran from the Castro to City Hall. They stood in silence, carrying candles. That night the people of San Francisco wept.

Harvey Milk was gone, but his legacy only gained momentum in the fight for civil rights. The following October, a hundred thousand people brought his dream to life and took to the streets of Washington in the capital’s first-ever Gay Pride March, many carrying portraits of the slain San Francisco hero.

Thirty-four years later, one brave woman picked up where he left off and made possible a dream even Milk didn’t dare to dream — one which the president himself proclaimed “a victory for American democracy,” the triumphant road to which Milk had paved.

Originally featured here.

MARIA MERIAN

Inspired children’s books about science are woefully rare in our culture — as rare, perhaps, as are homages to pioneering female scientists and celebrations of the intersection of art and science. The confluence of these three rarities is what makes Summer Birds: The Butterflies of Maria Merian (public library) — a young-readers counterpart to Taschen’s lavish volume Maria Sibylla Merian: Insects of Surinam — so wonderful. Writer Margarita Engle and artist Julie Paschkis tell the story of 17th-century German naturalist and illustrator Maria Merian, whose studies of butterfly metamorphosis are among the most important contributions to the field of entomology in the history of science and forever transformed natural history illustration.

There are many ennobling and empowering threads to the story of Merian’s life — how she began studying insects as a young girl, two centuries before the dawn of science education for women; how she trained tirelessly in art, then brought those skills to illuminating science, all while raising her daughters; how she traveled to South Africa with her young daughter in an era when women had practically no agency of mobility; how she continued to work even after a stroke left her paralyzed.

But perhaps most pause-giving of all is the reminder of just how much superstition early scientists had to overcome in the service of simple truth: In Merian’s time, people considered insects evil and found the “supernatural” process of metamorphosis particularly ominous, believing it was witchcraft that transformed the insect from one state to another.

By meticulous and attentive observation, Merian proved that the process was very much a natural one, and beautifully so. She was only thirteen. Her groundbreaking work was a prescient testament to Richard Feynman’s famous assertion that science only adds to the mystery and the awe of the natural world.

When people understand the life cycles of creatures that change forms, they will stop calling small animals evil. They will learn, as I have, by seeing a wingless caterpillar turn into a flying summer bird.

On her site, Paschkis shares her research process and offers a fascinating history of insect illustration.

Originally featured here.

ANTOINE DE SAINT EXUPÉRY

“The Little Prince will shine upon children with a sidewise gleam. It will strike them in some place that is not the mind and glow there until the time comes for them to comprehend it.” So sang a 1943 review of The Little Prince, published a few months before the beloved book’s author disappeared over the Bay of Biscay never to return. But though it ultimately became the cause of his tragic death, Antoine de Saint-Exupéry’s experience as a pilot also informed the richness of his life and the expansive reach of his spirit, from his reflection on what his time in the Sahara desert taught him about the meaning of life to his beautiful meditation on the life-saving potential of a human smile. It was at the root of his identity and his imagination, and as such inspired the inception of The Little Prince.

That interplay between Saint-Exupéry the pilot and Saint-Exupéry the imaginative creator of a cultural classic is what celebrated Czech-born American children’s book author and illustrator Peter Sís explores in the beautiful graphic biography The Pilot and the Little Prince (public library) — a sensitive account of Saint-Exupéry’s life, underpinned by a fascinating chronicle of how aviation came to change humanity and a poignant undercurrent of political history, absolutely magical it its harmonized entirety.

Saint-Exupéry was born in 1900, a golden age of discovery, just as airplanes had been invented in France and the dawn of aviation was emanating an exhilarating spirit of exploration and invention. Young Antoine quickly became enchanted with that exhilaration and at the age of twelve, he built a makeshift flying machine.

Sís writes:

It did not take off, but this didn’t discourage him.

That summer, he rode his bike to a nearby airfield every day to watch the pilots test planes. He told them he had permission from his mother to fly, so one pilot took him up in the air. His mother was not happy. Antoine couldn’t wait to go up again.

The obsession had permanently lodged itself into his psyche. When the war came and he was summoned to military duty, young Saint-Exupéry requested the air force but was assigned to the ground crew. Again, he remained unperturbed. Two years later, when he heard about a new airline operated by the postal service to deliver the mail, he got himself hired — first as a mechanic, and soon as a test pilot, eventually learning to fly by accompanying other pilots on mail routes. Sís writes:

One day, he heard the news he had been waiting for: he would fly the mail from France to Spain by himself. Henri Guillaumet, another pilot and later Antoine’s good friend, told him not just to depend on the map but to follow the face of the landscape.

Saint-Exupéry was living his dream, flying in Europe and West Africa. Eventually, the airline assigned him to an airfield in Cape Juby in southern Morocco, and the two years he spent in the desert were among the happiest in his life, a period he would go on to cherish with beautiful and bittersweet wistfulness for the rest of his days. Sís captures the romantic poetics of the experience:

He lived in a wooden shack and had few belongings and fewer visitors. With an ocean on one side and desert everywhere else, it seemed like one of the loneliest places in the world. But he loved the solitude and being under millions of stars.

The locals came to call him Captain of the Birds as he rescued stranded pilots and appeased hostile nomads who had shot down planes and kidnapped flyers. His time in the desert became powerful fuel for his writing and the raw inspiration for The Little Prince. But the skies remained his greatest love. Sís traces the trajectory of Saint-Exupéry’s travels and passions:

Eager to explore other skies, Antoine joined his fellow aviators in creating new mail routes in South America. Nothing could stop them as they crossed glaciers, rain forests, and mountain peaks, battling fierce winds and wild storms.

Antoine spent more time in the air here than anywhere else because the pilots now also flew at night. With stars above and lights below, his world felt both immense and small.

Upon returning to France, Saint-Exupéry fell in love, got married, and reached significant fame as both a pilot and an author. But driven by his chronic adventurer’s restlessness, he continued to dream up expeditions that came to border on stunts. In one, he competed for a prize for the fastest flight between Paris and Saigon, but he and his copilot crashed in North Africa, surviving by a hair and wandering the desert for days before being rescued. In another, he set out to become the first French pilot to fly from New York to the tip of South America. The plane crashed near Guatemala City but, miraculously, he survived once more.

As World War II engulfed Europe, Saint-Exupéry was called for military duty once more, this time as a pilot, observing from high in the skies the atrocities the Germans inflicted all over. Once his war service ended, he decided he couldn’t continue to live in France under German occupation and fled to Portugal on a ship — a trip that would stir the very foundations of his soul and inspire his magnificent Letter to a Hostage — eventually ending up in New York, where he found himself lonesome and alienated.

After writing Flight to Arras and sending a copy to President Roosevelt with the inscription “For President Franklin Roosevelt, whose country is taking on the heavy burden of saving the world,”Saint-Exupéry bought a set of watercolor paints and began working on the illustrations for the story that would become The Little Prince. Sís captures the layered message of the book, informed both by Saint-Exupéry’s passions and his forlorn homesickness, with beautiful simplicity:

He described a planet more innocent than his own, with a boy who ventured far from home, questioned how things worked, and searched for answers.

But the author grew increasingly restless once more. Longing to fly again and to see his family, who had remained in France, he rejoined his old squadron in North Africa, requesting flights that would take him back to France. Sís captures the tragic bluntness of how Saint-Exupéry’s story ended, at once almost sterile in its abruptness and richly poetic in the context of his lifelong obsession:

On July 31, 1944, at 8:45am, he took off from Borgo, Corsica, to photograph enemy positions east of Lyon. It was a beautiful day. He was due back at 12:30.

But he never returned. Some say he forgot his oxygen mask and vanished at sea.

Maybe Antoine found his own glittering planet next to the stars.

Originally featured here.

IBN SINA

Humanity’s millennia-old quest to understand the human body is strewn with medical history milestones, but few individual figures merit as much credit as Persian prodigy-turned-polymath Ibn Sina (c. 980 CE–1037 AD), commonly known in the West as Avicenna — one of the most influential thinkers in our civilization’s unfolding story. He authored 450 known works spanning physics, philosophy, astronomy, mathematics, logic, poetry, and medicine, including the seminal encyclopedia The Canon of Medicine, which forever changed our understanding of the human body and its inner workings. This masterwork of science and philosophy — or metaphysics, as it was then called — remained in use as a centerpiece of medieval medical education until six hundred years after Ibn Sina’s death.

His story comes to life in The Amazing Discoveries of Ibn Sina (public library) by Lebanese writer Fatima Sharafeddine, Iran-based Iraqi illustrator Intelaq Mohammed Ali, and Canadian indie powerhouse Groundwood Books — a fine addition to the loveliest children’s books celebrating science.

In stunning illustrations reminiscent of ancient Islamic manuscript paintings, this lyrical first-person biography traces Ibn Sina’s life from his childhood as a voracious reader to his numerous scientific discoveries to his lifelong project of advancing the art of healing.

A universal celebration of curiosity and the unrelenting pursuit of knowledge, the story is doubly delightful for adding a sorely needed touch of diversity to the homogenous landscape of both science history and contemporary children’s books — here are two Middle Eastern women, telling the story of a pioneering scientist from the Islamic Golden Age.

Originally featured here.

FRIDA KAHLO

Mexican painter Frida Kahlo (July 6, 1907–July 13, 1954) was a woman of vibrantly tenacious spirit who overcame an unfair share of adversity to become one of humanity’s most remarkable artists and a wholehearted human being out of whom poured passionate love letters and compassionate friend-letters.

The polio she contracted as a child left her right leg underdeveloped — an imperfection she’d later come to disguise with her famous colorful skirts. As a teenager, having just become one of only thirty-five female students at Mexico’s prestigious Preparatoria school, Kahlo was in a serious traffic accident that sent an iron rod through her stomach and uterus. She spent three months in a full-body cast and even though the doctors didn’t believe it possible, she willed her way to walking again. Although the remainder of her life was strewn with relapses of extreme pain, frequent hospital visits, and more than thirty operations, that initial recovery period was a crucial part of her creative journey.

True to Roald Dahl’s conviction that illness emboldens creativity, Kahlo made her first strides in painting while bedridden, as a way of occupying herself, painting mostly her own image. Today, she remains best-known for her vibrant self-portraits, which comprise more than a third of her paintings, blending motifs from traditional Mexican art with a surrealist aesthetic. Above all, she became a testament to the notion that we can transcend external limitations to define our scope of possibility.

Kahlo’s singular spirit and story spring to life in the immeasurably wonderful Viva Frida (public library) by writer/illustrator Yuyi Morales and photographer Tim O’Meara.

In simple, lyrical words and enchanting photo-illustrations, this dreamlike bilingual beauty tells the story of an uncommon Alice in a luminous Wonderland of her own making.

Morales, who painstakingly handcrafted all the figurines and props and staged each vignette, writes in the afterword:

When I think of Frida Kahlo, I think of orgullo, pride. Growing up in Mexico, I wanted to know more about this woman with her mustache and unibrow. Who was this artist who had unapologetically filled her paintings with old and new symbols of Mexican culture in order to tell her own story?

I wasn’t always so taken by Frida. When I was younger, I often found her paintings tortuous and difficult to understand. The more I learned about Frida’s life, the more her paintings began to take on new light for me. I finally saw that what had terrified me about Frida’s images was actually her way of expressing the things she felt, feared, and wanted.

[…]

Her work was proud and unafraid and introduced the world to a side of Mexican culture that had been hidden from view.

As a child, while learning to draw, I would often study my own reflection in the mirror and think about Frida. Did she know how many artists she influenced with her courage and her ability to overcome her own limitations?

See more, including a behind-the-scenes look at Morales’s meticulous craftsmanship and creative process, here.

ERNEST SHACKLETON

In August of 1914, legendary British explorer Ernest Shackleton led his brave crew of men and dogs on a journey to the end of the world — the enigmatic continent of Antarctica. That voyage — monumental both historically and scientifically — would become the last expedition of the Heroic Age of Antarctic Exploration, which stretched from 1888 to 1914. From Flying Eye Books — the children’s book imprint of British indie press Nobrow, which gave us Freud’s comic biography, Blexbolex’s brilliant No Man’s Land and some gorgeous illustrated histories of aviation and the Space Race — comes Shackleton’s Journey (public library), a magnificent chronicle by emerging illustrator William Grill, whose affectionate and enchanting colored-pencil drawings bring to life the legendary explorer and his historic expedition.

As Grill tells us in the introduction, Shackleton was a rather extraordinary character:

Shackleton was the second of ten children. From a young age, Shackleton complained about teachers, but he had a keen interest in books, especially poetry — years later, on expeditions, he would read to his crew to lift their spirits. Always restless, the young Ernest left school at 16 to go to sea. After working his way up the ranks, he told his friends, “I think I can do something better, I want to make a name for myself.”

And make it he did. Reflecting on the inescapable allure of exploration, which carried him through his life of adventurous purpose, Shackleton once remarked:

I felt strangely drawn to the mysterious south. I vowed to myself that some day I would go to the region of ice and snow, and go on and on ’til I came to one of the poles of the Earth, the end of the axis on which this great round ball turns.

From the funding and recruitment of the famed expedition, to the pioneering engineering of the Endurance ship, to the taxonomy of crew members, dogs, and supplies, Grill traces Shackleton’s tumultuous journey from the moment the crew set sail to their misfortune-induced change of plans and soul-wrenching isolation “500 miles away from the nearest civilization” to their eventual escape from their icy prison and salvation ashore Elephant Island.

As a lover of dogs and visual lists, especially illustrated lists and dog-themed illustrations, I was especially taken with Grill’s visual inventories of equipment and dogs:

Despite the gargantuan challenges and life-threatening curveballs, Shackleton’s expedition drew to a heroic close without the loss of a single life. It is a story of unrelenting ambition to change the course of history, unflinching courage in the face of formidable setbacks, and above all optimism against all odds — the same optimism that emanates with incredible warmth from Grill’s tender illustrations.

Years later, Shackleton himself captured the spirit that carried them:

I chose life over death for myself and my friends… I believe it is in our nature to explore, to reach out into the unknown. The only true failure would be not to explore at all.

Originally featured here.

JULIA CHILD

Legendary chef Julia Child (August 15, 1912–August 13, 2004) not only revolutionized the world of cookbooks but was also a remarkable beacon of entrepreneurship and perseverance more than a decade before women started raising their voices in the media world. Her unrelenting spirit and generous heart cast her as one of modern history’s most timeless role models, and that’s precisely what writer and illustrator Jessie Hartland celebrates in the endlessly wonderful Bon Appetit! The Delicious Life of Julia Child (public library) — a heartening illustrated biography of the beloved chef, intended to enchant young readers with her story but certain to delight all of us. Hartland’s vibrant drawings — somewhere between Maira Kalman, Wendy MacNaughton, and Vladimir Radunsky — exude the very charisma that made Childs an icon, and infuse her legacy with fresh joy.

Amidst the beautiful illustrations are practical glimpses of Child’s culinary tricks and the context of her recipes:

At the end of the story, as at the end of her life, Child emerges not only as a masterful cook but also as a fierce entrepreneur, a humble human, and restlessly creative soul.

Originally featured here.

HENRI ROUSSEAU

“People working in the arts engage in street combat with The Fraud Police on a daily basis,” Amanda Palmer wrote in her fantastic manifesto for the creative life, one of the best books of the year, “because much of our work is new and not readily or conventionally categorized.” Few artists in history have lived through this street combat with more dignity and resilience of spirit than French Post-Impressionist painter Henri Rousseau (May 21, 1844–September 2, 1910). Long before history came to celebrate him as one of the greatest artists of his era, long before he was honored by major retrospectives by such iconic institutions as the MoMA and the Tate Museum, long before Sylvia Plath began weaving homages to him into her poetry, he spent a lifetime being not merely dismissed but ridiculed. And yet Rousseau — who was born into poverty, began working alongside his plumber father as a young boy, still worked as a toll collector by the age of forty, and was entirely self-taught in painting — withstood the unending barrage of harsh criticism with which his art was met during his entire life, and continued to paint from a deep place of creative conviction, with an irrepressible impulse to make art anyway.

In The Fantastic Jungles of Henri Rousseau (public library, writer Michelle Markel and illustrator Amanda Hall tell an emboldening real-life story, and a stunningly illustrated one, of remarkable resilience and optimism in the face of public criticism; of cultivating a center so solid and a creative vision so unflinching that no outside attack can demolish it and obstruct its transmutation into greatness; of embodying Ray Bradbury’s capacity for weathering the storm of rejection and Picasso’s conviction about never compromise in one’s art.

Henri Rousseau wants to be an artist.
Not a single person has ever told him he is talented.
He’s a toll collector.
He’s forty years old.

But he buys some canvas, paint, and brushes, and starts painting anyway.

Rousseau’s impulse for art sprang from his deep love of nature — a manifestation of the very thing that seventeen-year-old Virginia Woolf intuited when she wrote in her diary that the arts “imitate as far as they can the one great truth that all can see”.

Unable to afford art lessons, Rousseau educated himself by going to the Louvre to study the paintings of his favorite artists and examining photographs, magazines, and catalogs to learn about the anatomy of the human body.

At the age of forty-one, he showed his work as part of a big art exhibition, but his art — vibrant, flat, seemingly childish — was met, as Markel writes, with “only mean things.” Even so, Rousseau saved the reviews and pasted them into his scrapbook.

With his voracious appetite for inspiration, Rousseau visited the World’s Fair, where he was especially enchanted by the exhibits of exotic lands. “They remind him of adventure stories he loved when he was a boy,” Markel writes. The vivid images haunted him for days, until he finally turned to the easel to exorcise his restless imagination.

He holds his paintbrush to the canvas. A tiger crawls out. Lightning strikes, and wind whips the jungle grass.

Sometimes Henri is so startled by what he paints that he has to open the window to let in some air.

But for all his earnest creative exuberance, he is met with derision.

Every year Henri goes back to the art exhibition to show new paintings. He fusses over the canvases and retouches them until the last minute.

And every year the art experts make fun of him. They say it looks like he closed his eyes and painted with his feet.

And yet Rousseau manages to embody Georgia O’Keeffe’s credo that “whether you succeed or not is irrelevant… making your unknown known is the important thing” — he continues to paint, to study nature, and to rejoice in the process itself.

One night, he dreams up a painting of which he is especially proud, depicting a lion looking over a sleeping gypsy with friendly curiosity.

Once again he takes his work to the art show. This time, perhaps, he’ll please the experts. His pulse races.

The experts say he paints like a child. “If you want to have a good laugh,” one of them writes, “go see the paintings by Henri Rousseau.”

By now Henry is used to the nasty critics. He knows his shapes are simpler and flatter than everyone else’s, but he thinks that makes them lovely.

Everything he earns by giving music lessons, he spends on art supplies. But he lives by Thoreau’s definition of success.

His home is a shabby little studio, where one pot of stew must last the whole week. But every morning he wakes up and smiles at his pictures.

At sixty-one, Rousseau is still living in poverty, but happily paints his jubilant junglescapes. He continues to hope for critical acclaim and continues to be denied it, cruelly, by the “experts,” one of whom even says that “only cavemen would be impressed by his art.”

At last, Rousseau, already an old man, gets a break — but the recognition comes from a new generation of younger artists, who befriend him and come to admire his work. More than his talent and his stomach for criticism, however, one comes to admire his immensely kind and generous heart.

Whenever Henri has money to spare, and stages a concert in his little studio, all the artists come. Along with the grocer, locksmith, and other folks from the neighborhood, they listen to Henri’s students and friends play their musical instruments. Henri gives the shiniest, reddest apples to the children.

Eventually, even Picasso pays heed and throws old Henri a banquet, at which “the old man sits upon a makeshift throne” playing his violin as people dance and celebrate around him, his heart floating “like a hot-air balloon above the fields.”

At the end of his life, Rousseau paints his masterwork “The Dream” and finally becomes successful by a public standard as the critics, at last, grant him acclaim. But the beautiful irony and the ennobling message of the story is that he was successful all along, for he had found his purpose — a feat with which even Van Gogh struggled for years — and filled each day with the invigorating joy of making his unknown known.

A hundred years later, the flowers still blossom, the monkeys still frolic, and the snakes keep slithering through Henri’s hot jungles. His paintings now hang in museums all over the world. And do you think experts call them “foolish,” “clumsy,” or “monstrous”? Mais non! They call them works of art.

By an old man,
by a onetime toll collector,
by one of the most gifted self-taught artists in history:
Henri Rousseau

Originally featured here.

* * *

For a different, more grownup celebration of notable lives, complement these children’s-books treasures with the graphic-novel biographies of Sigmund Freud, Salvador Dalí, Karl Marx, Robert Moses, Andy Warhol, Charles Darwin, Francis Bacon, Richard Feynman, Steve Jobs, and Hunter S. Thompson.

BP

This Idea Must Die: Some of the World’s Greatest Thinkers Each Select a Major Misconception Holding Us Back

From the self to left brain vs. right brain to romantic love, a catalog of broken theories that hold us back from the conquest of Truth.

“To kill an error is as good a service as, and sometimes even better than, the establishing of a new truth or fact,” asserted Charles Darwin in one of the eleven rules for critical thinking known as Prospero’s Precepts. If science and human knowledge progress in leaps and bounds of ignorance, then the recognition of error and the transcendence of falsehood are the springboard for the leaps of progress. That’s the premise behind This Idea Must Die: Scientific Theories That Are Blocking Progress (public library) — a compendium of answers Edge founder John Brockman collected by posing his annual question“What scientific idea is ready for retirement?” — to 175 of the world’s greatest scientists, philosophers, and writers. Among them are Nobel laureates, MacArthur geniuses, and celebrated minds like theoretical physicist and mathematician Freeman Dyson, biological anthropologist Helen Fisher, cognitive scientist and linguist Steven Pinker, media theorist Douglas Rushkoff, philosopher Rebecca Newberger Goldstein, psychologist Howard Gardner, social scientist and technology scholar Sherry Turkle, actor and author Alan Alda, futurist and Wired founding editor Kevin Kelly, and novelist, essayist, and screenwriter Ian McEwan.

Brockman paints the backdrop for the inquiry:

Science advances by discovering new things and developing new ideas. Few truly new ideas are developed without abandoning old ones first. As theoretical physicist Max Planck (1858–1947) noted, “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.” In other words, science advances by a series of funerals.

Many of the answers are redundant — but this is a glorious feature rather than a bug of Brockman’s series, for its chief reward is precisely this cumulative effect of discerning the zeitgeist of ideas with which some of our era’s greatest minds are tussling in synchronicity. They point to such retirement-ready ideas as IQ, the self, race, the left brain vs. right brain divide, human nature and essentialism, free will, and even science itself. What emerges is the very thing Carl Sagan deemed vital to truth in his Baloney Detection Kit — a “substantive debate on the evidence by knowledgeable proponents of all points of view.”

Illustration by Lizi Boyd from ‘Flashlight.’ Click image for more.

One of the most profound undercurrents across the answers has to do with our relationship with knowledge, certainty, and science itself. And one of the most profound contributions in that regard comes from MacArthur fellow Rebecca Newberger Goldstein, a philosopher who thinks deeply and dimensionally about some of the most complex questions of existence. Assailing the idea that science makes philosophy obsolete — that science is the transformation of “philosophy’s vagaries into empirically testable theories” and philosophy merely the “cold-storage room in which questions are shelved until the sciences get around to handling them” — Goldstein writes:

The obsolescence of philosophy is often taken to be a consequence of science. After all, science has a history of repeatedly inheriting — and definitively answering — questions over which philosophers have futilely hemmed and hawed for unconscionable amounts of time.

The gravest problem with this theory, Goldstein notes, is its internal incoherence:

You can’t argue for science making philosophy obsolete without indulging in philosophical arguments… When pressed for an answer to the so-called demarcation problem, scientists almost automatically reach for the notion of “falsifiability” first proposed by Karl Popper. His profession? Philosophy. But whatever criterion you offer, its defense is going to implicate you in philosophy.

This is something that Dorion Sagan, Carl Sagan’s son, has previously addressed, but Goldstein brings to it unparalleled elegance of thought and eloquence of expression:

A triumphalist scientism needs philosophy to support itself. And the lesson here should be generalized. Philosophy is joined to science in reason’s project. Its mandate is to render our views and our attitudes maximally coherent.

In doing so, she argues, philosophy provides “the reasoning that science requires in order to claim its image as descriptive.” As a proponent of the vital difference between information and wisdom — the former being the material of science, the latter the product of philosophy, and knowledge the change agent that transmutes one into the other — I find the provocative genius of Goldstein’s conclusion enormously invigorating:

What idea should science retire? The idea of “science” itself. Let’s retire it in favor of the more inclusive “knowledge.”

Neuroscientist Sam Harris, author of the indispensable Spirituality Without Religion, echoes this by choosing our narrow definition of “science” as the idea to be put to rest:

Search your mind, or pay attention to the conversations you have with other people, and you’ll discover that there are no real boundaries between science and philosophy — or between those disciplines and any other that attempts to make valid claims about the world on the basis of evidence and logic. When such claims and their methods of verification admit of experiment and/or mathematical description, we tend to say our concerns are “scientific”; when they relate to matters more abstract, or to the consistency of our thinking itself, we often say we’re being “philosophical”; when we merely want to know how people behaved in the past, we dub our interests “historical” or “journalistic”; and when a person’s commitment to evidence and logic grows dangerously thin or simply snaps under the burden of fear, wishful thinking, tribalism, or ecstasy, we recognize that he’s being “religious.”

The boundaries between true intellectual disciplines are currently enforced by little more than university budgets and architecture… The real distinction we should care about — the observation of which is the sine qua non of the scientific attitude — is between demanding good reasons for what one believes and being satisfied with bad ones.

In a sentiment that calls to mind both Richard Feynman’s spectacular ode to a flower and Carl Sagan’s enduring wisdom on our search for meaning, Harris applies this model of knowledge to one of the great mysteries of science and philosophy — consciousness:

Even if one thinks the human mind is entirely the product of physics, the reality of consciousness becomes no less wondrous, and the difference between happiness and suffering no less important. Nor does such a view suggest that we’ll ever find the emergence of mind from matter fully intelligible; consciousness may always seem like a miracle. In philosophical circles, this is known as “the hard problem of consciousness” — some of us agree that this problem exists, some of us don’t. Should consciousness prove conceptually irreducible, remaining the mysterious ground for all we can conceivably experience or value, the rest of the scientific worldview would remain perfectly intact.

The remedy for all this confusion is simple: We must abandon the idea that science is distinct from the rest of human rationality. When you are adhering to the highest standards of logic and evidence, you are thinking scientifically. And when you’re not, you’re not.

Illustration from ‘Once Upon an Alphabet’ by Oliver Jeffers. Click image for more.

Psychologist Susan Blackmore, who studies this very problem — famously termed “the hard problem of consciousness” by philosopher David Chalmers in 1996 — benches the idea that there are neural correlates to consciousness. Tempting as it may be to interpret neural activity as the wellspring of that special something we call “consciousness” or “subjective experience,” as opposed to the “unconscious” rest of the brain, Blackmore admonishes that such dualism is past its cultural expiration date:

Dualist thinking comes naturally to us. We feel as though our conscious experiences were of a different order from the physical world. But this is the same intuition that leads to the hard problem seeming hard. It’s the same intuition that produces the philosopher’s zombie — a creature identical to me in every way except that it has no consciousness. It’s the same intuition that leads people to write, apparently unproblematically, about brain processes being either conscious or unconscious… Intuitively plausible as it is, this is a magic difference. Consciousness is not some weird and wonderful product of some brain processes but not others. Rather, it’s an illusion constructed by a clever brain and body in a complex social world. We can speak, think, refer to ourselves as agents, and so build up the false idea of a persisting self that has consciousness and free will.

Much of the allure of identifying such neural correlates of consciousness, Blackmore argues, lies in cultural mythologies rooted in fantasy rather than fact:

While people are awake they must always be conscious of something or other. And that leads along the slippery path to the idea that if we knew what to look for, we could peer inside someone’s brain and find out which processes were the conscious ones and which the unconscious ones. But this is all nonsense. All we’ll ever find are the neural correlates of thoughts, perceptions, memories, and the verbal and attentional processes that lead us to think we’re conscious.

When we finally have a better theory of consciousness to replace these popular delusions, we’ll see that there’s no hard problem, no magic difference, and no NCCs.

Illustration by Rob Hunter from ‘A Graphic Cosmogony.’ Click image for more.

In a related grievance, social psychologist Bruce Hood — author of the uncomfortable yet strangely comforting The Self Illusion — does away with the notion of the self. Half a century after Alan Watts enlisted Eastern philosophy in this mission, Hood presents a necessary integration of science and philosophy:

It seems almost redundant to call for the retirement of the free willing self, as the idea is neither scientific nor is this the first time the concept has been dismissed for lack of empirical support. The self did not have to be discovered; it’s the default assumption most of us experience, so it wasn’t really revealed by methods of scientific inquiry.

[…]

Yet the self, like a conceptual zombie, refuses to die. It crops up again and again in recent theories of decision making, as an entity with free will which can be depleted. It reappears as an interpreter in cognitive neuroscience, as able to integrate parallel streams of information arising from separable neural substrates. Even if these appearances of the self are understood to be convenient ways of discussing the emergent output of multiple parallel processes, students of the mind continue to implicitly endorse the idea that there’s a decision maker, an experiencer, a point of origin.

We know the self is constructed because it can be so easily deconstructed — through damage, disease, and drugs. It must be an emergent property of a parallel system processing input, output, and internal representations. It’s an illusion because it feels so real, but that experience is not what it seems. The same is true for free will. Although we can experience the mental anguish of making a decision… the choices and decisions we make are based on situations that impose on us. We don’t have the free will to choose the experiences that have shaped our decisions.

[…]

By abandoning the free willing self, we’re forced to reexamine the factors that are truly behind our thoughts and behavior and the way they interact, balance, override, and cancel out. Only then will we begin to make progress in understanding how we really operate.

Illustration by Ben Newman from ‘A Graphic Cosmogony.’ Click image for more.

Among the most provocative answers, in fact, is one examining the factors that underlie one of the most complex and seemingly human of our experiences: love. Biological anthropologist Helen Fisher, who studies the brain on love, points to romantic love and addiction as two concepts in need of serious reformulation and reframing — one best accomplished by understanding the intersection of the two. Fisher argues that we ought to broaden the definition of addiction and do away with science’s staunch notion that all addiction is harmful. Love, she argues with a wealth of neurobiological evidence in hand, is in fact a state that closely resembles that of addiction in terms of what happens in the brain during it — and yet love, anguishing as it may be at times, is universally recognized as the height of positive experience. In that respect, it presents a case of “positive addiction.” Fisher writes:

Love-besotted men and women show all the basic symptoms of addiction. Foremost, the lover is stiletto-focused on his/her drug of choice, the love object. The lover thinks obsessively about him or her (intrusive thinking), and often compulsively calls, writes, or stays in touch. Paramount in this experience is intense motivation to win one’s sweetheart, not unlike the substance abuser fixated on the drug. Impassioned lovers distort reality, change their priorities and daily habits to accommodate the beloved, experience personality changes (affect disturbance), and sometimes do inappropriate or risky things to impress this special other. Many are willing to sacrifice, even die for, “him” or “her.” The lover craves emotional and physical union with the beloved (dependence). And like addicts who suffer when they can’t get their drug, the lover suffers when apart from the beloved (separation anxiety). Adversity and social barriers even heighten this longing (frustration attraction).

In fact, besotted lovers express all four of the basic traits of addiction: craving, tolerance, withdrawal, and relapse. They feel a “rush” of exhilaration when they’re with their beloved (intoxication). As their tolerance builds, they seek to interact with the beloved more and more (intensification). If the love object breaks off the relationship, the lover experiences signs of drug withdrawal, including protest, crying spells, lethargy, anxiety, insomnia or hypersomnia, loss of appetite or binge eating, irritability, and loneliness. Lovers, like addicts, also often go to extremes, sometimes doing degrading or physically dangerous things to win back the beloved. And lovers relapse the way drug addicts do. Long after the relationship is over, events, people, places, songs, or other external cues associated with their abandoning sweetheart can trigger memories and renewed craving.

Fisher points to fMRI studies that have shown intense romantic love to trigger the brain’s reward system and the dopamine pathways responsible for “energy, focus, motivation, ecstasy, despair, and craving,” as well as the brain regions most closely associated with addiction and substance abuse. In shedding light on the neurochemical machinery of romantic love, Fisher argues, science reveals it to be a “profoundly powerful, natural, often positive addiction.”

Illustration by Christine Rösch from ‘The Mathematics of Love.’ Click image for more.

Astrophysicist Marcelo Gleiser, who has written beautifully about the necessary duality of knowledge and mystery, wants to do away with “the venerable notion of Unification.” He points out that smaller acts of unification and simplification are core to the scientific process — from the laws of thermodynamics to Newton’s law of universal gravity — but simplification as sweeping as reducing the world to a single Theory of Everything is misplaced:

The trouble starts when we take this idea too far and search for the Über-unification, the Theory of Everything, the arch-reductionist notion that all forces of nature are merely manifestations of a single force. This is the idea that needs to go.

Noting that at some point along the way, “math became equated with beauty and beauty with truth,” Gleiser writes:

The impulse to unify it all runs deep in the souls of mathematicians and theoretical physicists, from the Langlands program to superstring theory. But here’s the rub: Pure mathematics isn’t physics. The power of mathematics comes precisely from its detachment from physical reality. A mathematician can create any universe she wants and play all sorts of games with it. A physicist can’t; his job is to describe nature as we perceive it. Nevertheless, the unification game has been an integral part of physics since Galileo and has produced what it should: approximate unifications.

And yet this unification game, as integral as it may be to science, is also antithetical to it in the long run:

The scientific impulse to unify is crypto-religious… There’s something deeply appealing in equating all of nature to a single creative principle: To decipher the “mind of God” is to be special, is to answer to a higher calling. Pure mathematicians who believe in the reality of mathematical truths are monks of a secret order, open only to the initiated. In the case of high energy physics, all unification theories rely on sophisticated mathematics related to pure geometric structures: The belief is that nature’s ultimate code exists in the ethereal world of mathematical truths and that we can decipher it.

Echoing Richard Feynman’s spectacular commencement address admonishing against “cargo cult science,” Gleiser adds:

Recent experimental data has been devastating to such belief — no trace of supersymmetric particles, of extra dimensions, or of dark matter of any sort, all long-awaited signatures of unification physics. Maybe something will come up; to find, we must search. The trouble with unification in high energy physics is that you can always push it beyond the experimental range. “The Large Hadron Collider got to 7 TeV and found nothing? No problem! Who said nature should opt for the simplest versions of unification? Maybe it’s all happening at much higher energies, well beyond our reach.”

There’s nothing wrong with this kind of position. You can believe it until you die, and die happy. Or you can conclude that what we do best is construct approximate models of how nature works and that the symmetries we find are only descriptions of what really goes on. Perfection is too hard a burden to impose on nature.

People often see this kind of argument as defeatist, as coming from someone who got frustrated and gave up. (As in “He lost his faith.”) Big mistake. To search for simplicity is essential to what scientists do. It’s what I do. There are essential organizing principles in nature, and the laws we find are excellent ways to describe them. But the laws are many, not one. We’re successful pattern-seeking rational mammals. That alone is cause for celebration. However, let’s not confuse our descriptions and models with reality. We may hold perfection in our mind’s eye as a sort of ethereal muse. Meanwhile nature is out there doing its thing. That we manage to catch a glimpse of its inner workings is nothing short of wonderful. And that should be good enough.

Ceramic tile by Debbie Millman courtesy of the artist

Science writer Amanda Gefter takes issue with one particular manifestation of our propensity for oversimplification — the notion of the universe. She writes:

Physics has a time-honored tradition of laughing in the face of our most basic intuitions. Einstein’s relativity forced us to retire our notions of absolute space and time, while quantum mechanics forced us to retire our notions of pretty much everything else. Still, one stubborn idea has stood steadfast through it all: the universe.

[…]

In recent years, however, the concept of a single shared spacetime has sent physics spiraling into paradox. The first sign that something was amiss came from Stephen Hawking’s landmark work in the 1970s showing that black holes radiate and evaporate, disappearing from the universe and purportedly taking some quantum information with them. Quantum mechanics, however, is predicated upon the principle that information can never be lost.

Gefter points to recent breakthroughs in physics that produced one particularly puzzling such paradox, known as the “firewall paradox,” solved by the idea that spacetime is divided not by horizons but by the reference frames of the observers, “as if each observer had his or her own universe.”

But the solution isn’t a multiverse theory:

Yes, there are multiple observers, and yes, any observer’s universe is as good as any other’s. But if you want to stay on the right side of the laws of physics, you can talk only about one at a time. Which means, really, that only one exists at a time. It’s cosmic solipsism.

Here, psychology, philosophy, and cosmology converge, for what such theories suggest is what we already know about the human psyche — as I’ve put it elsewhere, the stories that we tell ourselves, whether they be false or true, are always real. Gefter concludes:

Adjusting our intuitions and adapting to the strange truths uncovered by physics is never easy. But we may just have to come around to the notion that there’s my universe and there’s your universe — but there’s no such thing as the universe.

Biological anthropologist Nina Jablonski points to the notion of race as urgently retirement-ready. Pointing out that it has always been a “vague and slippery concept,” she traces its origins to Hume and Kant — the first to divide humanity into geographic groupings called “races” — and the pseudoscientific seeds of racism this division planted:

Skin color, as the most noticeable racial characteristic, was associated with a nebulous assemblage of opinions and hearsay about the inherent natures of the various races. Skin color stood for morality, character, and the capacity for civilization; it became a meme.

Even though the atrocious “race science” that emerged in the 19th and early 20th century didn’t hold up — whenever scientists looked for actual sharp boundaries between groups, none came up — and race came to be something people identify themselves with as a shared category of experiences and social bonds, Jablonski argues that the toxic aftershocks of pseudoscience still poison culture:

Even after it has been shown that many diseases (adult-onset diabetes, alcoholism, high blood pressure, to name a few) show apparent racial patterns because people share similar environmental conditions, groupings by race are maintained. The use of racial self-categorization in epidemiological studies is defended and even encouraged. Medical studies of health disparities between “races” become meaningless when sufficient variables — such as differences in class, ethnic social practices, and attitudes — are taken into account.

Half a century after the ever-prescient Margaret Mead made the same point, Jablonski urges:

Race has a hold on history but no longer has a place in science. The sheer instability and potential for misinterpretation render race useless as a scientific concept. Inventing new vocabularies to deal with human diversity and inequity won’t be easy, but it must be done.

Psychologist Jonathan Gottschall, who has previously explored why storytelling is so central to the human experience, argues against the notion that there can be no science of art. With an eye to our civilization’s long struggle to define art, he writes:

We don’t even have a good definition, in truth, for what art is. In short, there’s nothing so central to human life that’s so incompletely understood.

Granted, Gottschall is only partly right, for there are some excellent definitions of art — take, for instance, Jeanette Winterson’s or Leo Tolstoy’s — but the fact that they don’t come from scientists only speaks to his larger point. He argues that rather than being unfit to shed light on the role of art in human life, science simply hasn’t applied itself to the problem adequately:

Scientific work in the humanities has mainly been scattered, preliminary, and desultory. It doesn’t constitute a research program.

If we want better answers to fundamental questions about art, science must jump into the game with both feet. Going it alone, humanities scholars can tell intriguing stories about the origins and significance of art, but they don’t have the tools to patiently winnow the field of competing ideas. That’s what the scientific method is for — separating the more accurate stories from the less accurate stories. But a strong science of art will require both the thick, granular expertise of humanities scholars and the clever hypothesis-testing of scientists. I’m not calling for a scientific takeover of the arts, I’m calling for a partnership.

[…]

The Delphic admonition “Know thyself” still rings out as the great prime directive of intellectual inquiry, and there will always be a gaping hole in human self-knowledge until we develop a science of art.

In a further testament to the zeitgeist-illuminating nature of the project, actor, author, and science-lover Alan Alda makes a passionate case for the same concept:

The trouble with truth is that not only is the notion of eternal, universal truth highly questionable, but simple, local truths are subject to refinement as well. Up is up and down is down, of course. Except under special circumstances. Is the North Pole up and the South Pole down? Is someone standing at one of the poles right-side up or upside-down? Kind of depends on your perspective.

When I studied how to think in school, I was taught that the first rule of logic was that a thing cannot both be and not be at the same time and in the same respect. That last note, “in the same respect,” says a lot. As soon as you change the frame of reference, you’ve changed the truthiness of a once immutable fact.

[…]

This is not to say that nothing is true or that everything is possible — just that it might not be so helpful for things to be known as true for all time, without a disclaimer… I wonder — and this is just a modest proposal — whether scientific truth should be identified in a way acknowledging that it’s something we know and understand for now, and in a certain way.

[…]

Facts, it seems to me, are workable units, useful in a given frame or context. They should be as exact and irrefutable as possible, tested by experiment to the fullest extent. When the frame changes, they don’t need to be discarded as untrue but respected as still useful within their domain. Most people who work with facts accept this, but I don’t think the public fully gets it.

That’s why I hope for more wariness about implying we know something to be true or false for all time and for everywhere in the cosmos.

Illustration from ‘Once Upon an Alphabet’ by Oliver Jeffers. Click image for more.

And indeed this elasticity of truth across time is at the heart of what I find to be the most beautiful and culturally essential contribution to the collection. As someone who believes that the stewardship of enduring ideas is at least as important as the genesis of new ones — not only because past ideas are the combinatorial building blocks of future ones but also because in order to move forward we always need a backdrop against which to paint the contrast of progress and improvement — I was most bewitched by writer Ian McEwan’s admonition against the arrogance of retiring any idea as an impediment to progress:

Beware of arrogance! Retire nothing! A great and rich scientific tradition should hang onto everything it has. Truth is not the only measure. There are ways of being wrong that help others to be right. Some are wrong, but brilliantly so. Some are wrong but contribute to method. Some are wrong but help found a discipline. Aristotle ranged over the whole of human knowledge and was wrong about much. But his invention of zoology alone was priceless. Would you cast him aside? You never know when you might need an old idea. It could rise again one day to enhance a perspective the present cannot imagine. It would not be available to us if it were fully retired.

To appreciate McEwan’s point, one need only look at something like Bertrand Russell’s timely thoughts on boredom, penned in 1930 and yet astoundingly resonant with our present anxieties about the societal side effects of current technology. McEwan captures this beautifully:

Every last serious and systematic speculation about the world deserves to be preserved. We need to remember how we got to where we are, and we’d like the future not to retire us. Science should look to literature and maintain a vibrant living history as a monument to ingenuity and persistence. We won’t retire Shakespeare. Nor should we Bacon.

Complement This Idea Must Die, the entirety of which weaves a mind-stretching mesh of complementary and contradictory perspectives on our relationship with knowledge, with some stimulating answers to previous editions of Brockman’s annual question, exploring the only thing worth worrying about (2013), the single most elegant theory of how the world works (2012), and the best way to make ourselves smarter (2011).

BP

Neil deGrasse Tyson Selects the Eight Books Every Intelligent Person on the Planet Should Read

How to “glean profound insight into most of what has driven the history of the western world.”

In December of 2011, Neil deGrasse Tysonchampion of science, celebrator of the cosmic perspective, master of the soundbite — participated in Reddit’s Ask Me Anything series of public questions and answers. One reader posed the following question: “Which books should be read by every single intelligent person on the planet?” Adding to history’s notable reading lists — including those by Leo Tolstoy, Alan Turing, Brian Eno, David Bowie, Stewart Brand, and Carl Sagan — Tyson offers the following eight essentials, each followed by a short, and sometimes wry, statement about “how the book’s content influenced the behavior of people who shaped the western world”:

  1. The Bible (public library; free ebook), to learn that it’s easier to be told by others what to think and believe than it is to think for yourself
  2. The System of the World (public library; free ebook) by Isaac Newton, to learn that the universe is a knowable place
  3. On the Origin of Species (public library; free ebook) by Charles Darwin, to learn of our kinship with all other life on Earth
  4. Gulliver’s Travels (public library; free ebook) by Jonathan Swift, to learn, among other satirical lessons, that most of the time humans are Yahoos
  5. The Age of Reason (public library; free ebook) by Thomas Paine, to learn how the power of rational thought is the primary source of freedom in the world
  6. The Wealth of Nations (public library; free ebook) by Adam Smith, to learn that capitalism is an economy of greed, a force of nature unto itself
  7. The Art of War (public library; free ebook) by Sun Tzu, to learn that the act of killing fellow humans can be raised to an art
  8. The Prince (public library; free ebook) by Machiavelli, to learn that people not in power will do all they can to acquire it, and people in power will do all they can to keep it

Tyson adds:

If you read all of the above works you will glean profound insight into most of what has driven the history of the western world.

(What has driven it, evidently, is also the systematic exclusion of the female perspective. The prototypical “intelligent person” would be remiss not to also read, at the very least, Margaret Fuller’s foundational text Woman in the Nineteenth Century, which is even available as a free ebook, and Betty Friedan’s The Feminine Mystique. But, of course, the question of diversity is an infinite one and any list is bound to be pathologically unrepresentative of all of humanity — a challenge I’ve addressed elsewhere — so Tyson’s selections remain indispensable despite their chromosomal lopsidedness. My hope, meanwhile, is that we’ll begin to see more such reading lists by prominent female scientists, philosophers, artists, or writers of the past and present; to my knowledge, none have been made public as of yet — except perhaps Susan Sontag’s diary, which is essentially a lifelong reading list.)

Complement with Nabokov on the six short stories every writer should read, then revisit Tyson on genius and the most humbling fact about the universe.

BP

View Full Site

The Marginalian participates in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn commissions by linking to Amazon. In more human terms, this means that whenever you buy a book on Amazon from any link on here, I receive a small percentage of its price, which goes straight back into my own colossal biblioexpenses. Privacy policy. (TLDR: You're safe — there are no nefarious "third parties" lurking on my watch or shedding crumbs of the "cookies" the rest of the internet uses.)