The Marginalian
The Marginalian

Search results for “charles darwin”

The Greatest Science Books of 2016

From the sound of spacetime to time travel to the microbiome, by way of polar bears, dogs, and trees.

I have long believed that E.B. White’s abiding wisdom on children’s books — “Anyone who writes down to children is simply wasting his time. You have to write up, not down.” — is equally true of science books. The question of what makes a great book of any kind is, of course, a slippery one, but I recently endeavored to synthesize my intuitive system for assessing science books that write up to the reader in a taxonomy of explanation, elucidation, and enchantment.

enchanters

Gathered here are exceptional books that accomplish at least two of the three, assembled in the spirit of my annual best-of reading lists, which I continue to consider Old Year’s resolutions in reverse — not a list of priorities for the year ahead, but a reflection on the reading most worth prioritizing in the year being left behind.

BLACK HOLE BLUES

In Black Hole Blues and Other Songs from Outer Space (public library), cosmologist, novelist, and unparalleled enchanter of science Janna Levin tells the story of the century-long vision, originated by Einstein, and half-century experimental quest to hear the sound of spacetime by detecting a gravitational wave. This book remains one of the most intensely interesting and beautifully written I’ve ever encountered — the kind that comes about once a generation if we’re lucky.

Everything we know about the universe so far comes from four centuries of sight — from peering into space with our eyes and their prosthetic extension, the telescope. Now commences a new mode of knowing the cosmos through sound. The detection of gravitational waves is one of the most significant discoveries in the entire history of physics, marking the dawn of a new era as we begin listening to the sound of space — the probable portal to mysteries as unimaginable to us today as galaxies and nebulae and pulsars and other cosmic wonders were to the first astronomers. Gravitational astronomy, as Levin elegantly puts it, promises a “score to accompany the silent movie humanity has compiled of the history of the universe from still images of the sky, a series of frozen snapshots captured over the past four hundred years since Galileo first pointed a crude telescope at the Sun.”

blackholes_einstein

Astonishingly enough, Levin wrote the book before the Laser Interferometer Gravitational-Wave Observatory (LIGO) — the monumental instrument at the center of the story, decades in the making — made the actual detection of a ripple in the fabric of spacetime caused by the collision of two black holes in the autumn of 2015, exactly a century after Einstein first envisioned the possibility of gravitational waves. So the story she tells is not that of the triumph but that of the climb, which renders it all the more enchanting — because it is ultimately a story about the human spirit and its incredible tenacity, about why human beings choose to devote their entire lives to pursuits strewn with unimaginable obstacles and bedeviled by frequent failure, uncertain rewards, and meager public recognition.

Indeed, what makes the book interesting is that it tells the story of this monumental discovery, but what makes it enchanting is that Levin comes at it from a rather unusual perspective. She is a working astrophysicist who studies black holes, but she is also an incredibly gifted novelist — an artist whose medium is language and thought itself. This is no popular science book but something many orders of magnitude higher in its artistic vision, the impeccable craftsmanship of language, and the sheer pleasure of the prose. The story is structured almost as a series of short, integrated novels, with each chapter devoted to one of the key scientists involved in LIGO. With Dostoyevskian insight and nuance, Levin paints a psychological, even philosophical portrait of each protagonist, revealing how intricately interwoven the genius and the foibles are in the fabric of personhood and what a profoundly human endeavor science ultimately is.

She writes:

Scientists are like those levers or knobs or those boulders helpfully screwed into a climbing wall. Like the wall is some cemented material made by mixing knowledge, which is a purely human construct, with reality, which we can only access through the filter of our minds. There’s an important pursuit of objectivity in science and nature and mathematics, but still the only way up the wall is through the individual people, and they come in specifics… So the climb is personal, a truly human endeavor, and the real expedition pixelates into individuals, not Platonic forms.

For a taste of this uncategorizably wonderful book, see Levin on the story of the tragic hero who pioneered gravitational astronomy and how astronomer Jocelyn Bell discovered pulsars.

TIME TRAVEL

Time Travel: A History (public library) by science historian and writer extraordinaire James Gleick, another rare enchanter of science, is not a “science book” per se, in that although it draws heavily on the history of twentieth-century science and quantum physics in particular (as well as on millennia of philosophy), it is a decidedly literary inquiry into our temporal imagination — why we think about time, why its directionality troubles us so, and what asking these questions at all reveals about the deepest mysteries of our consciousness. I consider it a grand thought experiment, using physics and philosophy as the active agents, and literature as the catalyst.

Gleick, who examined the origin of our modern anxiety about time with remarkable prescience nearly two decades ago, traces the invention of the notion of time travel to H.G. Wells’s 1895 masterpiece The Time Machine. Although Wells — like Gleick, like any reputable physicist — knew that time travel was a scientific impossibility, he created an aesthetic of thought which never previously existed and which has since shaped the modern consciousness. Gleick argues that the art this aesthetic produced — an entire canon of time travel literature and film — not only permeated popular culture but even influenced some of the greatest scientific minds of the past century, including Stephen Hawking, who once cleverly hosted a party for time travelers and when no one showed up considered the impossibility of time travel proven, and John Archibald Wheeler, who popularized the term “black hole” and coined “wormhole,” both key tropes of time travel literature.

Gleick considers how a scientific impossibility can become such fertile ground for the artistic imagination:

Why do we need time travel, when we already travel through space so far and fast? For history. For mystery. For nostalgia. For hope. To examine our potential and explore our memories. To counter regret for the life we lived, the only life, one dimension, beginning to end.

Wells’s Time Machine revealed a turning in the road, an alteration in the human relationship with time. New technologies and ideas reinforced one another: the electric telegraph, the steam railroad, the earth science of Lyell and the life science of Darwin, the rise of archeology out of antiquarianism, and the perfection of clocks. When the nineteenth century turned to the twentieth, scientists and philosophers were primed to understand time in a new way. And so were we all. Time travel bloomed in the culture, its loops and twists and paradoxes.

I wrote about Gleick’s uncommonly pleasurable book at length here.

FELT TIME

A very different take on time, not as cultural phenomenon but as individual psychological interiority, comes from German psychologist Marc Wittmann in Felt Time: The Psychology of How We Perceive Time (public library) — a fascinating inquiry into how our subjective experience of time’s passage shapes everything from our emotional memory to our sense of self. Bridging disciplines as wide-ranging as neuroscience and philosophy, Wittmann examines questions of consciousness, identity, happiness, boredom, money, and aging, exposing the centrality of time in each of them. What emerges is the disorienting sense that time isn’t something which happens to us — rather, we are time.

One of Wittmann’s most pause-giving points has to do with how temporality mediates the mind-body problem. He writes:

Presence means becoming aware of a physical and psychic self that is temporally extended. To be self-conscious is to recognize oneself as something that persists through time and is embodied.

In a sense, time is a construction of our consciousness. Two generations after Hannah Arendt observed in her brilliant meditation on time that “it is the insertion of man with his limited life span that transforms the continuously flowing stream of sheer change … into time as we know it,” Wittmann writes:

Self-consciousness — achieving awareness of one’s own self — emerges on the basis of temporally enduring perception of bodily states that are tied to neural activity in the brain’s insular lobe. The self and time prove to be especially present in boredom. They go missing in the hustle and bustle of everyday life, which results from the acceleration of social processes. Through mindfulness and emotional control, the tempo of life that we experience can be reduced, and we can regain time for ourselves and others.

Perception necessarily encompasses the individual who is doing the perceiving. It is I who perceives. This might seem self-evident. Perception of myself, my ego, occurs naturally when I consider myself. I “feel” and think about myself. But who is the subject if I am the object of my own attention? When I observe myself, after all, I become the object of observation. Clearly, this intangibility of the subject as a subject — and not an object — poses a philosophical problem: as soon as I observe myself, I have already become the object of my observation.

More here.

WHEN BREATH BECOMES AIR

All life is lived in the shadow of its own finitude, of which we are always aware — an awareness we systematically blunt through the daily distraction of living. But when this finitude is made acutely imminent, one suddenly collides with awareness so acute that it leaves no choice but to fill the shadow with as much light as a human being can generate — the sort of inner illumination we call meaning: the meaning of life.

That tumultuous turning point is what neurosurgeon Paul Kalanithi chronicles in When Breath Becomes Air (public library) — his piercing memoir of being diagnosed with terminal cancer at the peak of a career bursting with potential and a life exploding with aliveness. Partway between Montaigne and Oliver Sacks, Kalanithi weaves together philosophical reflections on his personal journey with stories of his patients to illuminate the only thing we have in common — our mortality — and how it spurs all of us, in ways both minute and monumental, to pursue a life of meaning.

What emerges is an uncommonly insightful, sincere, and sobering revelation of how much our sense of self is tied up with our sense of potential and possibility — the selves we would like to become, those we work tirelessly toward becoming. Who are we, then, and what remains of “us” when that possibility is suddenly snipped?

Paul Kalanithi in 2014 (Photograph: Norbert von der Groeben/Stanford Hospital and Clinics)
Paul Kalanithi in 2014 (Photograph: Norbert von der Groeben/Stanford Hospital and Clinics)

A generation after surgeon Sherwin Nuland’s foundational text on confronting the meaning of life while dying, Kalanithi sets out to answer these questions and their myriad fractal implications. He writes:

At age thirty-six, I had reached the mountaintop; I could see the Promised Land, from Gilead to Jericho to the Mediterranean Sea. I could see a nice catamaran on that sea that Lucy, our hypothetical children, and I would take out on weekends. I could see the tension in my back unwinding as my work schedule eased and life became more manageable. I could see myself finally becoming the husband I’d promised to be.

And then the unthinkable happens. He recounts one of the first incidents in which his former identity and his future fate collided with jarring violence:

My back stiffened terribly during the flight, and by the time I made it to Grand Central to catch a train to my friends’ place upstate, my body was rippling with pain. Over the past few months, I’d had back spasms of varying ferocity, from simple ignorable pain, to pain that made me forsake speech to grind my teeth, to pain so severe I curled up on the floor, screaming. This pain was toward the more severe end of the spectrum. I lay down on a hard bench in the waiting area, feeling my back muscles contort, breathing to control the pain — the ibuprofen wasn’t touching this — and naming each muscle as it spasmed to stave off tears: erector spinae, rhomboid, latissimus, piriformis…

A security guard approached. “Sir, you can’t lie down here.”

“I’m sorry,” I said, gasping out the words. “Bad … back … spasms.”

“You still can’t lie down here.”

[…]

I pulled myself up and hobbled to the platform.

Like the book itself, the anecdote speaks to something larger and far more powerful than the particular story — in this case, our cultural attitude toward what we consider the failings of our bodies: pain and, in the ultimate extreme, death. We try to dictate the terms on which these perceived failings may occur; to make them conform to wished-for realities; to subvert them by will and witless denial. All this we do because, at bottom, we deem them impermissible — in ourselves and in each other.

I wrote about the book at length here.

THE CONFIDENCE GAME

“Try not to get overly attached to a hypothesis just because it’s yours,” Carl Sagan urged in his excellent Baloney Detection Kit — and yet our tendency is to do just that, becoming increasingly attached to what we’ve come to believe because the belief has sprung from our own glorious, brilliant, fool-proof minds. How con artists take advantage of this human hubris is what New Yorker columnist and psychology writer Maria Konnikova explores in The Confidence Game: Why We Fall for It … Every Time (public library) — a thrilling psychological detective story investigating how con artists, the supreme masterminds of malevolent reality-manipulation, prey on our hopes, our fears, and our propensity for believing what we wish were true. Through a tapestry of riveting real-life con artist profiles interwoven with decades of psychology experiments, Konnikova illuminates the inner workings of trust and deception in our everyday lives.

She writes:

It’s the oldest story ever told. The story of belief — of the basic, irresistible, universal human need to believe in something that gives life meaning, something that reaffirms our view of ourselves, the world, and our place in it… For our minds are built for stories. We crave them, and, when there aren’t ready ones available, we create them. Stories about our origins. Our purpose. The reasons the world is the way it is. Human beings don’t like to exist in a state of uncertainty or ambiguity. When something doesn’t make sense, we want to supply the missing link. When we don’t understand what or why or how something happened, we want to find the explanation. A confidence artist is only too happy to comply — and the well-crafted narrative is his absolute forte.

Konnikova describes the basic elements of the con and the psychological susceptibility into which each of them plays:

The confidence game starts with basic human psychology. From the artist’s perspective, it’s a question of identifying the victim (the put-up): who is he, what does he want, and how can I play on that desire to achieve what I want? It requires the creation of empathy and rapport (the play): an emotional foundation must be laid before any scheme is proposed, any game set in motion. Only then does it move to logic and persuasion (the rope): the scheme (the tale), the evidence and the way it will work to your benefit (the convincer), the show of actual profits. And like a fly caught in a spider’s web, the more we struggle, the less able to extricate ourselves we become (the breakdown). By the time things begin to look dicey, we tend to be so invested, emotionally and often physically, that we do most of the persuasion ourselves. We may even choose to up our involvement ourselves, even as things turn south (the send), so that by the time we’re completely fleeced (the touch), we don’t quite know what hit us. The con artist may not even need to convince us to stay quiet (the blow-off and fix); we are more likely than not to do so ourselves. We are, after all, the best deceivers of our own minds. At each step of the game, con artists draw from a seemingly endless toolbox of ways to manipulate our belief. And as we become more committed, with every step we give them more psychological material to work with.

Needless to say, the book bears remarkable relevance to the recent turn of events in American politics and its ripples in the mass manipulation machine known as the media.

More here.

THE GENE

“This is the entire essence of life: Who are you? What are you?” young Leo Tolstoy wrote in his diary. For Tolstoy, this was a philosophical inquiry — or a metaphysical one, as it would have been called in his day. But between his time and ours, science has unraveled the inescapable physical dimensions of this elemental question, rendering the already disorienting attempt at an answer all the more complex and confounding.

In The Gene: An Intimate History (public library), physician and Pulitzer-winning author Siddhartha Mukherjee offers a rigorously researched, beautifully written detective story about the genetic components of what we experience as the self, rooted in Mukherjee’s own painful family history of mental illness and radiating a larger inquiry into how genetics illuminates the future of our species.

Mukherjee writes:

Three profoundly destabilizing scientific ideas ricochet through the twentieth century, trisecting it into three unequal parts: the atom, the byte, the gene. Each is foreshadowed by an earlier century, but dazzles into full prominence in the twentieth. Each begins its life as a rather abstract scientific concept, but grows to invade multiple human discourses — thereby transforming culture, society, politics, and language. But the most crucial parallel between the three ideas, by far, is conceptual: each represents the irreducible unit — the building block, the basic organizational unit — of a larger whole: the atom, of matter; the byte (or “bit”), of digitized information; the gene, of heredity and biological information.

Why does this property — being the least divisible unit of a larger form — imbue these particular ideas with such potency and force? The simple answer is that matter, information, and biology are inherently hierarchically organized: understanding that smallest part is crucial to understanding the whole.

Among the book’s most fascinating threads is Mukherjee’s nuanced, necessary discussion of intelligence and the dark side of IQ.

THE POLAR BEAR

“In wildness is the preservation of the world,” Thoreau wrote 150 years ago in his ode to the spirit of sauntering. But in a world increasingly unwild, where we are in touch with nature only occasionally and only in fragments, how are we to nurture the preservation of our Pale Blue Dot?

That’s what London-based illustrator and Sendak Fellow Jenni Desmond explores in The Polar Bear (public library) — the follow-up to Desmond’s serenade to the science and life of Earth’s largest-hearted creature, The Blue Whale, which was among the best science books of 2015.

thepolarbear_jennidesmond2

thepolarbear_jennidesmond1

thepolarbear_jennidesmond8

thepolarbear_jennidesmond7

The story follows a little girl who, in a delightful meta-touch, pulls this very book off the bookshelf and begins learning about the strange and wonderful world of the polar bear, its life, and the science behind it — its love of solitude, the black skin that hides beneath its yellowish-white fur, the built-in sunglasses protecting its eyes from the harsh Arctic light, why it evolved to have an unusually long neck and slightly inward paws, how it maintains the same temperature as us despite living in such extreme cold, why it doesn’t hibernate.

thepolarbear_jennidesmond6

Beyond its sheer loveliness, the book is suddenly imbued with a new layer of urgency. At a time when we can no longer count on politicians to protect the planet and educate the next generations about preserving it, the task falls on solely on parents and educators. Desmond’s wonderful project alleviates that task by offering a warm, empathic invitation to care about, which is the gateway to caring for, one of the creatures most vulnerable to our changing climate and most needful of our protection.

thepolarbear_jennidesmond5

Look closer here.

THE BIG PICTURE

“We are — as far as we know — the only part of the universe that’s self-conscious,” the poet Mark Strand marveled in his beautiful meditation on the artist’s task to bear witness to existence, adding: “We could even be the universe’s form of consciousness. We might have come along so that the universe could look at itself… It’s such a lucky accident, having been born, that we’re almost obliged to pay attention.” Scientists are rightfully reluctant to ascribe a purpose or meaning to the universe itself but, as physicist Lisa Randall has pointed out, “an unconcerned universe is not a bad thing — or a good one for that matter.” Where poets and scientists converge is the idea that while the universe itself isn’t inherently imbued with meaning, it is in this self-conscious human act of paying attention that meaning arises.

Physicist Sean Carroll terms this view poetic naturalism and examines its rewards in The Big Picture: On the Origins of Life, Meaning, and the Universe Itself (public library) — a nuanced inquiry into “how our desire to matter fits in with the nature of reality at its deepest levels,” in which Carroll offers an assuring dose of what he calls “existential therapy” reconciling the various and often seemingly contradictory dimensions of our experience.

With an eye to his life’s work of studying the nature of the universe — an expanse of space and time against the incomprehensibly enormous backdrop of which the dramas of a single human life claim no more than a photon of the spotlight — Carroll offers a counterpoint to our intuitive cowering before such magnitudes of matter and mattering:

I like to think that our lives do matter, even if the universe would trundle along without us.

[…]

I want to argue that, though we are part of a universe that runs according to impersonal underlying laws, we nevertheless matter. This isn’t a scientific question — there isn’t data we can collect by doing experiments that could possibly measure the extent to which a life matters. It’s at heart a philosophical problem, one that demands that we discard the way that we’ve been thinking about our lives and their meaning for thousands of years. By the old way of thinking, human life couldn’t possibly be meaningful if we are “just” collections of atoms moving around in accordance with the laws of physics. That’s exactly what we are, but it’s not the only way of thinking about what we are. We are collections of atoms, operating independently of any immaterial spirits or influences, and we are thinking and feeling people who bring meaning into existence by the way we live our lives.

Carroll’s captivating term poetic naturalism builds on a worldview that has been around for centuries, dating back at least to the Scottish philosopher David Hume. It fuses naturalism — the idea that the reality of the natural world is the only reality, that it operates according to consistent patterns, and that those patterns can be studied — with the poetic notion that there are multiple ways of talking about the world and of framing the questions that arise from nature’s elemental laws.

I wrote about the book at length here.

THE HIDDEN LIFE OF TREES

Trees dominate the world’s the oldest living organisms. Since the dawn of our species, they have been our silent companions, permeating our most enduring tales and never ceasing to inspire fantastical cosmogonies. Hermann Hesse called them “the most penetrating of preachers.” A forgotten seventeenth-century English gardener wrote of how they “speak to the mind, and tell us many things, and teach us many good lessons.”

But trees might be among our lushest metaphors and sensemaking frameworks for knowledge precisely because the richness of what they say is more than metaphorical — they speak a sophisticated silent language, communicating complex information via smell, taste, and electrical impulses. This fascinating secret world of signals is what German forester Peter Wohlleben explores in The Hidden Life of Trees: What They Feel, How They Communicate (public library).

Illustration by Arthur Rackham for a rare 1917 edition of the Brothers Grimm fairy tales

Wohlleben chronicles what his own experience of managing a forest in the Eifel mountains in Germany has taught him about the astonishing language of trees and how trailblazing arboreal research from scientists around the world reveals “the role forests play in making our world the kind of place where we want to live.” As we’re only just beginning to understand nonhuman consciousnesses, what emerges from Wohlleben’s revelatory reframing of our oldest companions is an invitation to see anew what we have spent eons taking for granted and, in this act of seeing, to care more deeply about these remarkable beings that make life on this planet we call home not only infinitely more pleasurable, but possible at all.

Read more here.

BEING A DOG

“The act of smelling something, anything, is remarkably like the act of thinking itself,” the great science storyteller Lewis Thomas wrote in his beautiful 1985 meditation on the poetics of smell as a mode of knowledge. But, like the conditioned consciousness out of which our thoughts arise, our olfactory perception is beholden to our cognitive, cultural, and biological limitations. The 438 cubic feet of air we inhale each day are loaded with an extraordinary richness of information, but we are able to access and decipher only a fraction. And yet we know, on some deep creaturely level, just how powerful and enlivening the world of smell is, how intimately connected with our ability to savor life. “Get a life in which you notice the smell of salt water pushing itself on a breeze over the dunes,” Anna Quindlen advised in her indispensable Short Guide to a Happy Life — but the noticing eclipses the getting, for the salt water breeze is lost on any life devoid of this sensorial perception.

Dogs, who “see” the world through smell, can teach us a great deal about that springlike sensorial aliveness which E.E. Cummings termed “smelloftheworld.” So argues cognitive scientist and writer Alexandra Horowitz, director of the Dog Cognition Lab at Barnard College, in Being a Dog: Following the Dog Into a World of Smell (public library) — a fascinating tour of what Horowitz calls the “surprising and sometimes alarming feats of olfactory perception” that dogs perform daily, and what they can teach us about swinging open the doors of our own perception by relearning some of our long-lost olfactory skills that grant us access to hidden layers of reality.

Art by Maira Kalman from Beloved Dog

The book is a natural extension of Horowitz’s two previous books, exploring the subjective reality of the dog and how our human perceptions shape our own subjective reality. She writes:

I am besotted with dogs, and to know a dog is to be interested in what it’s like to be a dog. And that all begins with the nose.

What the dog sees and knows comes through his nose, and the information that every dog — the tracking dog, of course, but also the dog lying next to you, snoring, on the couch — has about the world based on smell is unthinkably rich. It is rich in a way we humans once knew about, once acted on, but have since neglected.

Savor more of the wonderland of canine olfaction here.

I CONTAIN MULTITUDES

“I have observed many tiny animals with great admiration,” Galileo marveled as he peered through his microscope — a tool that, like the telescope, he didn’t invent himself but he used with in such a visionary way as to render it revolutionary. The revelatory discoveries he made in the universe within the cell are increasingly proving to be as significant as his telescopic discoveries in the universe without — a significance humanity has been even slower and more reluctant to accept than his radical revision of the cosmos.

That multilayered significance is what English science writer and microbiology elucidator Ed Yong explores in I Contain Multitudes: The Microbes Within Us and a Grander View of Life (public library) — a book so fascinating and elegantly written as to be worthy of its Whitman reference, in which Yong peels the veneer of the visible to reveal the astonishing complexity of life thriving beneath and within the crude confines of our perception.

Early-twentieth-century drawing of Radiolaria, one of the first microorganisms, by Ernst Haeckel
Early-twentieth-century drawing of Radiolarians, some of the first microorganisms, by Ernst Haeckel

Artist Agnes Martin memorably observed that “the best things in life happen to you when you’re alone,” but Yong offers a biopoetic counterpoint in the fact that we are never truly alone. He writes:

Even when we are alone, we are never alone. We exist in symbiosis — a wonderful term that refers to different organisms living together. Some animals are colonised by microbes while they are still unfertilised eggs; others pick up their first partners at the moment of birth. We then proceed through our lives in their presence. When we eat, so do they. When we travel, they come along. When we die, they consume us. Every one of us is a zoo in our own right — a colony enclosed within a single body. A multi-species collective. An entire world.

[…]

All zoology is really ecology. We cannot fully understand the lives of animals without understanding our microbes and our symbioses with them. And we cannot fully appreciate our own microbiome without appreciating how those of our fellow species enrich and influence their lives. We need to zoom out to the entire animal kingdom, while zooming in to see the hidden ecosystems that exist in every creature. When we look at beetles and elephants, sea urchins and earthworms, parents and friends, we see individuals, working their way through life as a bunch of cells in a single body, driven by a single brain, and operating with a single genome. This is a pleasant fiction. In fact, we are legion, each and every one of us. Always a “we” and never a “me.”

There are ample reasons to admire and appreciate microbes, well beyond the already impressive facts that they ruled “our” Earth for the vast majority of its 4.54-billion-year history and that we ourselves evolved from them. By pioneering photosynthesis, they became the first organisms capable of making their own food. They dictate the planet’s carbon, nitrogen, sulphur, and phosphorus cycles. They can survive anywhere and populate just about corner of the Earth, from the hydrothermal vents at the bottom of the ocean to the loftiest clouds. They are so diverse that the microbes on your left hand are different from those on your right.

But perhaps most impressively — for we are, after all, the solipsistic species — they influence innumerable aspects of our biological and even psychological lives. Young offers a cross-section of this microbial dominion:

The microbiome is infinitely more versatile than any of our familiar body parts. Your cells carry between 20,000 and 25,000 genes, but it is estimated that the microbes inside you wield around 500 times more. This genetic wealth, combined with their rapid evolution, makes them virtuosos of biochemistry, able to adapt to any possible challenge. They help to digest our food, releasing otherwise inaccessible nutrients. They produce vitamins and minerals that are missing from our diet. They break down toxins and hazardous chemicals. They protect us from disease by crowding out more dangerous microbes or killing them directly with antimicrobial chemicals. They produce substances that affect the way we smell. They are such an inevitable presence that we have outsourced surprising aspects of our lives to them. They guide the construction of our bodies, releasing molecules and signals that steer the growth of our organs. They educate our immune system, teaching it to tell friend from foe. They affect the development of the nervous system, and perhaps even influence our behaviour. They contribute to our lives in profound and wide-ranging ways; no corner of our biology is untouched. If we ignore them, we are looking at our lives through a keyhole.

In August, I wrote about one particularly fascinating aspect of Yong’s book — the relationship between mental health, free will, and your microbiome.

HIDDEN FIGURES

“No woman should say, ‘I am but a woman!’ But a woman! What more can you ask to be?” astronomer Maria Mitchell, who paved the way for women in American science, admonished the first class of female astronomers at Vassar in 1876. By the middle of the next century, a team of unheralded women scientists and engineers were powering space exploration at NASA’s Jet Propulsion Laboratory.

Meanwhile, across the continent and in what was practically another country, a parallel but very different revolution was taking place: In the segregated South, a growing number of black female mathematicians, scientists, and engineers were steering early space exploration and helping American win the Cold War at NASA’s Langley Research Center in Hampton, Virginia.

Long before the term “computer” came to signify the machine that dictates our lives, these remarkable women were working as human “computers” — highly skilled professional reckoners, who thought mathematically and computationally for their living and for their country. When Neil Armstrong set his foot on the moon, his “giant leap for mankind” had been powered by womankind, particularly by Katherine Johnson — the “computer” who calculated Apollo 11’s launch windows and who was awarded the Presidential Medal of Freedom by President Obama at age 97 in 2015, three years after the accolade was conferred upon John Glenn, the astronaut whose flight trajectory Johnson had made possible.

Katherine Johnson at her Langley desk with a globe, or "Celestial Training Device," 1960 (Photographs: NASA)
Katherine Johnson at her Langley desk with a globe, or “Celestial Training Device,” 1960 (Photographs: NASA)

In Hidden Figures: The Story of the African-American Women Who Helped Win the Space Race (public library), Margot Lee Shetterly tells the untold story of these brilliant women, once on the frontlines of our cultural leaps and since sidelined by the selective collective memory we call history.

She writes:

Just as islands — isolated places with unique, rich biodiversity — have relevance for the ecosystems everywhere, so does studying seemingly isolated or overlooked people and events from the past turn up unexpected connections and insights to modern life.

Against a sobering cultural backdrop, Shetterly captures the enormous cognitive dissonance the very notion of these black female mathematicians evokes:

Before a computer became an inanimate object, and before Mission Control landed in Houston; before Sputnik changed the course of history, and before the NACA became NASA; before the Supreme Court case Brown v. Board of Education of Topeka established that separate was in fact not equal, and before the poetry of Martin Luther King Jr.’s “I Have a Dream” speech rang out over the steps of the Lincoln Memorial, Langley’s West Computers were helping America dominate aeronautics, space research, and computer technology, carving out a place for themselves as female mathematicians who were also black, black mathematicians who were also female.

Shetterly herself grew up in Hampton, which dubbed itself “Spacetown USA,” amid this archipelago of women who were her neighbors and teachers. Her father, who had built his first rocket in his early teens after seeing the Sputnik launch, was one of Langley’s African American scientists in an era when words we now shudder to hear were used instead of “African American.” Like him, the first five black women who joined Langley’s research staff in 1943 entered a segregated NASA — even though, as Shetterly points out, the space agency was among the most inclusive workplaces in the country, with more than fourfold the percentage of black scientists and engineers than the national average.

Over the next forty years, the number of these trailblazing black women mushroomed to more than fifty, revealing the mycelia of a significant groundswell. Shetterly’s favorite Sunday school teacher had been one of the early computers — a retired NASA mathematician named Kathleen Land. And so Shetterly, who considers herself “as much a product of NASA as the Moon landing,” grew up believing that black women simply belonged in science and space exploration as a matter of course — after all, they populated her father’s workplace and her town, a town whose church “abounded with mathematicians.”

Embodying astronomer Vera Rubin’s wisdom on how modeling expands children’s scope of possibility, Shetterly reflects on this normalizing and rousing power of example:

Building 1236, my father’s daily destination, contained a byzantine complex of government-gray cubicles, perfumed with the grown-up smells of coffee and stale cigarette smoke. His engineering colleagues with their rumpled style and distracted manner seemed like exotic birds in a sanctuary. They gave us kids stacks of discarded 11×14 continuous-form computer paper, printed on one side with cryptic arrays of numbers, the blank side a canvas for crayon masterpieces. Women occupied many of the cubicles; they answered phones and sat in front of typewriters, but they also made hieroglyphic marks on transparent slides and conferred with my father and other men in the office on the stacks of documents that littered their desks. That so many of them were African American, many of them my grandmother’s age, struck me as simply a part of the natural order of things: growing up in Hampton, the face of science was brown like mine.

[…]

The community certainly included black English professors, like my mother, as well as black doctors and dentists, black mechanics, janitors, and contractors, black cobblers, wedding planners, real estate agents, and undertakers, several black lawyers, and a handful of black Mary Kay salespeople. As a child, however, I knew so many African Americans working in science, math, and engineering that I thought that’s just what black folks did.

Katherine Johnson, age 98 (Photograph: Annie Leibovitz for Vanity Fair)
Katherine Johnson, age 98 (Photograph: Annie Leibovitz for Vanity Fair)

But despite the opportunities at NASA, almost countercultural in their contrast to the norms of the time, life for these courageous and brilliant women was no idyll — persons and polities are invariably products of their time and place. Shetterly captures the sundering paradoxes of the early computers’ experience:

I interviewed Mrs. Land about the early days of Langley’s computing pool, when part of her job responsibility was knowing which bathroom was marked for “colored” employees. And less than a week later I was sitting on the couch in Katherine Johnson’s living room, under a framed American flag that had been to the Moon, listening to a ninety-three-year-old with a memory sharper than mine recall segregated buses, years of teaching and raising a family, and working out the trajectory for John Glenn’s spaceflight. I listened to Christine Darden’s stories of long years spent as a data analyst, waiting for the chance to prove herself as an engineer. Even as a professional in an integrated world, I had been the only black woman in enough drawing rooms and boardrooms to have an inkling of the chutzpah it took for an African American woman in a segregated southern workplace to tell her bosses she was sure her calculations would put a man on the Moon.

[…]

And while the black women are the most hidden of the mathematicians who worked at the NACA, the National Advisory Committee for Aeronautics, and later at NASA, they were not sitting alone in the shadows: the white women who made up the majority of Langley’s computing workforce over the years have hardly been recognized for their contributions to the agency’s long-term success. Virginia Biggins worked the Langley beat for the Daily Press newspaper, covering the space program starting in 1958. “Everyone said, ‘This is a scientist, this is an engineer,’ and it was always a man,” she said in a 1990 panel on Langley’s human computers. She never got to meet any of the women. “I just assumed they were all secretaries,” she said.

These women’s often impossible dual task of preserving their own sanity and dignity while pushing culture forward is perhaps best captured in the words of African American NASA mathematician Dorothy Vaughan:

What I changed, I could; what I couldn’t, I endured.

Dive in here.

THE GLASS UNIVERSE

Predating NASA’s women mathematicians by a century was a devoted team of female amateur astronomers — “amateur” being a reflection not of their skill but of the dearth of academic accreditation available to women at the time — who came together at the Harvard Observatory at the end of the nineteenth century around an unprecedented quest to catalog the cosmos by classifying the stars and their spectra.

Decades before they were allowed to vote, these women, who came to be known as the “Harvard computers,” classified hundreds of thousands of stars according to a system they invented, which astronomers continue to use today. Their calculations became the basis for the discovery that the universe is expanding. Their spirit of selfless pursuit of truth and knowledge stands as a timeless testament to pioneering physicist Lise Meitner’s definition of the true scientist.

The "Harvard Computers" at work, circa 1890.
The “Harvard computers” at work, circa 1890.

Science historian Dava Sobel, author of Galileo’s Daughter, chronicles their unsung story and lasting legacy in The Glass Universe: How the Ladies of the Harvard Observatory Took the Measure of the Stars (public library).

Sobel, who takes on the role of rigorous reporter and storyteller bent on preserving the unvarnished historical integrity of the story, paints the backdrop:

A little piece of heaven. That was one way to look at the sheet of glass propped up in front of her. It measured about the same dimensions as a picture frame, eight inches by ten, and no thicker than a windowpane. It was coated on one side with a fine layer of photographic emulsion, which now held several thousand stars fixed in place, like tiny insects trapped in amber. One of the men had stood outside all night, guiding the telescope to capture this image, along with another dozen in the pile of glass plates that awaited her when she reached the observatory at 9 a.m. Warm and dry indoors in her long woolen dress, she threaded her way among the stars. She ascertained their positions on the dome of the sky, gauged their relative brightness, studied their light for changes over time, extracted clues to their chemical content, and occasionally made a discovery that got touted in the press. Seated all around her, another twenty women did the same.

Women working at the Harvard Observatory, with Williamina Fleming (standing) supervising
The “computers” working at the Harvard Observatory, with Williamina Fleming (standing) supervising. (Harvard University Archives)

Among the “Harvard computers” were Antonia Maury, who had graduated from Maria Mitchell’s program at Vassar; Annie Jump Cannon, who catalogued more than 20,000 variable stars in a short period after joining the observatory; Henrietta Swan Levitt, a Radcliffe alumna whose discoveries later became the basis for Hubble’s Law demonstrating the expansion of the universe and whose work was so valued that she was paid 30 cents an hour, five cents over the standard salary of the computers; and Cecilia Helena Payne-Gaposchkin, who became not only the first woman but the first person of any gender to earn a Ph.D. in astronomy.

Helming the team was Williamina Fleming — a Scotswoman whom Edward Charles Pickering, the thirty-something director of the observatory, first hired as a second maid at his residency in 1879 before recognizing her mathematical talents and assigning her the role of part-time computer.

Dive into their story here.

WOMEN IN SCIENCE

For a lighter companion to the two books above, one aimed at younger readers, artist and author Rachel Ignotofsky offers Women in Science: 50 Fearless Pioneers Who Changed the World (public library) — an illustrated encyclopedia of fifty influential and inspiring women in STEM since long before we acronymized the conquest of curiosity through discovery and invention, ranging from the ancient astronomer, mathematician, and philosopher Hypatia in the fourth century to Iranian mathematician Maryam Mirzakhani, born in 1977.

True as it may be that being an outsider is an advantage in science and life, modeling furnishes young hearts with the assurance that people who are in some way like them can belong and shine in fields comprised primarily of people drastically unlike them. It is this ethos that Igontofsky embraces by being deliberate in ensuring that the scientists included come from a vast variety of ethnic backgrounds, nationalities, orientations, and cultural traditions.

womeninscience_igontofsky_alicball

womeninscience_igontofsky_wangzhenyi

womeninscience_igontofsky_verarubin

lisemeitner_ignotofsky1

There are the expected trailblazers who have stood as beacons of possibility for decades, even centuries: Ada Lovelace, who became the world’s first de facto computer programmer; Marie Curie, the first woman to win a Nobel Prize and to this day the only person awarded a Nobel in two different sciences; Jocelyn Bell Burnell, who once elicited the exclamation “Miss Bell, you have made the greatest astronomical discovery of the twentieth century!” (and was subsequently excluded from the Nobel she deserved); Maria Sybilla Merian, the 17th-century German naturalist whose studies of butterfly metamorphosis revolutionized entomology and natural history illustration; and Jane Goodall — another pioneer who turned her childhood dream into reality against tremendous odds and went on to do more for the understanding of nonhuman consciousness than any scientist before or since.

Take a closer look here.

* * *

On December 2, I joined Science Friday alongside Scientific American editor Lee Billings to discuss some of our favorite science books of 2016:

Step into the cultural time machine with selections for the best science books of 2015, 2014, 2013, 2012, and 2011.

BP

A Cry of Gratitude: Baudelaire’s Magnificent Fan Mail to Wagner

“However used to fame a great artist may be, he cannot be insensible to a sincere compliment, especially when that compliment is like a cry of gratitude.”

A Cry of Gratitude: Baudelaire’s Magnificent Fan Mail to Wagner

“After silence that which comes nearest to expressing the inexpressible is music,” Aldous Huxley wrote in his enthralling meditation on the power of music, adding: “When the inexpressible had to be expressed, Shakespeare laid down his pen and called for music.” Nietzsche believed that “without music life would be a mistake” — perhaps the most extreme addition to the extensive canon of great writers extolling the power of music.

But perhaps the sincerest, most beautiful, most touching testament to the power of music came not in the form of a pointed essay but in a letter — that most intimate and immediate packet of sentiment.

A century before Albert Camus reached across the Iron Curtain to Boris Pasternak for a warm embrace of appreciation, his compatriot Charles Baudelaire (April 9, 1821–August 31, 1867) reached across another cultural divide to pay his gratitude for the transcendent gift of music. A thirty-eight-year-old Baudelaire, not yet celebrated as one of the finest poet-philosophers who ever lived, sent the great German composer and conductor Richard Wagner (May 22, 1813–February 13, 1883) a largehearted letter of appreciation, later included in The Conquest of Solitude: Selected Letters of Charles Baudelaire (public library). It stands as a touching testament to both the power of music and the power of a generous gesture.

baudelaire_wagner

Baudelaire writes on February 17, 1860:

Dear Sir:

I have always imagined that however used to fame a great artist may be, he cannot be insensible to a sincere compliment, especially when that compliment is like a cry of gratitude; and finally that this cry could acquire a singular kind of value when it came from a Frenchman, which is to say from a man little disposed to be enthusiastic, and born, moreover, in a country where people hardly understand painting and poetry any better than they do music. First of all, I want to tell you that I owe you the greatest musical pleasure I have ever experienced. I have reached an age when one no longer makes it a pastime to write letters to celebrities, and I should have hesitated a long time before writing to express my admiration for you, if Id did not daily come across shameless and ridiculous articles in which every effort is made to libel your genius. You are not the first man, sir, about whom I have suffered and blushed for my country. At length indignation impelled me to give you an earnest of my gratitude; I said to myself, “I want to stand out from all those imbeciles.”

In a sentiment that calls to mind Anthony Burgess’s recollection of how he fell in love with music as a little boy — Burgess wrote of “a psychedelic moment… an instant of recognition of verbally inexpressible spiritual realities” — Baudelaire recounts how Wagner first enchanted him:

The first time I went to the Italian Theatre in order to hear your works, I was rather unfavorably disposed and indeed, I must admit, full of nasty prejudices, but I have an excuse: I have been so often duped; I have heard so much music by pretentious charlatans. But you conquered me at once. What I felt is beyond description, and if you will be kind enough not to laugh, I shall try to interpret it for you. At the outset it seemed to me that I knew this new music, and later, on thinking it over, I understood whence came this mirage; it seemed to me that this music was mine, and I recognized it in the way that any man recognizes the things he is destined to love. To anybody but an intelligent man, this statement would be immensely ridiculous, especially when it comes from one who, like me, does not know music.

Baudelaire is echoing Tolstoy’s famous theory of art, in which the great Russian author asserted that emotional “infectiousness” is the single most significant criterion for great art. But he is also far ahead of his time — the word “empathy” only entered the popular lexicon half a century later and was first used to describe the imaginative act of projecting oneself into a work of art, inhabiting it, possessing it whilst being possessed by it in much the same way that Baudelaire did Wagner’s music.

The Gnomes: "He played until the room was entirely filled with gnomes."
One of Arthur Rackham’s rare 1917 illustrations for the fairy tales of the Brothers Grimm

In language strikingly reminiscent of how astronauts describe the cosmic awe of “the overview effect,” Baudelaire articulates the transcendent effect Wagner’s music had on him:

The thing that struck me the most was the character of grandeur. It depicts what is grand and incites to grandeur. Throughout your works I found again the solemnity of the grand sounds of Nature in her grandest aspects, as well as the solemnity of the grand passions of man. One feels immediately carried away and dominated.

Writing a generation before Oscar Wilde made his memorable case for why a “temperament of receptivity” is essential for savoring art, Baudelaire attests to how this willing receptivity, which great art unleashes in its audience, swings open the doors of perception:

Quite often I experienced a sensation of a rather bizarre nature, which was the pride and the joy of understanding, of letting myself be penetrated and invaded — a really sensual delight that resembles that of rising in the air or tossing upon the sea. And the music at the same time would now and then resound with the pride of life. Generally these profound harmonies seemed to me like those stimulants that quicken the pulse of the imagination… There is everywhere something rapt and enthralling, something aspiring to mount higher, something excessive and superlative. For example, if I may make analogies with painting, let me suppose I have before me a vast expanse of dark red. If this red stands for passion, I see it gradually passing through all the transitions of red and pink to the incandescent glow of a furnace. It would seem difficult, impossible even, to reach anything more glowing; and yet a last fuse comes and traces a whiter streak on the white of the background. This will signify, if you will, the supreme utterance of a soul at its highest paroxysm.

Baudelaire ends by reminding Wagner of the grand and heartbreaking truth of artistic appreciation: that most of it happens in the shadows, in the sinews, in the silence of private experiences, and that artists only ever glimpse a fraction through the grateful words of the few willing to return the gift by articulating those ineffable but transformative experiences that the artist has afforded them. Baudelaire writes:

From the day when I heard your music, I have said to myself endlessly, and especially at bad times, “If I only could hear a little Wagner tonight!” There are doubtless other men constituted like myself… Once again, sir, I thank you; you brought me back to myself and to what is great, in some unhappy moments.

Ch. Baudelaire

I do not set down my address because you might think I wanted something from you.

Complement with Baudelaire on the genius of childhood, beauty and strangeness, and his increasingly timely open letter to the privileged about the political power of art, then revisit other luminous beams of appreciation between great minds: Isaac Asimov’s fan mail to young Carl Sagan, Charles Dickens’s letter of admiration to George Eliot, teenage James Joyce’s grateful letter to Ibsen, his great hero, Darwin’s touching letter of appreciation to his best friend and greatest supporter, and the virtuous cycle of mutual admiration between Thomas Mann and Hermann Hesse.

BP

The Best Science Books of 2015

From Earth’s largest-hearted creature to the interconnectedness of the universe, by way of Einstein and artificial intelligence.

The Best Science Books of 2015

“Anyone who writes down to children is simply wasting his time,” E.B. White observed in a wonderful 1969 interview. “You have to write up, not down.” What’s true of great children’s books is true of great science books, which must do three things for the reader — explain, enchant, and elevate. They must tell you what something is and why it matters, captivate you to care about it and tickle you into taking pleasure in understanding it, and leave you in a higher state of awareness regarding whatever subtle or monumental aspect of the world the book had made its subject.

After the best art books of the year, here are the most stimulating science books of 2015, possessing this trifecta of merit.

1. ON THE MOVE

“I have been able to see my life as from a great altitude, as a sort of landscape, and with a deepening sense of the connection of all its parts,” Oliver Sacks wrote in his poignant, beautiful, and courageous farewell to life. In one final gesture of generosity, this cartographer of the mind and its meaning mapped the landscape of his remarkable character and career in On the Move: A Life (public library) — an uncommonly moving autobiography, titled after a line from a poem by his dear friend Thom Gunn: “At worst,” wrote Gunn, “one is in motion; and at best, / Reaching no absolute, in which to rest, / One is always nearer by not keeping still.” Sacks’s unstillness is that of a life defined by a compassionate curiosity — about the human mind, about the human spirit, about the invisibilia of our inner lives.

Oliver Sacks (Photograph: Nicholas Naylor-Leland)

The book, made all the more poignant by Dr. Sacks’s death shortly after its release, is not so much an autobiography in the strict sense as a dialogue with time on the simultaneous scales of the personal (going from world-champion weightlifter to world-renowned neurologist), the cultural (being a gay man looking for true love in the 1960s was nothing like it is in our post-DOMA, beTindered present), and the civilizational (watching horseshoe crabs mate on the beaches of City Island exactly as they did 400 million years ago on the shores of Earth’s primordial seas). This record of time pouring through the unclenched fingers of the mind’s most magnanimous patron saint has become one of the most rewarding reading experiences of my life — one I came to with deep reverence for Dr. Sacks’s intellectual footprint and left with deep love for his soul.

Dr. Sacks on the set of the cinematic adaptation of his book Awakenings, with Robin Williams, 1989 (Courtesy of Oliver Sacks)

Like Marie Curie, whose wounds and power sprang from the same source, Dr. Sacks’s character springs from the common root of his pain and his pleasure. At eighty, he reflects on a defining feature of his interior landscape:

I am shy in ordinary social contexts; I am not able to “chat” with any ease; I have difficulty recognizing people (this is lifelong, though worse now my eyesight is impaired); I have little knowledge of and little interest in current affairs, whether political, social, or sexual. Now, additionally, I am hard of hearing, a polite term for deepening deafness. Given all this, I tend to retreat into a corner, to look invisible, to hope I am passed over. This was incapacitating in the 1960s, when I went to gay bars to meet people; I would agonize, wedged into a corner, and leave after an hour, alone, sad, but somehow relieved. But if I find someone, at a party or elsewhere, who shares some of my own (usually scientific) interests — volcanoes, jellyfish, gravitational waves, whatever — then I am immediately drawn into animated conversation…

But Dr. Sacks’s intense introversion is also what made him such an astute listener and observer — the very quality that rendered him humanity’s most steadfast sherpa into the strange landscape of how minds other than our own experience the seething cauldron of mystery we call life.

On one particular occasion, the thrill of observation swelled to such proportions that it eclipsed his chronic introversion. He recounts:

I almost never speak to people in the street. But some years ago, there was a lunar eclipse, and I went outside to view it with my little 20x telescope. Everyone else on the busy sidewalk seemed oblivious to the extraordinary celestial happening above them, so I stopped people, saying, “Look! Look what’s happening to the moon!” and pressing my telescope into their hands. People were taken aback at being approached in this way, but, intrigued by my manifestly innocent enthusiasm, they raised the telescope to their eyes, “wowed,” and handed it back. “Hey, man, thanks for letting me look at that,” or “Gee, thanks for showing me.”

In a sense, Dr. Sacks has spent half a century pushing a telescope into our hands and inviting us, with the same innocent and infectious enthusiasm, to peer into an object even more remote and mysterious — the human mindscape — until we wow. And although he may paint himself as a comically clumsy genius — there he is, dropping hamburger crumbs into sophisticated lab equipment; there he is, committing “a veritable genocide of earthworms” in an experiment gone awry; there he is, watching nine months of painstaking research fly off the back of his motorcycle into New York’s densest traffic — make no mistake: This is a man of enormous charisma and grace, revealed as much by the details of his life as by the delight of his writing.

Dr. Sacks’s official portrait as a UCLA resident, taken at the neuropathology lab in 1964 (Courtesy of Oliver Sacks)

Dive deeper into this enormously rewarding book here.

2. ALEXANDER VON HUMBOLDT AND THE INVENTION OF NATURE

No thinker has shaped our understanding of the astounding interconnectedness of the universe more profoundly than the great Prussian naturalist, explorer, and geographer Alexander von Humboldt (September 14, 1769–May 6, 1859), who pioneered the notion that the natural world is a web of intricately entwined elements, each in constant dynamic dialogue with every other — a concept a century ahead of its time. His legacy isn’t so much any single discovery — although he did discover the magnetic equator, invented isotherms, and came up with climate zones — as it is a mindset, a worldview, a singular sensemaking sublimity.

Alexander von Humboldt by Friedrich Georg Weitsch, 1806
Alexander von Humboldt by Friedrich Georg Weitsch, 1806

Goethe, in his conversations with Eckermann, remarked that a single day with Humboldt enriched him more than years spent alone, enthusing:

What a man he is! … He has not his equal in knowledge and living wisdom. Then he has a many-sidedness such as I have found nowhere else. On whatever point you approach him, he is at home, and lavishes upon us his intellectual treasures. He is like a fountain with many pipes, under which you need only hold a vessel, and from which refreshing and inexhaustible streams are ever flowing.

Darwin asserted that Humboldt’s writings kindled in him a zeal without which he wouldn’t have boarded the Beagle or written On the Origin of Species. Thoreau was an ardent admirer of Humboldt’s “habit of close observation,” without the influence of which there might have been no Walden. Trailblazing astronomer Maria Mitchell, who met Humboldt weeks before his death, marveled in her diary that “no young aspirant in science ever left Humboldt’s presence uncheered,” and his ideas reverberate through her famous assertion that science is “not all mathematics, nor all logic, but it is somewhat beauty and poetry.” Emerson, in his essays and lectures, called Humboldt “a man whose eyes, ears, and mind are armed by all the science, arts, and implements which mankind have anywhere accumulated” and saw him as living proof that “a certain vastness of learning, or quasi omnipresence of the human soul in nature, is possible.”

Goethe's diagram of the comparative table elevations of the Old and New World, inspired by Humboldt
Goethe’s diagram of the comparative table elevations of the Old and New World, inspired by Humboldt

In informing and impressing the greatest minds of his time, Humboldt invariably influenced the course of science and its intercourse with the rest of culture in ways innumerable, enduring, and profound. His visionary understanding of nature’s interconnected sparked the basic ecological awareness that gave rise to the environmental movement. His integrated approach to science, incorporating elements of art, philosophy, poetry, politics, and history, provided the last bold counterpoint to the disconnected and dysfunctional “villages” of specialization into which science would fragment a mere generation later. And yet Humboldt, despite his enormous contribution to our most fundamental understanding of life, is largely forgotten today.

In The Invention of Nature: Alexander von Humboldt’s New World (public library), London-based design historian and writer Andrea Wulf sets out to liberate this extraordinary man’s legacy from the grip of obscurity and short-termism, illuminating the myriad threads of influence through which he continues to shape our present thinking about science, society, and life itself.

Alexander von Humboldt in his home library at at 67 Oranienburger Strasse, Berlin. Chromolithograph copy of watercolor drawing by Eduard Hildebrant, 1856.
Alexander von Humboldt in his home library at at 67 Oranienburger Strasse, Berlin. Chromolithograph copy of watercolor drawing by Eduard Hildebrant, 1856.

Wulf paints the backdrop for Humboldt’s enduring genius:

Described by his contemporaries as the most famous man in the world after Napoleon, Humboldt was one of the most captivating and inspiring men of his time. Born in 1769 into a wealthy Prussian aristocratic family, he discarded a life of privilege to discover for himself how the world worked. As a young man he set out on a five-year exploration to Latin America, risking his life many times and returning with a new sense of the world. It was a journey that shaped his life and thinking, and that made him legendary across the globe. He lived in cities such as Paris and Berlin, but was equally at home on the most remote branches of the Orinoco River or in the Kazakh Steppe at Russia’s Mongolian border. During much of his long life, he was the nexus of the scientific world, writing some 50,000 letters and receiving at least double that number. Knowledge, Humboldt believed, had to be shared, exchanged and made available to everybody.

But knowledge, for Humboldt, wasn’t merely an intellectual faculty — it was an embodied, holistic presence with life in all of its dimensions. A rock-climber, volcano-diver, and tireless hiker well into his eighties, Humboldt saw observation as an active endeavor and continually tested the limits of his body in his scientific pursuits. For him, mind, body, and spirit were all instruments of inquiry into the nature of the world. Two centuries before Carl Sagan sold us on the idea that “science invariably elicits a sense of reverence and awe,” Humboldt advocated for this then-radical notion amid a culture that drew a thick line between reason and emotion.

Wulf writes:

Fascinated by scientific instruments, measurements and observations, he was driven by a sense of wonder as well. Of course nature had to be measured and analysed, but he also believed that a great part of our response to the natural world should be based on the senses and emotions. He wanted to excite a “love of nature.” At a time when other scientists were searching for universal laws, Humboldt wrote that nature had to be experienced through feelings.

Out of this integrated approach to knowledge sprang Humboldt’s revolutionary view of life — the scientifically informed counterpart to Ada Lovelace’s famous assertion that “everything is naturally related and interconnected.” Wulf captures his greatest legacy:

Humboldt revolutionized the way we see the natural world. He found connections everywhere. Nothing, not even the tiniest organism, was looked at on its own. “In this great chain of causes and effects,” Humboldt said, “no single fact can be considered in isolation.” With this insight, he invented the web of life, the concept of nature as we know it today.

When nature is perceived as a web, its vulnerability also becomes obvious. Everything hangs together. If one thread is pulled, the whole tapestry may unravel.

Read more here.

3. DARK MATTER AND THE DINOSAURS

Every successful technology of thought, be it science or philosophy, is a time machine — it peers into the past in order to disassemble the building blocks of how we got to the present, then reassembles them into a sensemaking mechanism for where the future might take us. That’s what Harvard particle physicist and cosmologist Lisa Randall accomplishes in Dark Matter and the Dinosaurs: The Astounding Interconnectedness of the Universe (public library) — an intellectually thrilling exploration of how the universe evolved, what made our very existence possible, and how dark matter illuminates our planet’s relationship to its cosmic environment across past, present, and future.

Randall starts with a fascinating speculative theory, linking dark matter to the extinction of the dinosaurs — an event that took place in the outermost reaches of the Solar System sixty-six million years ago catalyzed an earthly catastrophe without which we wouldn’t have come to exist. What makes her theory so striking is that it contrasts the most invisible aspects of the universe with the most dramatic events of our world while linking the two in a causal dance, reminding us just how limited our perception of reality really is — we are, after all, sensorial creatures blinded by our inability to detect the myriad complex and fascinating processes that play out behind the doors of perception.

Randall writes:

The Universe contains a great deal that we have never seen — and likely never will.

A 17th-century conception of non-space by the English physician and cosmologist Robert Fludd, found in Cosmigraphics: Picturing Space Through Time

In Humboldt’s tradition of interconnectedness, Randall weaves together a number of different disciplines — cosmology, particle physics, evolutionary biology, environmental science, geology, and even social science — to tell a larger story of the universe, our galaxy, and the Solar System. In one of several perceptive social analogies, she likens dark matter — which comprises 85% of matter in the universe, interacts with gravity, but, unlike the ordinary matter we can see and touch, doesn’t interact with light — to the invisible but instrumental factions of human society:

Even though it is unseen and unfelt, dark matter played a pivotal role in forming the Universe’s structure. Dark matter can be compared to the under-appreciated rank and file of society. Even when invisible to the elite decision makers, the many workers who built pyramids or highways or assembled electronics were crucial to the development of their civilizations. Like other unnoticed populations in our midst, dark matter was essential to our world.

But the theory itself, original and interesting as it may be, is merely a clever excuse to do two more important things: tell an expansive and exhilarating story of how the universe as we know it came to exist, and invite us to transcend the limits of our temporal imagination and our delusions of omnipotence. How humbling to consider that a tiny twitch caused by an invisible force in the far reaches of the cosmos millions of years ago hurled at our unremarkable piece of rock a meteoroid three times the width of Manhattan, which produced the most massive and destructive earthquake of all time, decimating three quarters of all living creatures on Earth. Had the dinosaurs not died, large mammals may never have come to dominate the planet and humanity wouldn’t be here to contemplate the complexities of the cosmos. And yet in a few billion years, the Sun will retire into the red giant phase of its stellar lifetime and eventually burn out, extinguishing our biosphere and Blake and Bach and every human notion of truth and beauty. Stardust to stardust.

Read more here.

4. THE THRILLING ADVENTURES OF LOVELACE AND BABBAGE

In 1843, Ada Lovelace — the only legitimate child of the poet Lord Byron — translated a scientific paper by Italian military engineer Luigi Menabrea titled Sketch of an Analytical Engine, adding seven footnotes to it. Together, they measured 65 pages — two and half times the length of Menabrea’s original text — and included the earliest complete computer program, becoming the first true paper on computer science and rendering Lovelace the world’s first computer programmer. She was twenty-seven.

About a decade earlier, Lovelace had met the brilliant and eccentric British mathematician Charles Babbage who, when he wasn’t busy teaming up with Dickens to wage a war on street music, was working on strange inventions that would one day prompt posterity to call him the father of the computer. (Well, sort of.) The lifelong friendship that ensued between 18-year-old Lovelace and 45-year-old Babbage sparked an invaluable union of software and hardware to which we owe enormous swaths of modern life — including the very act of reading these words on this screen.

The unusual story of this Victorian power-duo is what graphic artists and animator Sydney Padua explores in the immensely delightful and illuminating The Thrilling Adventures of Lovelace and Babbage: The (Mostly) True Story of the First Computer (public library), itself a masterwork of combinatorial genius and a poetic analog to its subject matter — rigorously researched, it has approximately the same footnote-to-comic ratio as Lovelace’s trailblazing paper. The footnote, after all, is proto-hypertext linking one set of ideas to another, and in these analog hyperlinks, Padua draws on an impressive wealth of historical materials — from the duo’s scientific writings and lectures to Lovelace’s letters to Babbage’s autobiography to various accounts by their contemporaries.

Padua begins at the beginning, with Lovelace’s unusual upbringing as the daughter of Lord Byron, a “radical, adventurer, pan-amorist, and poet,” and Anne Isabella Millbanke, a “deeply moral Evangelical Christian and prominent anti-slavery campaigner.”

Determined to shield young Ada from any expression of her father’s dangerous “poetical” influence, her mother instructed the young girl’s nurse:

Be most careful always to speak the truth to her … take care not to tell her any nonsensical stories that will put fancies into her head.

She wasn’t spared the Victorian era’s brutal control mechanisms of women’s minds and bodies. Padua footnotes:

Ada’s upbringing was strict and lonely. She was given lessons while lying on a “reclining board” to perfect her posture. If she fidgeted, even with her fingers, her hands were tied in black bags and she was shut in a closet. She was five years old.

But the best control strategy for the disorderly tendencies of the poetical mind, it was determined, was thorough immersion in mathematics — which worked, but only to a degree.

Lovelace was eventually introduced to Babbage by the great Scottish mathematician, science writer, and polymath Mary Somerville — for whom, incidentally, the word “scientist” was coined.

And so one of history’s most paradigm-shifting encounters took place.

Implicit to the story is also a reminder that genius is as much the product of an individual’s exceptional nature as it is of the culture in which that individual is nourished. Genius leaps from the improbable into the possible — the courage of the leap is the function of individual temperament, but the horizons of possibility are to a large extent determined by the culture and the era.

Lovelace lived in an age when it was not only uncommon but even discouraged for women to engage in science, let alone authoring scientific paper themselves. In another illuminating footnote, Padua quotes from Babbage’s autobiography, capturing Lovelace’s dance with this duality of possibility and limitation perfectly:

The late Countess of Lovelace informed me that she had translated the memoir of Menabrea. I asked why she had not herself written an original paper on a subject with which she was so intimately acquainted? To this Lady Lovelace replied that the thought had not occurred to her.

And yet groundbreaking thoughts that hadn’t occurred to others did occur to Lovelace.

See more here.

5. THE BLUE WHALE

“The world is blue at its edges and in its depths,” Rebecca Solnit wrote in her beautiful meditation on the color of distance and desire. No creature compresses the edgeless grandeur of our Pale Blue Dot into a single body as perfectly as the blue whale — an animal absolutely awesome in the true sense of the word. That awe-striking being is what London-based illustrator Jenni Desmond celebrates in the marvelous nonfiction children’s book The Blue Whale (public library) — a loving science lullaby about our planet’s biggest creature, and a beautiful addition to the finest children’s books celebrating science.

Alongside Desmond’s immeasurably warm and largehearted illustrations is her simply worded, deeply intelligent synthesis of what marine biologists know about this extraordinary mammal — in fact, she worked closely with Diane Gendron, a marine biologist who studies blue whales. At the heart of the book is a compassionate curiosity about the beings with whom we share this world, effecting what the great Mary Oliver called a “sudden awareness of the citizenry of all things within one world.”

Indeed, despite the gaping disparity of scales, we have more in common with this gentle giant of the ocean than we realize — the blue whale, like us, is a highly intelligent mammal and one of the few creatures with a lifespan comparable to our own.

There is a charming meta touch to the story — the protagonist, a little boy with a crown that evokes Maurice Sendak’s Max, is learning and dreaming about blue whales by reading this very book, which he is seen holding in a number of the scenes.

Although the whaling industry of yore may have inspired some legendary art, more than 360,000 blue whales were killed in the first half of the twentieth century as these magnificent creatures were being reduced to oil, blubber, baleen, and meat. A global ban on whale hunting made them a protected species in 1966, but other forms of our arrogant anthropocentrism are putting them in danger anew as our our commercial fishing entangles them in its indiscriminate nets, our passenger ships pollute their habitats, and our general human activity continues to raise ocean temperatures.

And yet it isn’t with alarmism or bitter lamentation but with love befitting this largest-hearted of earthly creatures — its heart alone weighs around 1,300 pounds — that Desmond invites us into the world of the blue whale. She writes in the preface:

Blue whales are magnificent and intelligent creatures, and like all of the natural world they deserve our admiration and care. It is only then that they will flourish and multiply in their native ocean home.

And so it is with admiration and care that Desmond opens our eyes to the glory of this beautiful and intelligent creature — a creature whose own eye measures only six inches wide.

Look closer here.

6. THE PHYSICIST & THE PHILOSOPHER

“It is the insertion of man with his limited life span that transforms the continuously flowing stream of sheer change … into time as we know it,” Hannah Arendt wrote in her brilliant inquiry into time, space, and our thinking ego. A generation earlier, Virginia Woolf contemplated how this insertion engenders the astonishing elasticity of time; a generation later, Patti Smith pondered the subjectivity of how we experience time’s continuous flow. These reflections, once so radical and now so woven into the cultural fabric, wouldn’t have been possible without a fateful conversation that took place on April 6, 1922, which steered the course of twentieth-century science and shaped our experience of time.

So argues science historian Jimena Canales in The Physicist and the Philosopher: Einstein, Bergson, and the Debate That Changed Our Understanding of Time (public library) — a masterwork of cultural forensics, dissecting the many dimensions of the landmark conversation between Albert Einstein and Henri Bergson.

einsteinbergson

What makes the encounter particularly notable is that unlike the canon of great public conversations between intellectual titans — including those between David Bohm and Jiddu Krishnamurti, Margaret Mead and James Baldwin, and Matthieu Ricard and Jean-François Revel — where surface disagreements are undergirded by and ultimately reveal a larger shared ethos, Einstein and Bergson clashed completely and vehemently on the subject of their conversation: the nature of time. Einstein insisted that only two types of time existed: physical, the kind measured by clocks, and psychological, the subjective kind Virginia Woolf would later observe. For Bergson, this was a barbaric and reductionist perspective robbing time of the philosophical dimension that permeates nearly every aspect of how we experience its flow.

The debris of that disagreement became the foundation of our present ideas about the fabric of existence.

Art by Lisbeth Zwerger for a rare edition of Alice in Wonderland

What the encounter also reveals is the astounding amount of humanity upon which science, with all of its presumed rationalism and universal objectivity, is built. How pause-giving to think that our present understanding of time is largely the function of the personal differences between two men. Canales writes:

While Einstein searched for consistency and simplicity, Bergson focused on inconsistencies and complexities.

[…]

Bergson was the paradigmatic philosopher of memories, dreams, and laughter.

[…]

Time, he argued, was not something out there, separate from those who perceived it. It did not exist independently from us. It involved us at every level.

Bergson found Einstein’s definition of time in terms of clocks completely aberrant. The philosopher did not understand why one would opt to describe the timing of a significant event, such as the arrival of a train, in terms of how that event matched against a watch. He did not understand why Einstein tried to establish this particular procedure as a privileged way to determine simultaneity. Bergson searched for a more basic definition of simultaneity, one that would not stop at the watch but that would explain why clocks were used in the first place.

At that point, Einstein was busy rattling our understanding of time with his relativity theory. Bergson, one of the most prominent philosophers of the century and a major influence on such luminaries as Virginia Woolf, Gertrude Stein, T.S. Elliot, and William Faulkner, had advanced a theory of time that explained what the mechanics of clock-time could not, from the malleability of memory to the perplexities of premonitions. A staunch defender of intuition over the intellect, Bergson was sometimes accused, most famously by Bertrand Russell, of anti-intellectualism — but he was undeniably one of the most intelligent and incisive minds of his time. Although today Einstein is the better-known of the two, the opposite was true at the time of their confrontation, the consequences of which were profound and rippled out not only across the scientific community but across all of culture.

Read more here.

7. WHAT TO THINK ABOUT MACHINES THAT THINK

When Ada Lovelace and Charles Babbage invented the world’s first computer, their “Analytical Engine” became the evolutionary progenitor of a new class of human extensions — machines that think. A generation later, Alan Turing picked up where they left off and, in laying the foundations of artificial intelligence with his Turing Test, famously posed the techno-philosophical question of whether a computer could ever enjoy strawberries and cream or compel you to fall in love with it.

From its very outset, this new branch of human-machine evolution made it clear that any answer to these questions would invariably alter how we answer the most fundamental questions of what it means to be human.

That’s what Edge founder John Brockman explores in the 2015 edition of his annual question, inviting 192 of today’s most prominent thinkers to tussle with these core questions of artificial intelligence and its undergirding human dilemmas. The answers, collected in What to Think About Machines That Think: Today’s Leading Thinkers on the Age of Machine Intelligence (public library), come from such diverse contributors as physicist and mathematician Freeman Dyson, music pioneer Brian Eno, biological anthropologist Helen Fisher, Positive Psychology founding father Martin Seligman, computer scientist and inventor Danny Hillis, TED curator Chris Anderson, neuroscientist Sam Harris, legendary curator Hans Ulrich Obrist, media theorist Douglas Rushkoff, cognitive scientist and linguist Steven Pinker, and yours truly.

Art from Neurocomic, a graphic novel about how the brain works

The answers are strewn with a handful of common threads, a major one being the idea that artificial intelligence isn’t some futuristic abstraction but a palpably present reality with which we’re already living.

Beloved musician and prolific reader Brian Eno looks at the many elements of his day, from cooking porridge to switching on the radio, that work seamlessly thanks to an invisible mesh of connected human intelligence — a Rube Goldberg machine of micro-expertise that makes it possible for the energy in a distant oil field to power the stove built in a foreign factory out of components made by scattered manufacturers, and ultimately cook his porridge. In a sentiment that calls to mind I, Pencil — that magnificent vintage allegory of how everything is connected — Eno explains why he sees artificial intelligence not as a protagonist in a techno-dystopian future but as an indelible and fruitful part of our past and present:

My untroubled attitude results from my almost absolute faith in the reliability of the vast supercomputer I’m permanently plugged into. It was built with the intelligence of thousands of generations of human minds, and they’re still working at it now. All that human intelligence remains alive, in the form of the supercomputer of tools, theories, technologies, crafts, sciences, disciplines, customs, rituals, rules of thumb, arts, systems of belief, superstitions, work-arounds, and observations that we call Global Civilization.

Global Civilization is something we humans created, though none of us really know how. It’s out of the individual control of any of us — a seething synergy of embodied intelligence that we’re all plugged into. None of us understands more than a tiny sliver of it, but by and large we aren’t paralyzed or terrorized by that fact — we still live in it and make use of it. We feed it problems — such as “I want some porridge” — and it miraculously offers us solutions that we don’t really understand.

[…]

We’ve been living happily with artificial intelligence for thousands of years.

Art by Laura Carlin for The Iron Giant by Ted Hughes. Click image for more.

In one of the volume’s most optimistic essays, TED curator Chris Anderson, who belongs to the increasingly endangered tribe of public idealists, considers how this “hive mind” of semi-artificial intelligence could provide a counterpoint to some of our worst human tendencies and amplify our collective potential for good:

We all know how flawed humans are. How greedy, irrational, and limited in our ability to act collectively for the common good. We’re in danger of wrecking the planet. Does anyone thoughtful really want humanity to be evolution’s final word?

[…]

Intelligence doesn’t reach its full power in small units. Every additional connection and resource can help expand its power. A person can be smart, but a society can be smarter still…

By that logic, intelligent machines of the future wouldn’t destroy humans. Instead, they would tap into the unique contributions that humans make. The future would be one of ever richer intermingling of human and machine capabilities. I’ll take that route. It’s the best of those available.

[…]

Together we’re semiunconsciously creating a hive mind of vastly greater power than this planet has ever seen — and vastly less power than it will soon see.

“Us versus the machines” is the wrong mental model. There’s only one machine that really counts. Like it or not, we’re all — us and our machines — becoming part of it: an immense connected brain. Once we had neurons. Now we’re becoming the neurons.

Astrophysicist and philosopher Marcelo Gleiser, who has written beautifully about how to live with mystery in a culture obsessed with knowledge, echoes this idea by pointing out the myriad mundane ways in which “machines that think” already permeate our daily lives:

We define ourselves through our techno-gadgets, create fictitious personas with weird names, doctor pictures to appear better or at least different in Facebook pages, create a different self to interact with others. We exist on an information cloud, digitized, remote, and omnipresent. We have titanium implants in our joints, pacemakers and hearing aids, devices that redefine and extend our minds and bodies. If you’re a handicapped athlete, your carbon-fiber legs can propel you forward with ease. If you’re a scientist, computers can help you extend your brainpower to create well beyond what was possible a few decades back. New problems that once were impossible to contemplate, or even formulate, come around every day. The pace of scientific progress is a direct correlate of our alliance with digital machines.

We’re reinventing the human race right now.

Another common thread running across a number of the answers is the question of what constitutes “artificial” intelligence in the first place and how we draw the line between machine thought and human thought. Caltech theoretical physicist and cosmologist Sean Carroll performs elegant semantic acrobatics to invert the question:

We are all machines that think, and the distinction between different types of machines is eroding.

We pay a lot of attention these days, with good reason, to “artificial” machines and intelligences — ones constructed by human ingenuity. But the “natural” ones that have evolved through natural selection, like you and me, are still around. And one of the most exciting frontiers in technology and cognition is the increasingly permeable boundary between the two categories.

Art from Alice in Quantumland by Robert Gilmore, an allegory of quantum physics inspired by Alice in Wonderland

In my own contribution to the volume, I consider the question of “thinking machines” from the standpoint of what thought itself is and how our human solipsism is limiting our ability to envision and recognize other species of thinking:

Thinking isn’t mere computation — it’s also cognition and contemplation, which inevitably lead to imagination. Imagination is how we elevate the real toward the ideal, and this requires a moral framework of what is ideal. Morality is predicated on consciousness and on having a self-conscious inner life rich enough to contemplate the question of what is ideal. The famous aphorism attributed to Einstein — “Imagination is more important than knowledge” — is interesting only because it exposes the real question worth contemplating: not that of artificial intelligence but of artificial imagination.

Of course, imagination is always “artificial,” in the sense of being concerned with the unreal or trans-real — of transcending reality to envision alternatives to it — and this requires a capacity for accepting uncertainty. But the algorithms driving machine computation thrive on goal-oriented executions in which there’s no room for uncertainty. “If this, then that” is the antithesis of imagination, which lives in the unanswered, and often vitally unanswerable, realm of “What if?” As Hannah Arendt once wrote, losing our capacity for asking such unanswerable questions would be to “lose not only the ability to produce those thought-things that we call works of art but also the capacity to ask all the unanswerable questions upon which every civilization is founded.”

[…]

Will machines ever be moral, imaginative? It’s likely that if and when they reach that point, theirs will be a consciousness that isn’t beholden to human standards. Their ideals will not be our ideals, but they will be ideals nonetheless. Whether or not we recognize those processes as thinking will be determined by the limitations of human thought in understanding different — perhaps wildly, unimaginably different — modalities of thought itself.

See more responses here.

BONUS: THUNDER & LIGHTNING

Although this gem is among the best art books of the year, it is also a project of significant scientific scholarship, so it warrants inclusion among the year’s best science books as well.

“Sailors have an expression about the weather: they say, the weather is a great bluffer,” E.B. White wrote in his elevating letter of assurance to a man who had lost faith in humanity, adding: “I guess the same is true of our human society — things can look dark, then a break shows in the clouds, and all is changed, sometimes rather suddenly.” Our most steadfast companion since the dawn of our species, the weather seeded our earliest myths, inspired some of our greatest art, affects the way we think, and continues to lend itself to such apt metaphors for the human experience. Its reliable inconstancy constantly assures us that neither storm nor sunshine lasts forever; that however thick the gloom which shrouds today, the sun always rises tomorrow.

That abiding and dimensional relationship with the weather is what artist, Guggenheim Fellow, and American Museum of Natural History artist-in-residence Lauren Redniss explores in the beguiling Thunder & Lightning: Weather Past, Present, Future (public library).

Part encyclopedia and part almanac, the book is a tapestry of narrative threads highlighting various weather-related curiosities, from Eskimo dream mythology to the science of lightning to the economics of hurricanes to Benjamin Franklin’s inclination for “air baths.” Although Redniss’s selections might give the impression of trivia at first brush, make no mistake — these are not random factlets that trivialize their subject but an intentional kaleidoscopic gleam that shines the light of attention onto some of the most esoteric and enchanting aspects of the weather.

Like Redniss’s previous book — her astonishing visual biography of Marie Curie — this project is enormously ambitious both conceptually and in its execution. Redniss created her illustrations using copperplate etching, an early printmaking technique popular prior to 1820, and typeset the text in an original font she designed herself, which she titled Qanec LR after the Eskimo word for “falling snow.”

Take a closer look here.

Step into the cultural time machine with selections for the best science books of 2014, 2013, 2012, and 2011.

BP

My Favorite Things: Maira Kalman’s Illustrated Catalog of Unusual Objects, Memories, and Delight

“Go out and walk. That is the glory of life.”

Four decades after Barthes listed his favorite things, which prompted Susan Sontag to list hers, Maira Kalman — one of the most enchanting, influential, and unusual creative voices today, and a woman of piercing insight — does something very similar and very different in her magnificent book My Favorite Things (public library).

Kalman not only lives her one human life with remarkable open-heartedness, but also draws from its private humanity warm and witty wisdom on our shared human experience. There is a spartan sincerity to her work, an elegantly choreographed spontaneity — words meticulously chosen to be as simple as possible, yet impossibly expressive; drawings that invoke childhood yet brim with the complex awarenesses of a life lived long and wide. She looks at the same world we all look at but sees what no one else sees — that magical stuff of “the moments inside the moments inside the moments.” Here, her many-petaled mind blossoms in its full idiosyncratic whimsy as she catalogs the “personal micro-culture” of her inner life — her personal set of the objects and people and fragments of experience that constitute the ever-shifting assemblage we call a Self.

The book began as a companion to an exhibition Kalman curated to celebrate the anticipated reopening of the Cooper Hewitt, Smithsonian Design Museum. But it is also a kind of visual catalog sandwiched between a memoir, reminding us that our experience of art is laced with the minute details and monumental moments of our personal histories and is invariably shaped by them. Between Kalman’s original paintings and photographs based on her selections from the museum’s sweeping collection — the buttons and bathtubs, dogs and dandies, first editions of Winnie the Pooh and Alice’s Adventures in Wonderland and Proust’s letters — are also her childhood memories, her quirky personal collections, and her beautiful meditations on life.

Kalman writes in the introduction:

The pieces that I chose were based on one thing only — a gasp of DELIGHT.

Isn’t that the only way to curate a life? To live among things that make you gasp with delight?

And gasp one does, over and over. As Kalman makes her way through the vast Cooper Hewitt collection, her immeasurably lyrical interweavings of private and public expose that special way in which museums not only serve as temples to collective memory but also invite us to reopen the Proustian jars of our own memories with interest and aliveness and a capacity to gasp.

“Whoever invented the bed was a genius,” Kalman writes in her simple homage, inspired by a trading card ad from 1909. “When you get up from bed, get dressed in pants and socks.” The pants: French silk and linen breeches from 1750–1770; the socks: French knitted silk stockings from 1850–1900.

Her painting of a pair of yellow American slippers from the 1830s is really a love letter to walking, something Kalman sees as an existential activity and a creative device:

The ability to walk from one point to the next point, that is half the battle won.

Go out and walk.

That is the glory of life.

Beneath her painting of a quilted and embroidered silk Egyptian cap from the late 13th or early 14th century, Kalman hand-letters the perfect pairing — Pablo Neruda’s 1959 poem “Ode to Things”:

I love crazy things,
crazily.

I enjoy
tongs,
scissors.

I adore
cups,
rings,
soup spoons,
not to mention,
of course,
the hat.

As an enormous lover of Alice in Wonderland, I was particularly bewitched by Kalman’s painting of a photograph by Lewis Carroll, which calls to mind the real-life Alice who inspired his Wonderland:

There is also Kalman’s wink at Darwin’s despondent letter:

Painting a set of dolls made by Mexican nuns, Kalman notes in her singular style of wry awe:

The nuns have sensational fashion sense.

Emanating from the entire project is Kalman’s ability to witness life with equal parts humor and humility, and to always find the lyrical — as in her exquisite pairing of this early nineteenth-century European mount and a Lydia Davis poem:

The objects Kalman selects ultimately become a springboard for leaping into the things that move her most — like her great love of books, woven with such gentleness and subtlety into a French lamp shade from 1935:

The book. Calming object. Held in the hand.

Indeed, the screen does no justice to the magnificent object that is My Favorite Things, an object to be held in the hand and the heart. It follows Kalman’s equally enchanting The Principles of Uncertainty and Various Illuminations (Of a Crazy World), which she has complemented with such wonderful side projects as her illustrations for Strunk and White’s The Elements of Style and Michael Pollan’s Food Rules.

For a dimensional tour of Kalman’s mind and spirit, see Gael Towey’s wonderful short documentary.

Illustrations courtesy of Maira Kalman / HarperCollins; photographs my own

BP

View Full Site

The Marginalian participates in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn commissions by linking to Amazon. In more human terms, this means that whenever you buy a book on Amazon from any link on here, I receive a small percentage of its price, which goes straight back into my own colossal biblioexpenses. Privacy policy. (TLDR: You're safe — there are no nefarious "third parties" lurking on my watch or shedding crumbs of the "cookies" the rest of the internet uses.)