The Marginalian
The Marginalian

Search results for “havel”

The Paradox of Intellectual Promiscuity: Stephen Jay Gould on What Nabokov’s Butterfly Studies Reveal About the Unity of Creativity

“There is no science without fancy, and no art without facts.”

The history of human culture is rife with creators hailed as geniuses in one domain who also had a notable but lesser-known talent in another — take, for instance, Richard Feynman’s sketches, J.R.R. Tolkien’s illustrations, Sylvia Plath’s drawings, William Faulkner’s Jazz Age illustrations, Flannery O’Connor’s cartoons, David Lynch’s conceptual art, and Zelda Fitzgerald’s watercolors. Only rarely, however, do we encounter a person who has contributed to culture in a significant way in both art and science.

No one, argues Stephen Jay Gould (September 10, 1941–May 20, 2002) — perhaps the greatest science-storyteller humanity has ever had, a man of uncommon genius in the art of dot-connecting — better merits recognition for such duality of genius than Vladimir Nabokov, a titan of literary storytelling and a formidable lepidopterist who studied, classified, and drew a major group of butterflies, and even served as unofficial curator of lepidoptery at Harvard’s Museum of Comparative Zoology.

In a spectacular essay titled “The Paradox of Intellectual Promiscuity,” found in his altogether indispensable final essay collection I Have Landed: The End of a Beginning in Natural History (public library), Gould uses Nabokov’s case to make a beautiful and urgently necessary broader case against our culture’s chronic tendency to pit art and science against one another — “We have been befogged by a set of stereotypes about conflict and difference between these two great domains of human understanding,” he laments — and to assume that if a person has talent and passion for both areas, he or she can achieve greatness in only one and is necessarily a mere hobbyist in the other.

Gould writes:

We tend toward benign toleration when great thinkers and artists pursue disparate activities as a harmless hobby, robbing little time from their fundamental achievements… We grieve when we sense that a subsidiary interest stole precious items from a primary enterprise of great value… When we recognize that a secondary passion took substantial time from a primary source of fame, we try to assuage our grief over lost novels, symphonies, or discoveries by convincing ourselves that a hero’s subsidiary love must have informed or enriched his primary activity — in other words, that the loss in quantity might be recompensed by a gain in quality.

Nabokov’s drawing of ‘Eugenia onegini,’ named for Aleksandr Pushkin’s novel in verse, Eugene Onegin, which Nabokov translated. The illustration appeared on the endpaper of ‘Conclusive Evidence,’ Nabokov’s autobiography.

But Gould argues that neither lamentation of such “intellectual promiscuity” detracting from the primary endeavor nor the manufactured comfort of believing that one domain enriched the other is an appropriate response to Nabokov’s two great loves, literature and butterflies. Gould unambiguously annihilates a common misconception about the great author:

Nabokov was no amateur (in the pejorative sense of the term), but a fully qualified, clearly talented, duly employed professional taxonomist, with recognized “world class” expertise in the biology and classification of a major group, the Latin American Polyommatini, popularly known to butterfly aficionados as “blues.”

No passion burned longer, or more deeply, in Nabokov’s life than his love for the natural history and taxonomy of butterflies. He began in early childhood, encouraged by a traditional interest in natural history among the upper-class intelligentsia of Russia (not to mention the attendant economic advantages of time, resources, and opportunity).

[…]

The reasons often given for attributing to Nabokov either an amateur, or even only a dilettante’s, status arise from simple ignorance of accepted definitions for professionalism in this field.

[…]

Nabokov loved his butterflies as much as his literature. He worked for years as a fully professional taxonomist, publishing more than a dozen papers that have stood the test of substantial time.

That he received an annual salary of merely a thousand dollars during his six years at Harvard’s zoology museum and worked under the vague title Research Fellow shouldn’t be used as evidence of Nabokov’s amateurishness — in making a larger point about the rich history of people working on what they love for little or no pay, Gould points out that several esteemed curators at the museum during his own tenure worked as volunteers for the symbolic annual salary of one dollar. In one of his many spectacular, almost outraged asides — Gould’s signature intelligent zingers — he drives home the point that there is little correlation between merit and prestige:

Every field includes some clunkers and nitwits, even in high positions!

Returning to the two camps of explaining Nabokov’s dual giftedness — and parallel talents in general — Gould writes:

In seeking some explanation for legitimate grief, we may find solace in claiming that Nabokov’s transcendent genius permitted him to make as uniquely innovative and distinctive a contribution to lepidoptery as to literature. However much we may wish that he had chosen a different distribution for his time, we can at least, with appropriate generosity, grant his equal impact and benefit upon natural history… However, no natural historian has ever viewed Nabokov as an innovator, or as an inhabitant of what humanists call the “vanguard” (not to mention the avant-garde) and scientists the “cutting edge.” Nabokov may have been a major general of literature, but he can only be ranked as a trustworthy, highly trained career infantryman in natural history.

Even Nabokov’s butterfly drawings, Gould points out, were great but far from masterworks of natural history illustration, especially in comparison to the work of such visionaries as butterfly-drawing grand dame Maria Merian.

One of Maria Merian’s pioneering butterfly drawings. Click image for more.

Here, we are reminded of another perilous pathology of our culture — in the cult of genius, as in any cult, we leave no room for nuance; mere greatness is not good enough — one must lay a claim on grandeur. This is perhaps the most extreme testament to how perfectionism thwarts creativity.

But despite his mere greatness at lepidoptery, Nabokov regarded his time at the zoology museum as the most “delightful and thrilling” in his adult life — so creatively electrified was the author there that his years at Harvard even produced history’s most epic and entertaining account of food poisoning. But his love of butterflies began much earlier. In fact, one of the very first things Nabokov wrote in English, at the age of twelve, was a paper on Lepidoptera. The only reason it wasn’t published was that it turned out the butterfly in question had already been described by someone else.

This remark, which Gould makes rather in passing, made me wonder whether the incident instilled in young Vladimir an early reverence for attribution of discovery. As Gould later notes in another passing mention, Nabokov frequently voiced annoyance with scientists and science-writers not attributing discovery — not acknowledging the person who discovered and named a butterfly species. Therein lies a broader, and rather timely, lament about our culture’s failure to honor discovery as a creative act and a subset of scholarship — such a scientist, after all, doesn’t invent a species, for it already exists in nature, but discovers it, names it, and contextualizes it in the canon of natural history. It is no coincidence that Nabokov’s own role at the Harvard Museum of Comparative Zoology was that of curator, for this is the task of the curator — to describe, arrange, and contextualize what already exists in such a way as to shed new light on its meaning, to discover and un-cover its significance and place in the canon of ideas.

Embedded in this act is also a lineage of discovery, similar to the “links in a chain” metaphor Pete Seeger used for creativity: I learned of Nabokov’s pet peeve about discovery thanks to Stephen Jay Gould — perhaps the greatest curator of scientific ideas the world has ever known, the greatest contextualizer of such ideas in the popular imagination — and you learned of it via me, and the person you tell about this will learn of it via you. All of us are links in the evolutionary chain of ideas, much like each butterfly species discovered is a link in the evolutionary chain of natural history. This is why Richard Dawkins, in coining the word meme, used a metaphor from evolutionary biology to describe how ideas spread: “Just as genes propagate themselves in the gene pool by leaping from body to body via sperms or eggs, so memes propagate themselves in the meme pool by leaping from brain to brain.”

But back to Nabokov: His dedication to the integrity of discovery prompted him to write a short poem titled “A Discovery” in 1943:

Dark pictures, thrones, the stones that pilgrims kiss
Poems that take a thousand years to die
But ape the immortality of this
Red label on a little butterfly.

It might also be why he was so passionate about the integrity of detail. In a motto that calls to mind Susan Sontag’s memorable assertion that “a writer is a professional observer,” Nabokov instructed his literature students:

Caress the details, the divine details. In high art and pure science, detail is everything.

Butterfly drawing by Nabokov, August 1958 (Courtesy of Nabokov Museum)

In this, Gould finds the reconciliatory unity between Nabokov’s two great loves and how they communed with one another:

Although time spent on lepidoptery almost surely decreased his literary output, the specific knowledge and the philosophical view of life that Nabokov gained from his scientific career directly forged (or at least strongly contributed to) his unique literary style and excellence… Perhaps the major linkage of science and literature lies in some distinctive, underlying approach that Nabokov applied equally to both domains — a procedure that conferred the same special features upon all his efforts.

[…]

Among great twentieth-century thinkers, I know no better case than Nabokov’s for testing the hypothesis that an underlying unity of mental style (at a level clearly meriting the accolade of genius) can explain one man’s success in extensive and fully professional work in two disciplines conventionally viewed as maximally different, if not truly opposed. If we can validate this model for attributing interdisciplinary success to a coordinating and underlying mental uniqueness, rather than invoking the conventional argument about overt influence of one field upon another, then Nabokov’s story may teach us something important about the unity of creativity, and the falsity (or at least the contingency) of our traditional separation, usually in mutual recrimination, of art from science.

Therein, Gould argues, lies the only true solace in the accusation of “intellectual promiscuity.” Debunking the two false explanations of the Nabokov paradox — that “lepidoptery represented a harmless private passion, robbing no substantial time from his literary output” and that “his general genius at least made his lepidoptery as distinctive and as worthy as his literature” — Gould writes:

Nabokov’s two apparently disparate careers therefore find their common ground on the most distinctive feature of his unusual intellect and uncanny skill — the almost obsessive attention to meticulous and accurate detail that served both his literary productions and his taxonomic descriptions so well, and that defined his uncompromising commitment to factuality as both a principle of morality and a guarantor and primary guide to aesthetic quality.

Science and literature therefore gain their union on the most palpable territory of concrete things, and on the value we attribute to accuracy, even in smallest details, as a guide and an anchor for our lives, our loves, and our senses of worth… Of all scientific subfields, none raises the importance of intricate detail to such a plateau of importance as Nabokov’s chosen profession of taxonomic description for small and complex organisms. To function as a competent professional in the systematics of Lepidoptera, Nabokov really had no choice but to embrace such attention to detail, and to develop such respect for nature’s endless variety.

[…]

The universal and defining excellence of a professional taxonomist built a substrate for the uncommon, and (in Nabokov’s case) transcendent, excellence of a writer.

Young Vladimir and Véra Nabokov by Thomas Doyle from ‘The Who, the What, and the When: 65 Artists Illustrate the Secret Sidekicks of History.’ Click image for more.

But Gould’s most important point of all has little to do with Nabokov and everything to do with the toxic mythologies of creativity to which we, as a culture and as individuals, subscribe:

An ancient, and basically anti-intellectual, current in the creative arts has now begun to flow more strongly than ever before in recent memory-the tempting Siren song of a claim that the spirit of human creativity stands in direct opposition to the rigor in education and observation that breeds both our love for factual detail and our gain of sufficient knowledge and understanding to utilize this record of human achievement and natural wonder.

No more harmful nonsense exists than this common supposition that deepest insight into great questions about the meaning of life or the structure of reality emerges most readily when a free, undisciplined, and uncluttered (read, rather, ignorant and uneducated) mind soars above mere earthly knowledge and concern. The primary reason for emphasizing the supreme aesthetic and moral value of detailed factual accuracy, as Nabokov understood so well, lies in our need to combat this alluring brand of philistinism if we wish to maintain artistic excellence as both a craft and an inspiration.

[…]

If we assign too much of our total allotment to the mastery of detail, we will have nothing left for general theory and integrative wonder. But such a silly model of mental functioning can only arise from a false metaphorical comparison of human creativity with irrelevant systems based on fixed and filled containers — pennies in a piggy bank or cookies in a jar.

Gould ends by exhorting us:

Let us celebrate Nabokov’s excellence in natural history, and let us also rejoice that he could use the same mental skills and inclinations to follow another form of bliss.

[…]

Human creativity seems to work much as a coordinated and complex piece, whatever the different emphases demanded by disparate subjects-and we will miss the underlying commonality if we only stress the distinctions of external subjects and ignore the unities of internal procedure. If we do not recognize the common concerns and characteristics of all creative human activity, we will fail to grasp several important aspects of intellectual excellence-including the necessary interplay of imagination and observation (theory and empirics) as an intellectual theme, and the confluence of beauty and factuality as a psychological theme-because one field or the other traditionally downplays one side of a requisite duality.

[…]

I cannot imagine a better test case for extracting the universals of human creativity than the study of deep similarities in intellectual procedure between the arts and sciences.

No one grasped the extent of this underlying unity better than Vladimir Nabokov, who worked with different excellences as a complete professional in both domains.

[…]

Nabokov broke the boundaries of art and science by stating that the most precious desideratum of each domain must also characterize any excellence in the other — for, after all, truth is beauty, and beauty truth.

Gould seals this beautiful truth with a line — an exquisite, ennobling, oft-cited line — from one of Nabokov’s interviews:

There is no science without fancy, and no art without facts.

I Have Landed remains one of the finest tapestries of thought ever woven in the history of science storytelling. Complement this particular thread with Nabokov on inspiration, censorship and solidarity, what makes a great writer, what makes a great reader, and his sublime love letters to his wife.

BP

The Relationship Between Creativity and Mental Illness

The science behind the “tortured genius” myth and what it reveals about how the creative mind actually works.

“I think I’ve only spent about ten percent of my energies on writing,” Pulitzer Prize-winning writer Katherine Anne Porter confessed in a 1963 interview. “The other ninety percent went to keeping my head above water.” While art may be a form of therapy for the rest of us, Porter’s is a sentiment far from uncommon among the creatively gifted who make that art. Why?

When Nancy Andreasen took a standard IQ test in kindergarten, she was declared a “genius.” But she was born in the late 1930s, an era when her own mother admonished that no one would marry a woman with a Ph.D. Still, she became a psychiatrist and a neuroscientist, and made understanding the brain’s creative capacity her life’s work. Having grown up steeped in ambivalence about her “diagnosis” of extraordinary intellectual and creative ability, Andreasen wondered about the social forces at work in the nature-nurture osmosis of genius, about how many people of natural genius were born throughout history whose genius was never manifested, suppressed by lack of nurture. “Half of the human beings in history are women,” she noted, “but we have had so few women recognized for their genius. How many were held back by societal influences, similar to the ones I encountered and dared to ignore?” (One need only look at the case of Benjamin Franklin and his sister to see Andreasen’s point.)

Andreasen didn’t heed her mother’s warning and went on to become a pioneer of the neuroimaging revolution, setting out to understand how “genius” came to be and whether its manifestation could be actively nurtured — how we, as individuals and as a society, could put an end to wasting human gifts. She did get a Ph.D., too, but in Renaissance English literature rather than biochemistry — a multidisciplinary angle that lends her approach a unique lens at that fertile intersection of science and the humanities.

Neuroimaging of an axon from Portraits of the Mind: Visualizing the Brain from Antiquity to the 21st Century

In The Creating Brain: The Neuroscience of Genius (public library), Andreasen — whom Vonnegut once called “our leading authority on creativity” — crystallizes more than three decades of her work at the intersection of neuroscience, psychology, and cultural history.

One of the most interesting chapters in the book deals with the correlation between creativity and mental illness, bringing scientific rigor to such classic anecdotal examples as those evidenced in Van Gogh’s letters or Sylvia Plath’s journals or Leo Tolstoy’s diary of depression or Virginia Woolf’s suicide note. Having long opposed the toxic “tortured genius” myth of creativity, I was instantly intrigued by Andreasen’s inquiry, the backdrop of which she paints elegantly:

Did mental illness facilitate [these creators’] unique abilities, whether it be to play a concerto or to perceive a novel mathematical relationship? Or did mental illness impair their creativity after its initial meteoric burst in their twenties? Or is the relationship more complex than a simple one of cause and effect, in either direction?

She cites the work of Havelock Ellis, one of the earliest scholars of creativity, a Victorian physician, writer and social reformer ahead of his time. In 1926, in his late sixties, he published A Study of British Genius, an effort to provide a scientific assessment of the link between genius and psychopathology by studying a sample of people found in the British Dictionary of National Biography — a compendium of about 30,000 eminent public figures, whom he sifted through a set of criteria to identify 1,030 displaying “any very transcendent degree of native ability.” Andreasen recounts his findings:

The rate of “insanity” noted by Ellis is certainly higher than is usually recorded for the general population, for which the current base rate is 1 percent for schizophrenia and 1 percent for mania. These are the two most common psychotic illnesses. The rate of melancholia — or what we currently call depression — is similar to current lifetime population rates of approximately 10 to 20 percent.

Once she became a psychiatrist, having come from a literary world “well populated with people who had vividly described symptoms of mental illness,” Andreasen decided to apply everything science had uncovered in the decades since Ellis’s work and design a rigorous study on the relationship between creativity and mental illness. Andreasen had attended the University of Iowa Medical School and had completed her residency in psychiatry there — a somewhat fortuitous circumstance that presented her with the perfect, quite convenient sample pool for her study: the Iowa Writers’ Workshop, one of the most prestigious creative-writing programs in the world, which has included such distinguished faculty as Kurt Vonnegut and Annie Dillard since its inception in 1936.

Kurt Vonnegut was one of the authors Andreasen studied.

Andreasen’s study had a couple of crucial points of differentiation over Ellis’s work and other previous efforts: Rather than anecdotal accounts in biographies of her subjects, she employed structured, first-person interviews; she then applied rigorous diagnostic criteria to the responses based on The Diagnostic and Statistical Manual of Mental Disorders, the bible of modern psychiatry. Andreasen writes:

In addition to incorporating diagnostic criteria, the Iowa Writers’ Workshop Study also improved on its predecessors by including a group of educationally matched controls. The Writers’ Workshop has a limited number of permanent faculty members (typically two poets and two prose writers). The remainder of the faculty in any given year consists of visiting writers who come to Iowa, drawn by its pastoral tranquility and an opportunity to be “far from the madding crowd” for a time of introspection, incubation, and isolation.

[…]

I began the study with a perfectly reasonable working hypothesis. I anticipated that the writers would be, in general, psychologically healthy, but that they would have an increased rate of schizophrenia in their family members. This hunch made good sense, based on the information that I had at that time. I was influenced by my knowledge about people such as James Joyce, Bertrand Russell, and Albert Einstein, all of whom had family members with schizophrenia.

But as she began administering the interviews and applying to them the diagnostic criteria, her working hypothesis quickly crumbled: To her bewilderment, the majority of the writers “described significant histories of mood disorder that met diagnostic criteria for either bipolar illness or unipolar depression.” Most had received treatment for it — some with hospitalization, some with outpatient therapy and medication. Perhaps the most startling contrast with her initial hunch was the fact that not a single writer displayed any symptoms of schizophrenia.

And this is where the monumental importance of her study shines: What Andreasen found wasn’t confirmation for the “tortured genius” myth — the idea that a great artist must have some dark, tragic pathology in order to create — but quite the opposite: these women and men had become successful writers not because of their tortuous mental health but despite it.

Andreasen reflects on the findings:

Although many writers had had periods of significant depression, mania, or hypomania, they were consistently appealing, entertaining, and interesting people. They had led interesting lives, and they enjoyed telling me about them as much as I enjoyed hearing about them. Mood disorders tend to be episodic, characterized by relatively brief periods of low or high mood lasting weeks to months, interspersed with long periods of normal mood (known as euthymia to us psychiatrists). All the writers were euthymic at the time that I interviewed them, and so they could look back on their periods of depression or mania with considerable detachment. They were also able to describe how abnormalities in mood state affected their creativity. Consistently, they indicated that they were unable to be creative when either depressed or manic.

The sleep habits vs. creative output of famous writers. Click image for details.

More than that, her study confirmed two pervasive yet conflicting ideas about the relationship between creativity and mental illness:

One point of view … is that gifted people are in fact supernormal or superior in many ways. My writers certainly were. They were charming, fun, articulate, and disciplined. They typically followed very similar schedules, getting up in the morning and allocating a large chunk of time to writing during the earlier part of the day. They would rarely let a day go by without writing. In general, they had a close relationship with friends and family. They manifested the Freudian definition of health: lieben und arbeiten, “to love and to work.” On the other hand, they also manifested the alternative common point of view about the nature of genius: that it is “to madness near allied.” Many definitely had experienced periods of significant mood disorder. Importantly, though handicapping creativity when they occurred, these periods of mood disorder were not permanent or long-lived. In some instances, they may even have provided powerful material upon which the writer could later draw, as a Wordsworthian “emotion recollected in tranquility.”

Andreasen’s seminal study inspired a series of related research, most notably a project by British psychologist Kay Jamison, who examined 47 prominent poets, playwrights, novelists, biographers, and artists to find that a significant portion of them had mood disorders. Harvard psychiatrist Joseph Schildkraut found even starker evidence of the same tendency in a study of 15 mid-century abstract expressionists — about half had “some form of psychopathology, which was predominantly mood disorder.”

Andreasen returns to the question of why mood disorders are so common among writers, but schizophrenia — which she initially expected to find — is not:

The evidence supporting an association between artistic creativity and mood disorder is quite solid, as is the absence of an association with schizophrenia. The nature of artistic creativity, particularly literary creativity, is probably not compatible with the presence of an illness like schizophrenia, which causes many of its victims to be socially withdrawn and cognitively disorganized. An activity such as writing a novel or a play requires sustained attention for long periods of time and an ability to hold a complex group of characters and a plot line “in the brain” for as long as one or two years while the novel or play is being designed, written, and rewritten. This type of sustained concentration is extremely difficult for people suffering from schizophrenia.

Creativity in other fields may, however, be compatible with an illness like schizophrenia, particularly those fields in which the creative moment is achieved by flashes of insight about complex relationships or by exploring hunches and intuitions that ordinary folk might find strange or even bizarre.

(The famed Russian composer Tchaikovsky, who some scholars have speculated had symptoms of schizophrenia, articulated those “flashes of insight” spectacularly in his 1876 letter on the “immeasurable bliss” of creativity.)

Andreasen considers the unique psychoemotional constitution of the highly creative person, both its blessing and its curse:

Many personality characteristics of creative people … make them more vulnerable, including openness to new experiences, a tolerance for ambiguity, and an approach to life and the world that is relatively free of preconceptions. This flexibility permits them to perceive things in a fresh and novel way, which is an important basis for creativity. But it also means that their inner world is complex, ambiguous, and filled with shades of gray rather than black and white. It is a world filled with many questions and few easy answers. While less creative people can quickly respond to situations based on what they have been told by people in authority — parents, teachers, pastors, rabbis, or priests — the creative person lives in a more fluid and nebulous world. He or she may have to confront criticism or rejection for being too questioning, or too unconventional. Such traits can lead to feelings of depression or social alienation. A highly original person may seem odd or strange to others. Too much openness means living on the edge. Sometimes the person may drop over the edge… into depression, mania, or perhaps schizophrenia.

She considers the cognitive machinery common to both creative thinking and mental turmoil:

Creative ideas probably occur as part of a potentially dangerous mental process, when associations in the brain are flying freely during unconscious mental states — how thoughts must become momentarily disorganized prior to organizing. Such a process is very similar to that which occurs during psychotic states of mania, depression, or schizophrenia. In fact, the great Swiss psychiatrist Eugen Bleuler, who gave schizophrenia its name, described a “loosening of associations” as its most characteristic feature: “Of the thousands of associative threads that guide our thinking, this disease seems to interrupt, quite haphazardly, sometimes single threads, sometimes a whole group, and sometimes whole segments of them.”

Of course, we now know that this crossing of the wires that combines seemingly unrelated concepts is also the essence of creativity — or what Einstein once described as the “combinatory play” at the heart of ideation — and why dot-connecting is vital for great art. Andreasen writes:

When the associations flying through the brain self-organize to form a new idea, the result is creativity. But if they either fail to self-organize, or if they self-organize to create an erroneous idea, the result is psychosis. Sometimes both occur in the same person, and the result is a creative person who is also psychotic. As [schizophrenic mathematician John] Nash [who inspired the film A Beautiful Mind] once said: “the ideas I have about supernatural beings came to me the same way that my mathematical ideas did, so I took them seriously.”

This failure to self-organize stems from what cognitive scientists call input dysfunction — a glitch in the filtering system we use to tune out the vast majority of what is going on around us. Andreasen explains:

All human beings (and their brains) have to cope with the fact that their five senses gather more information than even the magnificent human brain is able to process. To put this another way: we need to be able to ignore a lot of what is happening around us — the smell of pizza baking, the sound of the cat meowing, or the sight of birds flying outside the window — if we are going to focus our attention and concentrate on what we are doing (in your case, for example, reading this book). Our ability to filter out unnecessary stimuli and focus our attention is mediated by brain mechanisms in regions known as the thalamus and the reticular activating system.

Creative people, Andreasen notes, can be more easily overwhelmed by stimuli and become distracted. Some of the writers in her study, upon realizing they had a tendency to be too sociable, employed various strategies for keeping themselves isolated from human contact for sizable stretches of time in order to create. (Victor Hugo famously locked away all his clothes to avoid the temptation of going out while completing The Hunchback of Notre Dame in 1830, which he wrote at his desk wearing nothing but a large gray shawl.) And yet for all its capacity to overwhelm, the creative mind remains above all a spectacular blessing:

Our ability to use our brains to get “outside” our relatively limited personal perspectives and circumstances, and to see something other than the “objective” world, is a powerful gift. Many people fail to realize that they even have this gift, and most who do rarely use it.

The Creating Brain is a fascinating read in its entirety. Complement it with a brief cultural history of “genius,” Bob Dylan on creativity and the unconscious mind, the psychology of how mind-wandering and “positive constructive daydreaming” boost creativity, and Carole King on overcoming creative block.

BP

Big Thinkers on the Only Things Worth Worrying About

A cross-disciplinary kaleidoscope of intelligent concerns for the self and the species.

In his famous and wonderfully heartening letter of fatherly advice, F. Scott Fitzgerald gave his young daughter Scottie a list of things to worry and not worry about in life. Among the unworriables, he named popular opinion, the past, the future, triumph, and failure “unless it comes through your own fault.” Among the worry-worthy, courage, cleanliness, and efficiency. What Fitzgerald touched on, of course, is the quintessential anxiety of the human condition, which drives us to worry about things big and small, mundane and monumental, often confusing the two classes. It was this “worryability” that young Italo Calvino resolved to shake from his life. A wonderful 1934 book classified all of our worries in five general categories that endure with astounding prescience and precision, but we still struggle to identify the things truly worth worrying about — and, implicitly, working to resolve — versus those that only strain our psychoemotional capacity with the deathly grip of anxiety.

‘My Wheel of Worry’ by Andrew Kuo, depicting his inner worries, arguments, counterarguments, and obsessions in the form of charts and graphs.
Click image for details.

In What Should We Be Worried About? (public library), intellectual jockey and Edge founder John Brockman tackles this issue with his annual question — which has previously answered such conundrums as the single most elegant theory of how the world works (2012) and the best way to make ourselves smarter (2011) — and asks some of our era’s greatest thinkers in science, psychology, technology, philosophy, and more to each contribute one valid “worry” about our shared future. Rather than alarmist anxiety-slinging, however, the ethos of the project is quite the opposite — to put in perspective the things we worry about but shouldn’t, whether by our own volition or thanks to ample media manipulation, and contrast them with issues of actual concern, at which we ought to aim our collective attention and efforts in order to ensure humanity’s progress and survival.

Behavioral neuroscientist Kate Jeffery offers one of the most interesting answers, reminiscent of Alan Watts’s assertion that “without birth and death … the world would be static, rhythm-less, undancing, mummified,” exploring our mortality paradox and pointing to the loss of death as a thing to worry about:

Every generation our species distills the best of itself, packages it up and passes it on, shedding the dross and creating a fresher, newer, shinier generation. We have been doing this now for four billion years, and in doing so have transmogrified from unicellular microorganisms that do little more than cling to rocks and photosynthesize, to creatures of boundless energy and imagination who write poetry, make music, love each other and work hard to decipher the secrets of themselves and their universe.

And then they die.

Death is what makes this cyclical renewal and steady advance in organisms possible. Discovered by living things millions of years ago, aging and death permit a species to grow and flourish. Because natural selection ensures that the child-who-survives-to-reproduce is better than the parent (albeit infinitesimally so, for that is how evolution works), it is better for many species that the parent step out of the way and allow its (superior) child to succeed in its place. Put more simply, death stops a parent from competing with its children and grandchildren for the same limited resources. So important is death that we have, wired into our genes, a self-destruct senescence program that shuts down operations once we have successfully reproduced, so that we eventually die, leaving our children—the fresher, newer, shinier versions of ourselves—to carry on with the best of what we have given them: the best genes, the best art, and the best ideas. Four billion years of death has served us well.

Now, all this may be coming to an end, for one of the things we humans, with our evolved intelligence, are working hard at is trying to eradicate death. This is an understandable enterprise, for nobody wants to die—genes for wanting to die rarely last long in a species. For millennia, human thinkers have dreamed of conquering old age and death: the fight against it permeates our art and culture, and much of our science. We personify death as a specter and loathe it, fear it and associate it with all that is bad in the world. If we could conquer it, how much better life would become.

Celebrated filmmaker Terry Gilliam leans toward the philosophical with an answer somewhere between John Cage and Yoda:

I’ve given up asking questions. I merely float on a tsunami of acceptance of anything life throws at me… and marvel stupidly.

Music pioneer Brian Eno, a man of strong opinions on art and unconventional approaches to creativity, is concerned that we see politics, a force that impacts our daily lives on nearly every level, as something other people do:

Most of the smart people I know want nothing to do with politics. We avoid it like the plague — like Edge avoids it, in fact. Is this because we feel that politics isn’t where anything significant happens? Or because we’re too taken up with what we’re doing, be it Quantum Physics or Statistical Genomics or Generative Music? Or because we’re too polite to get into arguments with people? Or because we just think that things will work out fine if we let them be — that The Invisible Hand or The Technosphere will mysteriously sort them out?

Whatever the reasons for our quiescence, politics is still being done — just not by us. It’s politics that gave us Iraq and Afghanistan and a few hundred thousand casualties. It’s politics that’s bleeding the poorer nations for the debts of their former dictators. It’s politics that allows special interests to run the country. It’s politics that helped the banks wreck the economy. It’s politics that prohibits gay marriage and stem cell research but nurtures Gaza and Guantanamo.

But we don’t do politics. We expect other people to do it for us, and grumble when they get it wrong. We feel that our responsibility stops at the ballot box, if we even get that far. After that we’re as laissez-faire as we can get away with.

What worries me is that while we’re laissez-ing, someone else is faire-ing.

Barbara Strauch, science editor of The New York Times, echoes Richard Feynman’s lament about the general public’s scientific ignorance — not the good kind, but the kind that leads to the resurgence of preventable diseases — when it comes to science, as well as the dismal state of science education. She sees oases of hope in that desert of ignorance but finds the disconnect worrisome:

Something quite serious has been lost. . . . This decline in general-interest science coverage comes at a time of divergent directions in the general public. At one level, there seems to be increasing ignorance. After all, it’s not just science news coverage that has suffered, but also the teaching of science in schools. And we just went through a political season that saw how all this can play out, with major political figures spouting off one silly statement after another, particularly about women’s health. . . .

But something else is going on, as well. Even as we have in some pockets what seems like increasing ignorance of science, we have at the same time, a growing interest of many. It’s easy to see, from where I sit, how high that interest is. Articles about anything scientific, from the current findings in human evolution to the latest rover landing on Mars, not to mention new genetic approaches to cancer — and yes, even the Higgs boson — zoom to the top of our newspaper’s most emailed list.

We know our readers love science and cannot get enough of it. And it’s not just our readers. As the rover Curiosity approached Mars, people of all ages in all parts of the country had “Curiosity parties” to watch news of the landing. Mars parties! Social media, too, has shown us how much interest there is across the board, with YouTube videos and tweets on science often becoming instant megahits.

So what we have is a high interest and a lot of misinformation floating around. And we have fewer and fewer places that provide real information to a general audience that is understandable, at least by those of us who do not yet have our doctorates in astrophysics. The disconnect is what we should all be worried about.

Nicholas Carr, author of the techno-dystopian The Shallows: What the Internet Is Doing to Our Brains, considers the effects that digital communication might be having on our intricate internal clocks and the strange ways in which our brains warp time:

I’m concerned about time — the way we’re warping it and it’s warping us. Human beings, like other animals, seem to have remarkably accurate internal clocks. Take away our wristwatches and our cell phones, and we can still make pretty good estimates about time intervals. But that faculty can also be easily distorted. Our perception of time is subjective; it changes with our circumstances and our experiences. When things are happening quickly all around us, delays that would otherwise seem brief begin to feel interminable. Seconds stretch out. Minutes go on forever. . . .

Given what we know about the variability of our time sense, it seems clear that information and communication technologies would have a particularly strong effect on personal time perception. After all, they often determine the pace of the events we experience, the speed with which we’re presented with new information and stimuli, and even the rhythm of our social interactions. That’s been true for a long time, but the influence must be particularly strong now that we carry powerful and extraordinarily fast computers around with us all day long. Our gadgets train us to expect near-instantaneous responses to our actions, and we quickly get frustrated and annoyed at even brief delays.

I know that my own perception of time has been changed by technology. . . .

As we experience faster flows of information online, we become, in other words, less patient people. But it’s not just a network effect. The phenomenon is amplified by the constant buzz of Facebook, Twitter, texting, and social networking in general. Society’s “activity rhythm” has never been so harried. Impatience is a contagion spread from gadget to gadget.

One of the gravest yet most lucid and important admonitions comes from classicist-turned-technologist Tim O’Reilly, who echoes Susan Sontag’s concerns about anti-intellectualism and cautions that the plague of ignorance might spread far enough to drive our civilization into extinction:

For so many in the techno-elite, even those who don’t entirely subscribe to the unlimited optimism of the Singularity, the notion of perpetual progress and economic growth is somehow taken for granted. As a former classicist turned technologist, I’ve always lived with the shadow of the fall of Rome, the failure of its intellectual culture, and the stasis that gripped the Western world for the better part of a thousand years. What I fear most is that we will lack the will and the foresight to face the world’s problems squarely, but will instead retreat from them into superstition and ignorance.

[…]

History teaches us that conservative, backward-looking movements often arise under conditions of economic stress. As the world faces problems ranging from climate change to the demographic cliff of aging populations, it’s wise to imagine widely divergent futures.

Yes, we may find technological solutions that propel us into a new golden age of robots, collective intelligence, and an economy built around “the creative class.” But it’s at least as probable that as we fail to find those solutions quickly enough, the world falls into apathy, disbelief in science and progress, and after a melancholy decline, a new dark age.

Civilizations do fail. We have never yet seen one that hasn’t. The difference is that the torch of progress has in the past always passed to another region of the world. But we’ve now, for the first time, got a single global civilization. If it fails, we all fail together.

Biological anthropologist Helen Fisher, who studies the brain on love and whose Why We Love remains indispensable, worries that we misunderstand men. She cites her research for some findings that counter common misconceptions and illustrate how gender stereotypes limit us:

Men fall in love faster too — perhaps because they are more visual. Men experience love at first sight more regularly; and men fall in love just as often. Indeed, men are just as physiologically passionate. When my colleagues and I have scanned men’s brains (using fMRI), we have found that they show just as much activity as women in neural regions linked with feelings of intense romantic love. Interestingly, in the 2011 sample, I also found that when men fall in love, they are faster to introduce their new partner to friends and parents, more eager to kiss in public, and want to “live together” sooner. Then, when they are settled in, men have more intimate conversations with their wives than women do with their husbands—because women have many of their intimate conversations with their girlfriends. Last, men are just as likely to believe you can stay married to the same person forever (76% of both sexes). And other data show that after a break up, men are 2.5 times more likely to kill themselves.

[…]

In the Iliad, Homer called love “magic to make the sanest man go mad.” This brain system lives in both sexes. And I believe we’ll make better partnerships if we embrace the facts: men love — just as powerfully as women.

David Rowan, editor of Wired UK and scholar of the secrets of entrepreneurship, worries about the growing disconnect between the data-rich and the data-poor:

Each day, according to IBM, we collectively generate 2.5 quintillion bytes — a tsunami of structured and unstructured data that’s growing, in IDC’s reckoning, at 60 per cent a year. Walmart drags a million hourly retail transactions into a database that long ago passed 2.5 petabytes; Facebook processes 2.5 billion pieces of content and 500 terabytes of data each day; and Google, whose YouTube division alone gains 72 hours of new video every minute, accumulates 24 petabytes of data in a single day. . . . Certainly there are vast public benefits in the smart processing of these zetta- and yottabytes of previously unconstrained zeroes and ones. . . .

Yet as our lives are swept unstoppably into the data-driven world, such benefits are being denied to a fast-emerging data underclass. Any citizen lacking a basic understanding of, and at least minimal access to, the new algorithmic tools will increasingly be disadvantaged in vast areas of economic, political and social participation. The data disenfranchised will find it harder to establish personal creditworthiness or political influence; they will be discriminated against by stock markets and by social networks. We need to start seeing data literacy as a requisite, fundamental skill in a 21st-century democracy, and to campaign — and perhaps even to legislate — to protect the interests of those being left behind.

Some, like social and cognitive scientist Dan Sperber, go meta, admonishing that our worries about worrying are ushering in a new age of anxiety, the consequences of which are debilitating:

Worrying is an investment of cognitive resources laced with emotions from the anxiety spectrum and aimed at solving some specific problem. It has its costs and benefits, and so does not worrying. Worrying for a few minutes about what to serve for dinner in order please one’s guests may be a sound investment of resources. Worrying about what will happen to your soul after death is a total waste. Human ancestors and other animals with foresight may have only worried about genuine and pressing problems such as not finding food or being eaten. Ever since they have become much more imaginative and have fed their imagination with rich cultural inputs, that is, since at least 40,000 years (possibly much more), humans have also worried about improving their lot individually and collectively — sensible worries — and about the evil eye, the displeasure of dead ancestors, the purity of their blood — misplaced worries.

A new kind of misplaced worries is likely to become more and more common. The ever-accelerating current scientific and technological revolution results in a flow of problems and opportunities that presents unprecedented cognitive and decisional challenges. Our capacity to anticipate these problems and opportunities is swamped by their number, novelty, speed of arrival, and complexity.

[…]

What I am particularly worried about is that humans will be less and less able to appreciate what they should really be worrying about and that their worries will do more harm than good. Maybe, just as on a boat in rapids, one should try not to slowdown anything but just to optimize a trajectory one does not really control, not because safety is guaranteed and optimism is justified — the worst could happen — but because there is no better option than hope.

Mathematician and economist Eric R. Weinstein considers our conventional wisdom on what it takes to cultivate genius, including the myth of the 10,000 hours rule, and argues instead that the pursuit of excellence is a social malady that gets us nowhere meaningful:

We cannot excel our way out of modern problems. Within the same century, we have unlocked the twin nuclei of both cell and atom and created the conditions for synthetic biological and even digital life with computer programs that can spawn with both descent and variation on which selection can now act. We are in genuinely novel territory which we have little reason to think we can control; only the excellent would compare these recent achievements to harmless variations on the invention of the compass or steam engine. So surviving our newfound god-like powers will require modes that lie well outside expertise, excellence, and mastery.

Going back to Sewall Wright’s theory of adaptive landscapes of fitness, we see four modes of human achievement paired with what might be considered their more familiar accompanying archetypes:

A) Climbing—Expertise: Moving up the path of steepest ascent towards excellence for admission into a community that holds and defends a local maximum of fitness.

B) Crossing—Genius: Crossing the ‘Adaptive Valley’ to an unknown and unoccupied even higher maximum level of fitness.

C) Moving—Heroism: Moving ‘mountains of fitness’ for one’s group.

D) Shaking—Rebellion: Leveling peaks and filling valleys for the purpose of changing the landscape to be more even.

The essence of genius as a modality is that it seems to reverse the logic of excellence.

He adds the famous anecdote of Feynman’s Challenger testimony:

In the wake of the Challenger disaster, Richard Feynman was mistakenly asked to become part of the Rogers commission investigating the accident. In a moment of candor Chairman Rogers turned to Neil Armstrong in a men’s room and said “Feynman is becoming a real pain.” Such is ever the verdict pronounced by steady hands over great spirits. But the scariest part of this anecdote is not the story itself but the fact that we are, in the modern era, now so dependent on old Feynman stories having no living heroes with which to replace him: the ultimate tragic triumph of runaway excellence.

This view, however, is remarkably narrow and defeatist. As Voltaire memorably remarked, “Appreciation is a wonderful thing: It makes what is excellent in others belong to us as well.” Without appreciation for the Feynmans of the past we duly don our presentism blinders and refuse to acknowledge the fact that genius is a timeless quality that belongs to all ages, not a cultural commodity of the present. Many of our present concerns have been addressed with enormous prescience in the past, often providing more thoughtful and richer answers than we are able to today, whether it comes to the value of space exploration or the economics of media or the essence of creativity or even the grand question of how to live. Having “living heroes” is an admirable aspiration, but they should never replace — only enhance and complement — the legacy and learnings of those who came before.

Indeed, this presentism bias is precisely what Noga Arikha, historian of ideas and author of Passions and Tempers: A History of the Humours, points to as her greatest worry in one of the most compelling answers. It’s something I’ve voiced as well in a recent interview with the Guardian. Arikha writes:

I worry about the prospect of collective amnesia.

While access to information has never been so universal as it is now — thanks to the Internet — the total sum of knowledge of anything beyond the present seems to be dwindling among those people who came of age with the Internet. Anything beyond 1945, if then, is a messy, remote landscape; the centuries melt into each other in an insignificant magma. Famous names are flickers on a screen, their dates irrelevant, their epochs dusty. Everything is equalized.

She points to a necessary antidote to this shallowing of our cultural hindsight:

There is a way out: by integrating the teaching of history within the curricula of all subjects—using whatever digital or other means we have to redirect attention to slow reading and old sources. Otherwise we will be condemned to living without perspective, robbed of the wisdom and experience with which to build for the future, confined by the arrogance of our presentism to repeating history without noticing it.

Berkeley developmental psychologist Alison Gopnik, author of The Philosophical Baby: What Children’s Minds Tell Us About Truth, Love, and the Meaning of Life, worries that much of modern parenting is concerned with the wrong things — particularly the push for overachievement — when evidence strongly indicates that the art of presence is the most important gift a parent can bestow upon a child:

Thinking about children, as I do for a living, and worrying go hand in hand. There is nothing in human life so important and urgent as raising the next generation, and yet it also feels as if we have very little control over the outcome. . . .

[But] “parenting” worries focus on relatively small variations in what parents and children do — co-sleeping or crying it out, playing with one kind of toy rather than another, more homework or less. There is very little evidence that any of this make much difference to the way that children turn out in the long run. There is even less evidence that there is any magic formula for making one well-loved and financially supported child any smarter or happier or more successful as an adult than another.

Instead, she argues, it is neglect that parents should be most worried about — a moral intuition as old as the world, yet one lamentably diluted by modern parents’ misguided concerns:

More recently research into epigenetics has helped demonstrate just how the mechanisms of care and neglect work. Research in sociology and economics has shown empirically just how significant the consequences of early experience actually can be. The small variations in middle-class “parenting” make very little difference. But providing high-quality early childhood care to children who would otherwise not receive it makes an enormous and continuing difference up through adulthood. In fact, the evidence suggests that this isn’t just a matter of teaching children particular skills or kinds of knowledge—a sort of broader institutional version of “parenting.” Instead, children who have a stable, nurturing, varied early environment thrive in a wide range of ways, from better health to less crime to more successful marriages. That’s just what we’d expect from the evolutionary story. I worry more and more about what will happen to the generations of children who don’t have the uniquely human gift of a long, protected, stable childhood.

Journalist Rolf Dobelli, author of The Art of Thinking Clearly, offers an almost Alan Wattsian concern about the paradox of material progress:

As mammals, we are status seekers. Non-status seeking animals don’t attract suitable mating partners and eventually exit the gene pool. Thus goods that convey high status remain extremely important, yet out of reach for most of us. Nothing technology brings about will change that. Yes, one day we might re-engineer our cognition to reduce or eliminate status competition. But until that point, most people will have to live with the frustrations of technology’s broken promise. That is, goods and services will be available to everybody at virtually no cost. But at the same time, status-conveying goods will inch even further out of reach. That’s a paradox of material progress.

Columbia biologist Stuart Firestein, author of the fantastic Ignorance: How It Drives Science and champion of “thoroughly conscious ignorance,” worries about our unreasonable expectations of science:

Much of science is failure, but it is a productive failure. This is a crucial distinction in how we think about failure. More importantly is that not all wrong science is bad science. As with the exaggerated expectations of scientific progress, expectations about the validity of scientific results have simply become overblown. Scientific “facts” are all provisional, all needing revision or sometimes even outright upending. But this is not bad; indeed it is critical to continued progress. Granted it’s difficult, because you can’t just believe everything you read. But let’s grow up and recognize that undeniable fact of life. . . .

So what’s the worry? That we will become irrationally impatient with science, with it’s wrong turns and occasional blind alleys, with its temporary results that need constant revision. And we will lose our trust and belief in science as the single best way to understand the physical universe. . . . From a historical perspective the path to discovery may seem clear, but the reality is that there are twists and turns and reversals and failures and cul de sacs all along the path to any discovery. Facts are not immutable and discoveries are provisional. This is the messy process of science. We should worry that our unrealistic expectations will destroy this amazing mess.

Neuroscientist Sam Harris, who has previously explored the psychology of lying, is concerned about bad incentives that bring out the worst in us, as individuals and as a society:

We need systems that are wiser than we are. We need institutions and cultural norms that make us better than we tend to be. It seems to me that the greatest challenge we now face is to build them.

Writer Douglas Rushkoff, author of Present Shock: When Everything Happens Now, offers a poignant and beautifully phrased, if exceedingly anthropocentric, concern:

We should worry less about our species losing its biosphere than losing its soul.

Our collective perceptions and cognition is our greatest evolutionary achievement. This is the activity that gives biology its meaning. Our human neural network is in the process of deteriorating and our perceptions are becoming skewed — both involuntarily and by our own hand — and all that most of us in the greater scientific community can do is hope that somehow technology picks up the slack, providing more accurate sensors, faster networks, and a new virtual home for complexity.

We should worry such networks won’t be able to function without us; we should also worry that they will.

Harvard’s Lisa Randall, one of the world’s leading theoretical physicists and the author of, most recently, Knocking on Heaven’s Door: How Physics and Scientific Thinking Illuminate the Universe and the Modern World, worries about the decline in major long-term investments in research, the kind that made the Large Hadron Collider possible, which would in turn diminish our capacity for exploring the most intensely fascinating aspects of the unknown:

I’m worried I won’t know the answer to questions I care deeply about. Theoretical research (what I do) can of course be done more cheaply. A pencil and paper and even a computer are pretty cheap. But without experiments, or the hope of experiments, theoretical science can’t truly advance either.

One of the most poignant answers comes from psychologist Susan Blackmore, author of Consciousness: An Introduction, who admonishes that we’re disconnecting our heads from our hands by outsourcing so much of our manual humanity to machines, in the process amputating the present for the sake of some potential future. She writes:

What should worry us is that we seem to be worrying more about the possible disasters that might befall us than who we are becoming right now.

From ‘Things I have learned in my life so far’ by Stefan Sagmeister.
Click image for details.

What Should We Be Worried About? is an awakening read in its entirety. For more of Brockman’s editorial-curatorial mastery, revisit the Edge Question compendiums from 2013 and 2012, and see Nobel-winning behavioral economist Daniel Kahneman on the marvels and flaws of our intuition.

BP

Things I Have Learned in My Life So Far: Sagmeister’s Typographic Maxims on Life, Updated

Lived wisdom in living lettering.

About a decade ago, Stefan Sagmeister, one of the most celebrated and influential designers of our time, began keeping a running list of life-learnings in his diary. Eventually, he translated these private thoughts into a series of typographic artworks and public installations at the intersection of the personal and the philosophical, creating a new genre of metaphoric lettering, which ended up among the 100 ideas that changed graphic design and which he collected in a gorgeous artifact of a book in 2006.

A new updated edition of Things I Have Learned in My Life So Far (public library) published by Abrams capitalizes on the “so far” portion of the premise by complementing all of Sagmeister’s original learnings with 48 additional pages exploring new ones that touch on everything from obsession to confidence to love, contextualized by a triumvirate of great minds: Design critic extraordinaire Steven Heller, psychologist Daniel Nettle, and Guggenheim curator Nancy Spector.

What makes Sagmeister’s maxims so beautiful and so moving is that, rather than mindless aphorisms dispensed as vacant cultural currency, they are the lived and living truths of a man who approaches his life with equal parts humor and humility, vigor and vulnerability.

One explores our shared propensity to worry (especially about sensitive subjects like money) and the immutable human desire to, as Italo Calvino memorably put it, lower our “worryability.” Sagmeister writes:

I used to lie awake at night brooding over problems that came up during the day. It kept me from sleeping, it was not enjoyable, and most importantly, I never arrived at a solution for anything — a remarkably effective way to be miserable.

(Cue in this great read on what the psychology of suicide-prevention teaches us about controlling our everyday worries.)

Another, a collaboration with Japanese illustrator Yuko Shimizu, enlists the whimsical to reveal the real:

Then there is the burden of vanity, the most extreme cultural symptoms of which are nothing short of heartbreaking:

Another captures the way in which the same compulsive drive that propels our most successful work poisons our inner lives:

Sagmeister writes:

I rarely obsess about things in my private life. I fail to care about the right shade of green for the couch, the sexual adventures of an ex-lover, or the correct setting for the meeting room air conditioner. I am not someone who misses things that aren’t already there.

However, I do sometimes obsess over the studio’s work and think that a number of our better projects come out of such obsessions.

Accompanying the artwork are also entertaining anecdotes that serve as an additional narrative about the unpredictability of life, perhaps a meta-maxim about how trying to control life into order only produces more chaos. Case in point: This coin project was inspired by the natural grid of the stone plates covering the urban plaza, which the city of Amsterdam lent Sagmeister for the endeavor. The installation consists of 650,000 Euro cents and took 100 volunteers to assemble over the course of the week. But the real beauty of it was a subtle experiment in psychology and behavioral economics: Sagmeister and his team had intended to leave the finished piece in the plaza unguarded, waiting for passers-by to disassemble it; they had painted one side of each coin a distinctive bright blue so they could track how the money traveled across Europe on a special site dedicated to the project. (Inspired, perhaps, by the Follow the Money project.) But everything, true to the workings of the human condition, took an unexpected turn: Having witnessed the laborious weeklong assembling, people in the neighborhood took a special fondness to the project and they took it upon themselves to guard the plaza from potential coin-takers. When a man attempted to carry away a bag full of coins, a neighbor immediately called the Dutch police, who proceeded to sweep in and sweep all the coins into buckets, transporting them to the Amsterdam police headquarters. The completed project didn’t even last until sunrise.

But as a lover of diaries, a proponent of Joan Didion’s conviction that keeping a journal enriches the soul, and a dedicated diarist myself, I find this one most captivating — doubly so for the beautiful symmetry to how the project began:

Sagmeister, who has kept a diary since almost as early an age as Anaïs Nin, explains:

I have kept a diary since I was twelve years old. . . . I do use the diary to go back and reread certain passages, to see what my thinking was, and, most importantly, to discover things I feel need changing: When I have repeatedly described a circumstance or character trait of mine that I dislike, I eventually wind up doing something about it.

Things I Have Learned in My Life So Far features 18 gorgeous unbound signatures, tucked into a laser-cut slipcase. Complement it with Sagmeister on the fear of failure and how to sustain creativity.

All images copyright © 2013 Stefan Sagmeister courtesy of Abrams

BP

View Full Site

The Marginalian participates in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn commissions by linking to Amazon. In more human terms, this means that whenever you buy a book on Amazon from any link on here, I receive a small percentage of its price, which goes straight back into my own colossal biblioexpenses. Privacy policy. (TLDR: You're safe — there are no nefarious "third parties" lurking on my watch or shedding crumbs of the "cookies" the rest of the internet uses.)