The Marginalian
The Marginalian

Search results for “mark twain”

Joan Didion on Learning Not to Mistake Self-Righteousness for Morality

“When we start deceiving ourselves into thinking not that we want something or need something … but that it is a moral imperative that we have it, then is when we join the fashionable madmen.”

Joan Didion on Learning Not to Mistake Self-Righteousness for Morality

“To be a moral human being is to pay, be obliged to pay, certain kinds of attention,” Susan Sontag wrote in what remains some of the finest advice on writing and life. But if beneath the world “morality,” as James Baldwin asserted, “we are confronted with the way we treat each other,” then to be a moral human being requires an especial attentiveness to other human beings and their subjective realities. In consequence, any true morality is the diametric opposite of self-righteousness — the very thing that so often masquerades for morality.

That paradox is what Joan Didion (b. December 5, 1934), a writer who has spent a lifetime mirroring us back to ourselves, examines with characteristic incisiveness in a short 1965 essay titled “On Morality,” found in Slouching Towards Bethlehem (public library) — the classic 1968 essay collection that gave us Didion on keeping a notebook and her timeless meditation on self-respect.

Joan Didion, 1977 (Photograph: Mary Lloyd Estrin)

With an eye to our tendency to mistake for morality what is indeed a “monstrous perversion” of the ego, Didion writes:

“I followed my own conscience.” “I did what I thought was right.” How many madmen have said it and meant it? How many murderers? Klaus Fuchs said it, and the men who committed the Mountain Meadows Massacre said it, and Alfred Rosenberg said it. And, as we are rotely and rather presumptuously reminded by those who would say it now, Jesus said it. Maybe we have all said it, and maybe we have been wrong. Except on that most primitive level — our loyalties to those we love — what could be more arrogant than to claim the primacy of personal conscience?

Half a century later, Didion’s point seems all the more disquieting amid our present culture, where the filter bubble of our loyalties has rendered in-group/out-group divisiveness all the more primitive and where we combat our constant terror of coming unmoored from our certitudes by succumbing to unbridled self-righteousness under the pretext of morality. Didion considers how this tendency has made us less moral rather than more:

You see I want to be quite obstinate about insisting that we have no way of knowing — beyond that fundamental loyalty to the social code — what is “right” and what is “wrong,” what is “good” and what “evil.” I dwell so upon this because the most disturbing aspect of “morality” seems to me to be the frequency with which the word now appears; in the press, on television, in the most perfunctory kinds of conversation. Questions of straightforward power (or survival) politics, questions of quite indifferent public policy, questions of almost anything: they are all assigned these factitious moral burdens. There is something facile going on, some self-indulgence at work.

In a passage of excruciating timeliness today, as we fling our self-righteousnesses at each other from the two-finger slingshot of what was once the peace sign, Didion adds:

Of course we would all like to “believe” in something, like to assuage our private guilts in public causes, like to lose our tiresome selves; like, perhaps, to transform the white flag of defeat at home into the brave white banner of battle away from home. And of course it is all right to do that; that is how, immemorially, things have gotten done. But I think it is all right only so long as we do not delude ourselves about what we are doing, and why. It is all right only so long as we remember that all the ad hoc committees, all the picket lines, all the brave signatures in The New York Times, all the tools of agitprop straight across the spectrum, do not confer upon anyone any ipso facto virtue. It is all right only so long as we recognize that the end may or may not be expedient, may or may not be a good idea, but in any case has nothing to do with “morality.” Because when we start deceiving ourselves into thinking not that we want something or need something, not that it is a pragmatic necessity for us to have it, but that it is a moral imperative that we have it, then is when we join the fashionable madmen, and then is when the thin whine of hysteria is heard in the land, and then is when we are in bad trouble.

Slouching Towards Bethlehem remains an indispensable read. Complement this particular fragment with Mark Twain on morality vs. the intellect, artist Ann Truitt on the cure for our chronic self-righteousness and James Baldwin and Chinua Achebe’s forgotten conversation about morality, then revisit Didion on grief, Hollywood’s diversity problem, and her all-time favorite books.

BP

Genes and the Holy G: Siddhartha Mukherjee on the Dark Cultural History of IQ and Why We Can’t Measure Intelligence

“If the history of medical genetics teaches us one lesson, it is [that] genes cannot tell us how to categorize or comprehend human diversity; environments can, cultures can, geographies can, histories can.”

Genes and the Holy G: Siddhartha Mukherjee on the Dark Cultural History of IQ and Why We Can’t Measure Intelligence

Intelligence, Simone de Beauvoir argued, is not a ready-made quality “but a way of casting oneself into the world and of disclosing being.” Like the rest of De Beauvoir’s socially wakeful ideas, this was a courageously countercultural proposition — she lived in the heyday of the IQ craze, which sought to codify into static and measurable components the complex and dynamic mode of being we call “intelligence.” Even today, as we contemplate the nebulous future of artificial intelligence, we find ourselves stymied by the same core problem — how are we to synthesize and engineer intelligence if we are unable to even define it in its full dimension?

How the emergence of IQ tests contracted our understanding of intelligence rather than expanding it and what we can do to transcend their perilous cultural legacy is what practicing physician, research scientist, and Pulitzer-winning author Siddhartha Mukherjee explores throughout The Gene: An Intimate History (public library) — a rigorously researched, beautifully written detective story about the genetic components of what we experience as the self, rooted in Mukherjee’s own painful family history of mental illness and radiating a larger inquiry into how genetics illuminates the future of our species.

Siddhartha Mukherjee (Photograph: Deborah Feingold)
Siddhartha Mukherjee (Photograph: Deborah Feingold)

A crucial agent in our limiting definition of intelligence, which has a dark heritage in nineteenth-century biometrics and eugenics, was the British psychologist and statistician Charles Spearman, who became interested in the strong correlation between an individual’s high performance on tests assessing very different mental abilities. He surmised that human intelligence is a function not of specific knowledge but of the individual’s ability to manipulate abstract knowledge across a variety of domains. Spearman called this ability “general intelligence,” shorthanded g. Mukherjee chronicles the monumental and rather grim impact of this theory on modern society:

By the early twentieth century, g had caught the imagination of the public. First, it captivated early eugenicists. In 1916, the Stanford psychologist Lewis Terman, an avid supporter of the American eugenics movement, created a standardized test to rapidly and quantitatively assess general intelligence, hoping to use the test to select more intelligent humans for eugenic breeding. Recognizing that this measurement varied with age during childhood development, Terman advocated a new metric to quantify age-specific intelligence. If a subject’s “mental age” was the same as his or her physical age, their “intelligence quotient,” or IQ, was defined as exactly 100. If a subject lagged in mental age compared to physical age, the IQ was less than a hundred; if she was more mentally advanced, she was assigned an IQ above 100.

A numerical measure of intelligence was also particularly suited to the demands of the First and Second World Wars, during which recruits had to be assigned to wartime tasks requiring diverse skills based on rapid, quantitative assessments. When veterans returned to civilian life after the wars, they found their lives dominated by intelligence testing.

Illustration by Emily Hughes from Wild

Because categories, measurements, and labels help us navigate the world and, in Umberto Eco’s undying words, “make infinity comprehensible,” IQ metrics enchanted the popular imagination with the convenient illusion of neat categorization. Like any fad that offers a shortcut for something difficult to achieve, they spread like wildfire across the societal landscape. Mukherjee writes:

By the early 1940s, such tests had become accepted as an inherent part of American culture. IQ tests were used to rank job applicants, place children in school, and recruit agents for the Secret Service. In the 1950s, Americans commonly listed their IQs on their résumés, submitted the results of a test for a job application, or even chose their spouses based on the test. IQ scores were pinned on the babies who were on display in Better Babies contests (although how IQ was measured in a two-year-old remained mysterious).

These rhetorical and historical shifts in the concept of intelligence are worth noting, for we will return to them in a few paragraphs. General intelligence (g) originated as a statistical correlation between tests given under particular circumstances to particular individuals. It morphed into the notion of “general intelligence” because of a hypothesis concerning the nature of human knowledge acquisition. And it was codified into “IQ” to serve the particular exigencies of war. In a cultural sense, the definition of g was an exquisitely self-reinforcing phenomenon: those who possessed it, anointed as “intelligent” and given the arbitration of the quality, had every incentive in the world to propagate its definition.

With an eye to evolutionary biologist Richard Dawkins’s culture-shaping coinage of the word “meme”“Just as genes propagate themselves in the gene pool by leaping from body to body via sperms or eggs,” Dawkins wrote in his 1976 classic The Selfish Gene, “so memes propagate themselves in the meme pool by leaping from brain to brain.” — Mukherjee argues that g became a self-propagating unit worthy of being thought of as “selfish g.” He writes:

It takes counterculture to counter culture — and it was only inevitable, perhaps, that the sweeping political movements that gripped America in the 1960s and 1970s would shake the notions of general intelligence and IQ by their very roots. As the civil rights movement and feminism highlighted chronic political and social inequalities in America, it became evident that biological and psychological features were not just inborn but likely to be deeply influenced by context and environment. The dogma of a single form of intelligence was also challenged by scientific evidence.

Illustration by Vladimir Radunsky for On a Beam of Light: A Story of Albert Einstein by Jennifer Berne

Along came social scientists like Howard Gardner, whose germinal 1983 Theory of Multiple Intelligences set out to upend the tyranny of “selfish g” by demonstrating that human acumen exists along varied dimensions, subtler and more context-specific, not necessarily correlated with one another — those who score high on logical/mathematical intelligence, for instance, may not score high on bodily/kinesthetic intelligence, and vice versa. Mukherjee considers the layered implications for g and its active agents:

Is g heritable? In a certain sense, yes. In the 1950s, a series of reports suggested a strong genetic component. Of these, twin studies were the most definitive. When identical twins who had been reared together — i.e., with shared genes and shared environments — were tested in the early fifties, psychologists had found a striking degree of concordance in their IQs, with a correlation value of 0.86. In the late eighties, when identical twins who were separated at birth and reared separately were tested, the correlation fell to 0.74 — still a striking number.

But the heritability of a trait, no matter how strong, may be the result of multiple genes, each exerting a relatively minor effect. If that was the case, identical twins would show strong correlations in g, but parents and children would be far less concordant. IQ followed this pattern. The correlation between parents and children living together, for instance, fell to 0.42. With parents and children living apart, the correlation collapsed to 0.22. Whatever the IQ test was measuring, it was a heritable factor, but one also influenced by many genes and possibly strongly modified by environment — part nature and part nurture.

The most logical conclusion from these facts is that while some combination of genes and environments can strongly influence g, this combination will rarely be passed, intact, from parents to their children. Mendel’s laws virtually guarantee that the particular permutation of genes will scatter apart in every generation. And environmental interactions are so difficult to capture and predict that they cannot be reproduced over time. Intelligence, in short, is heritable (i.e., influenced by genes), but not easily inheritable (i.e., moved down intact from one generation to the next).

And yet the quest for the mythic holy grail of general intelligence persisted and took us down paths not only questionable but morally abhorrent by our present standards. In the 1980s, scientists conducted numerous studies demonstrating a discrepancy in IQ across the races, with white children scoring higher than their black peers. While the controversial results initially provided rampant fodder for racists, they also provided incentive for scientists to do what scientists must — question the validity of their own methods. In a testament to trailblazing philosopher Susanne Langer’s assertion that the way we frame our questions shapes our answers, it soon became clear that these IQ tests weren’t measuring the mythic g but, rather, reflected the effects of contextual circumstances like poverty, illness, hunger, and educational opportunity. Mukherjee explains:

It is easy to demonstrate an analogous effect in a lab: If you raise two plant strains — one tall and one short — in undernourished circumstances, then both plants grow short regardless of intrinsic genetic drive. In contrast, when nutrients are no longer limiting, the tall plant grows to its full height. Whether genes or environment — nature or nurture — dominates in influence depends on context. When environments are constraining, they exert a disproportionate influence. When the constraints are removed, genes become ascendant.

[…]

If the history of medical genetics teaches us one lesson, it is to be wary of precisely such slips between biology and culture. Humans, we now know, are largely similar in genetic terms — but with enough variation within us to represent true diversity. Or, perhaps more accurately, we are culturally or biologically inclined to magnify variations, even if they are minor in the larger scheme of the genome. Tests that are explicitly designed to capture variances in abilities will likely capture variances in abilities — and these variations may well track along racial lines. But to call the score in such a test “intelligence,” especially when the score is uniquely sensitive to the configuration of the test, is to insult the very quality it sets out to measure.

Genes cannot tell us how to categorize or comprehend human diversity; environments can, cultures can, geographies can, histories can. Our language sputters in its attempt to capture this slip. When a genetic variation is statistically the most common, we call it normal — a word that implies not just superior statistical representation but qualitative or even moral superiority… When the variation is rare, it is termed a mutant — a word that implies not just statistical uncommonness, but qualitative inferiority, or even moral repugnance.

And so it goes, interposing linguistic discrimination on genetic variation, mixing biology and desire.

Illustration by Lisbeth Zwerger for a special edition of the fairy tales of the Brothers Grimm

Intelligence, it turns out, is as integrated and indivisible as what we call identity, which the great Lebanese-born French writer Amin Maalouf likened to an intricate pattern drawn on a tightly stretched drumhead. “Touch just one part of it, just one allegiance,” he wrote, “and the whole person will react, the whole drum will sound.” Indeed, it is to identity that Mukherjee points as an object of inquiry far apter than intelligence in understanding personhood. In a passage emblematic of the elegance with which he fuses science, cultural history, and lyrical prose, Mukherjee writes:

Like the English novel, or the face, say, the human genome can be lumped or split in a million different ways. But whether to split or lump, to categorize or synthesize, is a choice. When a distinct, heritable biological feature, such as a genetic illness (e.g., sickle-cell anemia), is the ascendant concern, then examining the genome to identify the locus of that feature makes absolute sense. The narrower the definition of the heritable feature or the trait, the more likely we will find a genetic locus for that trait, and the more likely that the trait will segregate within some human subpopulation (Ashkenazi Jews in the case of Tay-Sachs disease, or Afro-Caribbeans for sickle-cell anemia). There’s a reason that marathon running, for instance, is becoming a genetic sport: runners from Kenya and Ethiopia, a narrow eastern wedge of one continent, dominate the race not just because of talent and training, but also because the marathon is a narrowly defined test for a certain form of extreme fortitude. Genes that enable this fortitude (e.g., particular combinations of gene variants that produce distinct forms of anatomy, physiology, and metabolism) will be naturally selected.

Conversely, the more we widen the definition of a feature or trait (say, intelligence, or temperament), the less likely that the trait will correlate with single genes — and, by extension, with races, tribes, or subpopulations. Intelligence and temperament are not marathon races: there are no fixed criteria for success, no start or finish lines — and running sideways or backward, might secure victory. The narrowness, or breadth, of the definition of a feature is, in fact, a question of identity — i.e., how we define, categorize, and understand humans (ourselves) in a cultural, social, and political sense. The crucial missing element in our blurred conversation on the definition of race, then, is a conversation on the definition of identity.

Complement this particular portion of the wholly fascinating The Gene with young Barack Obama on identity and the search for a coherent self and Mark Twain on intelligence vs. morality, then revisit Schopenhauer on what makes a genius.

BP

The Emperor of Time: A Dreamlike Short Film About Motion Picture Pioneer Eadweard Muybridge

A foundational story of modern culture, told from the point of view of an abandoned son and viewed through an antiquated device.

“Muybridge was a doorway, a pivot between that old world and ours, and to follow him is to follow the choices that got us here,” Rebecca Solnit wrote in her masterwork about how Eadweard Muybridge annihilated space and time by inventing motion pictures.

That weird and wondrous doorway is what filmmaker and animator Drew Christie (who previously animated my essay on wisdom in the age of information) explores in The Emperor of Time — a dreamlike short film, part fiction and part nonfiction, about Muybridge’s odd and momentous life, told from the point of view of his abandoned son and viewed entirely through a mutoscope, an early motion picture device invented by animation pioneer Winsor McCay.

I’ve heard it said that in Ancient China, the emperor owned time — for all his subjects, their time was not their own but his. He could set the calendar to whatever he wanted, make hours or days as short or as long as he wanted.

My father was also an emperor of time. He was the first man who stared at time itself and said: “Stop.”

Complement with Solnit’s indispensable treatise on Muybridge and his cultural legacy, then revisit Christie’s lovely animated short film about Mark Twain and the illusion of originality.

BP

Don’t Heed the Haters: Albert Einstein’s Wonderful Letter of Support to Marie Curie in the Midst of Scandal

“If the rabble continues to occupy itself with you, then simply don’t read that hogwash, but rather leave it to the reptile for whom it has been fabricated.”

Don’t Heed the Haters: Albert Einstein’s Wonderful Letter of Support to Marie Curie in the Midst of Scandal

Few things are more disheartening to witness than the bile which small-spirited people of inferior talent often direct at those endowed with genius. And few things are more heartening to witness than the solidarity and support which kindred spirits of goodwill extend to those targeted by such loathsome attacks.

In 1903, Marie Curie (November 7, 1867–July 4, 1934) became the first woman to win the Nobel Prize. It was awarded jointly to her and her husband, Pierre, for their pioneering research on radioactivity. On April 19, 1906, she was widowed by an accident all the more tragic for its improbability. While crossing a busy Parisian street on a rainy night, Pierre slipped, fell under a horse-drawn cart, and was killed instantly. Curie grieved for years. In 1910, she found solace in Pierre’s protégé — a young physics professor named Paul Langevin, married to but separated from a woman who physically abused him. They became lovers. Enraged, Langevin’s wife hired someone to break into the apartment where the two met and steal their love letters, which she promptly leaked to the so-called press. The press eviscerated Curie and portrayed her as “a foreign Jewish homewrecker.”

Upon returning from a historic invitation-only science conference in Brussels, where she had met Albert Einstein (March 14, 1879–April 18 1955), Curie found an angry mob in front of her home in Paris. She and her daughters were forced to stay with a family friend.

At the 1911 Solvay Conference. Curie leaning on table. Einstein second from right. Also in attendance: Max Planck, Henri Poincaré, and Ernest Rutherford.
At the 1911 Solvay Conference. Curie leaning on table. Einstein second from right. Also in attendance: Max Planck, Henri Poincaré, and Ernest Rutherford.

Einstein considered Curie “an unpretentious honest person” with a “sparkling intelligence.” When he got news of the scandal, he was outraged by the tastelessness and cruelty of the press — the tabloids had stripped a private situation of all humanity and nuance, and brought it into the public realm with the deliberate intention of destroying Curie’s scientific reputation.

A master of beautiful consolatory letters and a champion of kindness as a central animating motive of life, Einstein wrote to Curie with wholehearted solidarity and support, encouraging her not to give any credence to the hateful commentaries in the press. The letter, found in Walter Isaacson’s terrific biography Einstein: His Life and Universe (public library), is a testament to the generosity of spirit that accompanied Einstein’s unparalleled intellect — a masterwork of what he himself termed “spiritual genius.”

curieeinstein

Einstein, who would later remark that “Marie Curie is, of all celebrated beings, the only one whom fame has not corrupted,” writes:

Highly esteemed Mrs. Curie,

Do not laugh at me for writing you without having anything sensible to say. But I am so enraged by the base manner in which the public is presently daring to concern itself with you that I absolutely must give vent to this feeling. However, I am convinced that you consistently despise this rabble, whether it obsequiously lavishes respect on you or whether it attempts to satiate its lust for sensationalism! I am impelled to tell you how much I have come to admire your intellect, your drive, and your honesty, and that I consider myself lucky to have made your personal acquaintance in Brussels. Anyone who does not number among these reptiles is certainly happy, now as before, that we have such personages among us as you, and Langevin too, real people with whom one feels privileged to be in contact. If the rabble continues to occupy itself with you, then simply don’t read that hogwash, but rather leave it to the reptile for whom it has been fabricated.

With most amicable regards to you, Langevin, and Perrin, yours very truly,

A. Einstein

Shortly after the scandal, Curie received her second Nobel Prize — this time in chemistry, for her discovery of the elements radium and polonium. To this day the only person awarded a Nobel Prize in two different sciences, she endures as one of humanity’s most visionary and beloved minds. The journalists who showered her with bile are known to none and deplored by all.

Complement with Kierkegaard on why haters hate and Anne Lamott’s definitive manifesto for how to handle them, then revisit Mark Twain’s witty and wise letter of support to Helen Keller when she was wrongly accused of plagiarism and Frida Kahlo’s compassionate letter to Georgia O’Keeffe after the American painter was hospitalized with a nervous breakdown.

BP

View Full Site

The Marginalian participates in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn commissions by linking to Amazon. In more human terms, this means that whenever you buy a book on Amazon from any link on here, I receive a small percentage of its price, which goes straight back into my own colossal biblioexpenses. Privacy policy. (TLDR: You're safe — there are no nefarious "third parties" lurking on my watch or shedding crumbs of the "cookies" the rest of the internet uses.)