May 11th, 2013

To Divine Is Human

By Nancy Mattina

 

 

 

 

 

 

 

 

Photo Credit:  Eduardo Amorim

PDF: MattinaJSESpring2013

To Divine Is Human

 

Nancy Mattina, Prescott College

 

Abstract The human capacity for scientific thinking is an innate one that coexists with our ability to intuit, believe, and invent. In crafting engaging narratives that urge our readers or students to think and act rationally on behalf of our imperiled biosphere, writers who are not scientists should take care not to sustain negative stereotypes of science and scientists in their commentary, even if some of our greatest storytellers have done so.

Keywords  science, scientists, trial and error, human error, environmental literature, popular science writing

 

Despite all the ardent prose glowing from the electronic gadgets that surround me, I still find myself browsing my undusted shelves for something to read. I rarely buy bound books anymore, which is why my collection of mostly paperback editions reflects the quirky canon I came of age on: Henry Miller, Kazantzakis, Joyce Carol Oates, James, Zola, Gordimer, Bellow, Steinbeck, Austen, Heinlein, Flaubert, Dostoyevsky, and the like. These decorated sentinels have long lined my walls, the listing pillars of my literary crèche, the ones who expected me to think about the world as it was and might be. I don’t sell them off even though the stories they tell have since ascended spotlessly to the digital cloud.

 

Truth be told, in my late twenties I stopped reading novels. Fiction seemed to have betrayed me. Trying to live the examined life through imagined others left me churlish and awkward however adroitly I parsed the lives of each flawed protagonist. Or maybe deconstructionism, for a spell the opium of the literate, was the spoiler. Either way, although literature had been my undergraduate religion, I began to lean toward scientific explanation, like a fatted calf bunting for mother’s milk. An artfully told yarn continues to please. But recognizing that my species threatens the existence of all others has reshaped my reading priorities as well as my judgment.

 

For example, the other day I ran a bitten finger down the bright orange spine of a Penguin paperback containing D.H. Lawrence’s essays, Etruscan Places. Lawrence wrote them not long before he died  and in the same season that he finished writing Lady Chatterley’s Lover, the spring of 1927, the novel that Anaïs Nin would defend as “artistically…his best novel…our only complete modern love story.” Lawrence’s reputation by this time has run the gamut of visionary, pornographer, radical, and chauvinist on the loop of popular opinion but most would grant that he sought to be a truth-teller, his famous dictum “Never trust the artist. Trust the tale,” a case in point.

 

As a social critic, Lawrence analyzed the human condition with tools we call biases now. In his non-fiction he never shied from announcing his conclusions in a studied, sometimes antic voice. He begins “Cerveteri”, the first Etruscan essay, with a tone as wry as any blogger’s.

 

“The Etruscans, as everyone knows, were the people who occupied the middle of Italy in early Roman days  and whom the Romans, in their usual neighborly fashion, wiped out entirely in order to make room for Rome with a very big R. They couldn’t have wiped them all out, there were too many of them. But they did wipe out the Etruscan existence as a nation and a people. However, this seems to be the inevitable result of expansion with a big E, which is the sole raison d’etre of people like the Romans.”

 

This opening volley against empire rolls straight across the weedy hush of Mussolini’s back forty to rest at Lawrence’s favorite angle of critical repose—our preference for Enlightenment pieties like rationalism and progress as substitutes for passionate connection with our sacred selves, what William Blake dubbed “blood-consciousness.” We’re quite wrong to overestimate the power of reason and science says Lawrence.

 

“The science of augury certainly was not exact science. But it was as exact as our sciences of psychology or political economy. And the augurs were as clever as our politicians, who also must practice divination, if ever they are to do anything worth the name. There is no other way when you are dealing with life. And if you live by the cosmos, you look in the cosmos for your clue. If you live by a personal god, you pray to him. If you are rational, you think things over. But it all amounts to the same thing in the end. Prayer, or thought, or studying the stars, or watching the flight of birds, or studying the entrails of the sacrifice, it is all the same process ultimately: of divination.  All it depends on is the amount of true, sincere, religious concentration you can bring to bear on your object.”

 

If this seems a rare verbal hug we could glean from today’s commentariat, then readers familiar with Lawrence’s essentialism will suspect this isn’t the end of his homily. And it isn’t.

 

“Whatever object will bring the consciousness into a state of pure attention, in a time of perplexity, will also give back an answer to the perplexity,” he continues. “But it is truly a question of divination,” he insists.

 

“As soon as there is any pretence of infallibility, and pure scientific calculation, the whole thing becomes a fraud and a jugglery. But the same is true not only of augury and astrology, but also of prayer and of pure reason, and even of the discoveries of the great laws and principles of science. Every great discovery or decision comes by an act of divination. Facts are fitted round afterwards.”

 

On rereading these paragraphs, I feel the old arguments rise like tumuli from the Etrurian heartland: Are mystical and scientific pursuits one in the same? Is science just another form of human make-believe?

 

Scientists as soothsayers, lab-coated haruspices, groping animal entrails and reciting received ideas. No, that literary invention will not fly. Any bench scientist will tell you that fitting facts round your intuitions or convictions rather than enacting a measured, public, reproducible, controlled experiment will get you fired, reviled, or, worst of all, ignored. And Lawrence is equally mistaken about science being a pretender to infallibility. Scientific seeking is predicated on our capacity for error, not a claim to infallibility or its doppelgänger, perfectibility.

“Mistakes are at the very base of human thought, embedded there, feeding the structure like root nodules. If we were not provided with the knack of being wrong, we could never get anything useful done.” So writes Lewis Thomas, physician, immunologist, poet, and columnist for The New England Journal of Medicine, in his essay titled “To Err is Human.”

“We think our way along by choosing between right and wrong alternatives, and the wrong choices have to be made as frequently as the right ones. We get along in life this way. We are built to make mistakes, coded for error.”

Far from presuming human perfectibility—a conceit that Lawrence roasts in a raucous essay on  Benjamin Franklin in Studies in Classic American Literature—science as a way of knowing disrupts the ego, calling into question our every perception. Thomas again:

“If we had only a single center in our brains, capable of responding only when a correct decision was to be made, instead of the jumble of different credulous, easily conned clusters of neurons that provide for being flung off into blind alleys…we could only stay the way we are today, stuck fast.”

Trial with error “open[s] the way” toward truth marbled with new errors.

There’s an emotional upside to consciously embracing our capacity for error and gullibility. Thomas calls it being “at our human finest, dancing with our minds.” But there’s also a tremendous evolutionary advantage conferred by it. “What is needed, for progress to be made,” writes Thomas, “is the move based on the error.” Only the scientific method generates an infinite number of questions, falsifiable claims, ambiguous evidence, and troves of fact. We use these to constantly redraw the topography of human ignorance on behalf of our species. Two steps forward, one back—then off in a new direction. The answers offered by augury and religion are designed to resist entropy, arising as they do in a closed system. By contrast, science thrives on accidents, revisions, and change. Doing science is adaptive behavior that may account for our success as a species more than any other human capacity we’ve exploited, after language. Like art, science draws us into a poignant tango with a carnal universe.

People who don’t think of themselves as scientists may become impatient with the pace of science  because they demand answers—too often billed as sensational discoveries by lone diviners—as the only excuse for science. And helpful answers do ensue. Lawrence’s short life transpired between a brief published note in 1875 on the antibacterial effects of household fungi  and the first mass production of penicillin in 1944, in time to save thousands of souls off the beaches of Normandy. We can’t know if his view of scientific calculation might have changed had he witnessed how hundreds of prepared people developed a community that would learn to convert a series of mishaps, false starts, and a perfectly moldy cantalope into a painless cure for the tuberculosis that killed him. This cultural adaptation to the daily threat of infectious diseases exploited our innate capacity for cooperation and altruism, two necessary (if not sufficient) human talents spiritualists praise but often fail to evoke.

And let’s not forget that science, despite its reputation as an elite calling, is essentially egalitarian. “Science belongs to everybody, ” naturalist and writer E.O. Wilson explains. “Its constituent parts can be challenged by anybody in the world who has sufficient information to do so.” There is no scientific truth until it has been patiently sought, tested, and reproduced by a community of skeptics keen to find the error in a fellow scientist’s work.

True, much wrong-doing is attributed to science by our writers, including the industrialized warfare that Lawrence sought to counter with the trope of divination. Weaponized nuclear fission, thalidomide, Agent Orange, Round Up®, armed drones, and toxically engineered crops stand as symbols for the grave moral errors of our times. Instead of insisting at every opportunity that the crimes against the biosphere we read of daily stem from institutionalized greed, malice, intolerance, or pride, contemporary social critics tend to demonize science itself rather the systems of counterfactual belief and power we humans deploy to do harm. Meanwhile, to critique religion and the occult is blasphemy in the mouths of mainstream media figures. It is even fashionable to repeat that science and technology threaten us, quoting narratives queasily parallel to those describing nature as our primeval foe. In the tabloids, scientific debate is remade as gossip; scientists earn fame as conspirators. Yet the broad sweep of chronicled time shows us that the scientific truths we have stumbled upon are not “a fraud and a jugglery” by clever initiates. Be they handy or horrible, scientific truths juggle us. They force us to decide between doing right or serving a few. Our collective struggle to be moral finds an easy scapegoat in science dismally applied.

Advances in allopathic medicine garner popular approval but many fields of science improve us. As cognitive scientist Steven Pinker points out, “The X-ray vision of the molecular geneticist reveals the unity of our species,” in spite of our perception that skin color divides us. With a single book, Silent Spring, marine biologist Rachel Carson revolutionized popular notions of water and air, substances we thought we knew from long familiarity. Her science-telling showed clearly that one group’s conveniences spelled doom for another. Because polar caps don’t calve in most people’s backyards and declining birth weights in endangered species easily elude our notice, we need scientists across the globe to patiently observe without help from the supernatural, gathering and charting the data so we can see what we didn’t suspect was true.

 

Science-telling, however, is only as good as its readers. More Americans accept as fact the existence of angels, the efficacy of prayer, or the predictive power of the zodiac than they do the science of Darwinian evolution or anthropogenic climate change. We still teach students of all ages to read, write, and critique texts chiefly through the study of literature, poetry, and scripture, even though (or because?) we have reasons to suspect that more than two-thirds of literate adult Americans cannot understand the science section of The New York Times. Around the college seminar table, I’ve seen humanism taught as the opposite of science (holism good, dualism bad). Words like objectivity and critical analysis are tainted with the odor of heartlessness when not fringed with air quotes. Within earshot, the word technology, short-hand for all things digital, is too often pronounced as if spitting out a fallen eyelash.

 

But inventing and (mis)applying technology through trial and error is what we humans have always done to deal with life. I often wonder how many writers sit down to pour their convictions onto electronic pages without recognizing that writing itself is a technology, a human invention and not a biological imperative like language. That the invention of writing over 10,000 years ago gradually altered the behavior of Homo sapiens as profoundly as the microprocessor has in the last fifty years. That writing, originating in the counting of sheep and jars of oil, was born the servant of numeracy. It took thousands of years for scratches in stone or bone to name a personal belief.  By the time the early Romans were dematerializing the Etruscans, writing was already fulfilling what the ethnologist Claude Lévi-Strauss declared its primary function:  to facilitate slavery. Ask any tribal historian which of her people’s inalienable rights were extinguished first at the end of a pen, and only later by the six-shooter.

 

To balance my students’ natural appetite for human-centered drama, I assign Thomas’ essay “The World’s Biggest Membrane” in my first-year college writing course. Few of the students in the class are science majors but most have an interest in Planet Earth strong enough to attract them to the little eco-minded college where I teach. Some admit they aren’t sure they get Thomas’ essay. Photolysis and chloroplasts appear in it. Despite the essay’s surprising buoyancy, which many of them remark on, it doesn’t occur to them to read it repeatedly and look up the new words that would help unlock its meaning. They are content to look for the plot, and finding none, send me quizzical looks.

 

My wish is that by the end of the semester they will start to recover their scientific natures from the sediments laid down by well-intentioned mentors many of whom were themselves suspicious of science and scientists. Perhaps our reading scientific writers together without prejudice will prevent a few students from remaining self-centered diarists as they write their way through college. Maybe more than a few of them will awaken to the fact that humankind’s creation story is written in the ancient rocks and protoplasm all around us rather than on sacred hides, impossible to revise. My writing course is not the right place to confide in them let alone reaffirm their allegiance to story for its own sake, at all costs. That’s why I don’t tell them that for this reader, who remembers leaning time and again into the sweet breath of my sleeping newborns, Thomas’ account of the Earth’s atmosphere evokes emotions as elemental as sexual love.

 

Beyond the classroom, crediting a scientific mindset has become a kind of ethnic marker defining a minor, under-articulated constituency. During the run-up to the last Presidential election I watched Rachel Maddow, the sure-footed political analyst and author, being interviewed on a popular late night talk show. Why, she was asked, are the candidates silent on the issue of climate change this political season? Maddow skipped the ritual excuses and offered a non-scientific opinion. “I think we need to stop thinking of science as the enemy,” she began.

 

Amen to that. We consume media in a world loud with vociferous science-deniers who would delight in Lawrence’s mischaracterization of scientists as nothing more than a gang of sanctimonious diviners with blood on their hands. Our actions over eons show us that each of us has a scientific nature alive in a brain that craves opportunities to be surprised as much as it loves certainty. We writers and educators who use scientific evidence to implore our listeners to act sustainably need to work out our own quarrels with science and come clean. We should refuse to force science-telling into roman á clef tales that displace the worth, not to mention the deep humanism, of scientific thinking. If there must be demonizing, expose the corporations, the nation-states, the churches, and the tribes that crush the fruits of science for their own violent toasts. Better to defend basic and blue skies research not as plot devices for a remake of “Dr. Strangelove” but as the real deal, like conservancies for the mind, a sublime expression of our first scientist, African Eve. That’s how our audiences will surmise that long-term human survival is less about the animals we slaughter than our willingness to sacrifice the sacred cows in the stories we write for each other. Of this I feel sure Lawrence would have approved.

 

 

 

Nancy Mettina

 

 

 

Photo Credit:  Eduardo Amorim

 

Nancy Mettina Article Thumbnail

| | PRINT: print




Comments are closed.