Select Page

EVIDENCE

EVIDENCE

EVIDENCE

by William H. Benson

April 29, 2010

     An English word that has drifted out of common usage is the word “pseudodox,” meaning a false idea or untrue opinion, but perhaps it is time to bring the word back.

     Today, April 29, 2010, marks the anniversary of the passing of Dionysius Lardner, an English scholar who died on this date in 1859. Known for editing the massive 132-volume encyclopedia named the Cabinet Library, Lardner was dismissed later in his life as one who was less than fully committed to facts. Charles Dickens labeled him as “that prince of humbugs,” and another denounced him as “an ignorant and impudent empiric.”

     Despite his lofty knowledge of scientific and technical matters at the time, Lardner had a difficulty: he would issue blanket declarations that others, through experimentation, later disproved. In other words, he proclaimed pseudodox.

      For example, he cautioned that trains must not be allowed to travel more than thirty miles per hour because they would asphyxiate passengers. Then, in 1835, he calculated and “proved that a steamship could never cross the Atlantic because it would need more coal than it could possibly carry without sinking.” There were others: all pseudodox.

     The question for today is “how do we know what to believe?” Or, “what is true?” Or, “how do we separate the true opinion from the false?” The historian would say that it all is derived from facts, those provable pieces of information supported by evidence.      

     This past week it was announced that two clerks in the Bisbee, Arizona courthouse were reorganizing files in a storage room when they stumbled upon a manila envelope marked “keep” that contained the original transcripts of the eyewitness accounts of the gunfight at the OK Corral.

     At 3:00 p.m. on October 26, 1881, at least five cowboys had ridden into the town of Tombstone and met up with the lawmen: the brothers Virgil, Morgan, and Wyatt Earp and also Doc Holliday. Three of the cowboys ended up dead that afternoon, and in the inquest conducted that same day, eyewitnesses gave their testimonies, which were transcribed onto pages of paper, the same pages found this week.

     The envelope’s contents were turned over to Arizona’s archivists, who had the duty of restoring the documents, which, after 119 years, are faded and “as brittle as potato chips.” They will remain on a shelf in the state’s archives, henceforth.

     Occasionally, biographies magically appear, such as Clifford Irving’s of Howard Hughes, which Hughes himself in 1971 declared via a teleconference on “60 Minutes” was a forgery. Then, in 1983, the German news magazine Stern paid $6 million in U.S. dollars to the journalist Gerd Heidemann for Adolf Hitler’s supposed autobiography, conveniently found in an East German barn. A scholar later proved that it too was a hoax: the ink, only recently applied to the paper, was not manufactured before 1950.

     How do we—those who are privileged enough to be alive today—know that there was once a gunfight at the OK Corral in Tombstone; that there once lived a reclusive billionaire named Howard Hughes; and that there was once, which is more than enough, a tyrant, a murdering madman named Adolf Hitler?

     Again, the answer is historical evidence: written documents, such as texts, chronicles, genealogies, diaries, and memoirs; oral interviews; works of art, such as paintings, coins portraits, and films; human remains; business, military, and government records; language; and artifacts, such as tools and pottery shards.

     Supported by the physical evidence, facts become stubborn things, for they repeatedly puncture holes in our opinions, in our prejudices, and in our cherished and most recently adopted political / religious / cultural package of beliefs.

     Robert J. Shafer, a historian, said it best: “Many statesmen have testified that the study of history prepared them for what to expect: from human greed, cruelty, and folly, and from nobility and courage and wisdom. It is clear that men who are ignorant of history are apt to make superficial judgments. . . .

     “[In our open society] we must expect unreasonable charges founded on ignorance, emotionalism, idiocy, vague fears, lust for attention, and hope of pecuniary and political profit.” In other words, we can and must anticipate a steady stream of pseudodox, and the best way to block it is by holding fast to the facts, all supported by genuine evidence.

CRITICISM

CRITICISM

CRITICISM

by William H. Benson

April 15, 2010

     Another prize-winning author once said of William Faulkner: “Well, he never faced any criticism,” a statement that I find odd, even inexplicable, for it seems to me that in America for at least the past three decades, everyone who has achieved a level of distinction or of leadership suffers the most bruising kind of criticism.

     Someone dares to suggest an idea, write a book, paint a picture, coach a ball team, record a song, film a movie, sponsor legislation, and coming fast upon his or her heels is the omnipresent critic, who jumps up to complain and to point out the fallacy of what that someone has done or offered to the world. And more often than not, that critic has never produced or done anything even remotely daring or worthwhile. If the critic ever applauds an achievement, it is negligible and infrequent.

     Alan Simpson, the former Senator from Wyoming, who was recently appointed to co-chair the deficit commission, said this of Americans today: “No one forgives anyone for anything anymore. People get angry just for disagreeing with them. . . . Look at what happens at the State of the Union address. There is a lot of whooping and jumping up and pointing. . . . I wonder sometimes what’s happened to simple tolerance.”

     The only realms of thought where criticism has achieved a heightened degree of respectability are in books and films. Literary criticism has a lengthy and rich tradition, one in which the critic attempts to find the good and the beautiful in that “slush-pile” of manuscripts submitted by erstwhile and would-be authors.

     Harold Bloom, the Yale scholar, said, “Literary criticism, as I have learned to practice it, relies finally upon an irreducibly aesthetic dimension in plays, poems, and narratives.” In a world of ugly, beauty is hard to find, and once found, it cannot be reduced.

     As for film, the most well-known of the critics were Gene Siskel and Roger Ebert, who had their own television show, “At the Movies,” which began in the 1970’s. Each week Siskel and Ebert would play a clip of the newly released movies, vote with either a thumb up or down, and then briefly argue about a particular movie’s merit. After they retired from the show, two other film critics replaced them: Michael Phillips of the Chicago Tribune and A. O. Scott of the New York Times.

     Last week, the powers-that-be announced that they had decided to give a thumbs down to “At the Movies,” and ax it from their lineup at the end of this season, sometime in mid-August. A. O. Scott wrote in last Sunday’s New York Times: “I am sorry ‘At the Movies’ is over. I had a good time doing it and wish it could have kept going, but I have no scores to settle, no blame to assign, no might-have-beens to explore.”

     Some areas of society institutionalized their criticism, and most deservedly: such as in Congress where ideas are transformed into laws and transactions that can and do profoundly affect people’s lives in countless ways, often not for the better. Criticizing the current administration in power is the American pastime, not discussing the weather.

     Other areas are relatively free of criticism, such as in religion. People in American do not ordinarily publicly badmouth a different denomination—its founder, its packaged set of beliefs, its rituals—than that of their own.

     I know of only one religious critic: again, Harold Bloom, who wrote in The American Religion: “[R]eligious criticism must seek for the irreducibly spiritual dimension in religious matters or phenomena of any kind. . . . A nation obsessed with religion rather desperately needs a religious criticism, whether or not it is prepared to receive any commentary whatsoever on . . . a question as [to] the individual’s relation to group persuasions.” Frequently in his book, Bloom would ask, “Where is the appeal in such a religious imagination?”

     Criticism, for all of its faults, does prod people to listen, to talk, to read, and to think. A. O. Scott wrote: “And that kind of provocation, that spur to further discourse, is all criticism has ever been. . . . As such, it is always apt to be misunderstood, undervalued and at odds with itself. . . . The future of criticism is the same as it ever was. Miserable and full of possibility. The world is always falling down. The news is always very sad. The time is always late.”

     Let’s hear from the critic.

NORMAL ACCIDENTS

NORMAL ACCIDENTS

NORMAL ACCIDENTS

by William H. Benson

April 1, 2010

     Disasters surround us. They inundate our news-saturated minds, pierce our sense of safety, and disturb our sleep. That we live in a world filled with awe-inspiring technologies makes our lives vastly more comfortable and entertaining, but at the cost of heightened and frightening risks. Certain “technological systems” are prone to error and subject to a possible accident and a catastrophe that can and will kill the innocent, despite the presence of sufficient back-up plans and alternate routes.

     Most people fully recognize the danger inherent in certain systems, such as in aircraft and airways, nuclear power plants, petrochemical plants, space travel, bridges, dams, supertankers, nuclear weapons, and biotechnology. Yet, we feel helpless to do anything; we rely upon the technology, but we are powerless to control any phase of it.

     For example, late last week a South Korean naval ship, for an unknown reason, exploded, severing the vessel into two parts, and drowning nearly half of the 104 seamen. Truly tragic it was, but not that uncommon on the high seas where marine disasters have thwarted men and women throughout humanity’s history.

     Homer said it best, that “humans tempt the gods when they plow these green and undulating fields,” called the seas.

     Also, out on that open sea in an ocean lane, “non-collision course collisions” actually do occur. “One would not think that ships could pile up as if they were on the Long Island Expressway, but they do.”

     There was the Titanic disaster on the night of April 14-15, 1912, when 1503 people perished “partly as a result of an overconfident captain imperturbably sailing into a field of icebergs at night, thinking he had an unsinkable ship.” The iceberg that the ship hit sliced open five watertight compartments: “[T]he designers had assumed that no more than three could ever be damaged at once.” And they were wrong.

     The worst commercial nuclear accident in the U.S. occurred on March 28, 1979, 31 years ago, at Three Mile Island, near Middletown, Pennsylvania, when operators made a crucial error at the same time that the equipment failed—a value was closed when it should have been opened. Despite Jane Fonda’s frightening portrayal in the movie The China Syndrome of what may have happened had Three Mile Island exploded, no lives were lost, fortunately.

     History’s worst nuclear accident that resulted in an explosion and a fire, happened on April 26, 1986, at Chernobyl, near Kiev in the USSR, now in the Ukraine, in which at least 31 people were immediately killed, and radioactive winds swept across Europe.

     But the worst explosion in terms of lives lost—3,849—was at a petrochemical plant that exploded at Bhopal, India, on December 3, 1985.

     The truly terrifying, and perhaps the next big accident, will be that of the supertanker loaded with chemicals. “A fair bit of the world’s sulfuric acid, vinylidene chloride, acetaldehyde, and trichloroethylene, etc. move by tankers that are unregulated, and often old, in poor condition, and traveling through winter storms.”

     Most experts attribute the cause of explosions and accidents to “operator error” or to “engineering and design flaws,” and that is mainly true. However, one writer named Charles Perrow has looked deeper into these accidents, and in his 1999 book Normal Accidents, Perrow has identified at least two common factors among them: 1) the tight coupling and 2) the sheer complexity of the pieces that have evolved into a system

     Perrow defines a “normal or a system accident” as that where “given the system characteristics, multiple and unexpected interactions of failures are inevitable.” This definition goes well beyond simple pilot error or an engineer seated at a drawing-table miscalculating a length of a line on a blueprint.

     The system itself can create the accident with its hidden interconnectedness, its fair chance of multiple failures, and its utterly incomprehensible design, for no one solitary person can know all the moving pieces and how they will interact when put into service.

     Another thinker, Dr. Andrew W. Lo, has added to Perrow’s two conditions a third: that of “the absence of normal feedback over an extended period of time.” When incidents rarely if ever become accidents, the notion spreads that the system has become less risky. When people with a Type A personality person think that that they are insulated from a diminished risk, they will take more risks. Accidents will happen.

     This is the world we live in; one that is fraught with real dangers due to our technological successes, in spite of our innocence and our noble intentions.  

EDUCATION REFORM

EDUCATION REFORM

EDUCATION REFORM

by William H. Benson

March 18, 2010

     The numbers are discouraging. In the first decade of the new millennium, only 68% of any freshman student could expect to graduate from high school four years later: 72% for girls and only 64.1% of the boys. As of 2003-2004, Nebraska ranked number one in its graduation rate, 87.6%, and Nevada ranked 51st, 57.4%, lower than Washington D.C.

     And if a student happens by luck or pluck to graduate from high school, he or she faces an uphill battle at a college where graduation rates are even more abysmally lower: only 56% of the incoming white freshman students will graduate six years later, versus 43% of the Hispanic students and 40% of the African-American. Because so many bright students lack the funds and the emotional support, in absolute discouragement, they quit.

     These numbers underline U. S. education’s dismal state. Kate Walsh, the president of the National Council on Teacher Quality, said, “there was such a dramatic achievement gap in the United States, far larger than in other countries, between socioeconomic classes and races. It was a scandal of monumental proportions, that there were two distinct school systems in the U.S., one for the middle class and one for the poor.”

     Last week, it was announced that twenty-nine of Kansas City’s sixty-one schools will be closed by next fall in order to stave off the district’s bankruptcy.

     On February 23, in Rhode Island, Superintendent Frances Gallo, who oversees Central Falls High School fired all 74 of the school’s teachers. She had asked them “to work 25 minutes longer each day, eat lunch with the students once a week, and agree to be evaluated by a third party.” Because the teachers refused her requests—unless they were paid additional money, at a rate of $90 an hour—she fired them all, and received the commendation of both the Secretary of Education and President Obama.

     Sometimes, not always but sometimes, it helps to trace our way back to a difficulty’s origin to see how we arrived at where we are. In 1647, the General Court of Massachusetts provided for the establishment of reading schools because it was “one chief project of that old deluder, Satan, to keep men from the knowledge of the Scriptures.” Deferring the issue of Puritan theology until later, the obvious point is that the colonial government decided then to pay for the business of educating children.

     Nothing of the same was decided for the colony’s other businesses—farming, husbandry, the retail trade, construction, or, most importantly, health care. Other than midwifery and prayer, there was little health care in early New England, but it must be underscored that the doctors were not taxpayer-supported then and would not enjoy such a privilege until the 1960’s with Lyndon Johnson’s Great Society.

     But what if those early Puritan legislators had elected to earmark taxpayer’s funds for their citizens’ health care, and not for education?

     The teachers would work as the doctors do today. Students would have to schedule appointments with a teacher weeks in advance for a fifteen-minute office visit and a scrawled prescription to buy a book. Self-study would be the rule, but if a student demonstrated a skill or an aptitude, the doctor would order further testing and a stay in a dormitory.

     Students would pay for each office visit, and parents would pay for education insurance in case of a child’s catastrophic desire for more education. Nurses would oversee the student’s progress. The teachers would work hard all day—ten to fifteen hours or more—but then receive an excellent income.

     And the doctors would work as the teachers do now. All the sick would gather every morning in a hospital room where the doctor, with chalk and lesson plans, would teach about good health and then give out an assignment. After an hour, the sick would be shuffled off to the next room for an additional lecture and assignment. Doctors would dispense medicines freely in every classroom, and all would take the same prescription.

     The doctors would face constant discipline problems because the sick would not want to sit in a desk all day, but for all their efforts, the doctors’ would receive a far lower income than now but work less hours, perhaps, with weekends and summers off.

     Far-fetched these scenarios may be, something like them may have evolved if health care had been publicly-supported initially and education had been kept private.

     The question of the day is, “Should the government now expand its control of the health care business?” The question is a profound and polarizing one, and the wisest of the thinkers are struggling mightily for the correct answer. People’s futures, human talent, and achievement are all at risk, whichever direction they choose.

ON CRIME & PUNISHMENT

ON CRIME & PUNISHMENT

ON CRIME & PUNISHMENT

by William H. Benson

March 5, 2010

     In Boston, on March 5, 1770 on a cold, moonlit evening, when a foot of snow lay on the ground, a gang of several hundred hostile Americans approached eight British soldiers, the hated Redcoats, sometimes called lobster-backs, and began to taunt them. The crowd then turned ugly and began shouting at the soldiers, cursing them, and throwing snowballs, chunks of ice, oyster shells, stones, sticks, anything they could pick up and throw. The cornered, terrified soldiers opened fire and killed five American men.

     The Boston Massacre, as it as since been called, was the opening salvo in what became the American Revolution. John Adams, a young attorney from nearby Braintree, Massachusetts, agreed to defend those eight British soldiers. Although he would later serve as a leader in America’s revolt against the British crown, Adams believed that it was his duty then to defend those soldiers from what appeared a most certain hanging.

     In preparation for the trial, Adams read the eighteenth-century Italian thinker, Cesare, Marchese di Beccaria, who had written an early work on the punishment of crime, titling it On Crime and Punishment. Beccaria had argued against the death penalty and torture, and had instead advocated for major reform of the criminal legal system, and that criminal justice should conform to rational forms of punishment.

     He presented certain enlightened principles: punishment should deter further crimes, rather than serve as a form of retribution; that it should be proportionate to the crime committed; that the certainty and promptness of the punishment, rather than its severity, would achieve the preventative affect; and that the proceedings should be made public.

Beccaria’s ideas were revolutionary in a day when capital punishment by hanging or firing squads was the rule.

     Standing before the jury, Adams argued that the British soldiers had acted in their own self defense when they had fired their guns into the crazed mob. “Do you expect he should behave like a stoic philosopher, lost in apathy?” Adams asked. “Facts are stubborn things, and whatever may be our wishes, our inclinations, or the dictums of our passions, they cannot alter the state of facts and evidence.”

     The jury acquitted six of the soldiers, and the other two, the jury determined, were guilty of manslaughter. But instead of being walked to the gallows, the two received a branding on their thumbs. For the rest of Adams’s life, he was pleased with his defense of those soldiers, calling it, “one of the most gallant, generous, manly, and disinterested actions of my whole life, and one of the best pieces I ever rendered my country.”

      When he wrote the Bill of Rights, James Madison included Amendment VIII: “Excessive bail shall not be required, nor excessive fines imposed, nor cruel and unusual punishments inflicted.” That amendment eliminated any U.S. citizen from ever facing the more sadistic forms of torture: the rack, head screws, the use of boiling water, mutilations by cutting off tongues and ears and hands, and locking people’s feet and hands in stocks.

     Thirty years ago, a notorious white collar criminal and con artist, named Frank W. Abagnale, Jr., wrote a book of his life of writing phony checks all over the world. When finally caught, he was living in France, was tried and sentenced to six months at the Perpignan prison. He was thrown naked into a darkened stone hole five feet wide and tall, with only a bucket that was emptied whenever the guards wanted to.

     No letters, no visitors, no communication was allowed, no haircuts, no chance to shave or cut his fingernails, no light, and nothing to read. He slept on the stone floor. Food was scant. Insects and bacteria took up residence on his skin, and soon scabs ulcerated his body. The stench from the filth was suffocating.

     Six months later, the Swedes had their turn at Abagnale. He was tried and sentenced to six months in the Swedish prison, and was told, “You will find it very different from the prisons in France. In fact it is very different from any of your American prisons.” Frank was permitted to live in a dormitory at Lund University, where he could take classes or work at a job. He could wear his own clothes, in what he learned was a coed prison. He realized that “Swedish prisons actually attempt to rehabilitate a criminal.”

 

     Which system really works to deter the criminal—the barbaric French dungeon, the soft Swedish dorm room, or the cramped American cell with steel doors and bars? Or is branding a thumb a better option? The answer does not instantly appear, because governments and judges have learned that matching the punishment to equal the crime and thus prevent other crimes is an art not quickly mastered.

HUCKLEBERRY FINN

HUCKLEBERRY FINN

HUCKLEBERRY FINN

by William H. Benson

February 18, 2010

     “You don’t know about me without you have read a book by the name of The Adventures of Tom Sawyer; but that ain’t no matter. That book was made by Mr. Mark Twain, and he told the truth, mainly. There was things which he stretched, but mainly he told the truth.”

     And with that brief introduction, Mark Twain began his story, The Adventures of Huckleberry Finn, publishing it in 1885, on this date, February 18. No one knew then that a boy’s story, about Huck Finn, a fourteen-year-old boy, and Jim, a run-away slave, floating on a raft down the Mississippi River would earn its esteemed position in America’s literary works.

     Ernest Hemingway said, “All modern American literature comes from one book by Mark Twain called Huckleberry Finn. . . . It’s the best book we’ve had.” And Harold Bloom, the literary critic, said that the first seven paragraphs of the book’s 19th chapter contain “the most beautifully written descriptive passage in all of American literature.”

     You may very well ask, “Why read and then reread this classic? or any classic? Why read literature at all? What is the benefit?” Mark Twain, invariably the humorist, defined the classics as “those books everybody says you should read but nobody does.”

     Students, especially the English majors, read them. Almost twenty years ago, the film critic David Denby decided that he would sit again in a student’s desk to read and discuss the classics. “In the fall of 1991, thirty years after entering Columbia University for the first time, I went back to school and sat with eighteen-year-olds and read the same books that they read. Not just any books. Together we read Homer, Plato, Sophocles, Augustine, Kant, Hegel, Marx, and Virginia Woolf. Those books. Those courses.”

     Except for the David Denby’s, deep reading of the classics is a talent few wish to master, and so it remains underappreciated, as a useless waste of our time. Unlike basketball stars, championship trophies are not presented to the most expansive readers, who can boast, as H. L. Mencken did, that he “read like an athlete,” when in his teens.

     Listening to the television is the skill most resorted to today; people listen to the sports newscaster, to the talking heads in a news studio, and to the empty heads who act and speak in some simple-minded television comedy or mystery. Conversely, to read is to tune in, listen, and catch the dazzling and daring ideas of the best thinkers from the past. 

     Yes, some of the great books, such as Hegel’s works, are exceedingly dreary, unintelligible, unfulfilling, and boring. Yet, only by reading them again and again, though, can a student gain that critical eye, discerning the good from the bad, the intelligent from the stupid, and the correct from the flawed. Nietzsche warned us to ask always, “Who is the interpreter and what power does he seek to gain over the text?”

     Sometimes the writer’s research leads him or her into areas that we, the readers, do not want to go, and so the texts swamp our very minds. A Newsweek writer wrote this past week, “Some works are overwhelming. We see authors struggling to create masterpieces that on almost every page threaten to collapse under their own weight. . . . But who wouldn’t rather read a flawed masterpiece than none at all. After all, not even Twain could figure out the right way to end Huckleberry Finn.”

     We read every day in the newspapers sorry tales of people who committed themselves to the most foolish choices that left them so vulnerable, so easily persuaded and intimidated, and so misled and gullible. They failed to ask beforehand Nietzsche’s question, “Who is the interpreter and what power does he seek over the text?”

     Certain texts will never go away. Harold Bloom said that “Freud, genius that he was, failed to apprehend the permanent power of texts that cannot vanish: the Tanakh, the New Testament, and the Qur’an. If asked the desert island question, I would have to take Shakespeare, but the world continues drowning in the blood-dimmed tide of its Scriptures, whether it reads them or not.”

     He also argued that “reading well is one of the great pleasures that solitude can afford you, because it is, at least in my experience, the most healing of pleasures.”

     The next Great American Novel, that work which will eclipse Huckleberry Finn, is just ahead of us, perhaps around the corner, but few will recognize it when it appears, for genius is transcendent, beyond the limits of our thoughts, or as William Blake said, “The ages are all equal, but genius is always above its age.” Read well from those authors, such as Mark Twain, who dazzle us, our great cloud of witnesses to the human condition.