Select Page

STEPHEN KING & JFK

STEPHEN KING & JFK

STEPHEN KING & JFK

by William H. Benson

December 1, 2011

     Stephen King is not one of my favorite authors, not even close. I have yet to finish anything he has written, mainly because he restricts himself to two genres that fail to satisfy me—horror and science-fiction. And yet I bought his new book, 11/22/1963, after reading a positive review in The New York Times Book Review. It is a work of fiction, but historical, with a pleasant-feeling romance, and a measure of science-fiction mixed in. A promising setup, even exciting.

     The book’s main character, Jake Epping, is divorced and teaching English composition in a small school in Maine, when Al, the proprietor of the local diner, explains privately to Jake that there is a spot in the back of his diner where, if he walks into that spot, he can travel back to a single point in time, “11:58 A.M. On the morning of September ninth, 1958. Every trip is the first trip.” Whenever Al returns from this hole into the past, for however long, he has only been gone two minutes.

     Because Al is dying of lung cancer from too many cigarettes, he asks Jake to go back to 1958, stay there for five years, and stop Lee Harvey Oswald from assassinating President Kennedy. “You can change history, Jake,” Al tells him. “Do you understand that? John Kennedy can live.”

     Jake agrees, gets himself to Dallas, locates Lee and Marina Oswald, rents the room underneath the Oswald’s room at 214 West Neely Street in Dallas, spies on them, falls in love with a Texas beauty named Sadie Dunhill, and struggles with all his might to stop Oswald. There is the setup.

     In 11/22/1963 Stephen King splices together two shop-worn subjects: time travel and Kennedy’s assassination. Time travel, as a literary topic, began with H. G. Wells’ The Time Machine and Edward Bellamy’s Looking Backward. It was advanced, according to King in his “Afterword,” by Jack Finney in his wonderful story Time and Again. For a good story, Hollywood loves time travel, either back or forward. Consider Frank Kapra’s It’s A Wonderful Life or Warren Beatty’s Heaven Can Wait.

     Kennedy’s assassination is a nest of ideas that can be neatly divided into two: those who subscribe to the conspiracy theory and those who point only to Lee Harvey Oswald. A host of books will describe in vivid detail the trajectory of the third bullet, the shooter on the grassy knoll, and that the FBI, the CIA, the State Department, the Soviet Union, or Fidel Castro were working in conjunction with Oswald.

     One would tend to believe that after considering what Oswald did in his twenty-four years: he joined the Marines, but then defected to Russia; married Marina Prusakova, a Russian girl with whom he had two daughters; returned to  the U.S., but took a bus to Mexico City where he appealed for a visa to Cuba; and handed out pro-Marxist pamphlets on New Orleans’s streets, urging Fair Play for Cuba.  

     Then, there is The Warren Commission Report—all twenty-six volumes of it, William Manchester’s Death of a President, Gerald Posner’s Case Closed, and Norman Mailer’s Oswald’s Tale, and each concluded that Lee Harvey Oswald acted alone. They consider it coincidental that on October 20, 1963 he found the job at Dallas’ Texas Book Depository, which just happened to be on the route of Kennedy’s motorcade, and that he had purchased the previous March by mail-order a Carcano rifle.

     Norman Mailer began his book convinced that there was a conspiracy, but after thoroughly examining the evidence, he wisely concluded that Oswald acted alone. Mailer said, “It is virtually not assimilable to our reason that a small lonely man felled a giant in the midst of his limousines, his legions, his throng, and his security.” Yet he did, and Stephen King agrees.

     “It is very, very difficult,” he wrote, “for a reasonable person to believe otherwise. Occam’s Razor—the simplest explanation is usually the right one.” The assassination is “the same simple American story: here was a dangerous little fame-junkie who found himself in just the right place to get lucky.”

     Simple it may be, but complexity is what the human mind craves and automatically resorts to it when seeking better explanations. Oswald’s last words were, “I’m a patsy,” words that have sent countless people on a frenzied pursuit to name those who they believe pulled Oswald’s strings.

     The wound our nation suffered that Friday, November 22, 1963, on a street in Dallas, Texas has not ever completely healed. I still remember the day. I was ten years old. I was in the fourth grade. Our teacher told us the sad news after our lunch break that our young President was most certainly dead, that someone had shot him while he was riding in a car. A mean thing to do, I thought then and still do.

     On a note of horror, Stephen King’s supreme area of expertise, he ends his book, and so he gives the reader his opinion on gun control, on the necessity of keeping guns out of the zealots’ hands, and on the danger of holding too tight to crazed opinions: “If you want to know what political extremism can lead to, look at the Zapruder film. Take particular note of frame 313, where Kennedy’s head explodes.”

JUMPING THE SHARK

JUMPING THE SHARK

JUMPING THE SHARK

by William H. Benson

November 24, 2011

     The expression “jumping the shark” refers to that Happy Days‘ episode that aired on September 20, 1977 when Fonzie, on water-skis and dressed in trunks and his trademark leather jacket, jumped over a penned-up shark. All television series seem to run through a cycle, beginning with a set of fresh ideas, imaginative characters, and entertaining plots, but then, over time as the writers lose their edge, they begin to grasp at silly plots that fail to make any sense, resorting to “jumping the shark” scenes.

     The idiom has subsequently broadened beyond television and now refers to “any enterprise whose best ideas are behind it,” to that “moment when brand, design, or creative effort moves beyond the essential qualities that initially defined its success, beyond relevance.” In a word, to the “absurd.”

     An audience, or a group of voters, citizens, readers, or members belonging to any organization can reach that moment of “jumping the shark,” when the absurd stands up and stares at all those fools who are watching. People suddenly feel exhausted by the trite, disappointed in the poor quality, bored with the commonplace, and annoyed by the hackneyed, the banal, and the routine. They are jaded.

     Speaking of Sports commentator, Howard Cosell, arrived at that point in 1983 after thirteen years on Monday Night Football. If I remember the quote correctly, he announced that professional football to him had become “an incredible bore.” As for college sports, we witnessed last week an absurd, even tragic, moment when the news broke from Penn State. Buzz Bissinger writing in Newsweek stated it best, “Too many universities are sports factories posing as academic institutions.”

     By their very definition, political campaigns bore even the most excitable people. The Iowa Caucuses are set for January 12, 2012, seven weeks from today, and Rick Perry cannot recall “the third of three federal departments he proposes to shut down” if elected. Paul Begala, also writing in Newsweek, labeled the Republicans “The Stupid Party,” for they know “they can earn applause by denouncing science,” especially evolution. They too have resorted to “jumping the shark.”

     The New York Times writer, Ross Douthat, said, “It will do America no good to replace the arrogant with the ignorant, the overconfident with the incompetent. In place of reckless meritocrats, we don’t need feckless know-nothings. We need intelligent leaders with a sense of their own limits, experienced people whose lives have taught them caution.” In other words, people who will not “jump the shark.”

     Now we hear rumors of a potential attack upon Iran, for that country, now we are told, has weapons of mass destruction. Did we not watch that show once before? I cannot imagine that few Americans would want to watch a repeat of that White House episode, and I fail to see any reason we should ever jump that shark again. Ron Paul, the Texas Congressman now running for President, argues that fears about Iran’s nuclear program have been “blown out of proportion,” and that “tough penalties are a mistake because they only hurt the local people and still pave a path to war.”

     The Christmas shopping season arrives this week. The basic faith and love in an eternal and divine being has been dwarfed by a frenzied adherence to obligations, want-satisfaction, duties, and a truckload of extensions that reside in a world a fair distance from the basic faith from which it all originated. In the season, we dare not lose Christmas, the best holiday of the year.

     Certain human values rarely go out of style or lose their freshness, and, like a magnet, they attract others. Values, such as graciousness, an appreciation of others’ efforts, patience, loyalty, dedication, determination, and yes thankfulness; those values still shine bright. But can they be pushed to the point that they too are absurd? Of course! Consider Charles Dickens’ Uriah Heep and his insincere “humbleness.” What a boring world it would be indeed, if everyone was just nice.

     President Lyndon Baines Johnson used to say that “where flattery is necessary, there is no excess.” But then he was a master of manipulation: he jumped the shark daily. Simple appreciation of others he lacked, and those Congressmen he approached knew he wanted one thing: their vote.

     In a list of human traits, gratitude and thankfulness do not rank as high as say, ambition or trustworthiness, and perhaps for that reason it is wise that we celebrate Thanksgiving only once a year. More often, and it too would transform into an absurdity. The rest of the days of the year, we, the hearty members of the American workforce, are expected to go to work, an expectation that can cause us to feel like an absurdity, as if we are jumping the shark every day.

     A rabbi named Tarphon used to say, “It is not necessary for you to complete the work, but neither are you free to desist from it.” We work five days a week, enjoy our weekends, and then in late November every year, we stop work one day and give thanks. Nothing absurd about that. No “jumping the shark.” Have a great Thanksgiving!

THE HUNDRED MOST

THE HUNDRED MOST

THE HUNDRED MOST

by William H. Benson

November 10, 2011

     Last Monday the British Museum published in the United States its book The History of the World in 100 Objects. Over a four-year stretch a hundred curators at the London museum selected a hundred objects from their expansive collection, objects they believed were representative of the world’s history.

     Number one is an odd one—the mummified body of Hornedjitef, an Egyptian priest, which was placed in an inner and outer coffin in preparation for his travel to the next world. Two and three make more sense: an Olduvai stone chopping tool from Tanzania, the oldest artifact in the collection, and an Olduvai hand axe. Five is a Clovis spear point, “the first evidence of human activity in North America,” and was used for hunting large game, such as mammoths, to the point of extinction.

     Six was a bird-shaped pestle found in New Guinea and was dated to as far back as 6000 B.C. The museum’s director, Neil MacGregor, wrote, “The history of our most modern cereals and vegetables begin around 10,000 years ago. It was a time of newly domesticated animals, powerful gods, dangerous weather, and better food.”     

     Number ninety-five is a defaced penny. The suffragettes of the early twentieth century, those women who campaigned for the right to vote, would stamp a multitude of pennies with the words: “VOTES FOR WOMEN.” Ninety-eight is the “thrones of weapons,” a chair composed of assault rifles and ammunition. Ninety-nine is a plastic credit card, and the final one is a solar panel and lamp. 

     Eight hours in the bright sun will yield one hundred hours of lamplight, enough to light a single room. Today 1.6 billion people live without access to electricity, and this device “allows people to study, work, and socialize outside daylight hours, vastly improving the quality of many lives.”

     Others have pointed out that although objects are of crucial importance in the history of the world, so are individuals, and so are authors.

     In 1978 a writer named Michael Hart ranked the 100 most influential persons in history, according to his own criteria. Into the number one slot, he placed the prophet of Islam, Muhammad, “because he was the only man in history who was supremely successful on both the religious and secular levels.”

     Into the number two position, he put Isaac Newton, quoting Voltaire who said of Newton, “It is to him who masters our minds by the force of truth, and not to those who enslave them by violence, that we owe our reverence.” Three, four, five, and six were founders of the world’s other primary religions: Jesus, Buddha, Confucius, and St. Paul.

     William Shakespeare is down the list a ways to number thirty-six. Hart argues, “I have ranked Shakespeare this low because of my belief that literary and artistic figures have had comparatively little influence on human history.” Huh? Really? I would disagree. Writers have always influenced those in positions of power and will continue to do so. Shakespeare ranks first among the world’s writers, which is where Daniel S. Burt ranked him in his book The Literary 100.

     Last week the movie Anonymous was released, and it supports the fallacious and oft-pandered idea that Shakespeare did not write his plays but that the seventeenth Earl of Oxford, Edward de Vere, did. People who subscribe to this argument are called anti-Stratfordians. In my ranking of the 100 worst ideas of all time, I would put this idea in the top twenty, for the evidence is overwhelming that Shakespeare wrote the plays bearing his name.

     At the top of that same list, I would place, in order, Naziism, Communism, war, slavery, and racial segregation. Further down, I would put the idea that the American nation is washed up, soon to be relegated to the dust-bin of history. To that I say, “Never bet against the American people, their economy, their ingenuity, or their will to triumph over the worst difficulties.”

     Bad ideas continually surface. Stephen Marche wrote last week in The New York Times: Our politeness has actually led us to believe that everybody deserves a say. The problem is that not everybody does deserve a say. Just because an opinion exists does not mean that the opinion is worthy of respect. Some people deserve to be marginalized and excluded.”

     As for the 100 best ideas of all time, I would place in the top ten the following: the domestication of plants and animals, self-government, written laws, the concept that humans possess rights and equality under law, the transmission of knowledge through written books, and the construction of libraries and museums.

CHARACTER

CHARACTER

CHARACTER

by William H. Benson

October 27, 2011

     In its September 18th edition, the New York Times Magazine ran an article on an interesting subject, developing students’ character. Dominic Randolph, the headmaster at New York City’s Riverdale Country School, suggested that in life character can be of far greater importance than intelligence.

     “There was this idea in America that if you worked hard and you showed real grit, that you could be successful. Strangely, we’ve now forgotten that. People who have an easy time of things, who get 800s on their SAT’s, I worry that those people get feedback that everything they’re doing is great.”

     “And I think as a result, we are actually setting them up for long-term failure. When that person suddenly has to face up to a difficult moment, then I think they’re ruined. I don’t think they’ve grown the capacities to be able to handle that.”

     Another educator, David Levin, superintendent of the KIPP (Knowledge is Power Program) charter schools in New York City, has tried to incorporate character training into his school’s curriculum. KIPP’s slogans are “Work Hard,” “Be Nice,” “There Are No Shortcuts,” and “Climb the Mountain to College,” and its goals are “that 75 percent of KIPP alumni will graduate from a four-year college, and 100 percent will be prepared for a stable career.” Currently, only 33% will graduate from college.

     Why is character so important? Since humanity began its slow rise, societies have paid homage to moral and religious laws that guided people toward better behavior. For an example, consider the Ten Commandments. Yet, character has a practical benefit: “cultivating these strengths represented a reliable path to the ‘good life,’ a life that was not just happy but also meaningful and fulfilling.”

     Randolph and Levin eventually came across the work of Angela Duckworth, Assistant Professor of Psychology at the University of Pennsylvania, who wrote, “I study competencies other than general intelligence that predict academic and professional achievement. My research centers on self-control and grit.” Grit, she determined, has little to do with intelligence: the two do not correlate positively.

     She wrote that, “The problem, I think, is not only the schools but also the students themselves. Here’s why: learning is hard. True, learning is fun, exhilarating and gratifying—but it is also often daunting, exhausting and sometimes discouraging. To help chronically low-performing but intelligent students, educators and parents must first recognize that character is at least as important as intellect.”

     Duckworth developed a twelve-question test to determine a student’s degree of grit. For example, the final six statements are:  “I have achieved a goal that took years of effort,” “I have overcome setbacks to conquer an important challenge,” “I finish whatever I begin,” “Setbacks don’t discourage me,” “I am a hard worker,” and “I am dilligent.” Students rate themselves.

     What is remarkable, even amazing, about Duckworth’s three-minute test is that she discovered that it is a better predictor of success than IQ scores, SAT or ACT results, or a battery of other tests. For example, she administered the test to 139 Ivy League undergraduates in the fall of 2002 at the beginning of their freshman year and found that those who scored the highest grit factor on her test had the higher Grade Point Averages four years later.

     She also gave the test to incoming cadets at the U. S. Military Academy at West Point in 2004 and again in 2006 and determined that those who had scored higher on her grit survey were mainly those who stayed the course and graduated in 2008 and in 2010. She then gave her survey to spelling bee contestants, and it predicted those which would make the final round.

     Duckworth noticed that self-control over one’s emotions and behavior is important but that the “people who accomplished great things often combined a passion for a single mission with an unswerving dedication to achieve that mission.” In other words, grit. True Grit.

     So, Levin and Randolph incorporated grit and self-control in their list of character traits for their students to focus upon, plus zest, social intelligence, gratitude, optimism, and curiosity.

     Intelligence is only one component of a student’s person, and people demonstrate various quantities of it. Yet, the same is true of ambition, hustle, and drive; of consideration for others’ feelings, kindness, generosity, and compassion; of reading other peoples’ thoughts and understanding interpersonal dynamics; and of passionate interest in a worthwhile subject. Every individual is unique.

 

     If only a store would sell boxes of grit and self-control; stock its shelves with gratitude, curiosity, and ambition; and put price tags on zest and perseverance. Would they sell out? Would parents buy them at Christmas to wrap and give to their children? Perhaps, but maybe not. Some people seem so content with where they are at in life. Label me pessimistic, not an admirable character trait.

CHRISTOPHER COLUMBUS

CHRISTOPHER COLUMBUS

CHRISTOPHER COLUMBUS

by William H. Benson

October 13, 2011

     Christopher Columbus completed four voyages to the Western Hemisphere. It was a tribute to his single-minded focus, his daring, his exemplary seamanship skills, his persuasive ability, and a dose of luck that he completed each of the four.

     After each return to Spain, he would report to the royal couple, Queen Isabella and King Ferdinand, approach their throne, and tell them what he had seen and done. They believed Columbus. They stared in dumbfounded amazement at the Indians that he brought back, and at the cages that held exotic birds and parrots. Each time they would choose to finance another larger and more grandiose voyage.

     For his first voyage, he departed the Canary Islands off the coast of Africa on September 8, 1492, and gave the order for the course, “West, nothing to the north, nothing to the south.”And so his three ships held a westerly course of 28 degrees north latitude. The primitive maps of his day indicated that Japan was at the same latitude as the Canary Islands, and Columbus believed if he followed a direct westerly course, he would arrive at the Japanese Islands, and eventually see China.

     He was obsessed with getting to the Orient, for he was convinced that once there he would find mountains of gold and gems waiting for him to take back to Spain.

     Instead of Japan though, he and his men came upon a small island, one of those in the Bahamas, due east of the tip of Florida, on October 12, and instead of finding gold, everywhere that Christopher and his fellow Spaniards sailed in the Bahamas, they encountered only naked Indians. Cuba, they found, had no gold, but on the larger island of Hispaniola, he saw some evidence of it.

     He departed Hispaniola on January 15, 1493, and struck land at Portugal’s coast on March 3.

     On November 1, 1493, he sailed west on his second voyage commanding a flotilla of seventeen ships that the royal couple had provided him. In the Caribbean, he discovered the Lesser Antilles, the Virgin Islands, and Jamaica, and established a Spanish colony at Santo Domingo on Hispaniola.

     However, problems overwhelmed him and his colony. Gold, he learned, was difficult to mine. The Indians were resentful of Columbus’ and his sailors’ very presence. They did not want to continue to feed these bearded white men, and certainly they did not want to dig gold for them. The Spaniards themselves were uncooperative with Columbus. They rebelled and would not work in the fields to provide food for themselves.

     He could not understand why so many people turned on him, and why they would lie to the King and Queen about him. Columbus returned to Spain in June of 1646.

     He should have quit then, but he was so obsessed with finding the treasures of the Indies, that he went on two more voyages. The third was to the south, to the coast of South America, to Venezuela and to the mouth of the Orinoco River, and the fourth was to the west, to Central America, to the modern states of Honduras, Nicaragua, Costa Rica, Panama, and Colombia. On none of these two voyages did he find any evidence that he was near Japan, China, India, or the Malaysian island.

    During his third trip, Ferdinand and Isabella heard of Columbus’ exploitation of the natives and of his dictatorial manners toward the Spanish settlers on Hispaniola. The royal couple listened and sent Francisco de Bobadilla to investigate, and upon his arrival, he promptly arrested Columbus, put him in chains, and ordered him returned to Spain. Christopher departed Santo Domingo in early October 1500, and when given a chance to have his chains removed, Columbus replied, “I have been placed in chains by order of the sovereigns, and I shall wear them until the sovereigns themselves order them removed.”

     Ferdinand and Isabella listened to Columbus’ side of the story and concluded that when on board a sailing vessel, he was an excellent mariner, but once on land, he was not a good administrator. Because of that, the royal couple agreed to finance his fourth voyage, but if he would only explore.

     After that final voyage, Christopher Columbus’ body was a wreck. He suffered from gout. Malaria he had caught in the New World drained him of all energy. Most painful was the rheumatoid arthritis he endured during the final years of his life. Just to move hurt him. Some scholars believe that because he had consumed river water on his voyages, he may have contracted shigellosis, a disease caused by a bacteria in the American tropics. If untreated it leads to Reiter’s syndrome, in which the victim feels that parts of the body are swollen and inflamed, symptoms that Columbus complained of.

 

     He died on May 20, 1506, at the age of fifty-five. Up to the end, he was demanding that King Ferdinand pay him all that the crown had promised him and give him all the honors to which he was entitled. He received some, but not nearly all that he wanted. He died not fully understanding that his four voyages were to a new hemisphere, and that he had discovered two new continents: North and South America.

ERROL MORRIS’S “BELIEVING IS SEEING”

ERROL MORRIS’S “BELIEVING IS SEEING”

ERROL MORRIS’S “BELIEVING IS SEEING”

by William H. Benson

September 29, 2011

     In the September 4th edition of the New York Times Book Review, I happened to read an interesting review of a new book, Errol Morris’ Believing is Seeing: The Mysteries of Photography. Before Morris achieved distinction as a movie director, he worked as a detective, and so he approaches the business of understanding the details hidden inside a photograph, as a detective might, with bullheaded determination, sheer doggedness, and unflagging tenacity. His book is an unswerving search for truth.

     Six primary photographs make up each of his six chapters.

     The first is actually Roger Fenton’s two photographs of the same scene he took in 1855 of the Valley of the Shadow of Death, so named for the ghastly battle fought there during the Crimean War. The first photograph is without cannonballs on the road, and the second is with them. Ever since, people have looked at the two photos and accused Fenton of posing, of placing those cannonballs there in the second, in order to heighten the effect, an error in journalistic judgment.

     Morris though makes a persuasive argument that Fenton did not place the cannonballs there on the road, but that he took the two photographs at different times on the same day. Morris explained, “I wanted to experiment with lighting the cannonballs from various directions, replicating the directions of the sun and time of day.” With the sun casting shadows, Morris determined that the cannonballs may have been seen in the second photograph, more so than in the first.

     I wonder why Morris finds this so important. So what? Why would he take seventy-one pages to arrive at such a simple idea. He explains that he is urging his readers to begin “thinking about some of the most vexing issues in photography—about posing, about the intentions of the photographer, about the nature of photographic evidence, and about the relationship between photographs and reality.”

     The second is that of the Hooded Man standing atop a box with a poncho-like blanket draped over his thin frame, his fingertips wired to the electrical cables stretching up the wall behind him. It is one of those ugly photos taken in the Abu Ghraid prison in Iraq, on the night of November 4-5, 2003.

     The third is that of Sabrina Harman, a young female U.S. Army soldier, stationed at Abu Ghraib prison, who is smiling, looking up at the cameraman, her thumb raised. What makes the picture unique is that she is close to the corpse of an Iraqi man, who Morris identified as Manadel al-Jamadi, who was killed during the course of an interrogation by CIA officials in the prison’s shower on the morning of November 4, 2003. Sabrina Harman, Morris determined, had nothing to do with the murder.

     The fourth is that of a steer’s skull lying bleached out, white, bone-dry, on a sun-baked landscape in North Dakota in the summer of 1936, during “one of the worst droughts in American history.” The fifth is a Mickey Mouse toy lying atop the rubble after an Israeli air strike on southern Lebanon in 2006, and the sixth is that of a photograph of three children that Amos Humiston, a Union soldier, was holding in his hand on the day he was killed, July 1, 1863, at Gettysburg.

     For each of the six, Morris slowly peels away, like an onion, the series of mysteries that surround each photograph, digging deeper, taking it down to another level. Who was the photographer? What was his or her intentions? Was posing involved? What is the truth here? I assume that the lesson that Morris wants to leave with his readers is that we are not to trust our eyes, that what we see in a photograph is not always what we should believe, even though he entitles his book Believing is Seeing.

     Posing can mean excluding something of importance out of a photo. For example, what if an elephant had been on that Crimean road and Fenton had waited until it had passed? If that had been the case, Morris writes, “he posed the photograph by excluding something. . . . But how would you know? . . . . Isn’t there always a possible elephant lurking just at the edge of the frame?”

     A photograph can be our sole frame of reference, as can a literary work, a painting, a job, a career, an education, or a speech. Those who exist within that single frame will not often permit anything new or different to enter into that frame. Such people succumb to what Morris labels “cognitive dissonance: when we embrace one theory, we will firmly reject beliefs that are incompatible with it. Faced with evidence that is incompatible with a theory, we will throw away the evidence, rather than the theory.”

     With photographs, there is always the potential that it has been faked, doctored, or that the photographer staged it, posing the scene, waiting for the elephant to pass out of or into the scene. Photos, videos, books, the news, speeches, and other frames of reference are all suspect, and demand our scrutiny and best judgment to discover what is the truth here.

     Yet, people are credulous; they want to believe what they see. I have said it before here, but it bears repeating now, “Believe only half of what you see, and none of what you hear.” I would question the word “half.” At the most believe only ten percent of what you see.