Select Page

THOUGHTS ON PLANET EARTH

THOUGHTS ON PLANET EARTH

THOUGHTS ON PLANET EARTH

by William H. Benson

September 3, 2009

     Over the eons, our planet, Earth, has staggered violently between centuries of sweltering heat or brutal chill. Cycles of global cooling followed by global warming have been the predominant pattern of Earth’s geological history. The last Ice Age began to end only about twelve thousand years ago, but at its peak, around twenty thousand years ago, about 30 percent of the Earth’s land surface was under ice, whereas today it is 10 percent. 

     The Wisconsian ice sheet covered much of Europe and North America then, and in certain places was half a mile thick. Standing at its leading point, a person would have gaped in astonishment at a towering wall of glacial ice over two thousand feet high, and life in any form atop those millions of square miles of ice had to have been a struggle.

     The size of the Great Lakes and Hudson’s Bay, as well as the thousands of lakes spread across Canada and Minnesota are evidence that those glaciers were indeed colossal. That it is today water and not ice is a fact that we owe to “global warming.”

     Bill Bryson suggested in his 2003 book, A Short History of Nearly Everything, “that ice will be a long-term part of our future.” Ice, rather than global warming.

     What causes an Ice Age and what reverses it are largely unknowns, even though theories abound, such as that the ellipsis that the Earth follows around the sun changes shape, or that the pitch of the orientation of the Earth to the sun wobbles.

     What is most astonishing is that on this, at times, bitter cold planet, life began, and, according to Bryson, “it happened just once. That is the most extraordinary fact in biology, perhaps the most extraordinary fact we know.”

     Matt Ridley, a biologist, said that, “Wherever you go in the world, whatever animal, plant, bug, or blob you look at, if it is alive, it will use the same dictionary and know the same code. All life is one.” And life’s beginning happened a long time ago.

     A hundred years ago this week, early in September of 1909, at the end of the fossil-finding season, an American paleontologist named Charles Doolittle Walcott stumbled upon the Burgess Shelf 8,000 feet high in the mountains of British Columbia, above timberline in steep country, a hundred miles west of Calgary.

     His was an extraordinary and quite lucky find, for the fossils he found that day were later determined to be extremely ancient, from about 540 million years ago, and originated from what is called the Cambrian Explosion, when life was then quite young.

     The fossils were imprinted upon shale rock, and he noted that they were “bizarrely different,” for there were animals with five eyes, others that were shaped like a pineapple slice, and even others with stilt-like legs. Animal life, it seemed to Walcott, was then exuberant and daring, trying out all kinds of forms, and experimenting,

     The Burgess Shale, wrote Stephen Jay Gould in his popular book Wonderful Life, was “our sole vista upon the inception of modern life in all its fullness.”

     However, the fossils that Walcott found were of the small and innocent-looking variety, such as minute oceanic crabs, but life moved on, drifting toward the big, mean, and fierce variety, the dinosaurs, which ruled the Earth, in a multiple number of forms, for millions of years, while the cycles of hot and cold raged on about them.

     And then, about 65 million years ago, the dinosaurs and about half the world’s species were obliterated, it is believed, by the devastation following an asteroid or a comet striking the Earth, and allowing our age, the Age of Mammals, thus a chance to begin.

     Based upon all of this, Bryson suggested four points: “Life wants to be; life does not always want to be much; life goes on; and life from time to time goes extinct.”

     “It is a curious fact,” wrote Bryson, “that on Earth, species death is, in the most literal sense, a way of life.” Of all the species of life forms that have every lived—an estimated thirty billion to perhaps as many as 4,000 billion—99.99 percent are now extinct. “To a first approximation,” said David Raup at Chicago University, “all species are extinct.”

     The environment’s conditions may change—from glacial ice to burning desert sand, from a sea to a high mountain range, from a temperate climate to a cold and inhospitable one—and yet life continues everywhere on planet Earth, undaunted and daring to be.

CARTER’S NATIONAL MALAISE SPEECH

CARTER’S NATIONAL MALAISE SPEECH

CARTER’S NATIONAL MALAISE SPEECH

by William H. Benson

August 20, 2009

     The 1970s seemed a disaster for our country. It was a time of faltering American military power relative to the Soviets, productivity declines in our nation’s industries, galloping inflation, and high unemployment. The causes for this disaster were mainly political: a failure to control the money supply, excessive taxation, and government intervention. In a word, it was a weak federal government headed by Jimmy Carter.

     By 1979, Carter’s approval rating had dropped to 25%, which was lower than Nixon’s during the Watergate scandal.

     If it was Carter’s job to restore the faith in the nation’s political system, “here was his greatest failure—solving the economic problems of the American people.”

     Then, when it seemed it could get no worse, during the last week of June of 1979, OPEC, the oil-producing cartel, announced a series of oil price increases. Gas prices skyrocketed, severe shortages resulted in long lines and waits at the pumps, and the people were mad, focusing their outrage over a seemingly endless economic decline.

     Beginning on July 4, 1979, Carter disappeared for the next ten days to his Presidential retreat at Camp David, and there, he listened to teachers, preachers, businessmen, and government officials—people whom he had invited—as they explained to him what was wrong with America, and what he should do about the energy crisis.

     Millions tuned in to listen to his televised speech on July 15. Since then, it has been called his “national malaise speech,” even though he never used the word “malaise.” People had expected him to talk about his energy legislation, and he did, but only briefly at the tail end of his speech. Instead, he chose to talk mainly about the American people’s lack of confidence, even though Walter Mondale, his Vice-President had warned him.

     A frowning Carter quoted from those who were at Camp David: “Mr. President, you are not leading this nation—you’re just managing the government.” “You don’t see the people enough any more.” “Some of your Cabinet members don’t seem loyal. There is not enough discipline among your disciples.” “If you lead, we will follow.”

     Then, he began to sound as if he was giving a sermon, which in a sense he was: “We are losing our confidence in the future.” “Too many of us now tend to worship self-indulgence and consumption.” “We’ve learned that piling up material goods cannot fill the emptiness of lives which have no confidence or purpose.”

     “This crisis of the American spirit is all around us.” “There is growing disrespect for government and for churches and for schools.” “Often you see paralysis and stagnation and drift.” “What you see too often in Washington and elsewhere is a system of government that seems incapable of action.”

     These were harsh words, and the people who heard them resented them. One commentator suggested that it was as if Carter had told the American public that they had bad breath, as if he had pointed an accusing finger at them and said, “It’s all your fault!”

     Two days later, on July 17, at a Cabinet meeting, Carter requested resignations from all the heads of the various departments. He only accepted five of them, but the damage was done: the public believed that the Carter Presidency was unraveling, and it was.

     Months later, the former Governor of California, sixty-nine year old Ronald Reagan, announced that he would run for President, and at that time he said, “I find no national malaise. I find nothing wrong with the American people!” Carter’s exact opposite, Reagan smiled, laughed, and joked. “A recession is when your neighbor loses his job. A depression is when you lose yours. And recovery is when Jimmy Carter loses his.” Reagan’s jaunty affability and folksy charm shone like sunlight upon the dark and gloomy mood, this so-called national malaise that Carter had dared to suggest.

     In a new book, What the Heck are You Up To, Mr. President?, the historian Kevin Mattson wrote that both “Carter and Reagan sprang from the same ideological impulse; both responded mainly to moral decline, but where Carter saw a need for humility and limits, Reagan saw a vacuum of optimism and determination. Reagan carried the day.”

      “The new President’s self-confidence, and confidence in America, soon communicated itself. It was not long before the American public began to sense that the dark days of the 1970s were over, and that the country was being led again.”

     The economic recovery that began under Reagan continued well into the 1990s, “the longest continual economic expansion in American history.”

ALZHEIMER’S DISEASE

ALZHEIMER’S DISEASE

ALZHEIMER’S DISEASE

by William H. Benson

August 6, 2009

     “Alzheimer’s disease is the number one neurological disorder in the United States today,” said Jeanette Worden, a neuroscientist at Vanderbilt. One in eight people over the age of 65 show symptoms of the disease, and for those over 85, it approaches one in two.

     In 1906, Alois Alzheimer, a German doctor and medical researcher, gave a lecture in which he described a woman named Auguste D. who showed an “unusual disease of the cerebral cortex.” When just 51, Auguste began to suffer from memory loss and faulty judgment, and then she died at the premature age of 55. Dr. Alzheimer was surprised when the autopsy revealed that her cortex was thin, and that there were deposits of plaque and neurofibrillary tangles deposited within her brain, typical of people much older.

     Alzheimer’s disease is characterized primarily by the loss of neurons in the brain, especially in the neocortical region, that area associated with higher functioning thinking and emotional processing tasks. Victims of the disease can walk, talk, eat, and function normally, but what is missing is the human thinking apparatus. This loss contributes to behavioral changes that are quite sharp and that profoundly affect their family members.

     Dr. Worden says that “there is no direct cause, and no known cure, and that it can be only diagnosed by an autopsy.”

     Researchers have identified though certain factors that increase a person’s risk: age, a severe head injury, inheriting the E4/E4 alleles for apolipoprotein E, a high fat diet, elevated cholesterol levels, obesity, diabetes, hypertension, smoking, low blood levels of folic acid, and high levels of homocysteine.

     We have no control over some of the risk factors, such as our age or the alleles we inherit, but several factors we can control. In fact, Professor Worden commented that the same risk factors for heart disease and stroke are the same as those for Alzheimer’s, that they overlap. A researcher said, “Anything that increases your chances of developing a stroke or a heart attack also increases your chances of developing Alzheimer’s.”

     The factors that decrease a person’s risk include: a low-fat diet, including omega-3 fats in the diet, such as found in cold water fish; maintaining a healthy weight; exercise; drinking fruit and vegetable juices; and continually finding mental challenges.

     These factors were, in part, determined from a research project undertaken by David Snowden, a neuroscientist at the University of Kentucky. In 1986, he approached a convent of nuns, the School Sisters of Notre Dame in Mankato, Minnesota, and eventually 678 members agreed to allow Snowden to observe their lifestyles. The nuns further surprised Snowden when they agreed to donate their brains upon their passing.

     Religious groups are excellent for studying because of the members’ similarities, allowing a researcher to focus upon their minute differences. Snowden understood that the School Sisters were noted for their longevity and a low incidence of Alzheimer’s.

     Snowden continues his study today, and has written a delightful book—Aging With Grace: What the Nun Study Teaches Us About Leading Longer, Healthier, and More Meaningful Lives. Of the 678 nuns, there are now 61 survivors, but some lived well beyond 100. They led low-stress lives, they had many friends with whom they shared similar ideas, they ate small portions of a low-fat diet rich in fruits and vegetables, they did Yoga daily, and they exercised often every week.

     But above all else, they kept themselves mentally challenged, learning foreign languages, reading philosophy, engaging in stimulating conversations, and writing to their Congressmen. By his nun study, Snowden concluded that normal aging does not have to be associated with major cognitive decline, but is often related to lifestyle.

     One of Snowden’s fascinating insights came from the nuns’ journals, which they had begun in their teens when they first entered the convent. Snowden read their journals and concluded that those nuns who displayed greater skill with written language—a complexity of ideas and emotional content—at an early age “were significantly less likely to be diagnosed with Alzheimer’s,” than those who displayed a deficiency.

     Dr. Worden suggests certain healthier habits. Avoid the sedentary lifestyle. Exercise everyday. Learn new things. Keep friendships strong and make new friends. Reduce stress. Eat smaller portions of a low-fat diet. Keep love strong, and laugh often.  

MOONWALK

MOONWALK

MOONWALK

by William H. Benson

July 23, 2009

     “Houston, Tranquility Base here. The Eagle has landed,” said Neil Armstrong from within the Landing Module at 4:17 p.m. EDT, Sunday, July 20, 1969, the moment of touchdown on the moon. To his fellow astronaut Buzz Aldrin seated beside him, he said, “So far, so good.” He then turned back to his checklist and said, “Okay, let’s get on with it.” As Apollo 11’s mission commander, Armstrong was unemotional, professional, and all about getting the job done correctly.

     Meanwhile on Planet Earth jubilation broke out, for people everywhere were elated by the news: two Americans were on the moon. Walter Cronkite, the television newscaster, was struck by the magnitude of the moment and said, “Whew, boy! Man on the Moon!”

     Four days earlier, on Wednesday, July 16, from Kennedy Space Center, a Saturn V rocket had launched Apollo 11 into orbit, with Michael Collins, Edwin Buzz Aldrin Jr., and Neil Armstrong seated within the command module.

     The entire Apollo mission involved a number of discrete phases, each with a checklist. “You have to do things right away,” Armstrong explained, “and do them properly, so that was the focus. It was a complete concentration on getting through each phase and being ready to do the proper thing if anything went wrong in the next phase.”

     After one and a half orbits around the earth, the astronauts had fired an engine that had sent their spacecraft on a trajectory toward a rendezvous with the moon. Three days later, on July 19, they had entered into lunar orbit, and the next day Eagle, the lunar module, separated from Columbia, the command module. Collins remained alone in the command modulewhile Armstrong and Aldrin directed the Eagle down to the moon.

     After resting and then slipping into their space suits, Armstrong and Aldrin prepared to descend the nine rungs of the Eagle’s ladder to the moon’s surface. First, Armstrong pulled a D-ring that released the MESA table from under the lunar module that then focused a television camera upon the ladder, and Aldrin flipped the circuit breaker that started the black and white camera rolling. The image was beamed back to Earth, and there people saw upon their television sets Armstrong as he stepped down the ladder.

     On Sunday, July 20, at 10:57 p.m. EDT, Armstrong’s foot touched the lunar crust, and he uttered his now eternally famous words, “That’s one small step for man, one giant leap for mankind.” No one had prompted him, and he had spent little time thinking about what he would say, so focused he was. But he had thought the words and said them to a world of people anxiously watching their televisions 244,000 miles away.

     Aldrin followed Armstrong outside, and for the next two and a half hours, the two astronauts walked upon the moon’s surface. Aldrin called the scene before him “magnificent desolation.” Armstrong took photographs. They stuck a flagpole bearing a U.S. flag into the moon’s dust. They collected rock and dust samples, and then they listened as President Nixon from the Oval Office telephoned his congratulations. Continually they glanced up in the sky to “Spaceship Earth” hovering overhead.

     At 1:45 p.m. on Monday, July 21, the Eagle blasted off from the moon’s surface, without difficulty. “There was no time to sightsee,” Aldrin said. “I was concentrating intently on the computers, and Neil was studying the attitude indicator, but I looked up long enough to see the American flag fall over.” Seven minutes later, they had achieved a lunar orbit, and at 4:38 p.m. the Eagle docked with Columbia. Two hours later Armstrong and a smiling Aldrin joined a very pleased Collins in the command module. The men then cut loose Eagle. “It was a fond farewell,” Armstrong remembers.    

     Two and a half days later, on July 24, at 11:35 a.m. CDT, the command module slammed into the first fringes of air at some 400,000 feet above Earth; splashdown came minutes later at 11:51. Armstrong then radioed Houston, “Everyone okay inside. Our checklist is complete. Awaiting swimmers.”

     Forty years have elapsed since Apollo 11’s stunning success. Cronkite is now gone. Armstrong will turn 79 on August 5, and since 1972 no human has traveled beyond low-Earth orbit. “The Apollo program is the greatest technical achievement of mankind to date. Nothing since Apollo has come close to the excitement that was generated by those astronauts—Armstrong, Aldrin, and the ten others who followed them.”

HIV / AIDS

HIV / AIDS

HIV / AIDS

by William H. Benson

July 9, 2009

     I first heard about it on the radio in perhaps 1981. Paul Harvey reported that a number of young men in California were suffering from a variety of health problems with strange-sounding names: Kaposi’s sarcoma, candidiasis, pneuomocystis, toxoplasmosis, leukoencephalopathy, and mycobacterium. At the time, this strange debilitating disease did not have a name or a diagnosis, but eventually, someone tagged it AIDS.

     Researchers went to work and discovered that those stricken with Acquired Immunodeficiency Syndrome may demonstrate different “opportunistic infections,” but they all shared a common trait—they were low on the CD4 cells, below a level of 200 when a normal CD4 count is between 600 and 1000. Dr. Michael Gottleib of the University of California said in early 1981 that, “They were virtually devoid of T-helper cells.” Their immune system had been profoundly compromised and was deficient.

     The medical detectives discovered that the AIDS victims seemed to belong to one of four groups—the so-called Four H Club: homosexuals, hemophiliacs, heroin-users, and Haitians. In those early years, the members of this club suffered most unfairly from severe discrimination and stigmatization within the emotionally-charged culture of blame and prejudice that surrounded all discussions of AIDS.

     Evidence pointed the medical detectives toward a virus as the culprit behind AIDS, and it was given the name HIV, Human Immunodeficiency Virus, specifically “HIV-1, group M, subtype B,” the most common strain found in the U.S.

     But then it was discovered that those outside those four social groups were also infected with HIV—those who had received blood transfusions between 1981 and 1986 before laboratories tested for HIV. Donors of blood were often people desperate for money, such as intravenous drug users who perhaps shared contaminated needles, and their tainted blood entered into the blood bank supply and then was sent worldwide.

     Most of those people who received infected blood donations during surgery went on to become HIV positive themselves. Two of those so unlucky included Isaac Asimov, the science fiction writer, and Arthur Ashe, the tennis champion. Ashe was the first African-American to compete in the international sport of tennis at the highest level of the game. Asimov died on April 6, 1992, and Ashe on February 6, 1993.

     But what about the Haitians? Why did AIDS show up in Haiti? Researchers are now confident, “99.7% certain,” that HIV came to the U.S. by way of Haiti, and that it had arrived on the Caribbean island from Haitians who had worked in the Congo in Africa.

     The “Hunter Theory,” subscribes to the idea that in Africa, in south Cameroon, hunters had shot chimpanzees that carried the Simian Immunodeficiency Virus. Hunters with cuts, scrapes, and wounds on their hands and arms had contacted the chimp’s blood, and SIV had adapted itself within its new host, human beings, as HIV. The dots connected from the U.S. back to African chimpanzees by way of Haiti.

     The statistics are sobering. An estimated 33 million people living today are infected, 22 million of those live in sub-Saharan Africa, and approximately 25% of those infected are unaware that they are, because they have never been tested. Since 1981, more than 25 million people have died from AIDS.

     All this human suffering and devastation has originated from strands of DNA and RNA packaged within a membrane studded with protein—a virus. War, famine, and pestilence have harassed human progress for centuries, and pestilence seems the worst.

      Treatment for those infected has evolved into a combination of antiretroviral drugs that only prolong the lives of those infected, for there is no cure. Prevention has included stern warnings about a more watchful, guarded, and safer lifestyle.

     Ultimately, what humans need is a vaccine, and so on May 18, 1997 President Bill Clinton challenged medical researchers to develop an HIV vaccine, the best hope for ending AIDS. A dozen years later and the goal remains unfulfilled, even though the need is urgent: AIDS is the world’s number 4 killer, but is number 1 in sub-Saharan Africa.

 

     July 10th would have been Arthur Ashe’s 66th birthday, and if given a chance, he, I am sure, would have liked to watch Serena Williams beat her sister Venus, in the women’s finals match last Saturday at Wimbledon, where he won the men’s finals in 1975.