Select Page

ALZHEIMER’S DISEASE

ALZHEIMER’S DISEASE

ALZHEIMER’S DISEASE

by William H. Benson

August 6, 2009

     “Alzheimer’s disease is the number one neurological disorder in the United States today,” said Jeanette Worden, a neuroscientist at Vanderbilt. One in eight people over the age of 65 show symptoms of the disease, and for those over 85, it approaches one in two.

     In 1906, Alois Alzheimer, a German doctor and medical researcher, gave a lecture in which he described a woman named Auguste D. who showed an “unusual disease of the cerebral cortex.” When just 51, Auguste began to suffer from memory loss and faulty judgment, and then she died at the premature age of 55. Dr. Alzheimer was surprised when the autopsy revealed that her cortex was thin, and that there were deposits of plaque and neurofibrillary tangles deposited within her brain, typical of people much older.

     Alzheimer’s disease is characterized primarily by the loss of neurons in the brain, especially in the neocortical region, that area associated with higher functioning thinking and emotional processing tasks. Victims of the disease can walk, talk, eat, and function normally, but what is missing is the human thinking apparatus. This loss contributes to behavioral changes that are quite sharp and that profoundly affect their family members.

     Dr. Worden says that “there is no direct cause, and no known cure, and that it can be only diagnosed by an autopsy.”

     Researchers have identified though certain factors that increase a person’s risk: age, a severe head injury, inheriting the E4/E4 alleles for apolipoprotein E, a high fat diet, elevated cholesterol levels, obesity, diabetes, hypertension, smoking, low blood levels of folic acid, and high levels of homocysteine.

     We have no control over some of the risk factors, such as our age or the alleles we inherit, but several factors we can control. In fact, Professor Worden commented that the same risk factors for heart disease and stroke are the same as those for Alzheimer’s, that they overlap. A researcher said, “Anything that increases your chances of developing a stroke or a heart attack also increases your chances of developing Alzheimer’s.”

     The factors that decrease a person’s risk include: a low-fat diet, including omega-3 fats in the diet, such as found in cold water fish; maintaining a healthy weight; exercise; drinking fruit and vegetable juices; and continually finding mental challenges.

     These factors were, in part, determined from a research project undertaken by David Snowden, a neuroscientist at the University of Kentucky. In 1986, he approached a convent of nuns, the School Sisters of Notre Dame in Mankato, Minnesota, and eventually 678 members agreed to allow Snowden to observe their lifestyles. The nuns further surprised Snowden when they agreed to donate their brains upon their passing.

     Religious groups are excellent for studying because of the members’ similarities, allowing a researcher to focus upon their minute differences. Snowden understood that the School Sisters were noted for their longevity and a low incidence of Alzheimer’s.

     Snowden continues his study today, and has written a delightful book—Aging With Grace: What the Nun Study Teaches Us About Leading Longer, Healthier, and More Meaningful Lives. Of the 678 nuns, there are now 61 survivors, but some lived well beyond 100. They led low-stress lives, they had many friends with whom they shared similar ideas, they ate small portions of a low-fat diet rich in fruits and vegetables, they did Yoga daily, and they exercised often every week.

     But above all else, they kept themselves mentally challenged, learning foreign languages, reading philosophy, engaging in stimulating conversations, and writing to their Congressmen. By his nun study, Snowden concluded that normal aging does not have to be associated with major cognitive decline, but is often related to lifestyle.

     One of Snowden’s fascinating insights came from the nuns’ journals, which they had begun in their teens when they first entered the convent. Snowden read their journals and concluded that those nuns who displayed greater skill with written language—a complexity of ideas and emotional content—at an early age “were significantly less likely to be diagnosed with Alzheimer’s,” than those who displayed a deficiency.

     Dr. Worden suggests certain healthier habits. Avoid the sedentary lifestyle. Exercise everyday. Learn new things. Keep friendships strong and make new friends. Reduce stress. Eat smaller portions of a low-fat diet. Keep love strong, and laugh often.  

MOONWALK

MOONWALK

MOONWALK

by William H. Benson

July 23, 2009

     “Houston, Tranquility Base here. The Eagle has landed,” said Neil Armstrong from within the Landing Module at 4:17 p.m. EDT, Sunday, July 20, 1969, the moment of touchdown on the moon. To his fellow astronaut Buzz Aldrin seated beside him, he said, “So far, so good.” He then turned back to his checklist and said, “Okay, let’s get on with it.” As Apollo 11’s mission commander, Armstrong was unemotional, professional, and all about getting the job done correctly.

     Meanwhile on Planet Earth jubilation broke out, for people everywhere were elated by the news: two Americans were on the moon. Walter Cronkite, the television newscaster, was struck by the magnitude of the moment and said, “Whew, boy! Man on the Moon!”

     Four days earlier, on Wednesday, July 16, from Kennedy Space Center, a Saturn V rocket had launched Apollo 11 into orbit, with Michael Collins, Edwin Buzz Aldrin Jr., and Neil Armstrong seated within the command module.

     The entire Apollo mission involved a number of discrete phases, each with a checklist. “You have to do things right away,” Armstrong explained, “and do them properly, so that was the focus. It was a complete concentration on getting through each phase and being ready to do the proper thing if anything went wrong in the next phase.”

     After one and a half orbits around the earth, the astronauts had fired an engine that had sent their spacecraft on a trajectory toward a rendezvous with the moon. Three days later, on July 19, they had entered into lunar orbit, and the next day Eagle, the lunar module, separated from Columbia, the command module. Collins remained alone in the command modulewhile Armstrong and Aldrin directed the Eagle down to the moon.

     After resting and then slipping into their space suits, Armstrong and Aldrin prepared to descend the nine rungs of the Eagle’s ladder to the moon’s surface. First, Armstrong pulled a D-ring that released the MESA table from under the lunar module that then focused a television camera upon the ladder, and Aldrin flipped the circuit breaker that started the black and white camera rolling. The image was beamed back to Earth, and there people saw upon their television sets Armstrong as he stepped down the ladder.

     On Sunday, July 20, at 10:57 p.m. EDT, Armstrong’s foot touched the lunar crust, and he uttered his now eternally famous words, “That’s one small step for man, one giant leap for mankind.” No one had prompted him, and he had spent little time thinking about what he would say, so focused he was. But he had thought the words and said them to a world of people anxiously watching their televisions 244,000 miles away.

     Aldrin followed Armstrong outside, and for the next two and a half hours, the two astronauts walked upon the moon’s surface. Aldrin called the scene before him “magnificent desolation.” Armstrong took photographs. They stuck a flagpole bearing a U.S. flag into the moon’s dust. They collected rock and dust samples, and then they listened as President Nixon from the Oval Office telephoned his congratulations. Continually they glanced up in the sky to “Spaceship Earth” hovering overhead.

     At 1:45 p.m. on Monday, July 21, the Eagle blasted off from the moon’s surface, without difficulty. “There was no time to sightsee,” Aldrin said. “I was concentrating intently on the computers, and Neil was studying the attitude indicator, but I looked up long enough to see the American flag fall over.” Seven minutes later, they had achieved a lunar orbit, and at 4:38 p.m. the Eagle docked with Columbia. Two hours later Armstrong and a smiling Aldrin joined a very pleased Collins in the command module. The men then cut loose Eagle. “It was a fond farewell,” Armstrong remembers.    

     Two and a half days later, on July 24, at 11:35 a.m. CDT, the command module slammed into the first fringes of air at some 400,000 feet above Earth; splashdown came minutes later at 11:51. Armstrong then radioed Houston, “Everyone okay inside. Our checklist is complete. Awaiting swimmers.”

     Forty years have elapsed since Apollo 11’s stunning success. Cronkite is now gone. Armstrong will turn 79 on August 5, and since 1972 no human has traveled beyond low-Earth orbit. “The Apollo program is the greatest technical achievement of mankind to date. Nothing since Apollo has come close to the excitement that was generated by those astronauts—Armstrong, Aldrin, and the ten others who followed them.”

HIV / AIDS

HIV / AIDS

HIV / AIDS

by William H. Benson

July 9, 2009

     I first heard about it on the radio in perhaps 1981. Paul Harvey reported that a number of young men in California were suffering from a variety of health problems with strange-sounding names: Kaposi’s sarcoma, candidiasis, pneuomocystis, toxoplasmosis, leukoencephalopathy, and mycobacterium. At the time, this strange debilitating disease did not have a name or a diagnosis, but eventually, someone tagged it AIDS.

     Researchers went to work and discovered that those stricken with Acquired Immunodeficiency Syndrome may demonstrate different “opportunistic infections,” but they all shared a common trait—they were low on the CD4 cells, below a level of 200 when a normal CD4 count is between 600 and 1000. Dr. Michael Gottleib of the University of California said in early 1981 that, “They were virtually devoid of T-helper cells.” Their immune system had been profoundly compromised and was deficient.

     The medical detectives discovered that the AIDS victims seemed to belong to one of four groups—the so-called Four H Club: homosexuals, hemophiliacs, heroin-users, and Haitians. In those early years, the members of this club suffered most unfairly from severe discrimination and stigmatization within the emotionally-charged culture of blame and prejudice that surrounded all discussions of AIDS.

     Evidence pointed the medical detectives toward a virus as the culprit behind AIDS, and it was given the name HIV, Human Immunodeficiency Virus, specifically “HIV-1, group M, subtype B,” the most common strain found in the U.S.

     But then it was discovered that those outside those four social groups were also infected with HIV—those who had received blood transfusions between 1981 and 1986 before laboratories tested for HIV. Donors of blood were often people desperate for money, such as intravenous drug users who perhaps shared contaminated needles, and their tainted blood entered into the blood bank supply and then was sent worldwide.

     Most of those people who received infected blood donations during surgery went on to become HIV positive themselves. Two of those so unlucky included Isaac Asimov, the science fiction writer, and Arthur Ashe, the tennis champion. Ashe was the first African-American to compete in the international sport of tennis at the highest level of the game. Asimov died on April 6, 1992, and Ashe on February 6, 1993.

     But what about the Haitians? Why did AIDS show up in Haiti? Researchers are now confident, “99.7% certain,” that HIV came to the U.S. by way of Haiti, and that it had arrived on the Caribbean island from Haitians who had worked in the Congo in Africa.

     The “Hunter Theory,” subscribes to the idea that in Africa, in south Cameroon, hunters had shot chimpanzees that carried the Simian Immunodeficiency Virus. Hunters with cuts, scrapes, and wounds on their hands and arms had contacted the chimp’s blood, and SIV had adapted itself within its new host, human beings, as HIV. The dots connected from the U.S. back to African chimpanzees by way of Haiti.

     The statistics are sobering. An estimated 33 million people living today are infected, 22 million of those live in sub-Saharan Africa, and approximately 25% of those infected are unaware that they are, because they have never been tested. Since 1981, more than 25 million people have died from AIDS.

     All this human suffering and devastation has originated from strands of DNA and RNA packaged within a membrane studded with protein—a virus. War, famine, and pestilence have harassed human progress for centuries, and pestilence seems the worst.

      Treatment for those infected has evolved into a combination of antiretroviral drugs that only prolong the lives of those infected, for there is no cure. Prevention has included stern warnings about a more watchful, guarded, and safer lifestyle.

     Ultimately, what humans need is a vaccine, and so on May 18, 1997 President Bill Clinton challenged medical researchers to develop an HIV vaccine, the best hope for ending AIDS. A dozen years later and the goal remains unfulfilled, even though the need is urgent: AIDS is the world’s number 4 killer, but is number 1 in sub-Saharan Africa.

 

     July 10th would have been Arthur Ashe’s 66th birthday, and if given a chance, he, I am sure, would have liked to watch Serena Williams beat her sister Venus, in the women’s finals match last Saturday at Wimbledon, where he won the men’s finals in 1975.

THOUGHTS ON CUSTER

THOUGHTS ON CUSTER

THOUGHTS ON CUSTER

by William H. Benson

June 25, 2000

     “The defeat of General Custer and the Seventh Cavalry on June 25, 1876, with a loss of 265 men killed and 52 wounded, was the most sensational battle of the western Indian wars.” So wrote George Bird Grinnell in his chapter “The Custer Battle, 1876” in his classic work, The Fighting Cheyennes.

     Grinnell was most unusual among western historians: he spent time with the Native Americans, interviewed them, especially the Cheyennes, and then recorded their oral histories of what had happened at their battles with the white soldiers between 1856 and 1879. Most historians dismissed the Native Americans’ accounts as untrustworthy, but Grinnell wrote them down as he heard them, “without comment.”

     “Grinnell found among the Cheyennes that their leaders were men of honor and veracity, honest and guarded in their statements, avoiding hearsay.”

     By the mid-1870’s, certain of the Plains Indians tribes resented the Americans’ invasion of their lands, and they refused to accept the white men’s treaties.

     Chief among these “non-treaty” Sioux was the Hunkpapa Sioux medicine man, Sitting Bull, who had witnessed a vision, that the gods had granted him power over the white soldiers. To him in the Rosebud River country of southern Montana in the spring of 1876, he gathered thousands from the tribes of the Sioux nations: the Hunkpapa, Brule, Ogalalla, Minneconjou, Santee, and the Yanktonnais. Joining these Sioux were also the Northern Cheyennes and the Arapahoe.   

       The U.S. government responded to Sitting Bull’s defiance, and dispatched three armies toward the Rosebud, one of which was commanded by General Alfred Terry, along with his Lieutenant Colonel, George Armstrong Custer, from Fort Lincoln near Bismarck, North Dakota. The three armies intended on converging on the Rosebud.

     Early on Sunday morning, June 25, 1876, Custer of the 7th Cavalry was on the divide between the Rosebud and the Little Big Horn River, when he received his scouts’ report that the Plains Indians were situated on the west side of the Little Big Horn, north and west of his current location. Custer decided to attack, and divided his army three units.

       Major Marcus A. Reno attacked the south end of the Indian camp, but when Reno saw the size of the camp, he decided not to continue his attack, when he was only a quarter of a mile from the village. A mistake. He and his troops retreated to a grove of timber, but panic-stricken, his 112 soldiers then bolted from the woods’ protection, crossed the Little Big Horn, and sought refuge upon a bluff.

     The Cheyenne later told Grinnell that they thought the white soldiers’ behavior was bizarre. “We could never understand why the soldiers left the timber, for if they had stayed there, the Indians could not have killed them.” More than 40 soldiers were killed.

     Frederick Benteen, Captain of Custer’s second unit, arrived on the bluffs just as Reno’s men were scaling them. Benteen asked Reno, “Where is Custer?” But Reno did not know. Custer had led the third unit himself, five miles to the north hoping to flank the entire Indian camp in a surprise attack. Suddenly, Custer too saw the immensity of the village, between 10,000 and 12,000 Indians, of which some three thousand were warriors.

     Custer chose not to attack. A mistake. But the Indian warriors, led by the chiefs Gall, Crazy Horse and Two Moon, recognized Custer’s indecision and attacked, while Custer and his soldiers tried to stand them off.

     “Indians and troopers fought in a big tangle. Everyone fired wildly. No one stopped to see who was who. They were so densely packed that the Indians were shooting each other. . . . Many of the Indians hacked away with clubs or hatchets. Soldiers died. It was every man for himself.” The fight was soon over, and Custer’s entire unit was wiped out.

     Years later, the Cheyenne warriors who fought there that day admitted to George Grinnell “that if Reno and Custer had kept on and charged through the village from opposite ends, the Indians would have scattered and there would have been no disaster.”

 

     That may have been so, but at the age of thirty-six, George Armstrong Custer, the most flamboyant and sensational Civil War general and Indian fighter that America had ever produced, was gone. The Indians had called Custer “Long Hair” or sometimes “Son of the Morning Sun.” The latter was a name filled with homage, for the Native Americans’ revered the sunrise, that eastern sky where the sun is born again and again.

TELEVISION AND TIANANMEN SQUARE

TELEVISION AND TIANANMEN SQUARE

TELEVISION AND TIANANMEN SQUARE

by William H. Benson

June 11, 2009

     Television audiences all over the world were focused upon events playing out in Tiananmen Square in Beijing, China in 1989, twenty years ago last week. Angry students had taken to the Square, where for days they had marched in protest against the communist government, shouting themselves hoarse by demanding greater freedoms.

     Never to be forgotten though was the scene of a single college student standing before a line of army tanks defying them to roll over him. Television cameras captured it all.

     On June 4, the government declared martial law and sent in the army, and with guns blazing, the soldiers and tanks did roll over the students. “Amid the gunfire, the protests quickly subsided.” The students fled, but the government rounded them up, and many were executed. Chinese television reporters for weeks afterwards repeatedly broadcast shots of despairing students on the wanted list, handcuffed and being taken into custody.

     And, then one day the picture on the television screen changed. Suddenly, there were scenes of prosperity and reports upon the progress in China. The government had instantly swept away from the public’s eye the Tiananmen Square demonstrators. As a result, the young Chinese, those under the age of thirty, know little about what happened at Tiananmen Square because the government had changed the channel, permanently.

     Eleven months later on May 3, 1990, an author named Bill McKibben, then living in upstate New York, decided he would watch television on all the channels of his cable company for the twenty-four hours of that day. To do this, he enlisted the help of about a hundred of his friends to record on videocassettes one of the channels. Then, for weeks afterwards, he watched each of those channels.

     McKibben, in his book The Age of Missing Information, asked a question, “Does having access to more information than ever by way of television mean that we know more than ever?” McKibben did not believe so. “We say information reverently, as if it meant understanding or wisdom but it doesn’t. Sometimes it even gets in the way.”

     Television directs a fire hose of information and entertainment into our homes. Tiananmen Square gave way to the Persian Gulf War, to the Clinton Presidency and his impeachment, to 9-11, to Bush’s War in Afghanistan and Iraq, and finally to President Obama speaking in Cairo last week. It is all instantaneous and alive, and it is not old information but new: after all, it is the news.   

     The television is ubiquitous, ever-present, standing in our living rooms, available 24 hours per day, and it is repetitive: who has not seen a re-run of Andy Griffith?

     UNESCO in 1953 investigated the new phenomenon of television and concluded, “Television constitutes an enormous drain on creative talent. It is difficult if not impossible to provide every day of the week good dramatic shows, good entertainment, good educational broadcasting, and good children’s programmes. The result of long hours of programmes is bound to be that quality becomes the exception.”

     Fred Friendly, a network executive, once said, “Television at its very worst makes so much money. Why would it ever want to be its very best?”

     And the people rule; television watchers determine the content. “It’s what you want,” McKibben wrote. The chairman of MTV networks once said, “The consumer is our God. Millions of dollars are spent to find out what the viewers want to see.” What counts are the people, and the networks ceaselessly curtsey and curry our favor. This inevitably distorts our view of the world; the thought is that we, each of us, are the most important point in the universe. “We do it all for you,” and “Have it your way,” are TV slogans.

     The truth is that the world does not care. It is actually an exceedingly dangerous place for people, and there are forces ready to roll over us without pausing.

     “The stars in the night sky hold no interest to advertisers,’ McKibben wrote, “for they don’t reflect us. The stars that do reflect us are the kind that appear on talk shows.” Television creates “celebrities who are celebrated almost entirely for being celebrities.”

     Chairman Mao insisted that citizens “serve the people.” And a constant refrain was that, “Since 1949, the people are the masters.” The newspaper is the People’s Daily, and the country is the People’s Republic of China.

      For the people, the Chinese government changed the channel in 1989, and so Tiananmen Square is “the Forgotten Revolution,” but in America, “We the people” can change the channel whenever we choose, but one channel is no different than is another, an easily “forgotten” comedy or drama or news story. A hundred channels, but nothing worth watching.

CHOOSING A MAJOR

CHOOSING A MAJOR

CHOOSING A MAJOR

by William H. Benson

May 28, 2009

     The graduation season is now behind us, and a new school year and college looms ahead for the graduates, some weeks hence. This summer the new crop of high school graduates will arrive at a decision they have been mulling over for months—what subject will they major in. The act of choosing one major precludes choosing all the others: in order to enroll at the university we are compelled to be selective.

     This decision has an overpowering impact upon students’ lives that carries forward for decades. Thirty-five years ago, the student who majored in psychology has struggled ever since with a succession of jobs—at a big discount store, as a truck driver, and now he is a custodian at the post office. Another enrolled in physical education classes, never located a job in a school, and now barely survives as a handy man. Another studied English literature, has taught English in a Denver metro high school ever since, and now counts the years and days until she can retire. 

     Choose wisely, perhaps a subject with broad commercial appeal, for your choice will echo forward through the decades of your life and your children’s lives, in terms of earning potential, personal gratification versus the misery index, and where you can live.

     There are few rocket scientists in rural America, only a smattering of brain surgeons there, but most small towns have schools that need teachers, banks that need bankers, and offices that require attorneys or realtors or accountants. If you want to live in a small town, you might plan your major around the jobs that exist there.

     The job market degrades us all; it forces us to accept the job that is available where we find ourselves, not necessarily that job for which the university has trained us.

     The universities and colleges of America are world class, and they do an excellent job of preparing people for the professions. Yet, they do an even better job of pulling open the doors marked wisdom and pointing to the treasures contained within the world. The university is a depository of human knowledge that the wisest humans have accumulated painstakingly over the ages of human existence, and it is available, only for the asking.

     The perceptive student asks, “Where else can I go and read all day?” Others show up the first day at college with a contradictory attitude. “What I don’t know isn’t worth knowing,” they might say, and then defy anyone to teach them a thing, but at college, curiosity is a prerequisite.

     “There are whole areas of knowledge which, if we have any curiosity or sense of life at all, we crave to explore.” Each area of knowledge is a new world—a series of caves that opens wider and wider.

     Sometimes the research may lead us into places we may not want to go. We are forced to visualize, to see with new eyes, for example, the sweeping changes of the historical past, the intricacies contained within the living cell, the power of great literary works, the symmetry of a chemical molecule, and the transformation of a language over time.   

     The world in which we live is an incredibly complicated one. If a student declares a major in biology, the first question is, “which aspect of biology?” For there is cellular biology, biochemistry, botany, zoology, microbiology, anatomy, ecosystems, and a series of other subjects that all fit under the header of biology.

      If you were to study history, you might well ask “what era and what geographical region?” History includes ancient, European, American, Asian, or even African histories. European history is further divided into British, French, German, Dutch, Italian, and Russian, and these are each divided into centuries, and those centuries into topics.

     The same can be said for chemistry, mathematics, astronomy, religion, physics, medicine, business, for within any broad subject there is layer upon layer of knowledge that has been carefully researched, documented, and then written into the textbooks. A careful student will limit his or her scope.

     Henry David Thoreau said, “Simplify. Simplify. Simplify.” He followed his own advice by living in a cabin near Walden Pond, but instead, I say to you, future college student, “Outline. Outline. Outline.” It is a proven method for organizing a mountain of mind-twisting information, much like a wall of pigeonholes, so that you can retain that information, perhaps withdraw it later, and then use it, for your own betterment.

     To choose a college major wisely means selecting where you want to live, factoring in your own innate curiosity, expanding your talent to visualize, limiting your scope, and finally adding in a measure of patience. Wait and allow your major to reveal itself to you. “Patience and patience,” said Emerson, “we shall win at last.”