Movie review: Dr. Strangelove

With Vladimir Putin rattling the Russian sabers last week, it seemed time to watch again the classic Cold War movie Dr. Strangelove; or How I Learned to Stop Worrying and Love the Bomb. Made in 1964, Dr. Strangelove depicts the possibility of the world’s superpowers going to war because of the belligerence of one United States general.

The movie opens with a comforting statement from the United States Air Force that the events depicted in the movie could not possibly happen in real life. Yet the rules and regulations used by Brigadier General Jack D. Ripper seem entirely reasonable and likely in the context of the film. Usually described as a black comedy, the script contains remarkably few laugh-out-loud lines. (“Gentlemen, you can’t fight in here—this is the War Room,” is one of the few.) The humor consists rather in situational comedy and irony bordering on parody: an Air Force pilot replaces his regulation helmet with a cowboy hat after receiving the order to bomb targets in the Soviet Union; a military officer with the code that can call off the attack attempts to reach the President and his advisors from a pay phone but does not have enough spare change to place the call.

Dr. Strangelove combines the extemporaneous comedy of Peter Sellers with the micromanaging direction of Stanley Kubrick. Sellers is one of the very few actors who has had a major role in more than one Kubrick film. This improbable pairing shows the enormous respect the two professionals held for one another. The cast also includes Sterling Hayden as General Ripper, George C. Scott as General Turgidson (a gung-ho, gum-chomping general who must explain to the President and his advisors what is happening and why—the gravely voice of Scott’s future portrayal of General Patton can be heard from time to time), Slim Pickens as the Air Force pilot, and James Earl Jones as a member of his crew. Sellers is given three roles: the title character, the American President, and a RAF officer assigned to General Ripper’s staff.

The title character, Dr. Strangelove, is meant to portray German scientists like Werner Von Braun, who were brought to the United States after World War II to assist the military and the space program. As portrayed by Sellers, he is uncannily reminiscent of a then-unknown Harvard Professor of Government named Henry Kissinger. Of his three characters, Sellers spends the least time on the screen as Strangelove. His portrayal of President Merkin Muffley—said to be based on unsuccessful presidential candidate Adlai Stevenson—makes the character a single voice of calm and reason surrounded by insanity, yet Sellers’ comedic genius shines in his telephone conversations (during which only his words are heard) with the Soviet Premier. Group Captain Lionel Mandrake is also, for Sellers, an understated character, played against the madness of General Ripper. Yet his efforts to wheedle the call-back code from the general, along with his scene in the telephone booth, are among the highlights of the movie.

Kubrick based the movie on a serious novel and only realized along the way that the movie would play better as a comedy than as a serious war film. The foolishness of a Mutually Assured Destruction policy, followed by both the United States and the Soviet Union during the 1960s, is skillfully portrayed in the film. This movie may have help lead to the turn toward détente that both governments attempted in the 1970s. Peter Sellers was the first actor to be nominated for an Academy Award for a film in which he portrayed more than one character. The movie was nominated for Best Picture (and remains the longest-titled movie to be so honored) along with Zorba the Greek, Becket, and Mary Poppins, but they all lost to My Fair Lady.

Much has changed in the world since 1964, but Putin’s boasts last week about Russian weaponry remind us that much has also stayed the same. It may be only the grace of God that has spared the world thus far from the incredible damage humanity is capable of causing, whether through a deliberate act of hate or through mere carelessness and stupidity. For this divine protection we should be thankful every day. J.

Advertisements

The true beginning of spring

The beginning and end of the seasons are matters for some dispute. Makers of almanacs and calendars  proclaim changes of season on the equinoxes and the solstices. The spring equinox this year will take place at 11:55 a.m. Central Daylight-Saving Time. At that moment, the earth will tilt in such a way that the sunlight will strike directly upon the equator. As a result, on that day all parts of the earth will experience twelve hours of daylight and twelve hours of nighttime—hence the term “equinox.”

Yet in the United States the social calendar does not reflect the calendar of equinoxes and solstices. Summer traditionally begins on Memorial Day weekend and traditionally ends on Labor Day weekend. A holiday season begins when stores start displaying their Christmas decorations and advertising their Christmas sales—recently, this has happened around the end of October. The same holiday season ends with the celebration of the New Year, and then comes a dark and dismal season punctuated by a series of minor holidays including Dr. King’s birthday, Super Bowl Sunday, St. Valentine’s Day, and Presidents’ Day.

But when does winter end and spring begin? One theory holds that winter ends if the groundhog emerges from his burrow on February 2nd and does not see his shadow. If he sees his shadow, he returns to his burrow and we have six more weeks of winter (putting the start of spring shortly before the equinox). Still other people make the celebration of Easter the beginning of spring, putting the start of the season anywhere between March 22 and April 25.

For three reasons, I place the start of spring at the beginning of March. First, this division nicely breaks the year into two halves. From March to August we write the full names of months, using three to six letters. From Sept. to Feb. we abbreviate the names of the months, using three or four letters. In my opinion, that distinguishes the times of the year as well as any other measurement.

Moreover, this plan provides nearly three full months of spring before the summer social calendar kicks in on Memorial Day weekend. Following this pattern, summer ends on Labor Day weekend, and the start of winter can be placed around the beginning of December.

But the best way to identify the beginning of spring is to consult the lyrics of Lerner and Lowe’s classic Broadway musical Camelot. In this idealized world, as young King Arthur assures his future bride Guinevere, even the weather is subject to royal decree. Among the commands that the weather must follow are these stipulations: “The winter is forbidden ‘til December, and exits March the Second on the dot.” Following this command of the king, the Salvageable household invariably acknowledges the beginning of spring on the second day of March.

Blessings to you on all your spring activities. J.

All things considered, I’d rather have the flu

Some days I could almost—almost, I say—envy those people who came down with the flu this winter. And I could almost—almost, I say—offer to exchange troubles with them if that were possible, and if they were willing to make the trade.

With the flu, a person has measurable symptoms such as a fever and a cough. With anxiety and depression, few if any symptoms can be perceived from outside. When one has the fever and cough of the flu, other people are willing to believe what they say about aches and weariness. When one has anxiety and depression, other people are more likely to say, “It’s all in your head,” meaning, “Nothing is really wrong with you.” A few people might think of the flu virus as imaginary and the flu as something that can be conquered by positive thinking. Many more treat anxiety and depression as imaginary and as things that can be conquered by positive thinking.

When someone calls in with the flu, the advice given is usually gentle, kind, and considerate: “Take care of yourself, get plenty of rest, drink lots of fluids, and don’t push yourself—don’t try to come back until you are sure you are better.” With anxiety and depression, the advice is usually less helpful: “Don’t mope; don’t feel sorry for yourself; think about other people and their problems; get active and keep yourself busy and your problems won’t seem so big.” This advice shows a fundamental misunderstanding of what anxiety and depression are doing to a person. It is as if a well-meaning person walked up to a paralyzed man lying on a stretcher and said, “Get up and take a walk—it’ll make you feel better.” Only our Lord Jesus Christ was able to cure paralysis by telling people to get up and walk. The rest of us can only make that victim feel worse by encouraging him to do exactly what he wants to do and cannot do. The person battling anxiety and depression is equally vulnerable. He or she wants to cheer up, wants to be active doing useful things, and wants to feel better. Those are exactly the things he or she cannot do. Encouraging such a response to anxiety and depression is like rubbing salt into a wound.

How can you support a person battling anxiety and depression? Let them know that you care. Be available for them, even if they do not seem to want anything from you. Avoid advice about how to handle their problems, unless you are a qualified counselor or physician. Do not criticize them for taking medication or seeking counseling to help with their problems—don’t criticize, even if you are convinced in your own mind that such medicines and counseling services are a fraud and a rip-off. Above all, avoid blaming them even indirectly for their problems. Don’t tell them that their fears and their sorrows are signs that they do not believe God’s promises. Pray for them, wish them peace and calm, and keep on loving them—even when their struggles and their means of coping with those struggles make them seem unlovable.

The American landscape has become friendlier towards people who have limited mobility. It has become kinder towards people who have limited intelligence. Insults are still spoken, and sometimes people resist the facilities that accommodate people with various challenges. We still have a long way to go toward accepting and helping those with emotional challenges. That journey begins with genuine kindness and compassion. J.

Memories, like the corners of my head

My memory is not what it used to be.

Then, again, it never was.

Somehow this winter I’ve wandered the internet chasing through rabbit holes about memory. The news is not good. According to various studies, human memory as not as reliable as we would like to believe. In fact, our memories can be changed merely by the way people ask us questions about what we have seen or heard.

Example #1: Show a film of two cars colliding to various individuals. Afterward, ask them several questions about the filmed collision. Ask them to judge how fast the cars were traveling when they collided. But ask other people who saw the same film how fast the cars were traveling when they crashed. Those who hear the word “crash” in the question generally will remember the cars moving faster than those who hear the word “collide.”

Example #2 is similar. Ask people to study a photograph of two cars that have collided. After taking away the photograph, ask them several questions about what they saw. Ask if they remember seeing broken glass by the cars. (There was no broken glass in the photograph.) Those who are asked about the cars that crashed are more likely to remember broken glass than those who are asked about the cars that collided.

Example #3: Show individuals a group of photographs of various people. Afterward, ask them questions about the photographs. When you ask how tall the basketball player was, the average answer will be several inches greater than when you ask others how short the basketball player was.

Can it be that we were both so simple then?

Most people are willing to admit that their memory is less than perfect. Some people go to great lengths to deny faults in their memory. Fiona Broome is an extreme example of this tendency. She was surprised to hear that Nelson Mandela was still alive in 2010 because she remembered his funeral taking place in the 1980s. When she asked other people about Nelson Mandela, a few others agreed that they thought he had died in the 1980s. Rather than confessing that their memories were wrong, Broome affirmed that they were remembering the death and burial of Nelson Mandela correctly—they had somehow traveled from an alternate reality to this reality while keeping a few memories of the former reality.

I am not making this up! The Mandela Effect is considered proof of alternate universes and jumps between them. Billy Graham’s funeral is remembered by some people as having already taken place in past years. They have vivid memories of television and internet coverage of his funeral, and they insist that this funeral must have happened in an alternate reality. In addition to funerals, people remember variant spellings of certain names and different corporate logos that have never existed. There’s nothing wrong with their memory, they say—they are merely victims of the Mandela Effect.

I remember noticing twenty years ago that most people misspelled the Berenstain Bears. But some people insist that, when they were younger, the name was spelled Berenstein. They insist that their bologna’s second name is M-E-Y-E-R, whereas I thought the commercial of the little boy singing about his bologna’s name would help people remember that it’s M-A-Y-E-R. There has always been a “k” in Chick-fil-A, no matter what people say they remember. And the United States has never had more than fifty states, even though some people remember learning that there are fifty-two.

Apparently the Mandela Effect has also relocated New Zealand, which some people claim used to be west of Australia. I have had the same experience with Japan, which I picture being much closer to Taiwan than to Korea. But I don’t believe in an alternate universe where island nations are relocated.

As if the Mandela Effect was not already far-fetched, some people go a step further and insist that the Mandela Effect is the result of CERN’s experiments to understand subatomic particles. If I follow the argument correctly, the work with high energy particles has either caused alternate worlds to begin existing or has increased unintended travel between alternate worlds. People really believe these theories, and somehow they also believe that the world of Berenstein Bears and New Zealand west of Australia is a better world than the one we live in now.

At least I think I remember someone making that claim. J.

Early medieval Christian writers

Pseudo-Dionysius; John Scotus Eriugena; John Climacus: the names may be unfamiliar, but the writings of these men have shaped the course of Christianity from the earlier Middle Ages to the present.

Western civilization in general and Protestant Christianity in particular perpetuate an image of Europe’s Dark Ages—the Roman Empire fell, and until the Renaissance a thousand years later, Europe stagnated in a miasma of superstition and barbarianism. This myth was encouraged by thinkers of the so-called Enlightenment (a label they chose for themselves); following the religious wars of the Reformation, Europe was allegedly ready to abandon the blind prejudices of religion and emerge into the light of science, reason, and humanistic philosophy. Because of this attitude, many of the treasures of the Middle Ages were buried in libraries and museums. Condemned with labels like “Gothic,” the advances of European civilization during these centuries were all set aside as a bypath to oblivion, barbarism from which the fragile flame of the Renaissance and the more robust furnace of the Enlightenment rescued western civilization.

Even the Great Books of the Western World series acknowledges only three writers from the Middle Ages—Chaucer, Aquinas, and Dante. All three are undeniably great, but they could anchor a new set of books that might be called Great Books of the Western Middle Ages. That set would also include Pseudo-Dionysius, John Scotus Eriugena, and John Climacus.

Pseudo-Dionysius is an anonymous writer of the fifth or sixth century who represented himself as the man named Dionysius who heard Paul preach in Athens and became a Christian (Acts 17:34). His surviving writings include “The Divine Names,” “The Mystical Theology,” “The Celestial Hierarchy,” and “The Ecclesiastical Hierarchy.” As these titles suggest, the writer organizes the known universe into levels of power and authority, reaching from the lowest forms of created being to the one Uncreated Being, God Himself. Pseudo-Dionysius is known for organizing the angels of heaven into nine levels—three sets of three—and also for describing the levels of church leadership that existed in his time and place. More important, Pseudo-Dionysius recommended humility in the believer who would approach God. The Lord of the universe is far beyond human understanding, and we know him only through what He has told us about himself in the Bible.

Pseudo-Dionysius wrote, “Let us hold on to the scriptural rule ‘not in the plausible words of human wisdom, but in demonstration of the power granted by the Holy Spirit’ (I Corinthians 2:4) to the scripture writers, a power by which, in a manner surpassing speech and knowledge, we reach a union superior to anything available to us by way of our own abilities or activities in the realm of discourse or of intellect. This is why we must not dare to resort to words or conceptions concerning that hidden divinity which transcends being, apart from that the sacred scriptures have divinely revealed. Since the unknowing of what is beyond being is something above and beyond speech, mind, or being itself, one should ascribe to it an understanding beyond being. Let us therefore look as far upward as the light of sacred scripture will allow, and, in our reverent awe of what is divine, let us be drawn together toward the divine splendor.”

John Scotus Eriugena was a theologian, philosopher, and scientist of the early ninth century who lived in the British Isles. He preserved and commented upon the writings of Pseudo-Dionysius, and also wrote a  profound commentary on the Gospel according to John. As a scientist, Eriugena continued the tradition of ancient Greek and Roman science, bridging the time between ancient civilization and the scientists of the High Middle Ages such as Roger Bacon and Nicholas of Cusa. The work of Copernicus, Galileo, Newton, and their heirs would have been impossible without the contributions of men like Eriugena and Roger Bacon. Yet medieval European science was always grounded in the truth of God’s Word, finding meaning and purpose for all creation in the messages from God which communicate the thoughts he wants known by human beings.

Commenting on the opening verses of the Gospel according to John, Eriugena wrote, “When humanity abandoned God, the light of divine knowledge receded from the world. Since then, the eternal light reveals itself in a two-fold manner through Scripture and through creation. Divine knowledge may be renewed in us no other way, but through the letters of Scripture and the species of creature. Learn, therefore, to understand these divine modes of expression and to conceive their meanings in your soul, for therein you will know the Word.”

John Climacus was a monk who lived in a monastery near Mount Sinai at the beginning of the seventh century. His last name refers to his most famous writing, “The Ladder of Divine Ascent,” which describes the Christian life in terms of gaining virtues and dispelling vices. One of the virtues recommended by Climacus is apathy or dispassion, detachment from the things of this world. This may reflect a Buddhist influence upon Christian monasticism in west Asia, unsurprising in the centuries before the rise of Islam in that part of the world. John’s description of the ladder, based loosely on Jacob’s dream, was a deep influence on the writings of the Greek Orthodox and Russian Orthodox churches, lasting until the present. John was himself deeply influenced by the Desert Fathers, the early monks of Egypt and the surrounding area, extending back in time to Saint Anthony. While John’s writings appear to tilt toward legalism, he was more interested in prescribing rules for life in a monastery than he was in speaking of the grace of God and the unearned redemption that belongs to all Christians.

John wrote, “We should love the Lord as we do our friends. Many a time I have seen people bring grief to God, without being bothered about it, and I have seen these very same people resort to every device, plan, pressure, pleas from themselves and their friends, and every gift, simply to restore an old relationship upset by some minor grievance…. In this world, when an emperor summons us to obedience, we leave everything aside and answer the call at once without delays or hanging back or excuses. We had better be careful then not to refuse, through laziness or inertia, the call to heavenly life in the service of the King of kings, the Lord of lords, the God of gods…. Some people living carelessly in the world put a question to me: ‘How can we who are married and living among public cares aspire to the monastic life?’ I answered: ‘Do whatever good you may. Speak evil of no one. Rob no one. Tell no lie. Despise no one and carry no hate. Do not separate yourself from the church assemblies. Show compassion to the needy. Do not be a cause of scandal to anyone. Stay away from the bed of another, and be satisfied with what your own wives can provide you. If you do all this, you will not be far from the kingdom of heaven.’”

Far from being mired in any dark ages, these writers show themselves to be as intelligent and as relevant as any of our contemporary Christian authors. J.

Ash Wednesday is St. Valentine’s Day

This winter contains several odd conjunctions. January ended with a Super/Blue/Blood moon. February has no full moon, something which happens roughly every seventeen years. March will have a blue moon. And in the middle of February, St. Valentine’s Day will fall on Ash Wednesday, the first day of the season of Lent.

At least two, and possibly three, Christian martyrs named Valentine are remembered on February 14. Popular tradition associates one of them with messages about God’s love, but evidence of such letters does not exist. Probably the romantic aspect of St. Valentine’s Day reflects preChristian celebrations in Europe. Already in midFebruary the new life of spring can be felt or anticipated. Birds gather to migrate north. Early flowers begin to sprout through the snow. Spring training camps open to get ready for the baseball season. No matter what the groundhog said on February 2, by the 14th the world is ready for spring.

From early times, Christians have used the last weeks of winter as a time to prepare for the observance of Good Friday and the celebration of Easter. The season of Lent consists of forty days plus six Sundays—each Sunday being a weekly reminder of the resurrection and so not counted among the forty days of Lent. Traditional churches treat Lent as a time of somber reflection and repentance. Christians remember that Jesus suffered and died on a cross to pay for our sins. Thinking about his sacrifice and our sins during Lent, traditional Christians change even Sunday worship. Praise songs are replaced with Lenten hymns. Flowers on the altar and other decorations are eliminated or reduced. Additional services are added to the schedule, often with a theme that prepares for the coming of Holy Week.

Many Christians choose to fast during Lent. They voluntarily surrender some usual pleasure during the forty days and six Sundays of Lent. Some give up candy. Some give up alcohol. Some give up video games. Fasting is not intended for self-improvement in a worldly sense, although giving up certain foods and beverages might have that effect. Fasting does not force God to provide blessings that he has withheld. Instead, fasting shows dedication to God. It provides evidence that God is more important than worldly pleasures. Fasting teaches self-control. When a Christian can say no to candy or video games for six-and-a-half weeks, that Christian is made stronger, able to say no to temptations to sin. Fasting also teaches compassion. When we go without luxuries, we understand how it feels to live without those luxuries because of poverty rather than choice.

The sinful world can take even the most noble customs of the church and pervert them into something twisted and strange. Plans to fast during Lent lead to a desire to use up the luxury before it is forbidden. What was once a simple matter of eating the last butter and eggs in the kitchen, or having one last piece of candy or one last martini, has become Mardi Gras and Carnival—riotous celebrations of worldliness that have more to do with darkness than with light. Perhaps those people who take part in Mardi Gras are more inclined to repent when they awaken on Ash Wednesday than their sober neighbors. All the same, a day and a season focused on repentance is not intended to encourage greater sin in advance, even if that does offer more reason to repent.

Setting aside the excesses of Mardi Gras, the odd conjunction of February 14 leads to a dilemma. Should one offer candy and other goodies to one’s family and one’s coworkers to honor St. Valentine’s Day, or should one consider the possibility that a person might be starting a fast on that day, choosing not to eat sweets until Easter? The Valentine treats should probably be shared earlier, to avoid the risk of undermining a time of fasting at its very beginning.

And, speaking of odd conjunctions, Easter Sunday this year will be observed on April Fools’ Day. J.

Fact-checking, and another “Who Said That?”

One of my previous job titles is “fact-checker.” Yes, I checked facts. Instead of doing my own writing, I read what other people had written, checked their sources, consulted other sources, corrected wrong information, and made sure the company would publish something that was accurate and reliable. The writers were paid five cents a word. I was paid an hourly rate. I probably earned more checking facts than the writers earned for their work.

When I was in college, one of the assigned texts that every student read was a small book, How To Lie With Statistics. I checked amazon this morning and saw that the book is still available. Its only fault is that the examples all date from the 1950s. Aside from that, the book is wonderfully readable and extremely helpful. The title is, of course, a joke. The book does not teach the reader how to lie with statistics; it shows how other people lie with statistics and teaches the reader how to evaluate the data that others use to prove their points.

For example: on another blog last week a commenter asserted that 91 percent of scientists are atheists and 97 of biologists are atheists. I refrained from commenting (not wanting to enter the conversation), but I had many significant questions to ask. Who conducted the survey? How did they choose their respondents? How did they define atheism? The numbers quoted are so incredible (meaning unbelievable) that the survey is almost certainly skewed.

Perhaps they surveyed science professors in secular European colleges and universities. Perhaps the survey was conducted through a periodical whose readers are mostly secular scientists. Perhaps the survey was mailed in a package that Christians and Jews and Muslims would be likely to disregard. For that matter, perhaps the survey was designed to lump agnostics and deists into the category of atheists. If they intended from the start to demonstrate that most scientists do not believe in God, they had several ways to achieve that goal, and more than likely they used all of them.

A fact-checker like me easily becomes a curmudgeon (and when I place a post in the category “curmudgeon,” you can be sure I am not taking myself very seriously). Inaccurate information rankles me. A few years ago I was part of a trivia contest conducted as a fundraiser for a Christian camp. One of the first questions was, “Who wrote the poem that begins, ‘I think that I shall never see a poem lovely as a tree’?” The correct answer is Joyce Kilmer, but the judges of the contest insisted that the author was James Joyce. I shrugged off the mistake and continued in the competition that evening, but I have not returned to the annual event since that year. (This paragraph actually belongs in Saturday’s post, but it slipped my mind then as I was writing.)

Some of my crabbiness probably stems from being done with winter and ready for spring. Some comes from stress helping my daughter deal with a broken phone, a broken car, difficulty at her workplace, and the last semester of college. Some stems from hope and uncertainty about my own future. I appreciate the patience and support of all of you. J.

 

Who said that?

“Don’t believe everything you see on the Internet.” –Abraham Lincoln

At least I think Lincoln said it… I saw it on the Internet.

In a world of fake news and other misinformation, historical facts and quotes are as unreliable as any other information. Type the name of any famous person into Google, and among the results will be a page of quotes. But on most of those pages, the quotes will be listed without any source, and in some cases the creator of the site was misinformed. As a result, if you type a famous line into Google, you may see it attributed to any number of people. Google does not know everything—it links your search to other people’s assertions, and some of those assertions are wrong.

Often a profound or clever line from an obscure person is credited to a more famous person. In his book Funny People, Steve Allen provides a list of quips that he suggests were said by Groucho Marx. He then reveals that he, Steve Allen, was responsible for every one of them. He admits, though, that they sound funnier coming from Groucho. Many statements about liberty and the danger of government oppression have been attributed to George Washington, Thomas Jefferson, Benjamin Franklin, and other founding fathers of the United States. In more than a few cases, the lines were written and attributed to them long after they had died.

Ancient figures, including Socrates and Cicero, are sometimes quoted as deploring the state of their times, with young people failing to respect their elders, social morals on the decline, and a general lack of trustworthiness among the population. The point of the citation is to show that these problems have always been around, but, alas, the quote is a recent invention and was never said by Socrates, Cicero, or any other ancient sage.

I’ve approached people quoting Martin Luther, Soren Kierkegaard, and other religious people of note, asking them in which document they found their quote. They admitted that they didn’t know where it was written, but they thought so-and-so had written it. In the case of Kierkegaard, the speaker told an audience that Kierkegaard had described Christian worship as a performance in which we are the actors and God is the spectator. When I spoke privately with the speaker, he admitted that he didn’t know where Kierkegaard had written that line, and I suggested that the speaker was probably misinformed. Years later I came across something similar in one of Kierkegaard’s works, with an important difference: Kierkegaard was discussing, not Christian worship, but Christian good deeds, which is an entirely different matter altogether.

(“Which is an entirely different matter.”)

Quoting out of context is as bad as inventing or misattributing a quote. Graduation speeches and other motivational talks often refer to Robert Frost’s poem “The Road Not Taken.” They speak to the last three lines of the poem: “Two roads diverged in the woods and I—I took the one less traveled by. And that has made all the difference.” Such speeches suggest that Frost encourages us to be unique individuals, to dare to be different, to refuse to follow the crowd, and so on. Frost recommends no such thing in this poem. Given the title, the poem could easily be read as an expression of regret and not a suggestion that we all should take the road less traveled.

Read carefully the following paragraph taken from a book written a little more than one hundred years ago. The book is called The Life of Reason, and the paragraph is found on pages 82 and 83; the entire book is nearly 500 pages long. The paragraph is part of a chapter called “Flux and Constancy in Human Nature,” the last of ten chapters in the section of the book entitled, “Reason in Common Sense.” Here is the paragraph:

“Progress, far from consisting in change, depends upon retentiveness. When change is absolute there remains no being to improve and no direction is set for possible improvement; and when experience is not retained, as among savages, infancy is perpetual. Those who cannot remember the past are condemned to repeat it. In the first stage of life the mind is frivolous and easily distracted; it misses progress by failing in consecutiveness and persistence. This is the condition of children and barbarians, in whom instinct has learned nothing from experience. In a second stage men are docile to events, plastic to new habits and suggestions, yet able to graft them on original instincts, which they thus bring to fuller satisfaction. This is the plane of manhood and true progress. Last comes a stage when retentiveness is exhausted and all that happens is at once forgotten; a vain, because unpractical, repetition of the past takes the place of plasticity and fertile readaptation. In a moving world readaptation is the price of longevity. The hard shell, far from protecting the vital principle, condemns it to die down slowly and be gradually chilled, by giving birth to a generation plastic to the contemporary world and able to retain its lessons. Thus old age is as forgetful as youth, and more incorrigible; it displays the same inattentiveness to conditions; its memory becomes self-repeating and degenerates into an instinctive reaction, like a bird’s chirp.”

I cite this paragraph, not to agree with it—I particularly dislike the disparaging remarks about savages and barbarians—but to see if the third sentence jumped out at you as something familiar. Poor George Santayana, who is remembered for only one sentence (which yesterday I saw attributed to Edmund Burke—that started me down this road). Some years ago I heard the same sentence quoted by three different speakers at the dedication of a library building, and I doubt any of those speakers could have said who Santayana was, when and where he lived, and what he meant by that sentence. It fits on a T-shirt or a bumper sticker; hardly anyone would care to advertise the sentiment that most people are either too young or too old to learn.

Or, as Julius Caesar once said, “There’s hardly any point in speaking, when people are going to remember it wrong as soon as tomorrow dawns.” J.

Candlemas (Groundhog Day)

Most people, whether believers or unbelievers, are familiar with the Christian holidays of Christmas and Easter. Far fewer are aware of the minor festivals of the Christian calendar, such as Candlemas, which is observed every year on the second day of February.

As Christians in the Roman Empire chose to celebrate the Incarnation of Jesus (that is to say, his birthday) at the same time that Romans and Celts and Germans were celebrating various Yuletide observances, so Christians also chose to celebrate the Presentation of Jesus at the same time that Celts were observing a holiday they called Imbolc. This holiday falls halfway between the winter solstice near the end of December and the spring equinox near the end of March. In Ireland, some of the old customs of Imbolc have been blended into St. Brigid’s Day on February 1, but for most other European Christians and their descendants around the world, Candlemas has received the attention formerly given to Imbolc.

The second chapter of the Gospel according to Luke describes the birth and childhood of Jesus. The familiar account of the birth of Jesus in Bethlehem, including the announcement by the angel to shepherds and their visit, comes from Luke. Luke also wrote that Jesus was circumcised on the eighth day from his birth and was presented to God on the fortieth day from his birth. Celebrating the birth of Jesus on December 25 puts the anniversary of his circumcision on January 1 and his presentation on February 2.

What is the significance of the presentation of Jesus? As at his circumcision, Jesus was fulfilling the Law for the benefit of all his people. The Law of God, given through Moses, required every firstborn son to be offered to God and purchased from God with a sacrifice. This presentation and purchase of the firstborn son reminded God’s people of the tenth plague upon Egypt, when God’s angel killed the firstborn son of every family in Egypt except for those who obeyed God, marking their houses with the blood of a lamb. The details of the plague, the Passover, and the remembrance are filled with images of Jesus and his sacrifice—the death of a firstborn son picturing the death on the cross of God’s only-begotten Son, the substitution of a lamb for some sons (and the use of the lamb’s blood to identify those who were protected) showing Jesus as the Lamb of God taking the place of sinners, and the purchase of the firstborn son in following generations showing the price Jesus paid on the cross to cover the debt of sinners. Because Jesus, on the fortieth day from his birth, was already obeying the commands of God, Christians are credited with his righteousness. We are free to approach the throne of God and even to call him our Father. Jesus took our place in this sinful world so we can take his place in God’s Kingdom.

Bonfires were lit in Europe on Imbolc night as part of the celebration of the holiday. Christian churches chose to replace the bonfires with many candles, filling the church with light to remember Jesus, the Light of the world. From that custom comes the name, Candlemas. I first encountered that name in the stories of King Arthur, for he and his knights would gather on Candlemas, as they did on Christmas and Easter, to celebrate and to await the beginning of new adventures. The king would not allow his court to eat the feast until some odd event had taken place, sending at least one knight off on a mission to rescue some victim or defeat some enemy.

Before the establishment of the National Weather Service or the invention of Doppler Radar, European Christians often trusted traditions about the holidays to make long-term forecasts of the coming weather. St. Swithin’s Day (July 15) in the British Isles was thought to set the pattern for the next forty days—either it would remain dry for forty days or it would rain for forty days, depending upon whether or not it rained that day. In Hungary the weather on St. Martin’s Day (November 11) predicted the kind of winter that was coming: “If St. Martin arrives on a white horse, it will be a mild winter—if he arrives on a brown horse, it will be a cold and snowy winter.” In other words, snow on November 11 promised a mild winter. So also, the weather on Candlemas was thought to predict the next forty days of weather: a clear and sunny Candlemas meant winter was only half over, but a cloud-filled sky on Candlemas morning meant that winter was over and spring was about to begin.

In Germany bears often took a break from hibernation around the beginning of February to check out conditions and get a bite to eat. The weather tradition for Candlemas became associated with the emergence of the bear and the question of whether it cast a shadow. German settlers in North America adapted the tradition to local wildlife, and thus began the tradition of Groundhog Day.

Ironically, more Americans are aware of Groundhog Day than of Candlemas. The fame of Groundhog Day increased in 1993 with the release of the movie Groundhog Day starring Bill Murray. The movie has little connection to Christian beliefs. It is more suited to explaining the idea of samsara, found in Hindu and Buddhist beliefs. Samsara is the cycle of lifetimes in which one’s atman (roughly analogous to spirit or soul, but not exactly the same thing) keeps returning to this world until it has learned all it needs to know and is fully enlightened.

On Groundhog Day I check for shadows as I bring in the morning paper. This year, I will also remember to light a candle or two and celebrate the feast of Candlemas. J.

(Reposted from February 2, 2016)

Book report

I recently finished reading a science fiction novel; portions of it contained black comedy of a sort. In the plot, the United States has just emerged from a horrible and destructive war. The survivors of the war decide to find a new use for the technology that was developed to fight the war. After brief consideration, they decide to use this new technology to explore outer space.

Of course, if this novel had been written any time after 1960, the plot would be a retelling of current events. Rocket technology was developed by the Germans during World War II to bombard the United Kingdom. At the end of the war, Soviet forces and American forces both sought to capture the German scientists who had developed those rockets. At first the technology was improved only to prepare for another war, as the Cold War was intensifying. By the 1960s, though, both sides were seeing nonmilitary advantages to their respective space programs. In particular, the United States chose the challenge of bringing a man to the moon and returning him to the earth, aiming to achieve that goal before the 1960s ended. In July 1969, Neil Armstrong, Buzz Aldrin, and Michael Collins made that historic journey, lifting off in their rocket from the Florida coast and traveling all the way to the moon. Armstrong and Aldrin both walked on the moon, conducted scientific experiments, and commemorated their achievement. They even spoke with President Nixon, who joked about the longest long-distance phone call in history.

The novel I read, From the Earth to the Moon, was written and published by Jules Verne in 1865. The war in question was the Civil War, and the technology he described was an enormous and powerful cannon. The Baltimore Gun Club resolves to fire a giant cannon ball at the moon. As plans are made for the cannon and cannonball, a French poet volunteers to be a passenger inside the missile. In the end, three men encase themselves in the cannonball, which is gently lowered into a specially built cannon, located on the Florida coast, and the three of them are shot to the moon.

Jules Verse was one of science fiction’s earliest authors. He liked to write travel novels. (His best is Around the World in Eighty Days.) When considering voyages that had never been attempted, such as one to the moon, he carefully considered just how it could be done, down to the smallest details. He had no conception of liquid-fueled rockets like those that would be used by Soviet and American explorers. Verne’s giant cannon and cannonball would not have worked. In many other aspects of his story, though, Verne captured a historic event and described it well… one hundred years before it took place. J.