For their edition during the Easter Holy Week, Newsweek ran a cover story entitled "The End of Christian America"., which, concluded that American is still a Christian nation, just not as strongly or as uniformly as it once was. Of course, many Americans assert that the United States is a Christian nation. Many foreigners, as well, view the United States as a Christian country. In fact, sixty-two percent of Americans still believe the United States it is. However, the United States sadly, is not, never has, and probably never will be a Christian nation.
Many conservative Christians assert that America was founded as a Christian nation. This is patently false. Christianity was important both socially and politically in early America, particularily in the Northeast. However, almost all of the founding fathers (and their wives) were deists. Some (most notably Thomas Paine and Thomas Jefferson) had some very nasty things to say about organized Christianity and actively fought against the fusion of polity and religion. Jefferson edited the Declaration of Independence to take out Christian language. Even the founders who were more orthodox in their views (such as John Jay, the first Supreme Court Chief Justice) practiced a Calvinist brand of Christianity that is very different from most forms practiced today.
More importantly than how the country was founded is how it has functioned. For the first several decades of its existence the United States' economy was powered by the enslavement of an entire race of people. (Especially after Eli Whitney invented the cotton gin.) The United States government gained millions of acres of land by violently stealing it from Native Americans (While murdering and displacing thousands in the process.) The Cherokee "Trail of Tears" alone resulted in the starvation and death of over 4,000 Native Americans. The "peculiar institution" (aka slavery), as well as the active policy of the United States government toward Native Americans sharply contrasted with Christian morals.
America has also often used aggression and violence, using its military as a weapon of destruction. Sherman's March to the Sea is one easy example. The U.S. also employed scorched earth tactics against Native Americans, and in the Philippine-American War, the Spanish-American War, and the Vietnam War. The United States also built concentration camps during the Philippine-American War, where the form of torture known as "water cure" was popular. (Water cure is when one is forced to drink great amounts of water, or another liquid such as urine, until the stomach nearly bursts and/or the person vomits, several times, often resulting in death. Teddy Roosevelt acknowledged its widespread use during the war.) The Civil War certainly doesn't seem like a very Christian way to solve a dispute. America has used violence and sacrificed the lives of many to satisfy imperialistic goals. (See the Mexican-American War, the Philippine-American War, The Spanish-American war, and any of the quasi-wars between the U.S. government and Native Americans.) America certainly has often exploited its military power. More recently, the scandal at Abu Ghraib, Guantanomo Bay, and waterboarding have been less than Christian.
Discrimination has also been a hallmark of American culture. America is still trying to reconcile its past treatment of blacks. Segregation and disenfranchisement is well documented and needn't be discussed. Other more acute forms of discrimination, such as the placement of Japanese-Americans in internment camps after Pearl Harbor or the Chinese Exclusion Act are less talked about, but occurred.
American culture certainly isn't Christian. There are many examples of un-Christian behavior. America's lack of engagement in several genocides. It's indifference towards poverty. The abortion of forty-five million babies since 1973. The culture of extreme materialism/consumerism. The pollution and destruction of the environment. The high divorce and infidelity rates. The obesity epidemic. (I believe obesity is a "sin" for several reasons) America's high rate of debt. Even the popularization of rap music, which generally espouses a worldview that is the polar opposite of Christianity.
America has never been a Christian nation. There never was a "Christian America". American has done many, many, many, MANY positive, good things. It is generous. The government doesn't actively try to hurt or exploit its people. The United State has often promoted peace and freedom. It does often try to help the underpriveliged in its own country and around the world. The very fact that it is there it debates and considers issues such as waterboarding or torture means it is not evil. I have emphasized the negative things, the dark moments, in American history and culture to illustrate its failure to live up to Christian ideals. Christianity and Judeo-Christian values have played a very important and special role in America. As a Christian, I think this is a good thing. America is changing religiously, in both positive and negative ways. I believe Christians leaders should focus not so much on how many Christians there are in America, but instead how those who are Christian display the message of Jesus.
Notes:
For more information on the religion(s) of the founding fathers, I would recommend Faiths of the Founding Fathers.
Most of the founding fathers were deists (or at least leaned toward deism) because they studied at our had connections to the College of William and Mary.
(For those interested, and I'm sure someone will be, Jon Stewart graduated from William and Mary in 1984.)
Along with most of the founding fathers, many historians believe that Abraham Lincoln was also a deist.
While a few historians still debate it, most do consider George Washington a deist.
Subscribe to:
Post Comments (Atom)
1 comment:
I think you bring up some good points. There is a lot of equivocation with the term "Christian America" and the many meanings it has.
Post a Comment