Is the West Becoming Pagan Again?

This year, at the height of what used to be called the Christmas season, a Pew Research Center poll on religion revealed that only slightly more Americans described themselves as Roman Catholics (21 percent) than as believers in “nothing in particular” (20 percent). The millennial generation, which includes most adult Americans under 40, is the first one in which Christians are a minority.

Many Americans have a sense that their country is less religious than it used to be. But is it really? The interplay among institutions, behaviors and beliefs is notoriously hard to chart. Even if we could determine that religious sentiment was in flux, it would be hard to say whether we were talking about this year’s fad or this century’s trend.

Or perhaps we are dealing with an even deeper process. That is the argument of a much-discussed book published in Paris this fall. In it, the French political theorist Chantal Delsol contends that we are living through the end of Christian civilization — a civilization that began (roughly) with the Roman rout of pagan holdouts in the late fourth century and ended (roughly) with Pope John XXIII’s embrace of religious pluralism and the West’s legalization of abortion.

The book is called “La Fin de la Chrétienté,” which might be translated as “The End of the Christian World.” Ms. Delsol is quite clear that what is ending is not the Christian faith, with its rites and dogmas, but only Christian culture — the way Christian societies are governed and the art, philosophy and lore that have arisen under Christianity’s influence.

That is still quite a lot. In the West, Christian society is the source of our cultural norms and moral proscriptions, not to mention the territory of our present-day culture wars, with their strident arguments over pronouns and statues and gay bridegrooms and pedophile priests. (Read more from “Is the West Becoming Pagan Again?” HERE)

Delete Facebook, Delete Twitter, Follow Restoring Liberty and Joe Miller at gab HERE.