Are Belief in God and Christianity Really Dying in America?

What’s really happening to religion in America? Plainly stated: it’s complicated.

Perhaps the title of the latest Pew Research Center report — “U.S. Public Becoming Less Religious” — provides the most concise overview, though there’s some debate over what, exactly, is going on beneath the numbers when it comes to religious adherence and practice.

This was the second of two extensive religion reports released this year by Pew, with the data within providing a snapshot of the beliefs and practices of the American populace. In contrast, the first report titled, “America’s Changing Religious Landscape,” was released in May, focusing mainly on overarching demographic changes.

The takeaway from both reports was that the American populace is becoming less religiously devout, but answering the “how” and “why” gets a bit more dicey, as pastors, faith leaders and sociologists all have theories as to what’s really happening, culturally speaking . . .

The United States remains a majority Christian country, with 70.6 percent falling under the Bible-based umbrella in 2014. This is a decrease of eight percentage points, though, from 2007 when the study found that 78.4 percent of the nation embraced Christianity. (Read more from “Are Belief in God and Christianity Really Dying in America?” HERE)

Follow Joe Miller on Twitter HERE and Facebook HERE.