How Will The Shocking Decline Of Christianity In America Affect The Future Of This Nation?EndOfThe AmericanDream.com
Is Christianity in decline in America? When you examine the cold, hard numbers it is simply not possible to come to any other conclusion. Over the past few decades, the percentage of Christians in America has been steadily declining. This has especially been true among young people. As you will see later in this article, there has been a mass exodus of teens and young adults out of U.S. churches. In addition, what “Christianity” means to American Christians today is often far different from what “Christianity” meant to their parents and their grandparents. Millions upon millions of Christians in the United States simply do not believe many of the fundamental principles of the Christian faith any longer. Without a doubt, America is becoming a less “Christian” nation. This has staggering implications for the future of this country. The United States was founded primarily by Christians that were seeking to escape religious persecution. For those early settlers, the Christian faith was the very center of their lives, and it deeply affected the laws that they made and the governmental structures that they established. So what is the future of America going to look like if we totally reject the principles that this nation was founded on?
Overall, Christianity is still the largest religion in the world by far. According to the Pew Forum on Religion & Public Life, there are currently 2.2 billion Christians in the world. So Christianity is not in danger of disappearing any time soon. In fact, in some areas of the globe it is experiencing absolutely explosive growth.
But in the United States, things are different. Churches are shrinking, skepticism is growing and apathy about spiritual matters seems to be at an all-time high.