Division, political strife, and infighting have left many Americans disillusioned with Christianity and the Church. For the first time in 80 years, A recent Gallup poll found that more than 50% of Adults in the United States do not belong to any church or house of worship. The number is even higher for millenials.
American Christianity seems to be very good at promoting anything but the actual Gospel of Christ Jesus these days–namely, that Christ died for your sin and rose for your justification. It can feel like this anti-Gospel infection is only getting worse, as many people walk away from their faith.
America is experiencing a crisis of hope. People are frantically looking for something certain to put their hope in as careers, political figures, and even physical health prove to be fragile.
And too often, Americans are not receiving hope and certainty from the church.
Instead of preaching the promises of God to their congregants, many Christian leaders are trading in the Gospel for an unbearable checklist of ways to earn God’s favor (i.e. more ‘to-dos’).
To preserve the church in America and faith in God, do you believe that it is essential for people to know and proclaim the actual Gospel – that the death of Christ saves sinners?
Are churches and religious organizations in your state teaching the center of the Christian Scriptures - Christ Jesus crucified and risen for sinners - or are they majoring in the minors?