Majority Of White Evangelicals Believe You Have To Be Christian To Be Truly American.

Reader so 57% of white evangelicals say being a Christian is very important for being truly American.  Christianity is not exclusive to white people, nor is it American religion. We are made up of a whole bunch of different races and religions and there is nothing wrong with that. While being a Christian is very important for any American, people must understand that you can also be an American and not be a Christian. After all, an American is defined as an inhabitant or native of the United States, or someone born to at least one American parent.

Popular posts from this blog

Good News: Black Members At Paula White Church Are Leaving Because They Disagree With Her Being A Spiritual Advisor To Trump!!

Minister Louis Farrakhan Has Come Out The Closet As A Christian, I Know That My Redeemer Liveth My Jesus is Alive.

Why Are Professing Christians Defending and Co-signing Snoop Dogg Gospel Project??

Prophet Brian Carn, Dr. Earl Carter, Bishop Ronzel Pretlow, Response To Apostle Matthew Stevenson's Commentary!

Stop It, Pastor Steven Furtick Did Not Sign A 6-Year, $110 Million Contract To Preach At Lakewood Church!