Majority Of White Evangelicals Believe You Have To Be Christian To Be Truly American.
Reader so 57% of white evangelicals say being a Christian is very important for being truly American. Christianity is not exclusive to white people, nor is it American religion. We are made up of a whole bunch of different races and religions and there is nothing wrong with that. While being a Christian is very important for any American, people must understand that you can also be an American and not be a Christian. After all, an American is defined as an inhabitant or native of the United States, or someone born to at least one American parent.