Facebook the other day, I noticed that someone posted this question..."Should America be considered a Christian nation"? That question got me to thinking why can't Christianity be the predominant religion in America?....Why do other countries refer to themselves by their religion? Example: The Jewish state of Israel....The Muslim country of Saudi Arabia, of Iran, or Iraq...The Hindu nation of India.
I'm in no way saying a belief in a Christian America mean that others can't believe in their choice...It just means that the underlying foundations of our country were said to be built on Christian principles...So why can't Christianity be the predominant religion in America? Especially if Christian culture and teachings are a heavy influence in America...Then why can't America be noted as a Christian nation?