Member-only story

Christian Nationalism

A Dishonor to Christianity

Photo by Cas Holmes on Unsplash

There seems to be a whole lot of talk about Christian Nationalism these days. You know, the idea that the United States should be declared a Christian nation, including Christian symbolism displayed everywhere, Christian values taught in school, including the Creation; that we ‘take back the country for God’.

I find it all very ignorant, arrogant, and self-serving. Ignorant, because, although most if not all of our ‘founding fathers’ were Christian, the United States was the first country to purposefully not proclaim a state religion. Why? Because they founded the country on the basis of religious freedom, among other things. Did you know that the words “under God” were not added to the Pledge of Allegiance until 1954? “In God we trust” wasn’t seen on coins until the Civil War era. So, to say that the country was founded on Christianity is ignorant.

Arrogance comes into play, because these folks think their religion is better than all others, in fact, the only true religion. They’re certainly entitled to that belief (due to freedom of religion in the Constitution), but the idea that they believe that makes them better than others to the point that others should be forced to follow their religious beliefs (contrary to the Constitution) is not only arrogant, but absurd. I mean, if you want to believe that only…

--

--

Jodie Helm www.asktheangels222.com
Jodie Helm www.asktheangels222.com

Written by Jodie Helm www.asktheangels222.com

4X Top Writer , Archangel channel, Reiki Master, Bridge. I share the messages I receive from my guides here. My only religion is Love. asktheangels222@gmail.com

Responses (10)