Ten Political Truths in the USA
It is my belief that these are things that should matter to everyone.
This is not my usual post, but I cannot remain silent. I feel the need to share what I believe are truths that need to be taken into account, in especial given current events.
First — The United States is NOT a Christian nation. Period. Every single religion in the world is accepted and practiced in this nation.