This is not my usual post, but I cannot remain silent. I feel the need to share what I believe are truths that need to be taken into account, in especial given current events.
First — The United States is NOT a Christian nation. Period. Every single religion in the world is accepted and practiced in this nation.