I've been thinking about this a lot lately, and I would like to see what you guys think. Is America a Christian Nation founded on Christian Ideals, or is this a secular Republic that happens to have Christians in it? Try to provide evidence for your thoughts whenever possible. I also have a decent amount of evidence for both sides at my disposal to help the discussion along if that's necessary.