Just a quick history of who the Puritans were and what they did, to start out with. Essentially Puritans were people belonging to the extreme religious group of Puritanism, who broke away from the Church of England because they believed it was too complicated. They believed that church and religion shouldn't be about fancy buildings, but about the actual teachings of God. They also found that the hierarchy of the Church's were often abused, and people only joined for the power that came along with the positions. As a result of these beliefs, they moved to America to start a new colony where they could reform the Church of England. In the end however, they broke away entirely.
So that sounds fine and great, but when you get a bit deeper into the history of what they did you might not like them too much anymore.
First off because of their hypocrisy. Isn't it ironic that the most strict and close minded form of Christianity is trying to gain religious freedoms, while they deny that right to everyone else? They believed that there way was the only way, and anyone who thinks differently can burn in Hell. Not only that, but if a woman was found guilty of immodest dress she would be stripped to the waist, tied to a cart, and whipped until her back was bloody. So let's get this straight, she was found guilty of "immodest dress", and the appropriate punishment is to strip her to the waist.
Secondly, there was absolutely no separation between Church and State what so ever. All forms of entertainment were illegal (as it would distract ones thoughts from God) and if you spoke against the religion you ran the risk of having a scorching awl ran through your tongue, your ear cut off, or a letter branded onto your forehead symbolizing what crime you committed.
Lastly, the Salem witch trials. Must I say more?
I find it ridiculous that despite all this, historians still speak of the Puritans as if they were heroes. And you know, the pilgrims weren't that much better. England even took over after awhile because of how ridiculous these religion based colonies were getting.
I could rant all day about colonial America, mercantilism, puritanism, etc. But I want your views on the subject.
They encountered Natives which was a culture shock they were completely unprepared for.
The English knew that the area they were going to was inhabited by natives.
The country was founded by the religious, so it was only right for the country to be religious in just the same way.
The English originally came to America because they were hoping it would be full of gold. Basically it was a come, get the gold, and leave type of thing. After they realized that there wasn't gold though, they decided to stay and see what there was. Then eventually the religious people seeking freedom started to come. American was founded on greed.
And this is totally random, but they established public schools.
And Harvard University. Quite ironic, really. However, back then all the education was strictly religious. So it wasn't exactly like it is now.