I guess that many users here are from the United States.
I'm from Germany, and some days ago, some friends and me were talking about America's point of view towards Germany. Somebody told that when she was in America, the people asked if she likes how Hitler governs.
I hope I don't have to explain that Germany is a democratic country and that Hitler died 70 years ago.
My question: What does America think about Germany? Are the prejudices true and America thinks that Hitlers still alive or what else?
Holy **** what the hell!?!?!? You should slap anybody that thinks Hitler still runs Germany, teach them German history, then beat them to death with the text book.
I have no problem with Germany, but you haven't seemed to be doing much lately or I just haven't heard about it.