Well, we all know that the Twilight saga has brought in massive rewards, and it's like Harry Potter all over again. A huge number love it and almost as many hate it (Research indicates the top reason is fans). And quite a few series have jumped out of nowhere based on the series.
But I find the books (And movies) have all changed our opinion of one thing: Vampires. Book lovers now see the 'Cold One' as appealing and sexy, while haters... Drop dead.
I'm just curious as to how people feel about this. Do you like the fact vampires have been changed so dramatically? If so, how?
It made me appreciative of vampires who are actually dangerous and actually kill people, vampires we're scared of, the way it's suppose to be, instead of just teen whiny vampires.
It made me appreciate Dracula a lot more, and I never really thought much of him before he was "threatened" by the Twilight crew. Those losers aren't vampires! They're just cannibals who are practically invincible.
Remember the days when vampires sucked blood and not pe-nis?
Good days..
I usually don't agree with you samy, but dammit, you're right.
Vampires are not glitter covered perfect in every way thin little white emo kids. They are evil night walkers who drink human blood and have a phobia for garlic and holy water.