Home > Dangerous Universe > The Walking Dead: Do Zombies Go Forever?
The Walking Dead: Do Zombies Go Forever?
By Bert Ehrmann
Check out The Dangerous Universe Website!
Fort Wayne Reader
I honestly can't remember a time when there's been as many interesting new sci-fi/horror TV series airing as there are now. Last summer, the first season of the alien invasion drama Falling Skies debuted on TNT. A few weeks back the time travel epic about man vs. dinosaurs Terra Nova premiered on FOX. And as of late even MTV has gotten in on the fun with a comedy/drama about vampires, werewolves and zombies vs. cops in the new series Death Valley.
But the absolute “king” of the modern sci-fi/horror genre series has to be The Walking Dead, the second season of which recently began airing on AMC.
The Walking Dead, based on the comic series of the same name, follows a group of survivors of a zombie apocalypse who must face a changed world where governments have fallen and the only thing that matters is staying one step ahead of the undead for a while longer. The series begins with police officer Rick Grimes (Andrew Lincoln) shot during an arrest who awakens several weeks later in a hospital after the zombies had stormed through and ruined civilization.
Rick, not totally grasping this new reality, takes a hellish tour of the south along with a group of survivors including his family, all of which are unsure as to what to do next.
When the first season of The Walking Dead premiered last fall it was a bit of a shock to me. Until then, I hadn't seen anything as intense as that series on AMC or any other cable channel, basic or premium. In The Walking Dead, there is gore, terror and shock in just about every episode of the series. Sometimes important characters die and return as a member of the undead, and one never quite feels as if the cast of characters are safe, as it seems as at any moment a zombie might stumble into view to cause mayhem/horror.
At the same time, though, The Walking Dead can be oddly uplifting with characters banding together, protecting each other and carving out niches of civilization within a ruined America.
All of which is exactly what makes The Walking Dead series so great; I'm not sure another series focusing on the same subject matter would be willing to go the places story-wise and take the chances that the creators of The Walking Dead have taken.
Unfortunately, though, everything I've described above, in my opinion anyway, will eventually lead directly to the demise of the series. (Spoiler alert.) So far, The Walking Dead has shared many of the same story points/elements as the comic book series, and it appears as if these links will continue as the series progresses.
As the comic series developed, characters were killed, others are introduced who sometimes died too. Rick's band continued their trek away from the zombies and though they did find some respite now and then, for the most part the band was/is constantly under threat from the zombies, from other bands of survivors, and sometimes from within.
And all this never stops.
Over the last eight years of the comics, help has never come to the survivors and the group never finds a base of permanent safety from the undead. All of which became a bit too much for me to handle, so I gave up on the comic series a few years ago. I'm not criticizing this style of story as it has produced volumes of great and important work. It's just that after a while it's all overwhelming and depressing.
Which is why I'm worried about the TV series going down this same path. How many seasons/years will an audience stick with a show that's essentially a chronicle of very bad things happening to good people?
Still, for me at least this point probably won't be reached for hopefully several years. So, until the time comes that The Walking Dead becomes too much for me to bear I'll be there watching new episodes of the series Sunday nights at 10 on AMC.
The first season of The Walking Dead is available via digital download, DVD and Blu-ray. Visit me online at AlphaEcho.com.