Saturday, November 12, 2011

The Walking Dead Examines Faith

As most anyone who follows the series "the Walking Dead" know, the title refers not to the zombies, but to the survivors. The show's intent is to explore how people respond in a world where civilization is stripped away and the animal instincts move to the fore. This article discusses religion and the Walking Dead based on events in the last couple of episodes.

No comments:

Post a Comment