The show is really about anarchy brought on by a zombie apocalypse. Everyone’s fighting for food, shelter, domination.
I know and zombies were a niche thing a couple decades ago.
Why is WALKING DEAD the highest rated show on TV?
It’s a reflection of the culture.