Well, the Hunger Games taught children about children hunting children.
True, but at least they didn’t glorify it. They made it out to be a harrowing and terrible experience, not something to be sought out or justified.