“The Owners” is supposed to be a negative term in America?
Hell that’s what this country and everyone who’s come here had a dream to be. The promise of America is that you too can own: a house, your own business, property, etc.
Just because that outcome isn’t guaranteed or underwritten by those who successfully do it, doesn’t mean someone stole it from someone else. There are plenty of our ancestors (and mostly white european ones) who were not that successful at it, but they didn’t bitch that government wasn’t taking enough from those that made it. (Well the lefties always did, I’m not counting them.)
Well, I think the problem is a significant part of our population (and the media) are at war with the American Dream. They don't like it. They don't want you to have it. They oppose Christianity. They curse George Washington. They spit on the Constitution. And they want to brainwash your children.
There's a line in an old Nazi play: "When I hear the word 'culture', that's when I reach for my revolver."
Well, the concept of 'ownership' makes these people think about shooting you and your family.