Agree, but apparently Palantir takes it to another level. It is directing actions to be taken, or resolution of conflict. So apparently, we are now getting into a huge “ethics” and moral”dilemma unlike before. Its one thing to predict, another to take “action” on something that has not occurred. Very futuristic for us.
Are you thinking Palantir is AI?
If Palantir was seeing anything correctly it would have foreseen HRC’s loss, if it’s an aggregate behavior predictor. The MOST IMPORTANT factor in swamp intel’s future to even pay the rent.