I was in Britian last week and this subject came up. The peson I was talking to said that the white farmers stole the land from the blacks in the first place, and he appeared to think there was a certain justice in what was happening over there. I don't know much about the history of the place, but it seemed to me to be on the same level of the argument that the Europeans stole America from the Indians, or that the U.S. stole California from Mexico.(Have I just opened a can of worms here?) Whatever the case, I hardly think that what happened in the past justifies what is happening there now.
I'm going to do a little research on the subject, but for now can anyone here give a brief synopsis on whether the Brit's remarks are true?
Many thanks.
Your analogy is correct, it is the same as other places where ownership changed hands through a combination of conquest and barter.
I would have loved to have been there and told this Saxon fraud to go back to Germany. After all, HIS people stole Britain from the Celtic people. So he should have his 22 Kg suitcase packed and start walking the Chunnel to the Continent.
Lets face it...everybody stole something from everybody else. Various Indian tribes in the US would hunt down other tribes, kill them and then claim the region as their hunting region. The same is true in Africa in the 1700s with various tribes killing off other tribes. Its the way of man. As for the white farmers...its true that they came and simply took the land. But they also developed the entire business of farming in the region...and made the country tens of millions in the 1970s and 1980s. Now, its gone, and will not come back.