Can you imagine the outcry if Spain took over North America instead of those benevolent and gentle English Protestants?
Spain did own/take over part of North America before the British/English-it was called Spanish Territory or New Spain-it was what is now the American Southwest-Texas, New Mexico, Arizona. Spain also colonized California, parts of
Florida, Colorado, and several other Southern states-and Spain was not benevolent at all...