My boss went to England in April (business/pleasure trip). I asked how she enjoyed her trip. Her answer was "Never, never, never, never again. I hate the British people!" As more and more Americans come home from Europe after bad experiences they are going to stop going. I wouldn't walk across the street to visit people who hate me, why would I cross an ocean? The only view most of us get of Europe is the never ending criticisms and disdain. It's taking a toll. Sad.
Sorry about that; wasn't London was it?