Thailand has never been colonized. Western influence though is a possible explanation, however.
More probable than possible. We Americans have many fine qualities, but aspects of our modern crude popular culture that is picked up by other peoples is not one of them.