No. In fact, the best evidence that Christianity has the best hold on truth, is the quality of life that the Christian West has produced. It is unrivalled in the known universe.
Creation is the main function of God as people generally understand Him. However, even the Christian West can't seem to get together on the best approach to understanding His creation.
The populations of humans who were stomped on by Western Imperialism probably wouldn't share your rosy view of Western moral superiority. "The Christian West" is growing up and becoming the Postchristian West, and we will all be the better for it.