Was there some reason why Britain had more "right" to have an Empire than did Germany?
Granted the Kaiser was not a lovable character, and the maladroit Zimmerman Note was no help. But WWI in the best interests of the US? No way.
Certainly the Zimmerman Note, passed to their Embassy over cables through the US, was not worth a war by itself.
US had a lot of trade at that time, most carried on by ships. Germany’s violation of the Hague convention rules with their submarine attacks threatened that trade. They had agreed to the Hague convention, and then after the Lusitania, agreed again to an interpretation of the Hague Convention that would forbid their use of U-boats.
And then, after making such agreements with the US, they violated them.
WWI was not about the German empire, except that empire was one way to attack Germany without shedding oceans of blood in France. I don’t wonder that Britain tried that approach first.
Germany was treaty bound to honor the neutrality of Belgium. That they didn’t, brought in Britain.
Rule of thumb: Breaking rules of civilized warfare gives a temporary advantage, but also give you more enemies.
Completely off-topic, I'd suppose, but will bite anyway:
For the US, WWI was not about who had more "right" to Empire.
In those days, everyone had empires, even the US.
The key fact to remember about WWI is that US investors had sunk $billions into Britain & France, and didn't want to see that money lost.
And with the impending collapse of Russia, Germany looked like the winner, unless the US intervened.
So the US did intervene, Germans were defeated and US prosperity protected, at least for ten more years.
What happened next is a different subject, but my point is, without US intervention in WWI, our Great Depression would have begun in 1919.