With the exception of characterizing Britain as a mature nation, I respectfully disagree.
The sun did not begin to truly set on the British Empire until after the turn of the century. Probably, I would be safe in claiming the decline came part and parcel with WWI. Britain declared war on Germany in the name of the Empire, but the individual dominions signed the armistice under their own signature and joined the League of Nations as independent States.
All through the 19th century the empire expanded. It was pretty much a blooming flower for a very long time.