really only since WWII
I’m a tremendous supporter of the US-UK alliance, but people are myopic about history (or simply don’t know it). The US and UK were never ‘allies’ until late in WWI, and even then it was an awkward and uneasy alliance, which reverted back to US neutrality in the 1920s and 30s. Only with Pearl Harbor and the US entrance into WWII did the warm close alliance that we all assume today really begin to develop.
Throughout the 19th century the US and Brits were fairly unfriendly competitors, and the Brits considered supporting the Confederacy in our Civil War. Ceertainly the USA never supported any of their wars of empire etc. We just ignored them when we weren’t competing on trade etc.
America and Britain are bound by language and culture, the common-law system, the idea of constitutional guarantees and rights, and (to a lesser extent these days) common religious views.
It seems a bit of an exaggeration to describe 1775, 1812, and the side conflict with Britain in 1861-1865 as family quarrels, but compared to the fundamental matters that bind us together, it's not far off.