I’m sure all the thermometers were perfectly accurate to a fraction of a degree 125 years ago.
I do not believe it has anything to do with the accuracy or calibration of thermometers. Rather, according to a college friend (geologist), the organizations that record and report on weather/climate are using a different method to determine temperatures. In regards to heat, the previous method was to measure the temperature of the air at a particular height above ground (no idea what that height was but perhaps 5’,6’, 7’, etc above ground) but now they measure temperatures at ground level or no more than 12” - 24” above ground which gives a much higher reading.
Consider a piece of metal sitting in direct sunlight for a few hours and then put your hand on it and you will get one hell of a burn. Put a thermometer 1” above that same piece of metal and it will record a much higher temperature than if you held it 24” above the metal. The ground absorbs the energy/heat from the sun and then radiates it back into the air. The closer you are to the hot piece of metal, the higher the reading. They do this in order to demonstrate or prove that the climate is heating up and thus justify the claim and con game of “Climate Change”.