It's unlike me studying trends of temperatures in my area (downloaded at https://www.ncdc.noaa.gov/cdo-web/search in CSV format). Every day there's a high temperature and a low temperature. Thus, every year I have 365 highs and 365 lows. Thus looking across 10 years I could surmise if it was worth going solar in my area (not for the grid, that's dumb, but for me to create enough homemade power to reduce my power bill enough that I don't worry about the Dims' warmageddon cult energy policies draining my retirement finances just to keep the house cool and warm).
I've had solar now for 3 years, plus my inverters record telemetry in 5-minute candles. I can download that, import it into a SQL Server database, and study how much solar power was coming in at any point in time, how much power my electrical panels consumed at that time (my load), how much excess power was stored to the batteries, how much power was pulled from batteries, and how much power was either pulled from the grid (buying power) or exported to the grid (selling power). Two years ago I used that detail of information to determine if it was worth upgrading the solar and, if so, which parts (i.e. more solar panels and/or more inverter capacity and/or more battery storage). Also if it was worth getting an EV since it was time to replace my wife's car anyway. With each of those decisions I was able to determine how much to invest in what component to upgrade it to take advantage of the economies of scale (more is better ROI), but not invest so much I would be fighting the law of diminishing returns (more is more expensive with less ROI). The same for other decisions like if it was worth it to replace my standard AC and gas furnace with a variable speed heat pump and heat strips, and replace my gas water heater with a hybrid water heater (built-in heat pump, or it can use normal heat strips).
End result: in the past 365 days 82% of the power we consumed was homemade power from solar/battery storage, with the other 18% having to be pulled from the grid. That includes charging the EV for 13K home charged miles (not counting charging away from home during trips). That's in an all-electric home. I don't recommend this for everybody. Just for those who can do this kind of detailed analysis on their own energy consumption habits and weather patterns.
My point: There's no way I would have spent the money to do it if my decision was based on only a few data points per year, like the # of hurricanes and typhoons per year in a 10 year or 50 year period. IMHO there's just not enough data points with that to build trends to warrant scaring the public over climate-ageddon. At least with my energy project, since it depends on daily high and low temps and daily sun hours and cloudy hours and rain hours, just a few years' worth of daily data is enough to build trends that matter for my project.
Thank you for the response and your original post. I could have added that my graphs of 110 or 160 data point would have to consider changes in the ability to measure. In 1850 or 1900 there would be limited observations. If not careful a researcher would detect an increase in major storms just because more and better instrument and more deliberate attempts at observation.