Posted on 05/20/2020 9:56:15 AM PDT by grundle
Imperial Colleges modelling... could go down in history as the most devastating software mistake of all time, in terms of economic costs and lives lost.
... those of us with a professional and personal interest in software development have studied the code on which policymakers based their fateful decision to mothball our multi-trillion pound economy and plunge millions of people into poverty and hardship. And we were profoundly disturbed at what we discovered. The model appears to be totally unreliable and you wouldnt stake your life on it.
Imperials model appears to be based on a programming language called Fortran, which was old news 20 years ago and, guess what, was the code used for Mariner 1. This outdated language contains inherent problems with its grammar and the way it assigns values, which can give way to multiple design flaws and numerical inaccuracies. One file alone in the Imperial model contained 15,000 lines of code.
Try unravelling that tangled, buggy mess, which looks more like a bowl of angel hair pasta than a finely tuned piece of programming. Industry best practice would have 500 separate files instead. In our commercial reality, we would fire anyone for developing code like this and any business that relied on it to produce software for sale would likely go bust.
The approach ignores widely accepted computer science principles known as separation of concerns, which date back to the early 70s and are essential to the design and architecture of successful software systems. The principles guard against what developers call CACE: Changing Anything Changes Everything.
Without this separation, it is impossible to carry out rigorous testing of individual parts to ensure full working order of the whole.
(Excerpt) Read more at telegraph.co.uk ...
every “model” used to determine public policy needs to open source.
The dangers of models in the modern world is they are used as justification for major decisions.>>> yes and these model need to be open source. auditable.
Neil Ferguson has now made multiple wildly inflated disease models.
Last month Neil Ferguson, a professor of mathematical biology at Imperial College London, told Guardian Unlimited that up to 200 million people could be killed.
https://www.theguardian.com/world/2005/sep/30/birdflu.jamessturcke
Professor Neil Ferguson, from the Department of Infectious Disease Epidemiology at Imperial College, said ... the future number of deaths from Creutzfeldt-Jakob disease (vCJD) due to exposure to BSE in beef was likely to lie between 50 and 50,000.
https://www.theguardian.com/education/2002/jan/09/research.highereducation
And now the Wu-Flu.
You're an idiot.
SO - if not ONE forecast was ever given - you would not know a single thing was ever coming.
And with attitudes like yours - I'm ok with that. If you think you can do better by looking at the clouds and taking a guess - GO FOR IT. Next time a storm is in the Gulf - TURN OFF THE WEATHER CHANNEL and do not pay attention to ANY forecasts. Wing it.
Idiot.
Every single forecast is based on computer models. The TV weather guys take every forecast from the NWS and the NHC - which are based on models.
So - never - EVER - pay attention to another forecast again. Roll the dice and plan that outdoor event without looking at the weather. After all - it's a "guess"...LOL.
And my guess is as good as yours - right? lol
So - never - EVER - watch the channel x weather again. EVER.
I never used APL but saw it in Byte magazine, but I did use FORTH.
These are very different languages. APL was tight from its symbolic structure, as I recall, while FORTH was tight due to incredibly recursive code that could be interpreted (including inline assembly language) or compiled.
It was technically possible to have FORTH be smaller than any compiled program because no compiler would do microrecursion to eliminate redundancy, while the FORTH programmer did this by nature and need.
Those were some good days. I remember being in high school and having saved up $150 for an amazing FORTH environment in the 1980s.
I remember my Fortran programming class and creating the punch cards from my college days in the late 1970s.
I can still remember a funny incident with punch cards from back in those days. This guy had two giant stacks of punch cards, one under each arm. It was a very windy day on the Palouse meaning gale force like winds. As this guy went to step onto the curb, he misjudged his step and tripped. One stack of cards started to slip out and as he tried to secure that one, he lost control of the other and both hit the ground, breaking the rubber bands that he had wrapped them with. The fierce winds immediately caught the cards and it was like an explosion of punch cards scattered across the Idaho panhandle. The guy stood there for a second in stunned silence and then erupted in some of the most impressive profanity you ever heard.
Ada could be easily handled by programmers with Turbo Pascal and Delphi experience.
FORTRAN got the Apollo astronauts to the moon and back.
LOL, as I recall the cards had to be inserted in their proper order. Probably took him days just to punch the cards. Sounds like a good way to learn some new cuss words.
My first wife was a keypunch operator for an insurance company.
One stack of cards was the program that had to be loaded into the computer that then operated on the second stack of cards: the data.
To cut down on the GIGO factor; new data was first placed on a card by a KP operator. That card was placed in a machine which compared the data to the exact same data (in theory) that the Verifier would type into her (mostly) machine.
If the two items matched; the card was kept.
I guess the idea was that it would be difficult for two people to make the same error entering the data. Any discrepancy would show that one or the other had made a typo.
Ive had HP calculators since the HP-25. I still have my HP-48SX (and it still works) although I am more likely to use the HP-48 emulator app on my iPhone these days.
Oh, by the way its Reverse Polish Notation since you enter the operator after the operand.
This was done deliberately to ALL involved the end justified the means these bastards dont give a damn about anyone, ALWAYS remember what Schumer said when Trump tweeted about the intel community, that Trump would regret the day because intel have ways of getting you you never dreamed possible!! These bastards have a DEEP HATRED of Trump and if WE get in the way SO BE IT!!!
This article is a pretty good explanation of what a joke Ferguson is and details some of his other disastrous predictions.
There are some links that get into a bunch of other stuff if you're interested.
As for who we should have been listening to.
No favorite from me.
Probably would have had a more accurate assessment if they'd ask me or you.
He did it in FORTAN? FORTRAN was old when I was in High School. I graduated in the 90’s.
Had to remain conversant in it, as some old PLC and DCS programs have it, but I haven’t seen any “new” programs in fortan since... well since the one I wrote in Basics of Programming for Chemical Engineers.
That was 1995 .
A good software engineer can write Fortran in any language.
That’s why we punched line numbers into columns 73-80.
I suspect that it was not a mistake.
Id tend to agree with you.
L
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.