There was a wonderful Sci-Fi short story called, IIRC, “The A & O Book”. This was a supposed reference to the British bi-level secondary school systems, Ordinary and Advanced. The story was written back at the beginning of the Digital Age.
In the story, set sometime in the future, the young hero is accused of cheating on his A&O exams. The test administrator was described in much the same way as a stereotypical Hollywood nerd is portrayed; overweight, myopic, fingertips turning spatulate due to a lifetime spent at the keyboard.
The evidence for our hero’s cheating was that he had used zero computer time and had a zero percentage error rate. It is revealed that the “A&O” meant Apples & Oranges. The math problems dealt with division and percentages and were meant to be solved with the test computer’s calculator. Our young hero had been taught fractions by his reactionary grandfather and needed no computer time to calculate. Also, because he dealt with fraction through all of the intermediate steps of calculation, he had no rounding errors (i.e 1/3 -0.33333333...)
Ergo, he MUST have cheated and used a stolen copy of the answer book.
I think this was written back in the 1950’s or 60’s. Now, 50 + years later, you are asking this question, “Do old fashioned arithmetic algorithms really need to be taught any more?”
In the story, set sometime in the future, the young hero is accused of cheating on his A&O exams. The test administrator was described in much the same way as a stereotypical Hollywood nerd is portrayed; overweight, myopic, fingertips turning spatulate due to a lifetime spent at the keyboard.
The evidence for our heros cheating was that he had used zero computer time and had a zero percentage error rate. It is revealed that the A&O meant Apples & Oranges. The math problems dealt with division and percentages and were meant to be solved with the test computers calculator. Our young hero had been taught fractions by his reactionary grandfather and needed no computer time to calculate. Also, because he dealt with fraction through all of the intermediate steps of calculation, he had no rounding errors (i.e 1/3 -0.33333333...)
I would love to read this story. If you (or anyone else) can think of more details that could help me locate it either in print or online (Google has been fruitless thus far), that'd be great!
Sounds very similar in theme to "The Machine Stops" by C. S. Forester: what happens when the technology you rely on, even for your survival, starts to break down?