I started my first computing when I was a 1969 ME freshman running punchcards through an IBM 360 operating on FORTRAN IV with WATFOR. I still remember solving the Newton-Raphson method to find a good approximation for the root of a real-valued function f(x) = 0. It uses the idea that a continuous and differentiable function can be approximated by a straight line tangent to it. I was astonished by the successive iterations, that they could be programmed, and that an approximation could give such good results. It was a real eye-opener.
Thanks. Interesting...
[[I started my first computing when I was a 1969 ME freshman]]
I was lucky, I didn’t get hooked on computers until my mid 30’s- then it all went downhill from there lol- I need rehab I think.
My second one was actually useful to me: In the game of Risk, which I played often with others in the back of our HS Math Lab, there are rules for attrition of armies depending on the values of the dice thrown and whether you're an attacker or defender. If I and my opponent each had huge armies facing each other in adjacent countries, and I knew eventually there'd be a battle between us, my question was: Is it better to wait for him to attack me, or better if I'm the attacker?
There was no web to search in 1970, and I hadn't yet learned the math tools (combinatorics) to solve it on paper, so I wrote a FORTRAN program to use random #s to simulate 1000 throws of the dice - attacker has 3 dice, defender has 2, all ties won by defender. (Results: attacker lost ~85 armies for each 100 lost by defender).
But I'd already used BASIC for a couple of years before that for more than just fun & games. I wrote programs to number-crunch statistics like chi-squared for Biology labs.