Posted on 03/16/2025 6:24:41 PM PDT by SeekAndFind
I like you and I like your insightful posts, so don’t take this personally. That program is idiotic. It doesn’t get the concept of prime.
Instead of testing all the numbers up to sqrt(n) test all the primes previously found up to sqrt(n), then when you ran out of primes < sqrt(n) start on the next integer. As the n increases, for any positive integer N you pick there is an M such that for n>M my program will be N times as fast as yours/Gemeni’s. And I feel pretty certain my suggestion isalso not the fastest.
Your example supports Stephen Wolfram’s observation that LLMs start out innumerate. They continue sequences of words using markov chain probabilities and if the model is not large enough you see they sound exactly like Kamala Harris word salad.
As a psysics, programmer, lifetime computer engineer and EE/Computer PE, life taught me if you can write a good program to do anything you want, and it does everything the customer requested, you are only a small part of the way there.
1)Requirements are hard to extract from people. If you ask people what they want and do it, most of the time they say “that’s what I said but that isn’t what I want” and change it. (documented study)
2)If it is fast and good it will change the process because they start running it way more times in analysis, such as designing different ways or doing tax returns different ways.
3} Users will conflate the UI with the program logic
4) Every program needs to be tightly integrated with subject matter experts to handle process escapes as times snd facts change.
5) You will constantly have to argue what your work does versus what someone else can promise.
There are several competing issues going on. Computer Science has always been an evolving field with “adapt or die” being part of it from the beginning. For example, when I started everyone was expected to know assembly language (which itself changes for each new generation of processors) and how to write a custom memory allocator (a skill I have used precisely 0 times after graduating). Likewise for a long time a course in “Computer Graphics” included a lot of analytic geometry such as “How do you know whether a pixel is inside or outside of a shape”. So from that perspective a lot of what is now taught “Here is how to code a linked list -— here is how to create a file on disk -— here is how to sort numbers” may become similar: Something that still exists but only one expert at that one company ever touches.
In contrast, “no code” programming has been a buzzword for decades. 5GLs were supposed to remove the “drudgery” from programming meaning businesses would not need to hire programmers anymore. Apple’s HyperCard was supposedly going to make traditional programming obsolete. Visual Programming was supposed to take over the world. So the idea “this next invention will make the entire field of computer science obsolete” is long-lived and hasn’t been true.
One thing that very definitely is happening: AIs are now good enough to solve simple problems. Which means computer science students are now turning in AI-generated code rather than learning how to write their own. This is already a crisis in academic circles, as it means they are not learning the basics that this article is describing are still essential skills. And AI can not currently (and may never) solve complex problems, so the currently growing crop of student may be in a very bad place in a few years.
The part I most strongly agree with: we don’t yet know how to use these tools effectively.
Thank you for an illuminating article and many insightful comments.
Grace Hopper as I recall ...
It was an analogy that management could understand.
I still marvel at management.
Yea, good times. Coding is good work.
Ok, I learned Fortran,Cobol,and C++. What would be a good introduction to using AI. Some sort DIY tutorial? (Just enough that I can have fun, not so much that I will fundamentally disrupt modern civilization as we know it.)
;-)
High level languages have been (and still are) evolving since machine language to make developers more productive and software more able to utilize hardware since UNIVAC.
If we had AI back in the 1950s, we would have never progressed past assembly.
I have some significant experience working with our in-house, sandboxed AI. For example, I have had our AI produce a series of Automated Unit Tests (to be differentiated from standard 'unit tests'). Creating of AUTs require that they follow the template Arrange-Act-Assert.
What I got back looked pretty good, and at first glance, seemed like it may have saved our development team hundreds of hours of development and testing....
....except....
... it only provided the bare minimum of AUT code. The stuff was not sufficient to test our business use-cases. Sure, it provided my asked-for 80% code coverage... but poorly.
The amount of prompt-engineering it would take to produce good AUTs would require a book about the thickness of 'War and Peace'. And you cannot verbally prompt-engineer, since invariably you will either leave something out or you will need to revise the prompt. This means it really has to be written.
So, to your point, training it to catch all the quirks is an extremely daunting and time-intensive task, and that may prove impractical. Furthermore, that's even if you are AWARE of every quirk. Your development team may not.
I know you pretty well, you like to be right. I ask, however, that you accord me subject-matter-expert deference in this particular topic, as I would accord you the same deference in, say, operation of a nuclear plant... which I believe you have SME experience in.
At this juncture, AI can only provide a reasonable start-point. See my above experience. You still need qualified developers to proof the code, and those developers need to understand the business use-cases.
Me too. I do the very largest portion of my work in C, with an occasional dip into assembler. I have been working fully loaded since 1968. In the first 15 years of my career assemble was most heavily used with FORTRAN taking up some slack. C became available in 1980; and, that was really nice. I am an Embedded programmer, with the very rare add-on skill of being able to design the circuit boards. Instrumentation and control is the silo in which I live.
Most of the time that is due to lack of planning and process. Few teams work to coding standards or have dedicated test teams. Worse yet, the art of writing specifications is dead. Most of my clients do not have a clue about how their system should work or the nuances of how they should "look and feel" to the end user. Almost none of them will sit down with me to verify that my understanding of the process/procedures/behavior of their intended product matches their expectations. Fortunately for them, I'm pretty good at this and can sleuth my way through the project.
A decent organization will refine its safety and functionality checklists or feed that back into standards. Few do so.
Or God, for that matter. ;-D
The 680xx family was the best there ever was! Too bad that Intel blight was adopted for use in the PC.
As a retired software engineer I can’t see a machine solving some of the software design issues we did. Sometimes you have think outside the box.
AI is like Wikipedia, a nice place to start but never take it as fact.
I trained my students using 6800 and 8085 based trainers. About 2 years into my employment teaching the class, I wired up a 6809 CPU adapter board for the 6800 trainer and built an updated monitor ROM that had access to all of the 6809 features. It was a "one off", but I donated it to the department. The EPA68000 trainers became available in my last semester teaching. Too late to incorporate it into the course.
I still have my Atari 1040ST serial number 2. 68000 based. My TRS-80 Model 16 has a 68010 CPU board inside to run Xenix.
“The amount of prompt-engineering it would take to produce good AUTs would require a book about the thickness of ‘War and Peace’.”
I am fully aware that most of the processing at that “AI Islands” is for trading the models.
My ex also made good money working on old COBOL systems full of quirks.
“Ok, I learned Fortran,Cobol,and C++. What would be a good introduction to using AI. Some sort DIY tutorial? (Just enough that I can have fun, not so much that I will fundamentally disrupt modern civilization as we know it.)
;-)”
Microsoft and Google have sites that allow you to work with AI.
I played with the Microsoft sandbox but never signed up for development work.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.