Free Republic
Browse · Search
General/Chat
Topics · Post Article

Skip to comments.

The End of Computer Programming as We Know It
O'Reilly ^ | 03/16/2025 | Tim O’Reilly

Posted on 03/16/2025 6:24:41 PM PDT by SeekAndFind

click here to read article


Navigation: use the links below to view more comments.
first previous 1-2021-4041-6061-8081-99 next last
To: TexasGator

I like you and I like your insightful posts, so don’t take this personally. That program is idiotic. It doesn’t get the concept of prime.

Instead of testing all the numbers up to sqrt(n) test all the primes previously found up to sqrt(n), then when you ran out of primes < sqrt(n) start on the next integer. As the n increases, for any positive integer N you pick there is an M such that for n>M my program will be N times as fast as yours/Gemeni’s. And I feel pretty certain my suggestion isalso not the fastest.

Your example supports Stephen Wolfram’s observation that LLMs start out innumerate. They continue sequences of words using markov chain probabilities and if the model is not large enough you see they sound exactly like Kamala Harris word salad.

As a psysics, programmer, lifetime computer engineer and EE/Computer PE, life taught me if you can write a good program to do anything you want, and it does everything the customer requested, you are only a small part of the way there.

1)Requirements are hard to extract from people. If you ask people what they want and do it, most of the time they say “that’s what I said but that isn’t what I want” and change it. (documented study)

2)If it is fast and good it will change the process because they start running it way more times in analysis, such as designing different ways or doing tax returns different ways.

3} Users will conflate the UI with the program logic

4) Every program needs to be tightly integrated with subject matter experts to handle process escapes as times snd facts change.

5) You will constantly have to argue what your work does versus what someone else can promise.


41 posted on 03/16/2025 10:45:00 PM PDT by takebackaustin
[ Post Reply | Private Reply | To 13 | View Replies]

To: SeekAndFind

There are several competing issues going on. Computer Science has always been an evolving field with “adapt or die” being part of it from the beginning. For example, when I started everyone was expected to know assembly language (which itself changes for each new generation of processors) and how to write a custom memory allocator (a skill I have used precisely 0 times after graduating). Likewise for a long time a course in “Computer Graphics” included a lot of analytic geometry such as “How do you know whether a pixel is inside or outside of a shape”. So from that perspective a lot of what is now taught “Here is how to code a linked list -— here is how to create a file on disk -— here is how to sort numbers” may become similar: Something that still exists but only one expert at that one company ever touches.

In contrast, “no code” programming has been a buzzword for decades. 5GLs were supposed to remove the “drudgery” from programming meaning businesses would not need to hire programmers anymore. Apple’s HyperCard was supposedly going to make traditional programming obsolete. Visual Programming was supposed to take over the world. So the idea “this next invention will make the entire field of computer science obsolete” is long-lived and hasn’t been true.

One thing that very definitely is happening: AIs are now good enough to solve simple problems. Which means computer science students are now turning in AI-generated code rather than learning how to write their own. This is already a crisis in academic circles, as it means they are not learning the basics that this article is describing are still essential skills. And AI can not currently (and may never) solve complex problems, so the currently growing crop of student may be in a very bad place in a few years.

The part I most strongly agree with: we don’t yet know how to use these tools effectively.


42 posted on 03/16/2025 10:57:52 PM PDT by TennesseeProfessor
[ Post Reply | Private Reply | To 1 | View Replies]

To: SeekAndFind

Thank you for an illuminating article and many insightful comments.


43 posted on 03/17/2025 12:41:11 AM PDT by Rockingham
[ Post Reply | Private Reply | To 1 | View Replies]

To: LouAvul

Grace Hopper as I recall ...


44 posted on 03/17/2025 2:44:38 AM PDT by SandwicheGuy ("Man is the only pack animal that will follow an unstable leader." Cesar Chavez)
[ Post Reply | Private Reply | To 5 | View Replies]

To: TexasGator

It was an analogy that management could understand.

I still marvel at management.


45 posted on 03/17/2025 5:11:15 AM PDT by ImJustAnotherOkie
[ Post Reply | Private Reply | To 8 | View Replies]

To: Myrddin

Yea, good times. Coding is good work.


46 posted on 03/17/2025 5:34:06 AM PDT by ImJustAnotherOkie
[ Post Reply | Private Reply | To 27 | View Replies]

To: SandwicheGuy

Ok, I learned Fortran,Cobol,and C++. What would be a good introduction to using AI. Some sort DIY tutorial? (Just enough that I can have fun, not so much that I will fundamentally disrupt modern civilization as we know it.)
;-)


47 posted on 03/17/2025 5:56:11 AM PDT by Don@VB (THE NEW GREEN DEAL IS JUST THE OLD RED DEAL)
[ Post Reply | Private Reply | To 44 | View Replies]

To: SeekAndFind
IMHO using AI for programming is an enormous step backwards because hides the pain of using insufficient tooling, and thus the motivation to create better ones.

High level languages have been (and still are) evolving since machine language to make developers more productive and software more able to utilize hardware since UNIVAC.

If we had AI back in the 1950s, we would have never progressed past assembly.

48 posted on 03/17/2025 6:06:42 AM PDT by SecondAmendment (The history of the present Federal Government is a history of repeated injuries and usurpations .)
[ Post Reply | Private Reply | To 1 | View Replies]

To: TexasGator; usconservative; Mr. K
It can when trained to include those “quirks”.

I have some significant experience working with our in-house, sandboxed AI. For example, I have had our AI produce a series of Automated Unit Tests (to be differentiated from standard 'unit tests'). Creating of AUTs require that they follow the template Arrange-Act-Assert.

What I got back looked pretty good, and at first glance, seemed like it may have saved our development team hundreds of hours of development and testing....

....except....

... it only provided the bare minimum of AUT code. The stuff was not sufficient to test our business use-cases. Sure, it provided my asked-for 80% code coverage... but poorly.

The amount of prompt-engineering it would take to produce good AUTs would require a book about the thickness of 'War and Peace'. And you cannot verbally prompt-engineer, since invariably you will either leave something out or you will need to revise the prompt. This means it really has to be written.

So, to your point, training it to catch all the quirks is an extremely daunting and time-intensive task, and that may prove impractical. Furthermore, that's even if you are AWARE of every quirk. Your development team may not.

I know you pretty well, you like to be right. I ask, however, that you accord me subject-matter-expert deference in this particular topic, as I would accord you the same deference in, say, operation of a nuclear plant... which I believe you have SME experience in.

49 posted on 03/17/2025 6:14:48 AM PDT by Lazamataz (I'm so on fire that I feel the need to stop, drop, and roll!)
[ Post Reply | Private Reply | To 15 | View Replies]

To: SecondAmendment
IMHO using AI for programming is an enormous step backwards because hides the pain of using insufficient tooling, and thus the motivation to create better ones.

At this juncture, AI can only provide a reasonable start-point. See my above experience. You still need qualified developers to proof the code, and those developers need to understand the business use-cases.

50 posted on 03/17/2025 6:16:34 AM PDT by Lazamataz (I'm so on fire that I feel the need to stop, drop, and roll!)
[ Post Reply | Private Reply | To 48 | View Replies]

To: ImJustAnotherOkie
To me programming is low level languages. That’s where the work gets done.

Me too. I do the very largest portion of my work in C, with an occasional dip into assembler. I have been working fully loaded since 1968. In the first 15 years of my career assemble was most heavily used with FORTRAN taking up some slack. C became available in 1980; and, that was really nice. I am an Embedded programmer, with the very rare add-on skill of being able to design the circuit boards. Instrumentation and control is the silo in which I live.

51 posted on 03/17/2025 6:27:44 AM PDT by GingisK
[ Post Reply | Private Reply | To 2 | View Replies]

To: linMcHlp
the actual software we use daily doesn’t seem like it’s getting noticeably better

Most of the time that is due to lack of planning and process. Few teams work to coding standards or have dedicated test teams. Worse yet, the art of writing specifications is dead. Most of my clients do not have a clue about how their system should work or the nuances of how they should "look and feel" to the end user. Almost none of them will sit down with me to verify that my understanding of the process/procedures/behavior of their intended product matches their expectations. Fortunately for them, I'm pretty good at this and can sleuth my way through the project.

A decent organization will refine its safety and functionality checklists or feed that back into standards. Few do so.

52 posted on 03/17/2025 6:36:58 AM PDT by GingisK
[ Post Reply | Private Reply | To 10 | View Replies]

To: dfwgator
a bunch of quirks, that cannot simply be deduced by using AI

Or God, for that matter. ;-D

53 posted on 03/17/2025 6:38:03 AM PDT by GingisK
[ Post Reply | Private Reply | To 12 | View Replies]

To: Myrddin

The 680xx family was the best there ever was! Too bad that Intel blight was adopted for use in the PC.


54 posted on 03/17/2025 6:41:23 AM PDT by GingisK
[ Post Reply | Private Reply | To 27 | View Replies]

To: SeekAndFind

As a retired software engineer I can’t see a machine solving some of the software design issues we did. Sometimes you have think outside the box.


55 posted on 03/17/2025 6:45:09 AM PDT by McGruff (Biden will go down in history as the worst president ever.)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Lazamataz

AI is like Wikipedia, a nice place to start but never take it as fact.


56 posted on 03/17/2025 6:48:38 AM PDT by CodeToad ( )
[ Post Reply | Private Reply | To 50 | View Replies]

To: Skywise
It certainly is better than Google for discovering solutions to edge case problems. The limit is a static training model that is older than the problem you are trying to resolve. I found that with questions about keycloak. I was seeing quirks that were specific to the latest release, but ChatGPT was 5 releases behind.
57 posted on 03/17/2025 7:38:04 AM PDT by Myrddin
[ Post Reply | Private Reply | To 35 | View Replies]

To: GingisK
The 680xx family was the best there ever was! Too bad that Intel blight was adopted for use in the PC.

I trained my students using 6800 and 8085 based trainers. About 2 years into my employment teaching the class, I wired up a 6809 CPU adapter board for the 6800 trainer and built an updated monitor ROM that had access to all of the 6809 features. It was a "one off", but I donated it to the department. The EPA68000 trainers became available in my last semester teaching. Too late to incorporate it into the course.

I still have my Atari 1040ST serial number 2. 68000 based. My TRS-80 Model 16 has a 68010 CPU board inside to run Xenix.

58 posted on 03/17/2025 7:44:11 AM PDT by Myrddin
[ Post Reply | Private Reply | To 54 | View Replies]

To: Lazamataz

“The amount of prompt-engineering it would take to produce good AUTs would require a book about the thickness of ‘War and Peace’.”

I am fully aware that most of the processing at that “AI Islands” is for trading the models.

My ex also made good money working on old COBOL systems full of quirks.


59 posted on 03/17/2025 7:48:28 AM PDT by TexasGator (X1.1111'1'./iI11 .I1.11.'1I1.I'')
[ Post Reply | Private Reply | To 49 | View Replies]

To: Don@VB

“Ok, I learned Fortran,Cobol,and C++. What would be a good introduction to using AI. Some sort DIY tutorial? (Just enough that I can have fun, not so much that I will fundamentally disrupt modern civilization as we know it.)
;-)”

Microsoft and Google have sites that allow you to work with AI.

I played with the Microsoft sandbox but never signed up for development work.


60 posted on 03/17/2025 7:59:23 AM PDT by TexasGator (X1.1111'1'./iI11 .I1.11.'1I1.I'')
[ Post Reply | Private Reply | To 47 | View Replies]


Navigation: use the links below to view more comments.
first previous 1-2021-4041-6061-8081-99 next last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
General/Chat
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson