Free Republic
Browse · Search
General/Chat
Topics · Post Article

To: Lazamataz

I also use AI for simple tasks, like code a short sequence of function calls, but it cannot create large segments of code.

For instance, I can ask it, “Create the calling sequence for Command.ExecuteReaderAsync()”, but it cannot create an entire class for it such as, “Create a class using the SQL command and connection for an asynchronous library calling stored procedures.” What it creates is never the entire class.


61 posted on 11/01/2025 11:29:49 AM PDT by CodeToad
[ Post Reply | Private Reply | To 38 | View Replies ]


To: CodeToad

You could probably do it with Claude Code.


62 posted on 11/01/2025 11:31:53 AM PDT by dfwgator ("I am Charlie Kirk!")
[ Post Reply | Private Reply | To 61 | View Replies ]

To: CodeToad; Owen; usconservative; Mr. K; piytar; kiryandil; Regulator; dfwgator; central_va
For instance, I can ask it, “Create the calling sequence for Command.ExecuteReaderAsync()”, but it cannot create an entire class for it such as, “Create a class using the SQL command and connection for an asynchronous library calling stored procedures.” What it creates is never the entire class.

Oh, don't get me started. LOL

SonarQube quality gate (rightfully) won't pass two of my CI pipelines because we're using Task.Wait, which is deprecated. Fine. So I tell our inhouse, sandboxed LLM to write stuff that used await instead.

Sure, it knocked out the immediate function or method call, but what about all the asynch parent classes above it? Can you handle static, non instantiated classes? What about the calling methods?

Brain freeze.

Not QUITE sure why the previous developers used asynch anyways, there is no simultaneous code that needed to be executed. Perhaps they were showing off, or maybe it was a demonstration of the truism, 'If all you have is a hammer, everything looks like nails.'

Anyways, yeah, our inhouse LLM that is leveraging ChatGPT 4.1 could not do it. Not for the whole calling stack, anyways, and when I tried to break it up, I got AI Slop.

But curiously, this entity did a far better job than Codeium/Windsurf. THAT entity simply said, "Hey, yeah, that's a tough problem. Here's how you would go about solving it."

Wait, what? THAT'S YOUR JOB, you lazy sack of 1000-parameter multidimensional vectors!

I haven't tried claude or copilot yet, so I cannot speak to those.

The only benefit Codeium/Windsurf brags on is that it can context-wrap our repos, and opposed to our inhouse LLM (which sports 2500 context tokens) Codeium/Windsurf brings 20,000.

Doesn't use these features very well, yet, but reportedly they are there.

PS: If your codegen is quitting when it runs out of token space, there is a magic command: "Continue", or "Continue from this code line (insert code line).

77 posted on 11/01/2025 12:35:36 PM PDT by Lazamataz (I figure if Charlie Kirk can die for free speech, I can be mildly inconvenienced.)
[ Post Reply | Private Reply | To 61 | View Replies ]

To: CodeToad

What Ive noticrd on our vendor
demos is that theu ALWAYS end up creating web pages or modifying various features on them.


98 posted on 11/01/2025 5:12:49 PM PDT by grey_whiskers (The opinions are solely those of the author and are subject to change without notice.)
[ Post Reply | Private Reply | To 61 | View Replies ]

Free Republic
Browse · Search
General/Chat
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson