Use of IA for PR - my POV
Lance Bachmeier
no at spam.net
Tue Feb 10 19:15:43 UTC 2026
On Tuesday, 10 February 2026 at 17:38:40 UTC, matheus wrote:
> Interesting, since I'm not using AI I'd like to know, in this
> case you have LLM locally and you point to D source folder and
> It learns from that database and do everything from there?
You *can* run LLMs locally, but it depends what you want it to do
for you. I use llama with qwen3-coder-next on my desktop CPU. It
works fine for asking it about what a function does, what some
code does, writing short functions, or documentation of short
functions. You no longer need a GPU for anything like that. Just
a free model and a computer with enough RAM.
If you want to do anything that requires thinking or a lot of
context, AI services like Claude or Gemini are still your best
bet, or you better have an expensive computer sitting around
doing nothing.
> they showed us a port of one module written over the past 20
> years in a new language using AI in just a couple of hours, it
> was modernized and everything else through AI.
If you want to port C code to D, that's to a large extent already
done. I can convert C header files quickly using ImportC plus
Gemini for cleaning up the edge cases. I'd be very cautious about
"modernizing" C code with an LLM. You better write an extensive
test suite before you try that.
More information about the Digitalmars-d
mailing list