How people are using LLM's with D

Richard (Rikki) Andrew Cattermole richard at cattermole.co.nz
Wed Feb 11 01:36:05 UTC 2026


I use LLM's in one of three ways:

1. Via the web interface (Gemini 3 flash)
	Great for working through ideas, horrible when you get close to the 
context window size, it can fold back into old ideas that had been dimissed.

2. Via Google Antigravity (Gemini 3 usually flash but thinking available 
free)
	Quotas are quite low, can easily use it all up in one day for the next 
two weeks. Works well for D. Uses vs-code and works with code-d.

3. Via kilo.ai in Intellij IDEA.
	I have to hunt for free models, and they are not the premium ones. No 
quotas however. For the project I'm working on atm I have a memory bank 
with research papers feed into it.

What do I prefer? If I had the money I'd go with Gemini for the model.
Kilo is a major upgrade over Google Antigravity or the web interface.

To break down what Kilo does that I've learned in the last month:

- It has a mode, orchestrator, architect, code, ask.
	This mode provides a prompt, and offers different abilities to the LLM 
in how it interacts with your project. It is a form of specialization. I 
typically start with my own orchestrator mode (I'll provide at end). 
This will on trigger to other modes and create new sub tasks with their 
own prompts that it will create.

- It has a memory bank, this gets read in and is included as part of the 
initial prompt each time you give it a task. They are regular markdown 
files that you can edit and are encouraged to do so. However LLM can 
update it with new information if you tell it to. ``.kilocode/memory-bank``

- Compressing of context, this will activate when you use X% of the 
context window, its mostly a hack to work around small context windows 
which are sadly present.

The main issue I'm running into is that it'll corrupt the D file and add 
a random chunk of the bottom code again to the bottom. Easy to fix but 
frustrating. This doesn't happen with Gemini.

Would I recommend Kilo? Yes Absolutely Yes.
Go try it! It supports a bunch of methods not just Intellij, including 
CLI and it gives you access to a lot of different providers including 
openrouter.

Would I trust it to work on dmd? For these non-premium models? Unlikely. 
If I had the budget then maybe it would be worth building the memory 
bank up and commit it.

What made the biggest difference? The memory bank, by far. After that 
having sub tasks that the LLM could create and research via orchestrator 
mode.

The main benefit to me currently is being able to adapt my existing code 
to new requirements, such as PhobosV3. After so many years doing another 
adaptation it isn't so fun anymore.



More information about the Digitalmars-d mailing list