Why use a DFA instead of DIP1000?
IchorDev
zxinsworld at gmail.com
Sat Sep 13 10:13:05 UTC 2025
On Saturday, 13 September 2025 at 02:39:40 UTC, Richard (Rikki)
Andrew Cattermole wrote:
> This is clearly a false positive; that branch could never run!
It could run if asserts were off. Obviously asserts are never
meant to fail when the program is in a correct state though.
> One of the hallmarks of a real data flow analysis engine is
> called dead code elimination; all optimising backends implement
> it. In fact, it's one of the first that ever gets implemented
> and DIP1000 can't do it!
This example doesn't seem very realistic. A branch that never
runs shouldn't really exist to begin with, right? Can you think
of any more realistic examples where DFA allows you to infer
safety where DIP1000 is currently limited?
Also I'm curious what other problems DFA could be applied to? For
instance, detecting what types a function throws and allowing you
to catch only those types instead of always having to
`catch(Exception)` to satisfy the compiler. Or doing more complex
type inference.
More information about the Digitalmars-d
mailing list