The D standard library is built on GC, is that a negative or positive?

IGotD- nise at nise.com
Thu Dec 15 12:36:59 UTC 2022


On Thursday, 15 December 2022 at 11:56:31 UTC, matheus wrote:
>
> To be honest I don't think people generally are bothered with 
> this. I mean when I write my batch programs to process 
> something in D or C# I don't even think about the GC stopping 
> or whatever, most of the time it will process some data in a 
> reasonable time so I'm OK.
>

Majority of applications are so small and have little performance 
requirements that it doesn't matter. It starts to matter when you 
have web services, games etc. For example if you reduce the 
memory consumption, perhaps you don't need to buy/hire that extra 
infrastructure which cost money.


> Where I work (Which is a big Health Insurance in my country), 
> our main language is C# and we process millions of data every 
> day, apps, web apps etc. And nobody is complaining too much 
> about the delay in our operations.
>

The default GC in C# is generational which is also a workaround 
of the inefficiency of the algorithm. C# has at least the luxury 
to use several different types of GC which might be more of them 
in the future.

I'm amazed how much the tracing GC is used in the computer 
industry despite its complexity and drawbacks. When will the time 
come when the tracing GC no longer scales with increasing memory 
consumption?






More information about the Digitalmars-d mailing list