Smart pointers instead of GC?

Adam Wilson flyboynw at gmail.com
Sat Feb 1 12:07:35 PST 2014


On Sat, 01 Feb 2014 02:15:42 -0800, Manu <turkeyman at gmail.com> wrote:

> On 1 February 2014 19:27, Adam Wilson <flyboynw at gmail.com> wrote:
>
>> On Fri, 31 Jan 2014 23:35:44 -0800, Manu <turkeyman at gmail.com> wrote:
>>
>>  On 1 February 2014 16:26, Adam Wilson <flyboynw at gmail.com> wrote:
>>>
>>>  On Fri, 31 Jan 2014 21:29:04 -0800, Manu <turkeyman at gmail.com> wrote:
>>>>
>>>>  On 26 December 2012 00:48, Sven Over <dlang at svenover.de> wrote:
>>>>
>>>>>
>>>>>   std.typecons.RefCounted!T
>>>>>
>>>>>>
>>>>>>
>>>>>>> core.memory.GC.disable();
>>>>>>>
>>>>>>>
>>>>>>>  Wow. That was easy.
>>>>>>
>>>>>> I see, D's claim of being a multi-paradigm language is not false.
>>>>>>
>>>>>>
>>>>>
>>>>> It's not a realistic suggestion. Everything you want to link uses the
>>>>> GC,
>>>>> and the language its self also uses the GC. Unless you write  
>>>>> software in
>>>>> complete isolation and forego many valuable features, it's not a
>>>>> solution.
>>>>>
>>>>>
>>>>>  Phobos does rely on the GC to some extent. Most algorithms and  
>>>>> ranges
>>>>> do
>>>>>
>>>>>  not though.
>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>  Running (library) code that was written with GC in mind and  
>>>>>>> turning
>>>>>> GC
>>>>>> off
>>>>>> doesn't sound ideal.
>>>>>>
>>>>>> But maybe this allows me to familiarise myself more with D. Who  
>>>>>> knows,
>>>>>> maybe I can learn to stop worrying and love garbage collection.
>>>>>>
>>>>>> Thanks for your help!
>>>>>>
>>>>>>
>>>>>>  I've been trying to learn to love the GC for as long as I've been
>>>>> around
>>>>> here. I really wanted to break that mental barrier, but it hasn't
>>>>> happened.
>>>>> In fact, I am more than ever convinced that the GC won't do. My  
>>>>> current
>>>>> #1
>>>>> wishlist item for D is the ability to use a reference counted  
>>>>> collector
>>>>> in
>>>>> place of the built-in GC.
>>>>> You're not alone :)
>>>>>
>>>>> I write realtime and memory-constrained software (console games), and
>>>>> for
>>>>> me, I think the biggest issue that can never be solved is the
>>>>> non-deterministic nature of the collect cycles, and the unknowable
>>>>> memory
>>>>> footprint of the application. You can't make any guarantees or
>>>>> predictions
>>>>> about the GC, which is fundamentally incompatible with realtime
>>>>> software.
>>>>> Language-level ARC would probably do quite nicely for the  
>>>>> miscellaneous
>>>>> allocations. Obviously, bulk allocations are still usually best  
>>>>> handled
>>>>> in
>>>>> a context sensitive manner; ie, regions/pools/freelists/whatever, but
>>>>> the
>>>>> convenience of the GC paradigm does offer some interesting and  
>>>>> massively
>>>>> time-saving features to D.
>>>>> Everyone will always refer you to RefCounted, which mangles your  
>>>>> types
>>>>> and
>>>>> pollutes your code, but aside from that, for ARC to be useful, it  
>>>>> needs
>>>>> to
>>>>> be supported at the language-level, such that the language/optimiser  
>>>>> is
>>>>> able to optimise out redundant incref/decref calls, and also that it  
>>>>> is
>>>>> compatible with immutable (you can't manage a refcount if the object  
>>>>> is
>>>>> immutable).
>>>>>
>>>>>
>>>> The problem isn't GC's per se. But D's horribly naive implementation,
>>>> games are written on GC languages now all the time (Unity/.NET). And
>>>> let's
>>>> be honest, games are kind of a speciality, games do things most  
>>>> programs
>>>> will never do.
>>>>
>>>> You might want to read the GC Handbook. GC's aren't bad, but most,  
>>>> like
>>>> the D GC, are just to simplistic for common usage today.
>>>>
>>>
>>>
>>> Maybe a sufficiently advanced GC could address the performance
>>> non-determinism to an acceptable level, but you're still left with the
>>> memory non-determinism, and the conundrum that when your heap  
>>> approaches
>>> full (which is _always_ on a games console), the GC has to work harder  
>>> and
>>> harder, and more often to try and keep the tiny little bit of overhead
>>> available.
>>> A GC heap by nature expects you to have lots of memory, and also lots  
>>> of
>>> FREE memory.
>>>
>>> No serious console game I'm aware of has ever been written in a  
>>> language
>>> with a GC. Casual games, or games that don't attempt to raise the bar  
>>> may
>>> get away with it, but that's not the industry I work in.
>>>
>>
>> That's kind of my point. You're asking for massive changes throughout  
>> the
>> entire compiler to support what is becoming more of an edge case, not  
>> less
>> of one. For the vast majority of use cases, a GC is the right call and D
>> has to cater to the majority if it wants to gain any significant  
>> mindshare
>> at all. You don't grow by increasing specialization...
>
>
> Why is ARC any worse than GC? Why is it even a compromise at the high  
> level?
> Major players have been moving away from GC to ARC in recent years. It's
> still a perfectly valid method of garbage collection, and it has the
> advantage that it's intrinsically real-time compatible.
>

Define Major Players? Because I only know about Apple, but they've been  
doing ARC for almost a decade, and IIRC, like GC's, it's not universally  
loved there either. Microsoft is desperately trying to get people to move  
back to C++ but so far the community has spoken with a resounding "You can  
pry C#/.NET from our cold, dead, hands." Java has shown no particular  
interest in moving away from GC's probably because their GC is best in  
class. Even games are starting to bring in GC's (The Witcher 2 for  
example, and almost all of the mobile/casual games market, which is  
actually monetarily bigger than the triple-A games market.)

> I don't think realtime software is becoming an edge case by any means,
> maybe 'extreme' realtime is, but that doesn't change the fact that the GC
> still causes problems for all realtime software.
>

Yes, but my point is that there is very little real-time software written  
as a percentage of all software written, which, by definition, makes it an  
edge-case. Even vehicle control software is no longer completely  
real-time. [I just happen to know that because that's the industry I work  
in. Certain aspects are, with the rest scheduled out.] And more to the  
point, D has made no claim about it's suitability for RT software and I  
have seen little demand for it outside a very small very vocal minority  
that is convinced that it has the dynamic resource management panacea if  
everyone would just do as they say.

> I personally believe latency and stuttering is one of the biggest  
> usability
> hindrances in modern computing, and it will become a specific design  
> focus
> in software of the future. People are becoming less and less tolerant of
> latency in all forms; just consider the success of iPhone compared to
> competition, almost entirely attributable to the silky smooth UI
> experience. It may also be a telling move that Apple switched to ARC  
> around
> the same time, but I don't know the details.
>

I use .NET every day, seriously not one day goes by when I haven't touched  
some C# code. I can happily report that you are *ahem* wrong. Even Visual  
Studio 2013 doesn't stutter often, and only when I am pulling in some  
massive third-party module that may or may not be well written.  
Ironically, it is VisualD, it of D code fame that most slows down VS for  
me. I write software in C# every day I can happily report that I have yet  
to have a problem with stuttering in my code that wasn't of my own devise.  
(If you forget to asynchronously call that web-service it WILL stutter)

And that's a point. I can write software in C# that works will without  
having to worry about circular references or if my data prematurely falls  
out of scope or any other of the details that are associated with ARC. And  
for my not-effort, I pay an effective cost of 0. Win-win. You're demanding  
that to suit your needs, we make a massive philosophical and language  
change to D that will incur HIGHER cognitive load on programmers for  
something that will not measurably improve the general use case? Ahem,  
that's good for D how?

> I also firmly believe that if D - a native compiled language familiar to
> virtually all low-level programmers - doesn't have any ambition to  
> service
> the embedded space in the future, what will? And why not?
> The GC is the only thing inhibiting D from being a successful match in  
> that
> context. ARC is far more appropriate, and will see it enter a lot more
> fields.
> What's the loss?
>

Cognitive load. How many details does the programmer have to worry about  
per line of code. Ease-of-use. A GC is easier to use in practice. You can  
say well they should learn to use ARC because it's better (for certain  
definitions of better) but they won't. They'll just move on. I'd say  
that's a pretty big loss.

And based on my conversations with Walter, I don't think that D was ever  
intended to make a play for the embedded space. If D can be made to work  
there great, but Walter, as near as I can tell, has no interest in tying  
the language in knots to make it happen. So that's a non-issue. And let's  
be honest, the requirements of that space are fairly extreme.

> I think it's also telling that newcomers constantly raise it as a massive
> concern, or even a deal-breaker. Would they feel the same about ARC? I
> seriously doubt it. I wonder if a poll is in order...
> Conversely, would any of the new-comers who are pro-GC feel any less  
> happy
> if it were ARC instead? $100 says they probably wouldn't even know, and
> almost certainly wouldn't care.

I DON'T see a majority of newcomers raising an issue with the GC, I only  
see it from newcomers with some pretty extreme latency requirements,  
primarily for the real-time crowd. The majority of newcomers aren't  
interested in RT work. I think you're falling prey to confirmation bias  
here.

-- 
Adam Wilson
GitHub/IRC: LightBender
Aurora Project Coordinator


More information about the Digitalmars-d mailing list