The "no gc" crowd

PauloPinto pjmlp at progtools.org
Wed Oct 9 00:39:32 PDT 2013


On Wednesday, 9 October 2013 at 07:29:30 UTC, Manu wrote:
> On 9 October 2013 15:23, PauloPinto <pjmlp at progtools.org> wrote:
>
>> On Wednesday, 9 October 2013 at 05:15:53 UTC, Manu wrote:
>>
>>> On 9 October 2013 08:58, ponce <contact at gmsfrommars.fr> wrote:
>>>
>>>  On Tuesday, 8 October 2013 at 22:45:51 UTC, Adam D. Ruppe 
>>> wrote:
>>>>
>>>>
>>>>> Eh, not necessarily. If it expands to static 
>>>>> assert(!__traits(****
>>>>> hasAnnotationRecursive,
>>>>>
>>>>> uses_gc));, then the only ones that *need* to be marked are 
>>>>> the lowest
>>>>> level ones. Then it figures out the rest only on demand.
>>>>>
>>>>> Then, on the function you care about as a user, you say 
>>>>> nogc and it
>>>>> tells
>>>>> you if you called anything and the static assert stacktrace 
>>>>> tells you
>>>>> where
>>>>> it happened.
>>>>>
>>>>> Of course, to be convenient to use, phobos would need to 
>>>>> offer
>>>>> non-allocating functions, which is indeed a fair amount of 
>>>>> work, but
>>>>> they
>>>>> wouldn't *necessarily* have to have the specific attribute.
>>>>>
>>>>>
>>>> But is it even necessary? There isn't a great deal of 
>>>> evidence that
>>>> someone interested in optimization will be blocked on this 
>>>> particular
>>>> problem, like Peter Alexander said.
>>>>
>>>> GC hassle is quite common but not that big a deal:
>>>> - Manu: "Consequently, I avoid the GC in D too, and never 
>>>> had any major
>>>> problems, only inconvenience." http://www.reddit.com/r/**
>>>> programming/comments/1nxs2i/****the_state_of_rust_08/ccnefe7<h**
>>>> ttp://www.reddit.com/r/**programming/comments/1nxs2i/**
>>>> the_state_of_rust_08/ccnefe7<http://www.reddit.com/r/programming/comments/1nxs2i/the_state_of_rust_08/ccnefe7>
>>>> >
>>>>
>>>> - Dav1d: said he never had a GC problem with BRala 
>>>> (minecraft client)
>>>> - Me: I had a small ~100ms GC pause in one of my games every 
>>>> 20 minutes,
>>>> more often than not I don't notice it
>>>>
>>>> So a definitive written rebutal we can link to would perhaps 
>>>> be helpful.
>>>>
>>>>
>>> I might just add, that while my experience has been that I 
>>> haven't had any
>>> significant technical problems when actively avoiding the GC, 
>>> the
>>> inconvenience is considerably more severe than I made out in 
>>> that post (I
>>> don't want to foster public negativity).
>>> But it is actually really, really inconvenient. If that's my 
>>> future with
>>> D,
>>> then I'll pass, just as any un-biased 3rd party would.
>>>
>>> I've been simmering on this issue ever since I took an 
>>> interest in D. At
>>> first I was apprehensive to accept the GC, then cautiously 
>>> optimistic that
>>> the GC might be okay. But I have seen exactly no movement in 
>>> this area as
>>> long as I've been following D, and I have since reverted to a 
>>> position in
>>> absolute agreement with the C++ users. I will never accept 
>>> the GC in it's
>>> current form for all of my occupational requirements; it's 
>>> implicitly
>>> non-deterministic, and offers very little control over 
>>> performance
>>> characteristics.
>>> I've said before that until I can time-slice the GC, and it 
>>> does not stop
>>> the world, then it doesn't satisfy my requirements. I see 
>>> absolutely no
>>> motion towards that goal.
>>> If I were one of those many C++ users evaluating D for 
>>> long-term adoption
>>> (and I am!), I'm not going to invest the future of my career 
>>> and industry
>>> in a complete question mark which given years of watching 
>>> already, is
>>> clearly going nowhere.
>>> As far as the GC is concerned, with respect to realtime 
>>> embedded software,
>>> I'm out. I've completely lost faith. And it's going to take 
>>> an awful lot
>>> more to restore my faith again than before.
>>>
>>> What I want is an option to replace the GC with ARC, just 
>>> like Apple did.
>>> Clearly they came to the same conclusion, probably for 
>>> exactly the same
>>> reasons.
>>> Apple have a policy of silky smooth responsiveness throughout 
>>> the OS and
>>> the entire user experience. They consider this a sign of 
>>> quality and
>>> professionalism.
>>> As far as I can tell, they concluded that non-deterministic 
>>> GC pauses were
>>> incompatible with their goal. I agree.
>>> I think their experience should be taken very seriously. They 
>>> have a
>>> successful platform on weak embedded hardware, with about a 
>>> million
>>> applications deployed.
>>>
>>> I've had a lot of conversations with a lot of experts, plenty 
>>> of
>>> conversations at dconf, and nobody could even offer me a 
>>> vision for a GC
>>> that is acceptable.
>>> As far as I can tell, nobody I talked to really thinks a GC 
>>> that doesn't
>>> stop the world, which can be carefully scheduled/time-sliced 
>>> (ie, an
>>> incremental, thread-local GC, or whatever), is even possible.
>>>
>>> I'll take ARC instead. It's predictable, easy for all 
>>> programmers who
>>> aren't experts on resource management to understand, and I 
>>> have DIRECT
>>> control over it's behaviour and timing.
>>>
>>> But that's not enough, offering convenience while trying to 
>>> avoid using
>>> the
>>> GC altogether is also very important. You should be able to 
>>> write software
>>> that doesn't allocate memory. It's quite hard to do in D 
>>> today. There's
>>> plenty of opportunity for improvement.
>>>
>>> I'm still keenly awaiting a more-developed presentation of 
>>> Andrei's
>>> allocators system.
>>>
>>
>>
>>
>> Apple dropped the GC and went ARC instead, because they never 
>> managed to
>> make it work properly.
>>
>> It was full of corner cases, and the application could crash 
>> if those
>> cases were not fully taken care of.
>>
>> Or course the PR message is "We dropped GC because ARC is 
>> better" and not
>> "We dropped GC because we failed".
>>
>> Now having said this, of course D needs a better GC as the 
>> current one
>> doesn't fulfill the needs of potential users of the language.
>>
>
> Well, I never read that article apparently... but that's 
> possibly even more
> of a concern if true.
> Does anyone here REALLY believe that a bunch of volunteer 
> contributors can
> possibly do what apple failed to do with their squillions of 
> dollars and
> engineers?
> I haven't heard anybody around here propose the path to an 
> acceptable
> solution. It's perpetually in the too-hard basket, hence we 
> still have the
> same GC as forever and it's going nowhere.

I already provided that information in antoher discussion thread 
awhile ago,

http://forum.dlang.org/post/cntjtnvnrwgdoklvznnw@forum.dlang.org

It is easy for developers outside Objective-C world to believe in 
the ARC PR, without knowing what happened in the battlefield. :)

--
Paulo


More information about the Digitalmars-d mailing list