Smart pointers instead of GC?

Manu turkeyman at gmail.com
Mon Feb 3 17:51:10 PST 2014


On 4 February 2014 10:49, Adam Wilson <flyboynw at gmail.com> wrote:

> On Mon, 03 Feb 2014 16:24:52 -0800, NoUseForAName <no at spam.com> wrote:
>
>  On Monday, 3 February 2014 at 23:00:23 UTC, woh wrote:
>>
>>>  ur right I never thought of that, I bet all them game devs never
>>> thought of it either, they so dumb.  I bet they never tried to use a GC,
>>> what fools!  Endless graphs of traced objects, oh yes oh yes!  It only runs
>>> when I allocate, oh what a fool I've been, please castigate me harder!
>>>
>>
>> Also people should consider that Apple (unlike C++ game devs) did not
>> have a tradition of contempt for GC. In fact they tried GC *before* they
>> switched to ARC. The pro-GC camp always likes to pretend that the anti-GC
>> one is just ignorant, rejecting GC based on prejudice not experience but
>> Apple rejected GC based on experience.
>>
>> GCed Objective-C did not allow them to deliver the user experience they
>> wanted (on mobile), because of the related latency issues. So they switched
>> to automated ref counting. It is not in question that ref counting
>> sacrifices throughput (compared to an advanced GC) but for interactive,
>> user facing applications latency is much more important.
>>
>>
> That may be the case, but StackOverflow shows that ARC hasn't been panacea
> in Apple land either. Way to many people don't understand ARC and how to
> use it, and subsequently beg for help understanding heisenleaks and weak
> references. ARC places a higher cognitive load on the programmer than a GC
> does. And Android runs just fine with GC'ed apps, but ARC guys don't want
> to talk about Google's successes there.


I'd have trouble disagreeing more; Android is the essence of why Java
should never be used for user-facing applications.
Android is jerky and jittery, has random pauses and lockups all the time,
and games on android always jitter and drop frames. Most high-end games on
android now are written in C++ as a means to mitigate that problem, but
then you're back writing C++. Yay!
iOS is silky smooth by comparison to Android.
I'm sure this isn't entirely attributable to the GC, or Java in general,
but it can't possibly be used as an example of success. Precisely the
opposite if anything. Games on Android make gamedevs who care about smooth
interactivity's brains bleed.

 You can do soft-real time with GC as long as the GC is incremental (D's is
>> not) and you heavily rely on object reuse. That is what I am doing with
>> LuaJIT right now and the frame rates are nice and constant indeed. However,
>> you pay a high price for that. Object reuse means writing additional code,
>> makes things more complex and error-prone, which is why your average app
>> developer does not do it.. and should not have to do it.
>>
>> Apple had to come up with a solution which does not assume that the
>> developers will be careful about allocations. The performance of the apps
>> in the iOS app store are ultimately part of the user experience so ARC is
>> the right solution because it means that your average iOS app written by
>> Joe Coder will not have latency issues or at least less latency issues
>> compared to any GC-based solution.
>>
>> I think it is an interesting decision for the D development team to make.
>> Do you want a language which can achieve low latency *if used carefully* or
>> one which sacrifices maximal throughput performance for less latency issues
>> in the common case.
>>
>> I see no obvious answer to that. I have read D has recently been used for
>> some server system at Facebook, ref counting usually degrades performance
>> in that area. It is no coincidence that Java shines on the server as a high
>> performance solution while Java is a synonym for dog slow memory hog on the
>> desktop and mighty unpopular there because of that. The whole Java
>> ecosystem from the VM to the libraries is optimized for enterprise server
>> use cases, for throughput, scalability, and robustness, not for making
>> responsive GUIs (and low latency in general) or for memory use.
>>
>>
> Ahem. Wrong. See: WinForms, WPF, Silverlight. All extremely successful GUI
> toolkits that are not known for GC related problems. I've been working with
> WPF since 2005, I can say the biggest performance problem with it by far is
> the naive rendering of rounded corners, the GC has NEVER caused a hitch.


On a modern many ghz PC with many cores, and many gb of ram (most of which
is unallocated), a hardware virtual memory manager, and a mature RTOS.
Computers come in all shapes and sizes. D is positioned as a systems
language, last time I checked... or else I don't know what I'm doing here.

 If D wants to be the new Java GC is the way to go, but no heap allocation
>> happy GCed language will ever challenge C/C++ on the desktop.
>>
>>
> So that's why nearly every desktop app (for Windows at least, but that's
> the overwhelming majority) that started development since .NET came out is
> written C#?


I don't think people write C# because it has a GC. People write C# because
it is productive and awesome; has an amazing dev infrastructure, dev
environment, well integrated GUI toolkits, debugger works awesome, docs are
excellent, etc.
Correlation does not imply causality.
I know lots of people who love C#, even write games in it, but criticise
the GC as it's biggest flaw.

I'm not saying there are lots of people that love it, and for C#'s intended
market, it makes perfect sense. I don't think D's market is C#'s market. If
it was, I would be a happy C# developer, and I never would have given D a
moments notice.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.puremagic.com/pipermail/digitalmars-d/attachments/20140204/7e3e5d02/attachment-0001.html>


More information about the Digitalmars-d mailing list