<div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">On 4 February 2014 10:49, Adam Wilson <span dir="ltr"><<a href="mailto:flyboynw@gmail.com" target="_blank">flyboynw@gmail.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div class="im">On Mon, 03 Feb 2014 16:24:52 -0800, NoUseForAName <<a href="mailto:no@spam.com" target="_blank">no@spam.com</a>> wrote:<br>
<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
On Monday, 3 February 2014 at 23:00:23 UTC, woh wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
ur right I never thought of that, I bet all them game devs never thought of it either, they so dumb. I bet they never tried to use a GC, what fools! Endless graphs of traced objects, oh yes oh yes! It only runs when I allocate, oh what a fool I've been, please castigate me harder!<br>
</blockquote>
<br>
Also people should consider that Apple (unlike C++ game devs) did not have a tradition of contempt for GC. In fact they tried GC *before* they switched to ARC. The pro-GC camp always likes to pretend that the anti-GC one is just ignorant, rejecting GC based on prejudice not experience but Apple rejected GC based on experience.<br>
<br>
GCed Objective-C did not allow them to deliver the user experience they wanted (on mobile), because of the related latency issues. So they switched to automated ref counting. It is not in question that ref counting sacrifices throughput (compared to an advanced GC) but for interactive, user facing applications latency is much more important.<br>
<br>
</blockquote>
<br></div>
That may be the case, but StackOverflow shows that ARC hasn't been panacea in Apple land either. Way to many people don't understand ARC and how to use it, and subsequently beg for help understanding heisenleaks and weak references. ARC places a higher cognitive load on the programmer than a GC does. And Android runs just fine with GC'ed apps, but ARC guys don't want to talk about Google's successes there.</blockquote>
<div><br></div><div>I'd have trouble disagreeing more; Android is the essence of why Java should never be used for user-facing applications.</div><div>Android is jerky and jittery, has random pauses and lockups all the time, and games on android always jitter and drop frames. Most high-end games on android now are written in C++ as a means to mitigate that problem, but then you're back writing C++. Yay!</div>
<div>iOS is silky smooth by comparison to Android.</div><div>I'm sure this isn't entirely attributable to the GC, or Java in general, but it can't possibly be used as an example of success. Precisely the opposite if anything. Games on Android make gamedevs who care about smooth interactivity's brains bleed.</div>
<div><br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div class="im">
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
You can do soft-real time with GC as long as the GC is incremental (D's is not) and you heavily rely on object reuse. That is what I am doing with LuaJIT right now and the frame rates are nice and constant indeed. However, you pay a high price for that. Object reuse means writing additional code, makes things more complex and error-prone, which is why your average app developer does not do it.. and should not have to do it.<br>
<br>
Apple had to come up with a solution which does not assume that the developers will be careful about allocations. The performance of the apps in the iOS app store are ultimately part of the user experience so ARC is the right solution because it means that your average iOS app written by Joe Coder will not have latency issues or at least less latency issues compared to any GC-based solution.<br>
<br>
I think it is an interesting decision for the D development team to make. Do you want a language which can achieve low latency *if used carefully* or one which sacrifices maximal throughput performance for less latency issues in the common case.<br>
<br>
I see no obvious answer to that. I have read D has recently been used for some server system at Facebook, ref counting usually degrades performance in that area. It is no coincidence that Java shines on the server as a high performance solution while Java is a synonym for dog slow memory hog on the desktop and mighty unpopular there because of that. The whole Java ecosystem from the VM to the libraries is optimized for enterprise server use cases, for throughput, scalability, and robustness, not for making responsive GUIs (and low latency in general) or for memory use.<br>
<br>
</blockquote>
<br></div>
Ahem. Wrong. See: WinForms, WPF, Silverlight. All extremely successful GUI toolkits that are not known for GC related problems. I've been working with WPF since 2005, I can say the biggest performance problem with it by far is the naive rendering of rounded corners, the GC has NEVER caused a hitch.</blockquote>
<div><br></div><div>On a modern many ghz PC with many cores, and many gb of ram (most of which is unallocated), a hardware virtual memory manager, and a mature RTOS.</div><div>Computers come in all shapes and sizes. D is positioned as a systems language, last time I checked... or else I don't know what I'm doing here.</div>
<div><br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div class="im">
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
If D wants to be the new Java GC is the way to go, but no heap allocation happy GCed language will ever challenge C/C++ on the desktop.<br>
<br>
</blockquote>
<br></div>
So that's why nearly every desktop app (for Windows at least, but that's the overwhelming majority) that started development since .NET came out is written C#?</blockquote><div><br></div><div>I don't think people write C# because it has a GC. People write C# because it is productive and awesome; has an amazing dev infrastructure, dev environment, well integrated GUI toolkits, debugger works awesome, docs are excellent, etc.</div>
<div>Correlation does not imply causality.</div><div>I know lots of people who love C#, even write games in it, but criticise the GC as it's biggest flaw.</div><div><br></div><div>I'm not saying there are lots of people that love it, and for C#'s intended market, it makes perfect sense. I don't think D's market is C#'s market. If it was, I would be a happy C# developer, and I never would have given D a moments notice.</div>
</div></div></div>