D 2015/2016 Vision?

bitwise via Digitalmars-d digitalmars-d at puremagic.com
Wed Oct 7 05:56:29 PDT 2015


On Wednesday, 7 October 2015 at 07:24:03 UTC, Paulo Pinto wrote:
> On Tuesday, 6 October 2015 at 20:43:42 UTC, bitwise wrote:
>> On Tuesday, 6 October 2015 at 18:43:42 UTC, Jonathan M Davis 
>> wrote:
>>> On Tuesday, 6 October 2015 at 18:10:42 UTC, bitwise wrote:
>>>> On Tuesday, 6 October 2015 at 17:20:39 UTC, Jonathan M Davis 
>>>> wrote:
>>>> I'm not sure what else I can say. The example I posted says 
>>>> it all, and it can't be done properly in D (or C#, but why 
>>>> lower the bar because of their mistakes? ;)
>>>
>>> It's a side effect of having the lifetime of an object 
>>> managed by the GC. There's no way around that except to use 
>>> something else like manual memory management or reference 
>>> counting.
>>
>> You are literally repeating what I just said in different 
>> words.
>>
>>> in D, it's a good reason to use structs to manage resources 
>>> like that, and since most objects really have no need of 
>>> inheritance and have no business being classes, it's usually 
>>> fine.
>>
>> This is an opinion.
>>
>> I want polymorphism AND deterministic destruction, and the 
>> least you could do is just admit that it's a downside to D not 
>> having it, instead of trying to tell me that everything I know 
>> is wrong..
>>
>>> But in the cases where you do have to use a class, it can get 
>>> annoying.
>>
>> YES, its does, and it's not just an odd case here and there..
>>
>>> You simply do not rely on the GC or the destruction of the 
>>> object to free system resources. You manually call a function 
>>> on the object to free those resources when you're done with 
>>> it.
>>
>> I'm sorry, but I almost can't believe you're saying this.
>>
>> So, you're saying you want me to just revert back to manual 
>> resource management and accept that huge resources like 
>> textures and such may just leak if someone doesn't use them 
>> right? or throws an exception? in a language like D that is 
>> supposed to be safe?
>>
>>> In the case of C#, they have a construct to help with it that 
>>> (IIRC) is something like
>>>
>>> using(myObj)
>>> {
>>> } // myObj.dispose() is called when exiting this scope
>>
>> For the THIRD time, I'll post my example:
>>
>> class Texture { }
>> class Texture2D : Texture {
>>     this() { /* load texture... */ }
>>     ~this { /* free texture */ }     // OOPS, when, if ever, 
>> will this be called?
>> }
>>
>> Now, does this really seem like a realistic use case to you?
>>
>> using(Texture tex = new Texture2D) {
>>     // ...
>> }
>>
>
> That no, but this yes (at least in C#):
>
> using (LevelManager mgr = new LevelManager())
> {
>      //....
>      // Somewhere in the call stack
>      Texture text = mgr.getTexture();
> }
> --> All level resources gone that require manual management gone
> --> Ask the GC to collect the remaining memory right now
>
> If not level wide, than maybe scene/section wide.
>
> However I do get that not all architectures are amendable to be 
> re-written in a GC friendly way.
>
> But the approach is similar to RAII in C++, reduce new to 
> minimum and allocate via factory functions that work together 
> with handle manager classes.
>
> --
> Paulo

Still no ;)

It's a Texture. It's meant to be seen on the screen for a while, 
not destroyed in the same scope which it was created.

In games though, we have a scene graph. When things happen, we 
often chip off a large part of it while the game is running, 
discard it, and load something new. We need to know that what we 
just discarded has been destroyed completely before we start 
loading new stuff when we're heavily constrained by memory. And 
even in cases where we aren't that constrained by memory, we need 
to know things have been destroyed, period, for non-memory 
resources. Also, when using graphics APIs like OpenGL, we need 
control over which thread an object is destroyed in, because you 
can't access OpenGL resources from just any thread. Now, you 
could set up some complicated queue where you send textures and 
so on to(latently) be destroyed, but this is just complicated. 
Picture a Hello OpenGL app in D and the hoops some noob would 
have to jump through. It's bad news.

Also, I should add, that a better example of the Texture thing 
would be a regular Texture and a RenderTexture. You can only draw 
to the RenderTexture, but you should be able to apply both to a 
primitive for drawing. You need polymorphism for this. A struct 
will not do.

     Bit






More information about the Digitalmars-d mailing list