auto storage class - infer or RAII?

Walter Bright newshound at digitalmars.com
Sun Nov 12 13:38:32 PST 2006


Jim Hewes wrote:
> I hope you don't mind a comment from a lurker. :)

Not at all.

> I had assumed that there was some more important purpose for type inference, 
> such as that it was useful in generics or something. But if it's just to 
> avoid having to declare types, I wonder if it's really that important to 
> have.

I think I must have presented the case for it very poorly. The goal is 
to minimize bugs; avoiding having to declare types is a means to that 
goal, not the end itself.

> "Walter Bright" <newshound at digitalmars.com> wrote in message 
> news:ej6vb3$2s5m$1 at digitaldaemon.com...
>> What is the type of foo.length? Is it an int? uint? size_t? class Abc? 
>> Without type inference, I'd have to go look it up. With type inference, I 
>> know I've got the *correct* type for i automatically.
>>
> 
> However, you don't know what that type is. If you want to know, you need to 
> go look it up. This is especially true if you're working on someone else's 
> code. I think that's fine if you're working within an IDE that can tell you 
> the type. I'm not a fan of Hungarian notation. When working within Visual 
> Studio for example, there are multiple ways to find the type of a variable 
> easily. You can hover the mouse over it. You can right-click on it and 
> select "Go to Declaration". I think this obviates things like Hungarian 
> notation. It would help with discovering the types of inferred variables.

If you're always viewing code by using such a tool, then discovering the 
type is trivial as you suggest. But even with such a tool available, 
lots of times you won't be using it to view the code.

And just to make it clear, I am not advocating Hungarian notation in the 
sense that it normally means (I do think, however, such notation used in 
the manner that Joel Spolsky writes about is useful).
http://www.joelonsoftware.com/articles/Wrong.html


> But if you're using some editor like vi, then you may find it tedious and 
> time-consuming to always search for the types of variables. I would be more 
> likely to avoid type inference altogether and simply write the type just for 
> the sake of documenting it.

I'd posit here that if type inference is possible, then perhaps 
documenting the type at that point is the wrong place to document it.


>> Now, suppose I change the type of foo.length. I've got to go through and 
>> check *every* use of it to get the types of the i's updated. I'll 
>> inevitably miss one and thereby have a bug.
> 
> I assume if you've made an assignment to the wrong variable type, the 
> compiler will catch this (unless there's an implicit cast).

That's just the problem. There is a lot of implicit casting going on. 
Consider .length - it often does mean int, uint, size_t, etc., and there 
are implicit casts between them. So if you 'document' it as int, then 
there's a bug if it really should be size_t.

> At least the 
> compiler errors should help you find all occurrences easily.

Unfortunately, it doesn't. One could counter with "make all implicit 
casting illegal." Pascal tried that, and in my not-so-humble-opinion 
that was a big reason for the failure of Pascal, and it's the primary 
reason why I dropped it as soon as I discovered C. Even so, who wants to 
go through their source, compiler error by compiler error, fixing it? If 
the work could be cut in half using type inference, why not?

> Although, you'd 
> still need to go through and manually change the type in each occurrence. I 
> just feel that making these kinds of changes to code are more the job of 
> development tools. Perhaps the line between language and tools can start to 
> get a little blurry; that's another discussion. Anyway, I just wonder if 
> there is a strong a case for having type inference at all.

It is optional. The ability to explicitly type declarations is not going 
to go away.



More information about the Digitalmars-d mailing list