Automatic typing

Steven Schveighoffer schveiguy at yahoo.com
Fri Jun 28 07:02:07 PDT 2013


On Fri, 28 Jun 2013 02:51:39 -0400, JS <js.mdnq at gmail.com> wrote:

> On Friday, 28 June 2013 at 00:48:23 UTC, Steven Schveighoffer wrote:
>> On Thu, 27 Jun 2013 20:34:53 -0400, JS <js.mdnq at gmail.com> wrote:
>>
>>> Would it be possible for a language(specifically d) to have the  
>>> ability to automatically type a variable by looking at its use cases  
>>> without adding too much complexity? It seems to me that most compilers  
>>> already can infer type mismatchs which would allow them to handle  
>>> stuff like:
>>>
>>> main()
>>> {
>>>    auto x;
>>>    auto y;
>>>    x = 3;   // x is an int, same as auto x = 3;
>>>    y = f(); // y is the same type as what f() returns
>>>    x = 3.9; // x is really a float, no mismatch with previous type(int)
>>> }
>>>
>>> in this case x and y's type is inferred from future use. The compiler  
>>> essentially just lazily infers the variable type. Obviously ambiguity  
>>> will generate an error.
>>>
>>
>> There are very good reasons not to do this, even if possible.   
>> Especially if the type can change.
>>
>> Consider this case:
>>
>> void foo(int);
>> void foo(double);
>>
>> void main()
>> {
>> auto x;
>> x = 5;
>> foo(x);
>>
>> ....
>> // way later down in main
>> x = 6.0;
>> }
>>
>> What version of foo should be called?  By your logic, it should be the  
>> double version, but looking at the code, I can't reason about it.  I  
>> have to read the whole function, and look at every usage of x.  auto  
>> then becomes a liability, and not a benefit.
>>
>
> says who? No one is forcing you to use it with an immediate inference.  
> If you get easily confused then simply declare x as a double in the  
> first place!

This is already defined:

auto x = 5;

To change the meaning of that, would be unnecessarily confusing.

> Most of the time a variable's type is well know by the programmer. That  
> is, the programmer has some idea of the type a variable is to take on.  
> Having the compiler infer the type is tantamount to figuring out what  
> the programmer had in mind, in most cases this is rather easy to do...  
> in any ambiguous case an error can be thrown.

It's not the programmer I'm worried about.  It's the maintainer/reviewer.

>> Coupling the type of a variable with sparse usages is going to be  
>> extremely confusing and problematic.  You are better off declaring the  
>> variable as a variant.
>>
>
> If you are confused by the usage then don't use it. Just because for  
> some programmers in some cases it is bad does not mean that it can't be  
> useful to some programmers in some cases.

But I use auto all the time, and I don't want its meaning to change.

> Some programmers what to dumb down the compiler because they themselves  
> want to limit all potential risk... What's amazing is that many times  
> the features they are against does not have to be used in the first  
> place.

As you have the compiler infer more and more, you lose it's ability to  
statically detect errors.  This is the point of having a statically typed  
language.

If you want loosey goosey semantics, you can use php.

> If you devise an extremely convoluted example then simply use a unit  
> test or define the type explicitly. I don't think limiting the compiler  
> feature set for the lowest common denominator is a way to develop a  
> powerful language.

"simply" is an inaccurate description.  Maintain any large project for  
some time in php and you will know what I mean.

> You say using a variant type is better off, how? What is the difference  
> besides performance? An auto type without immediate type inference  
> offers all the benefits of static typing with some of those from a  
> variant type...

It declares up front "this can change type mid-function".  I don't have to  
read the whole function to guess it's type, it's a variant.

> Since it seems you are not against variant then why would you be against  
> a static version, since it actually offers more safety?

Variant is reasonable.  It allows you to specify that you don't care about  
the type.  And anything that takes variant can do the same.

But in your scheme, the type is NOT I don't care, but basically defined by  
the compiler.  Good luck discovering what it is.  I see a lot of  
pragma(msg, typeof(x)) going to be put in that code.

> In fact, my suggestion could simply be seen as an optimization of a  
> variant type.
>
> e.g.,
>
> variant x;
> x = 3;
>
>
> the compiler realizes that x can be reduced to an int type and sees the  
> code as
>
> int x;
> x = 3;
>
> Hence, unless you are against variants and think they are evil(which  
> contradicts your suggestion to use it), your argument fails.

My argument is that auto should be left the way it is.  I don't want it to  
change.  And variant already does what you want with less confusing  
semantics, no reason to add another feature.

-Steve


More information about the Digitalmars-d mailing list