How about "auto" parameters?

Jonathan M Davis jmdavisProg at gmx.com
Tue Jun 7 09:48:01 PDT 2011


On 2011-06-07 09:01, foobar wrote:
> Andrei Alexandrescu Wrote:
> > On 6/7/11 7:11 AM, foobar wrote:
> > > I agree with Ary above and would also like to add that in the ML family
> > > of languages all the variables are also default auto typed: E.g.:
> > > fun add a b = a + b
> > > 
> > > 'add' would have the type ('a, 'a) -> 'a and the type inference engine
> > > will also infer that 'a must provide the + operator. I feel that this
> > > is more natural than having a dedicated function template syntax.
> > 
> > I agree it would be nice to further simplify generic function syntax.
> > One problem with the example above is that the type deduction didn't go
> > all that well - it forces both parameter types to be the same so it
> > won't work with adding values of different types (different widths,
> > mixed floating point and integrals, user-defined +). In a language
> > without overloading, like ML, things are a fair amount easier.
> 
> ML is strictly typed unlike C-like languages. This is a *good* thing and is
> a feature. While C's implicit casts are a horrible hole in the language.
> Also ML has only two types: integers and floating point. There is no short
> vs long problems. Yes, both arguments will have the same type but this is
> the correct default. When adding a floating point and an integral the user
> should be required to specify what kind of operation is being made, either
> the double is converted to an integral (how? floor, round, etc? ) or the
> integral is converted to a floating point which can cause a loss of
> precision. Although overloading complicates things it doesn't mean it's
> impossible.
> 
> ML is explicit but only in the correct places. C-like languages have
> shortcuts but those are in the wrong places where it hurts and it's
> verbose in other places. I prefer to let the compiler infer types for me
> but require me to be explicit about coercion which is type safe vs. the
> reverse which is both more verbose and less safe.
> 
> > > Better yet, instead of auto parameters, just make parameter types
> > > optional (as in ML) and let the compiler generate the template.
> > > 
> > > foo(a, b) { return a + b; } // will be lowered to:
> > > T foo(T) (T a, T b) { return a + b; }
> > > 
> > > Types are already inferred for delegate literals so why not extend this
> > > to regular functions too?
> > 
> > There are multiple issues. One is we don't have Hindley-Milner
> > polymorphism. The D compiler doesn't really "infer" types as "propagate"
> > them. Another is, such inference would make separate compilation
> > difficult.
> > 
> > 
> > Andrei
> 
> We don't Hindley-Miler _yet_. I can hope, can't I?
> Again, difficult != impossible. AFAIK it is working in Nemerle, isn't it?

I don't think that it generally makes sense to _add_ Hindley-Milner type 
inference to a language. That's the sort of design decision you make when you 
initially create the language. It has _huge_ repercussions on how the language 
works. And D didn't go that route. That sort of choice is more typical of a 
functional language.

And while D might arguably allow too many implicit conversions, it allows 
fewer than C or C++. I actually would expect that bugs due to implicit 
conversions would be fairly rare in D. And requiring more conversions to be 
explicit might actually make things worse, because it would become more 
frequently necessary to use casts, which tend to hide various types of bugs. 
So, while it might be better to require casting in a few more places than D 
currently does, on the whole it works quite well. And certainly expecting a 
major paradigm shift at this point is unrealistic. Minor improvements may be 
added to the language, and perhaps major backwards-compatible features may be 
added, but for the most part, D is currently in the mode of stabilizing and 
completing the implementation of its existing feature set. It's _far_ too late 
in the game to introduce something like Hindley-Milner type inference, 
regardless of whether it would have been a good idea in the beginning (and 
honestly, given D's C and C++ roots, I very much doubt that it ever would have 
been a good idea to have Hindley-Milner type inference in it - that would have 
made for a _very_ different sort of language, which definitely wouldn't be D).

- Jonathan M Davis


More information about the Digitalmars-d mailing list