Issues with constants, and inout (was Real World usage of D, Today)
Frits van Bommel
fvbommel at REMwOVExCAPSs.nl
Sat Jan 27 04:59:34 PST 2007
kris wrote:
> Andrei Alexandrescu (See Website For Email) wrote:
>> kris wrote:
>> [about implicit conversion rules]
>>
>>> extern (C) int printf (char*, ...);
>>>
>>> class Foo
>>> {
>>> void write (int x) {printf("int\n");}
>>> void write (uint x) {printf("uint\n");}
>>> void write (long x) {printf("long\n");}
>>> void write (char x) {printf("char\n");}
>>> void write (wchar x) {printf("wchar\n");}
>>> void write (double x) {printf("double\n");}
>>>
>>> void write (char[] x) {printf("char[]\n");}
>>> void write (wchar[] x) {printf("wchar[]\n");}
>>> }
>>>
>>> void main()
>>> {
>>> auto foo = new Foo;
>>>
>>> foo.write ('c');
>>> foo.write (1);
>>> foo.write (1u);
>>> foo.write (3.14);
>>> //foo.write ("asa");
>>> }
>>>
>>> prints:
>>>
>>> char
>>> int
>>> uint
>>> double
[snip]
>>> Now for the broken part. When you uncomment the string constant, the
>>> compiler gets all confused about whether it's a char[] or wchar[].
>>> There is no defaulting to one type, as there is for other constants
>>> (such as char). It /is/ possible to decorate the string constant in a
>>> similar manner to decorating integer constants:
>>>
>>> foo.write ("qwe"c);
>>>
>>> And this, of course, compiles. It's a PITA though, and differs from
>>> the rules for other constants.
>>
>> I talked to Walter about this and it's not a bug, it's a feature :o).
>> Basically it's hard to decide what to do with an unadorned string when
>> both wchar[] and char[] would want to "attract" it. I understand
>> you're leaning towards defaulting to char[]? Then probably others will
>> be unhappy.
>
> You'll have noticed that the constant 'c' defaults to /char/, and that
> there's no compile-time conflict between the write(char) & write(wchar)?
> Are people unhappy about that too? Perhaps defaulting of char constants
> and int constants should be abolished also?
It's a bit more complicated with character literals than just defaulting
to 'char':
-----
import std.stdio;
void main() {
writefln(typeid(typeof('a'))); // c <= \u007f
writefln(typeid(typeof('\uabcd'))); // c <= \uffff
writefln(typeid(typeof('\U000abcde'))); // c <= \U0010ffff
}
-----
outputs:
"""
char
wchar
dchar
"""
So it defaults to the *smallest* character type that can hold it in one
element.
Pretty cool, actually.
This also applies to other types, by the way. If you type an integer
literal that won't fit into an 'int', it'll be a 'long' constant
(assuming it fits), not an 'int' constant.
Perhaps we should do something similar with string literals, defaulting
it to use an array of the smallest character type that can hold all of
the characters in the string (i.e. the maximum "character type" of the
component characters) ?
That seems like a reasonable rule. And it has the added benefit that
"some string constant".length will always be the number of characters in
the string.
More information about the Digitalmars-d
mailing list