[Bug 51] String cast overrides the char type of decorated string literals.

d-bugmail at puremagic.com d-bugmail at puremagic.com
Sun Mar 19 02:30:21 PST 2006


http://d.puremagic.com/bugzilla/show_bug.cgi?id=51





------- Comment #2 from daiphoenix at lycos.com  2006-03-19 04:30 -------
(In reply to comment #1)
> -----BEGIN PGP SIGNED MESSAGE-----
> Hash: SHA1
> This might seem confusing, but is the correct behaviour.
> http://www.digitalmars.com/d/arrays.html
> # The type of a string is determined by the semantic phase of
> # compilation. The type is one of: char[], wchar[], dchar[], and is
> # determined by implicit conversion rules. If there are two equally
> # applicable implicit conversions, the result is an error. To
> # disambiguate these cases, a cast is appropriate:
> #
> # cast(wchar [])"abc"   // this is an array of wchar characters
> Thomas
> -----BEGIN PGP SIGNATURE-----
> iD8DBQFEHRi13w+/yD4P9tIRAmgpAJ9F/KNd1JSBTBOp1QME7RA6Lwja9wCfT7Gx
> SlkT9jiEQ1rxtIl/cc7wT7s=
> =aqz7
> -----END PGP SIGNATURE-----

That behaviour is true for undecorated string literals, however for decorated
string literals (i.e. those with postfix 'c', 'd', or 'w') it is not so. In
http://www.digitalmars.com/d/lex.html#stringliteral it is said: "The optional
Postfix character gives a specific type to the string, rather than it being
inferred from the context. ..."
Thus postfix-decorated string literals should not be vulnerable to type change
due to casts or other context influences, no?


-- 




More information about the Digitalmars-d-bugs mailing list