Why typedef's shouldn't have been removed :(

Jonathan M Davis jmdavisProg at gmx.com
Thu May 17 02:38:13 PDT 2012


On Thursday, May 17, 2012 11:16:08 Mehrdad wrote:
> Ugh, ran into a problem again...
> 
> I was hoping I could do a type deduction for a write() function I
> had, based on whether the input is 'size_t' (in which case it'd
> be hexadecimal) or 'uint' (in which case it'd be decimal), but
> nope! that doesn't work. :(
> 
> Any chance we'll be able to distinguish aliases like these
> sometime?

The compiler _might_ be made to output aliases in error messages along with 
what they're aliased to, but it's kind of the whole point of aliases that they 
_not_ be their own type. So, it wouldn't make any sense to distinguish them 
with type deduction and the like. As it stands, as far as the compiler is 
concerned, there is _zero_ difference between size_t and ulong on 64-bit and 
size_t and uint on 32-bit. It's effectively the same as a search and replace. 
You obviously want something _other_ than an alias, but since size_t uses an 
alias, that's the way it is, and that's the way that it's bound to stay.

- Jonathan M Davis


More information about the Digitalmars-d mailing list