Why typedef's shouldn't have been removed :(

Jonathan M Davis jmdavisProg at gmx.com
Thu May 17 02:53:19 PDT 2012


On Thursday, May 17, 2012 02:38:13 Jonathan M Davis wrote:
> On Thursday, May 17, 2012 11:16:08 Mehrdad wrote:
> > Ugh, ran into a problem again...
> > 
> > I was hoping I could do a type deduction for a write() function I
> > had, based on whether the input is 'size_t' (in which case it'd
> > be hexadecimal) or 'uint' (in which case it'd be decimal), but
> > nope! that doesn't work. :(
> > 
> > Any chance we'll be able to distinguish aliases like these
> > sometime?
> 
> The compiler _might_ be made to output aliases in error messages along with
> what they're aliased to, but it's kind of the whole point of aliases that
> they _not_ be their own type. So, it wouldn't make any sense to distinguish
> them with type deduction and the like. As it stands, as far as the compiler
> is concerned, there is _zero_ difference between size_t and ulong on 64-bit
> and size_t and uint on 32-bit. It's effectively the same as a search and
> replace. You obviously want something _other_ than an alias, but since
> size_t uses an alias, that's the way it is, and that's the way that it's
> bound to stay.

I'd point out though that if your problem is type deduction and compile time 
reflection, the code that you're generating will be generated on the 
architecture that it's being run on (unless you're generating .d files that you 
keep around for future builds), so the fact that size_t is a different size on 
a different machine is more or less irrelevant. You'll end up using whatever 
the size is for _that_ machine, which is all that it needs. The fact that it's 
not the same size as it might be on another machine should be irrelevant. 
Maybe you've found a use case where it matters, but certainly in general, I 
don't see why it would.

It's actually _less_ of a problem with code generation, because it can 
generate whatever is correct for that architecture, whereas if the programmer 
is doing it themselves, they can easily forget to use size_t and end up with 
their code not compiling properly on another architecture due to narrowing 
conversions and the like. Code generation just doesn't need size_t normally 
(if ever). It's the programmers writing code which do.

- Jonathan M Davis


More information about the Digitalmars-d mailing list