Caching in druntime TypeInfo classes

H. S. Teoh via Digitalmars-d digitalmars-d at puremagic.com
Tue Jun 30 14:14:01 PDT 2015


While investigating:

	 https://issues.dlang.org/show_bug.cgi?id=4244

I found that the druntime function for computing the hash of static
arrays (this also applies to dynamic arrays, btw) is horrendously slow:
about 8-9 times slower than the equivalent operation on a POD struct of
the same size.

The problem is caused by the call to hasCustomToHash() inside
getArrayHash() in object.d, which in turn calls getElement(), which
walks the typeinfo tree to find the TypeInfo for the first non-array /
typedef type info definition, in order to determine if array elements
have a custom toHash method. This walk is done *every single time* the
array is hashed, even though the return value never changes for each
array type.

So I tried to modify getArrayHash() to cache this information in the
TypeInfo, but ran into some roadblocks: since TypeInfo's are supposed to
be const, this operation is illegal. Unless I cast away const, but
that's a rather dirty hack. The other problem is that the compiler
hardcodes the sizes of each TypeInfo instance, so it will refuse to
compile object.d anyway if the TypeInfo is expanded to have an extra
field for caching the result of hasCustomTohash(). But since we have to
modify the compiler now, my reaction was, why not have the compiler
compute this value itself? Since the compiler already has all the
information needed to compute this value. We don't have to wait till
runtime. The only drawback is adding more complexity to the compiler,
making it hard for other efforts like SDC to implement D.

What do you guys think? Should hasCustomToHash() be cached somehow in
object.o? Or is caching a poor solution, and we should do something
else?


T

-- 
Why is it that all of the instruments seeking intelligent life in the
universe are pointed away from Earth? -- Michael Beibl


More information about the Digitalmars-d mailing list