Arbitrary Size Integer Arrays

Don nospam at nospam.com
Tue Sep 22 00:17:01 PDT 2009


dsimcha wrote:
> I'm thinking it might be useful to have a library type that allows for
> arbitrary size integers <= 64 bits (128 if ucent is ever implemented) to be
> packed into an array.  This might be nice when you want a huge array of them,
> the size you actually need is somewhere in between two native sizes, and you
> care more about space/cache efficiency than raw CPU cycles.  For example,
> let's say you wanted an array of N 19-bit integers.  You'd make an array of
> uints (32-bit) with ceil(N * 19 / 32) elements, and when it was indexed, you'd
> do a bunch of modulus and bit shifting to get the right 19 bits out of that
> block of memory, and return a uint.  It would be slow as molasses, but really
> space efficient, which would in certain cases be a reasonable tradeoff.
> 
> Has anyone implemented, or considered implementing this before?  If not, if I
> implemented it well and submitted it as an enhancement, would it be a
> reasonable thing to include in std.array?

A couple of comments:
(1) Many of the uses I can imagine for such a structure would have 
locality of reference.
If you provide slice access (eg, give me elements [a..b] as an array of 
ints) then you can have reasonable performance. Unpacking consecutive 
elements can be done quite efficiently (it's an interesting optimisation 
problem, though!).

(2) A floating-point version of this was discussed at some point. The 
number of exponent and significand bits would be specified. Again, you'd 
extract an array of such beasts into an array of floats or doubles.



More information about the Digitalmars-d mailing list