Question regarding Base64 decoding

Johannes Loher via Digitalmars-d-learn digitalmars-d-learn at puremagic.com
Sun Jul 31 03:07:00 PDT 2016


Currently, the function Base64.decode for decoding char[] to ubyte[] has
an in contract, that should ensure that the supplied buffer is large
enough. However, it uses the function decodeLength, which does not give
the accurate decodeLength, but instead an upper bound (decodeLength can
be up to 2 bigger than realDecodeLength).

I understand, that this is useful for the cases when one wants to decode
an imputRange, becuase the it is (in general) not possible to use
realDecodeLength, because it needs random access and opDollar. But it
would be possible to use realDecodeLength when decoding an array (in
fact it is used in the out contract in this case).

My problem with this is, that I read base64 encoded data from a file,
and I know, that the data needs to be exactly 32 byte when decoded,
otherwise the data is faulty and an exception should be raised. So
naively, I would just supply a 32 byte buffer to the decode function.
But the decodeLength function returns 33 for the encoded data, so when
calling decode with a 32 byte buffer, the in contract fails. The funny
thing is, when using a release build (and thus removing all contracts)
everything works as expected.

So my question is the following: Is there any specific reason for this
behaviour (which I can't see), or is this simply a "bug" and
realDecodeLength should be used in the in contract?


More information about the Digitalmars-d-learn mailing list