String created from buffer has wrong length and strip() result is incorrect

spir via Digitalmars-d-learn digitalmars-d-learn at puremagic.com
Fri Oct 17 01:30:53 PDT 2014


On 17/10/14 09:29, thedeemon via Digitalmars-d-learn wrote:
> On Friday, 17 October 2014 at 06:29:24 UTC, Lucas Burson wrote:
>
>>    // This is where things breaks
>>    {
>>       ubyte[] buff = new ubyte[16];
>>       buff[0..ATA_STR.length] = cast(ubyte[])(ATA_STR);
>>
>>       // read the string back from the buffer, stripping whitespace
>>       string stringFromBuffer = strip(cast(string)(buff[0..16]));
>>       // this shows strip() doesn't remove all whitespace
>>       writefln("StrFromBuff is '%s'; length %d", stringFromBuffer,
>> stringFromBuffer.length);
>>
>>       // !! FAILS. stringFromBuffer is length 15, not 3.
>>       assert(stringFromBuffer.length == strip(ATA_STR).length);
>
> Unlike C, strings in D are not zero-terminated by default, they are just arrays,
> i.e. a pair of pointer and size. You create an array of 16 bytes and cast it to
> string, now you have a 16-chars string. You fill first few chars with data from
> ATA_STR but the rest 10 bytes of the array are still part of the string, not
> initialized with data, so having zeroes. Since this tail of zeroes is not
> whitespace (tabs or spaces etc.) 'strip' doesn't remove it.

Side-note: since your string has those zeroes at the end, strip only removes the 
space at start (thus, final size=15), instead of at both ends.

d



More information about the Digitalmars-d-learn mailing list