Why std.algorithm.sort can't be applied to char[]?

John Colvin via Digitalmars-d-learn digitalmars-d-learn at puremagic.com
Wed May 14 02:01:22 PDT 2014


On Wednesday, 14 May 2014 at 08:27:46 UTC, monarch_dodra wrote:
> On Monday, 12 May 2014 at 18:44:22 UTC, Jonathan M Davis via 
> Digitalmars-d-learn wrote:
>> Sure, you can cast char[] to ubyte[] and sort that if you know 
>> that the array
>> only holds pure ASCII. In fact, you can use 
>> std.string.representation to do it
>> - e.g.
>>
>> auto ascii = str.representation;
>>
>> and if str were mutable, then you could sort it. But that will 
>> only work if
>> the string only contains ASCII characters. Regardless, he 
>> wanted to know why
>> he couldn't sort char[], and I explained why - all strings are 
>> treated as
>> ranges of dchar, making it so that if their element type is 
>> char or wchar, so
>> they're not random access and thus can't be sorted.
>
> Arguably, a smart enough implementation should know how to sort 
> a "char[]", while still preserving codepoint integrity.
>
> As a matter of fact, the built in "sort" property does it.
>
> void main()
> {
>     char[] s = "éöeèûà".dup;
>     s.sort;
>     writeln(s);
> }
> //prints:
> eàèéöû

Why would anyone ever want to sort code-points?

They might want to sort graphemes, but that's difficult to do 
in-place (needs O(n) memory, I think...). If out-of-place is good 
enough

someStr.byGrapheme.array.sort();


More information about the Digitalmars-d-learn mailing list