Why std.algorithm.sort can't be applied to char[]?

monarch_dodra via Digitalmars-d-learn digitalmars-d-learn at puremagic.com
Wed May 14 01:27:45 PDT 2014


On Monday, 12 May 2014 at 18:44:22 UTC, Jonathan M Davis via 
Digitalmars-d-learn wrote:
> Sure, you can cast char[] to ubyte[] and sort that if you know 
> that the array
> only holds pure ASCII. In fact, you can use 
> std.string.representation to do it
> - e.g.
>
> auto ascii = str.representation;
>
> and if str were mutable, then you could sort it. But that will 
> only work if
> the string only contains ASCII characters. Regardless, he 
> wanted to know why
> he couldn't sort char[], and I explained why - all strings are 
> treated as
> ranges of dchar, making it so that if their element type is 
> char or wchar, so
> they're not random access and thus can't be sorted.

Arguably, a smart enough implementation should know how to sort a 
"char[]", while still preserving codepoint integrity.

As a matter of fact, the built in "sort" property does it.

void main()
{
     char[] s = "éöeèûà".dup;
     s.sort;
     writeln(s);
}
//prints:
eàèéöû


More information about the Digitalmars-d-learn mailing list