Reading dchar from UTF-8 stdin
Ali Çehreli
acehreli at yahoo.com
Wed Mar 16 13:22:48 PDT 2011
On 03/16/2011 02:52 AM, spir wrote:
> On 03/15/2011 11:33 PM, Ali Çehreli wrote:
>> Given that the input stream is UTF-8
[...]
>> Would you expect all of the bytes to be consumed when a dchar was used
>> instead?
>>
>> import std.stdio;
>>
>> void main()
>> {
>> dchar code; // <-- now a dchar
>> readf(" %s", &code);
>> writeln(code); // <-- BUG: uses a code unit as a code point!
>> }
>
> Well, when I try to run that bit of code, I get an error in std.format.
> formattedRead (line near the end, marked with "***" below).
I use dmd 2.052 on an Ubuntu 10.10 console and compiles fine for me. I
know that there has been changes in formatted input and output lately.
Perhaps you use an earlier version?
> *args[0] = unformatValue!(A)(r, spec); // ***
>> When the input is ö, now the output becomes Ã.
>>
>> What would you expect to happen?
>
> I would expect a whole code representing 'ö'.
I agree; just opened a bug report:
http://d.puremagic.com/issues/show_bug.cgi?id=5743
Ali
More information about the Digitalmars-d
mailing list