Converting int to dchar?

Johannes Loher via Digitalmars-d-learn digitalmars-d-learn at puremagic.com
Sun Jul 31 14:58:21 PDT 2016


Am 31.07.2016 um 23:46 schrieb Seb:
> On Sunday, 31 July 2016 at 21:31:52 UTC, Darren wrote:
>> Hey, all.
>>
>> I'm pretty much a programming novice, so I hope you can bear with me. 
>> Does anyone know how I can change an int into a char equivalent?
>>
>> e.g.
>> int i = 5;
>> dchar value;
>> ?????
>> assert(value == '5');
>>
>> If I try and cast it to dchar, I get messed up output, and I'm not
>> sure how to use toChars (if that can accomplish this).
>>
>> I can copy+paste the little exercise I'm working on if that helps?
>>
>> Thanks in advance!
> 
> Ehm how do you you want to represent 1_000 in one dchar?
> You need to format it, like here.
> 
>     import std.format : format;
>     assert("%d".format(1_000) == "1000");
> 
> Note that you get an array of dchars (=string), not a single one.

An immutable array of dchars is a dstring, not a string (which is an
immutable array of chars). It is true however, that you should not
convert to dchar, but to string (or dstring, if you want utf32, but i
see no real reason for this, if you are only dealing with numbers),
because of the reason mentioned above. Another solution for this would
be using "to":

import std.conv : to;

void main()
{
    int i = 5;
    string value = i.to!string;
    assert(value == "5");
}

If you know that your int only has one digit, and you really want to get
it as char, you can always use value[0].



More information about the Digitalmars-d-learn mailing list