string vs. w/char*
Steven Schveighoffer
schveiguy at yahoo.com
Mon Feb 28 04:58:43 PST 2011
On Mon, 28 Feb 2011 07:34:39 -0500, Tyro[a.c.edwards] <nospam at home.com>
wrote:
> The bellow code attempts to use LoadStringA() to initialize _buf.
> However, regardless of what form _buf takes, the body of the if
> statement is always executed. I've attempted to use every type of string
> available in D to include char* _buf[MAX_RESSTRING+1] and setting
> _buf[MAX_RESSTRING] = '\0'; What am I doing incorrectly?
> Any assistance is greatly appreciated.
>
> class ResString
> {
> enum { MAX_RESSTRING = 255 }
>
> alias getBuffer this;
> @property string getBuffer() { return _buf; }
>
> this(HINSTANCE hInst, int resId)
> {
> _buf.length = MAX_RESSTRING;
>
> SetLastError(0);
>
> if(!LoadStringA(hInst, resId, cast(char*)toStringz(_buf),
> _buf.length + 1))
> {
> throw new WinException("Load String failed");
> }
> }
>
> private:
> string _buf;
> }
You should not be overwriting buf, it is immutable. You need to make a
new buffer each time.
this(HINSTANCE hInst, int resId)
{
auto mybuf = new char[MAX_RESSTRING];
auto nchars = LoadStringA(hInst, resId, mybuf.ptr, mybuf.length);
if(!nchars)
{
throw new WinException("Load String failed");
}
_buf = assumeUnique(mybuf[0..nchars]);
SetLastError(0);
}
If this isn't working, you might consider that the string you are trying
to load doesn't actually exist (that is a valid condition). What is the
error from GetLastError ?
-Steve
More information about the Digitalmars-d-learn
mailing list