Shouldn't invalid references like this fail at compile time?
Steven Schveighoffer
schveiguy at yahoo.com
Thu Jan 25 16:50:19 UTC 2018
On 1/24/18 9:46 PM, Walter Bright wrote:
> On 1/23/2018 7:22 PM, Jonathan M Davis wrote:
>> We need to do that anyway for the overly large
>> objects (and unfortunately don't last I heard).
>
> I put a limit in at one time for struct/class sizes to prevent this
> issue, but got a lot of pushback on it and it was reverted.
>
> Perhaps we can revisit that - and have large struct/classes be allow
> only in non- at safe code.
>
> In general, though, if you don't have struct/class object sizes larger
> than the protected memory at null, you're safe with null dereferences.
You don't need to ban them from @safe code, what you need to do is
determine if the field itself is beyond the zero page (which causes a
segfault), and if so, either read from the first byte of the struct (to
cause the segfault if it's in there), or verify the struct's address is
not within the zero page.
We recently removed an assert for null this from all functions. Perhaps
for structs that are large, in @safe code add that check back.
-Steve
More information about the Digitalmars-d
mailing list