Kernel buffer overflow exposes iPhone 11 Pro to radio based attacks

Bruce Carneal bcarneal at gmail.com
Wed Dec 9 08:29:58 UTC 2020


On Wednesday, 9 December 2020 at 01:23:37 UTC, Paul Backus wrote:
>
> The problem with this is that there is existing *correct* D 
> code that relies "no attributes" meaning @system, and which 
> would silently become incorrectly-annotated if the default were 
> changed. For example, there are many external @system functions 
> in the D runtime that do not have an explicit @system 
> annotation.

IIUC, such functions in existing .o files and libs would not be 
designated (mangled) @safe so I'd expect linker errors, not 
silence.  New compilations will have the source body and will, of 
course, reject non @safe code so, again, not silent. What have I 
misunderstood?  What is the "silent" problem?  Is there some 
transitive issue?

Note: @safe designation should be part of the external mangle of 
any future defaulted-and-verified- at safe function.  I don't see 
how it works otherwise.

>
> You flip the switch, your tests pass, and then months or years 
> later, you discover that a memory-corruption bug has snuck its 
> way into your @safe code, because no one ever got around to 
> putting an explicit @system annotation on some external 
> function deep in one of your dependencies. How would you react? 
> Personally, I'd jump ship to Rust and never look back.

How do your tests pass?  How does the code even compile?  If the 
default moves from lax (@system) to strict (@safe) I see how a 
lot of code that formerly compiled would stop compiling/linking, 
an ongoing concern were the DIP edited and re-introduced, but I 
don't see how you get bugs "sneaking" in or lying dormant.  
Absent explicit greenwashing by the programmer, how do the bugs 
sneak in?

>
> Of course, you can try to argue that it's the fault of the 
> library maintainer for not realizing that they need to 
> re-review all of their external function declarations--but why 
> should they have to, when the compiler can just as easily flag 
> those functions automatically? Isn't the whole reason we have 
> automatic memory-safety checks in the first place to *avoid* 
> relying on programmer discipline for this kind of thing?

Well, @safe by default is about as 
automatic/not-relying-on-discipline as it gets.  Unless annotated 
otherwise all functions with source are flagged at compile time 
if not verified @safe.  Extern declarations against old object 
files and libs should flag errors at link time.

Again I feel that I must be missing something.  What "programmer 
discipline" are you referring to?



More information about the Digitalmars-d mailing list