WhatsApp BO critical security vulnerability
H. S. Teoh
hsteoh at quickfur.ath.cx
Thu May 16 16:41:29 UTC 2019
On Thu, May 16, 2019 at 09:17:45AM -0700, Walter Bright via Digitalmars-d wrote:
> On 5/15/2019 6:19 PM, Exil wrote:
> > Wouldn't be surprised if it had something to do with data received
> > over the network. I always see people write code with assumptions
> > that the data will be valid. A good assumption would be that it
> > can't be trusted.
>
> Using asserts and relying on array bounds checking to check the
> validity of incoming data is incorrect.
>
> Asserts and bounds checking are for detecting bugs in the program's
> logic.
>
> Scrubbing input data for correctness is normal program behavior, and
> should never be disabled.
Yes, that's what Exil meant by "A good assumption would be that it can't
be trusted".
Far too often I've seen code that declares a static buffer of 1024 or so
bytes, then proceed to read in lines from stdin without checking for
line length -- the (totally unwarranted) assumption being that the user
wouldn't type a line that long. That has led to so many buffer overflow
bugs. It's just a security exploit waiting to happen.
But people keep doing it because in C, bounds checking is troublesome,
and many stdlib functions have APIs that don't even accept any bounds.
These two plus natural human laziness equals buffer overflows and
security exploits galore (too lazy to check bounds, too much work to
write utility functions with safer APIs instead of just using the unsafe
stdlib ones).
What we need is a language with the right incentives: bounds checking is
default and it takes effort to turn it off (should that ever become
necessary), and the standard library should always have an API that take
bounds. These two then put human laziness on the right side of the
fence -- you're safe by default, and disincentivised to write unsafe
APIs in place of the standard, safe ones.
Vetting input is a harder problem, though. It takes work to verify
input, no matter how you cut it. So the only incentives I can think of
to prod people in the right direction is some kind of tainting scheme
like in Perl, where it's an error to operate on data that isn't vetted.
But even then, the incentives are still not quite right -- the tainting
mechanism becomes an annoyance to be rid of, which encourages doing the
absolute minimal work to get it to pass rather than putting in the
effort to do it right. I don't have a good answer for this one.
T
--
EMACS = Extremely Massive And Cumbersome System
More information about the Digitalmars-d
mailing list