Type safety could prevent nuclear war
Ola Fosheim Grøstad via Digitalmars-d
digitalmars-d at puremagic.com
Thu Feb 4 16:41:52 PST 2016
On Friday, 5 February 2016 at 00:14:11 UTC, tsbockman wrote:
> But it's 2016 and my PC has 32GiB of RAM. Why should a C
> compiler running on such a system skip safety checks just
> because they would be too expensive to run on some *other*
> computer?
C has to be backwards compatible, but I don't know why people do
larger projects in C in 2016.
Libraries are done in C for portability and because it provides a
FFI interface defined as the ABI by hardware and OS vendors. BeOS
tried to define a specific C++ compiler as their ABI, but it was
problematic.
C++ does not have an ABI, you cannot link object files from
different C++ compilers. Java/C# are not system level languages.
So, basically, there is no suitable industry standard other than
C.
> This is already a solved problem in most other programming
> languages; there is no fundamental reason that the solutions
> used in D, C++, or Java could not be applied to C - without
> even changing any of the language semantics.
D and C++ change. C uses the ABI defined by the hardware/OS
vendor. It is locked in stone, frozen, beyond discussion.
As mentioned BeOS adopted C++. Apple has adopted Objective-C and
Swift. But how can you make _all_ the other vendors (Microsoft,
Google, IBM etc) standardize on something that isn't C?
> Aliasing types like that can be useful sometimes, but only
> within certain limits. In particular, the size (with alignment
> padding) of the types in question must match, otherwise you
> will corrupt the stack.
I see where you are coming from, but I meant what I said
literally. Machine language only deals with bitpatterns. When we
interface with machine language we just add lots of constraints
on what we hand over to it. Adding _more_ constraints the the
creator of the machine language code intended is never wrong. Not
adding enough constraints is not ideal, but often difficult to
avoid if we care about performance.
So if I write a piece of machine language code and give you the
object file you only have my words for what the input is supposed
to be. And then you have to make a formulation of the constraints
that fits your use case and is expressible in your language.
Different languages have different levels of expressiveness for
describing and enforcing type constraints.
More information about the Digitalmars-d
mailing list