[OT] Move semantics in a nutshell
Timon Gehr
timon.gehr at gmx.ch
Mon Nov 10 12:42:10 UTC 2025
On 11/10/25 01:49, monkyyy wrote:
> On Sunday, 9 November 2025 at 23:41:28 UTC, Timon Gehr wrote:
>> Invariants are also not saying "the program crashes sometimes", they
>> are saying "the program only uses these specific combinations of bit
>> patterns".
>
> You claimed my ubyte example was "an invariant",
It's an example of relying on a lot of invariants to establish a few
more invariants.
> using all 2^8 of ubytes
> space isn't a *specific* combination. Your using math language and logic
> here, you can say "the empty set is a set" but nah that aint real
> english 0 be a special case, if I handle all possible bit patterns my
> "invariants" drop to be `assert(1==1,"math stopped working")`type
> things; if your disk runs out of memory, or cosmic rays corrupt the os,
> or or or or or or; these airnt my problems anymore.
That's the point of invariants... Bit patterns that are not used are not
your problem anymore. Your program and so much else is just bits in
memory too.
> I write
> `opIndex(int)` I think "hmmm what are the possible `int`s" and then I
> handle those cases,
In your example you also had to think about "hmmm what are the possible
`T[]`. Actually you had to think about "hmmm what are the possible
combinations of `int`s and `T[]`s. And somehow you disregarded the
`T[]`s where the length and ptr are out of synch or the ptr points to
inaccessible memory.
> I dont "solve" problems with rants in docs,
But you did not consider invalid slices your problem anymore, because
someone wrote rants in docs about what `T[]` is supposed to mean and did
their best to implement that behavior in the compiler and runtime.
This is why people sometimes like to use user-defined or language types
with invariants (e.g., slices). "hmmm what are the possible `T`s" and
then handling these cases, while not having to deal with all of the
additional nonsensical cases by doing random nonsense.
> or add function coloring or ask for a big math proofer.
> ...
You don't have to do either of these things. Just accept that someone
else might like engaging in these activities. Don't tread on them.
> Insiting the way I write code fits your mental model while I continuous
> insit I'm not, is insane.
It's not a specific property of your code, it's just a fact of life that
some people talk about using certain terms. I don't need you to accept
these terms, I am just explaining why others think the underlying
concepts referred to by these terms are important and you are wasting
your time by trying to push your way of thinking onto them.
> Uncle bob may walk into a code base and go
> "hmmm, yes this is a visitor pattern", but if the author generally says
> "uncle bob is a con artist and should be shoot",
How hard is it to not suggest people should be shot for talking. I think
this is the bare minimum.
> It may "fit" but it doesn't "use" uncle bobs ideas.
> ...
Sure, people reinvent equivalent concepts all the time. Translating
between different viewpoints can still be illuminating, especially when
someone keeps telling other people they are stupid or should have
violence done to them just for talking about the same things in terms
more commonly understood in their specific communities.
Anyway, you are entirely correct that you don't have to "use"
abstractions for something to be _an example_ of what is being talked
about in the abstract. And even if something is an example of an
abstract concept, that does not at all preclude there being essential
new insights in the example that are not captured by the abstraction.
Not all abstractions are useful, but the higher the level of
abstraction, the more likely it is that something important is also
captured by the abstraction. x)
> I consider everything I do to just be dumb strong types. If its a
> panacea, its been there since the beginning and it aint that hard to
> call the so called experts with their "important exceptions" stupid.
> `1/0==int.max` god damn hardware bugs making me write work'onds.
> ...
Maybe just build your own language, run without an OS, build workarounds
for any remaining hardware traps into the compiler, run on (so-called)
bare metal, and discover your own terms for describing the wonderful
worlds of lack of memory protection that will open up before you. As you
correctly point out, it's not that hard. You could also get an FPGA or
something and fix the crashes at the hardware level, though integrating
this with other components might be a bit harder.
>> invariants are a thing, programming languages have to contend with them
>
> The compiler is unable to act on "slices cant point at invalid memory"
> so it doesnt and the os's whines; it can and does have opinions on int
> sizes and its 1 to 1 with reality.
> ...
The CPU architecture is basically a virtual machine integrated with the
OS at this point. 1 to 1 with reality? Certainly not.
> Types are more fundamental ground truth for what compilers work with,
> ints are far more real as something in a header then a rant in the spec.
>
>> it could also just start overwriting your hard disk with random garbage
>
> I aint never seen an undefined behavior do this even once `int.max+1`
> maybe its an int.max maybe its int.min, maybe its 0, maybe is some
> pathological platform its a random int, but I aint never seen no delete
> harddrive unless you let code from 4chan run, and let me tell ya it
> wasnt by mistake.
Look no further than encryption-based ransomware (e.g. EternalBlue
exploited by WannaCry and NotPetya/ExPetr). It's not a theoretical
scenario at all, though you might call it something else. I don't know.
I guess it has either not happened to you yet or you found your own
canonical language to speak about it and therefore everyone else who
suffered from these experiences is stupid or something.
Anyway, I was not even talking about undefined behavior, I was talking
about exploitable gadgets or other accidental an undesirable behaviors
that I suspect will basically inevitably form at an even higher rate in
100% failsafe computations.
Note that I don't think D is the most fruitful ground for these
"fail-safe" experiments, Walter is adamant about having programs crash
at the first sign of trouble. He often shares that this opinion was
shaped by experiences with programming computers before memory
protection was a common thing and he sometimes laments that younger
generations had not had to go through such experiences. It has been the
most opinionated area of D's design and I am very happy that we are
moving away a bit from the most extreme position where the language
fights you at every step of the way if you go as far as trying to make a
custom crash dump.
The opposite extreme is not suitable for most people either, though. And
this unfortunately implies that there is less support for these
alternative ways of doing computing. I don't know any way around that,
finite resources are an inevitable fact of life.
What we can hopefully agree on though is that commodity computing
hardware must not be locked down in order to enforce a certain style of
computing or programming methodology.
More information about the Digitalmars-d
mailing list