Morale of a story: ~ is the right choice for concat operator
Jonathan M Davis
newsgroup.d at jmdavisprog.com
Fri May 25 23:05:51 UTC 2018
On Friday, May 25, 2018 22:23:07 IntegratedDimensions via Digitalmars-d
wrote:
> On Friday, 25 May 2018 at 22:07:22 UTC, Dukc wrote:
> > On Friday, 25 May 2018 at 21:06:17 UTC, Walter Bright wrote:
> >> This ambiguity bug with + has been causing well-known problems
> >> since Algol. A *really* long time. Yet it gets constantly
> >> welded into new languages.
> >
> > Yeah. I could understand that choice for a language that tries
> > to be simple for beginners above everything else. But for
> > large-scale application language like C#, I quess this just did
> > not occur to them.
>
> I used to program in C# quite regularly and never had this issue.
> It is not a problem of the language but a problem of the
> programmer.
>
> A programmer should always know the types he is working and the
> functional semantics used. While it obviously has the potential
> to cause more problems it is not a huge deal in general. I might
> have been caught by that "bug" once or twice but it's usually an
> obvious fix. If you are moving from one language to another or
> haven't programming in one much you will have these types of
> problems, but they go away with experience. To degrade the
> language based on that is wrong. Languages should not be designed
> around noobs because then the programmers of that language stay
> noobs. Think BASIC. If all you did was programmed in basic then
> you would be considered a novice programmer by today's standards.
> If even you were an expert BASIC programmer, when you moved to a
> modern language you would be confused. For you to say that those
> languages are inferior because they don't do things like BASIC
> would be wrong, it is your unfamiliarity with the language and
> newer programming concepts that are the problem.
>
> A language will never solve all your problems as a programmer,
> else it would write the programs for us.
Personally, I don't think that I've ever made the mistake of screwing up +
and concatenating instead of adding or vice versa. And at the end of the
day, the programmer does need to know the tools that they're using and use
them correctly. That being said, the language (and other tools used for
programming) can often be designed in a way that reduces mistakes - and all
programmers make mistakes. e.g. in D, implicit fallthrough in case
statements is now illegal if the case statement is non-empty. e.g.
switch(i)
{
case 0: // legal fallthrough
case 1:
{
foo(bar());
break;
}
case 2:
{
do(something());
// illegal fallthrough
}
default: return 17;
}
Instead, the programmer must put a control flow statement there such as
break or goto. e.g.
switch(i)
{
case 0: // legal fallthrough
case 1:
{
foo(bar());
break;
}
case 2:
{
do(something());
goto case; // now explicitly goes to the next case statement
}
default: return 17;
}
Sure, it can be argued that this should be unnecessary and that the
programmer should just get it right, but it's not an altogether uncommon bug
to screw up case statements and invadvertently fall through to the next one
when you meant to put a break or some other control statement there.
Originally, implicit fallthrough was perfectly legal in D just like it is in
C or C++. However, when it was made illegal, it caught quite a few bugs in
existing programs - including at companies using D. This change to the
language fixed bugs and almost certainly saved people time and money.
Designing a good programming language is a bit of an art. It's not always
easy to decide when the language should be picky about something and when it
should let the programmer shoot themselves in the foot, but there are plenty
of cases where having the language be picky catches bugs that programmers
would otherwise make all the time, because we're not perfect.
That's part of why we have @safe in D. It disallows all kinds of perfectly
legitimate code, because it's stuff that's easy for the programmer to screw
up and often hard for them to get right, and by having large sections of the
program restricted in what is allowed prevents all kinds of bugs. Then in
the cases where the programmer actually needs to do the unsafe stuff, they
write @system code, manually verify that it's correct, and mark it as
@trusted so that it can be called from @safe code. Then, when they run into
a memory corruption issue later, they have a relatively small portion of the
program that they need to inspect.
A well-designed language enables the programmer to do their job correctly
and efficiently while protecting them from stupid mistakes where reasonably
possible. Using ~ instead of + costs us almost nothing while preventing
potential bugs. It's quickly learned when you first start using D, and then
the code is clear about whether something is intended to be addition or
concatenation without the programmer having to study it closely, and there
are cases like what the OP described where it actually allows the compiler
to catch bugs. It's a simple design decision with almost no cost that
prevents bugs. That's the kind of thing that we generally consider to be a
win around here.
- Jonathan M Davis
More information about the Digitalmars-d
mailing list