Thoughts about "Compile-time types" talk
Ola Fosheim Grøstad
ola.fosheim.grostad at gmail.com
Fri May 17 15:02:51 UTC 2019
On Friday, 17 May 2019 at 14:33:59 UTC, Alex wrote:
> All I will say about this is that all the different programming
> languages are just different expressions of the same. No matter
> how different they seem, they all attempt to accomplish the
> same. In mathematics, it has been found that all the different
> branches are identical and just look different because the
> "inventors" approaches it from different angles with different
> intents and experiences.
Not exactly sure what you are talking about here, if you are
thinking about Turing Machines then that is sort of a misnomer as
that only deals with the possibility of writing batch-programs
that computes the same result. It doesn't say if it is feasible
to actually do it with a real world programmer. In terms of
applied programming languages are very different once we move
outside of imperative languages.
> Everything you describe is simply mathematical logic
> implemented using different syntactical and semantical
> constructs that all reduce to the same underlying boolean logic.
Not really. If you limit the input and output to fixed sizes then
all programs can be implemented boolean expressions. However,
that is not what we are talking about here. We are talking about
modelling and type systems.
> We already have general theorem solving languages and any
> compiler is a theorem solver because all programs are theorems.
Usually not. Almost all compiled real world programs defer
resolution to runtime, thus they have a solver that is too weak.
Compiler do as much as they can, then they emit runtime checks
(or programs are simply left incorrect and crashes occationally
at runtime).
> Functional programs do this well because they are directly
> based in abstraction(category theory).
Actually, I think it has more to do with limiting the size of the
available state. Since you seldom write FP programs with high
speed in mind you also can relax more and pick more readable (but
less efficient) data-structures. However there are many types of
algorithms that are far easier to implement in imperative
languages. I'd say most real world performance oriented
algorithms fall into that category.
> As the program goes in complexity so does the code because
> there is no higher levels of abstraction to deal with it.
I don't think this is correct. Abstraction is a property of
modelling, not really a property of the language. The language
may provide more or less useful modelling mechanisms, but the
real source for abstraction-failures are entirely human for any
sensible real world language.
> covered. This is why no one programs in assembly... not because
> it's a bad language necessarily but because it doesn't allow
> for abstraction.
That's not quite true either. You can do abstraction just fine in
assembly with a readable assmbly language like Motoroal 68K and a
good macro assembler. It is more work, easier to make mistakes,
but whether you manage to do abstraction well is mostly a
property of the programmer (if the basic tooling is suitable).
> it just has a lot of other issues that makes it not a great
> language for practical usage.
It was not designed for writing large programs, it is more of a
PL-exploration platform than a software engineering solution.
AFAIK, Haskell lacks abstraction mechanisms for programming in
the large.
> No one here will actually be alive to find out if I'm right or
> wrong so ultimately I can say what I want ;)
It is a safe bet to say that you are both right and wrong (and so
are we all, at the end of the day).
More information about the Digitalmars-d
mailing list