List of Phobos functions that allocate memory?
Sean Kelly
sean at invisibleduck.org
Thu Feb 6 20:56:13 PST 2014
On Friday, 7 February 2014 at 01:31:17 UTC, Ola Fosheim Grøstad
wrote:
> On Friday, 7 February 2014 at 01:23:44 UTC, Walter Bright wrote:
>> Right. If you're:
>>
>> 1. using throws as control flow logic
> [...]
>> you're doing it wrong.
>
> I disagree.
>
> REST based web services tend to use throws all the time. It is
> a an effective and clean way to break all transactions that are
> in progress throughout the call chain when you cannot carry
> through a request, or if the request returns nothing.
But let this be up to the programmer working on the service, not
imposed on them by the API. Then if they run into something like
this DoS issue they can fix it. My experience with these
services is that performance is critical and bad input is common,
because people are always trying to hack your shit.
Where I work, people are serious about performance, our daily
volume is ridiculous, and our goal is five nine's of uptime
across the board. At the same time, really good asynchronous
programmers are about as rare as water on the moon. So something
like vibe.d, where mid-level programmers could write correct code
that still performs well thanks to the underlying event model,
would be a godsend. But only if I really can get what I pay for.
The thing I think a lot of people don't realize these days is
that performance per watt is just about the most important thing
there is. Data centers are expensive, slow to build, and rack
space is limited. If you can find a way to increase the
concurrent load per box by, say, an order of magnitude by
choosing a different language or programming model or whatever,
there's a real economic motivation to do so.
Java gets by by having a really good GC and a low barrier of
entry, but its scalability is really pretty poor all things
considered. On the other hand, C/C++ scales tremendously but
then you're stuck with the burden those languages impose in terms
of semantic complexity, bug frequency, and so on. D seems really
promising here but can't rely on having a fantastic incremental
GC like Java, and so I think it's a mistake to use Java as a
model for how to manage memory. And maybe Java just got it wrong
anyway. I know some people who had to go to ridiculous lengths
to avoid GC collection cycles in Java because a collection in the
app took _20_seconds_ to complete. Now maybe the application was
poorly designed or they should have been using an aftermarket GC,
but even so.
Finally, library programming is the one place where premature
optimization really is a good idea, because you can never be sure
how people will be using your code. That allocation may not be a
big deal to you or 98% of your users, but for the one big client
who calls that routine in a tight inner loop or operates at
volumes you never conceived of it's a deal breaker. I really
don't want Phobos to be the deal breaker :-)
More information about the Digitalmars-d
mailing list