Null references redux
Don
nospam at nospam.com
Wed Sep 30 08:39:35 PDT 2009
Jeremie Pelletier wrote:
> Yigal Chripun wrote:
>> On 29/09/2009 16:41, Jeremie Pelletier wrote:
>>
>>> What I argued about was your view on today's software being too big and
>>> complex to bother optimize it.
>>>
>>
>> that is not what I said.
>> I was saying that hand optimized code needs to be kept at minimum and
>> only for visible bottlenecks, because the risk of introducing
>> low-level unsafe code is bigger in more complex and bigger software.
>
> What's wrong with taking a risk? If you know what you're doing where is
> the risk, and if now how will you learn? If you write your software
> correctly, you could add countless assembly optimizations and never
> compromise the security of the entire thing, because these optimizations
> are isolated, so if it crashes there you have only a narrow area to
> debug within.
>
> There are some parts where hand optimizing is almost useless, like
> network I/O since latency is already so high having a faster code won't
> make a difference.
>
> And sometimes the optimization doesn't even need assembly, it just
> requires using a different high level construct or a different
> algorithm. The first optimization is to get the most efficient data
> structures with the most efficient algorithms for a given task, and THEN
> if you can't optimize it more you dig into assembly.
>
> People seem to think assembly is something magical and incredibly hard,
> it's not.
>
> Jeremie
Also, if you're using asm on something other than a small, simple loop,
you're probably doing something badly wrong. Therefore, it should always
be localised, and easy to test thoroughly. I don't think local extreme
optimisation is a big risk.
Greater risks come from using more complicated algorithms. Brute-force
algorithms are always the easiest ones to get right <g>.
More information about the Digitalmars-d
mailing list