Self Optimizing Code
Mark J Twain via Digitalmars-d
digitalmars-d at puremagic.com
Tue Aug 2 15:06:38 PDT 2016
Another new D feature here!
Self-Optimizing Code is a fantastic research area that can lead
to greater improvements in code, make them more responsive to
individual applications, and requires very little effort.
To demonstrate:
In many programs, memory behavior is fixed to a large degree, if
not completely. Take the typical dynamic data structure that over
allocates the required memory to store it's data by some factor,
usually 2.
We might have code like
if (length >= capacity)
expand(2*length);
These types of expressions are used through a program. They are
hard coded values that are generally somewhat arbitrary. They
definitely are not optimal in all cases.
Instead, a better solution would be to use variables:
if (n*length > m*capacity) expand(l*length)
n,m, and l are not variable and can change.
How is this useful you say? Because of mathematics!
In a typical program we will have N tuples, where each tuple t_i
is a parameter set, such as t_k(n,m,l).
This forms a vector space and, due to mathematics, we can find
optimal values for the different parameter sets by observing the
outcome of small perturbations over long times.
e.g., we intially start with our hard coded values as defaults.
t_i_0(1,1,2) for our example above.
A background process has this variable, valid ranges, and the
ability to measure performance for the app.
It then, in the background, perturbates t_i, say t_i_1(1.01, 1,
2), monitors the performance impact, and stores the results.
This can be efficiently since the algorithm is quite simple, can
be done without threading concerns(since the parameters are
effectively immutable to app), and can run in the background just
monitoring the app performance.
Using statistics, we can find out if the change had any
significant effect(remember, we are doing this over the long
haul, so any deviations will cancel out).
Eventually the program will optimize itself to run faster by
choosing the values that best suit how the program is used during
the optimization process(which can be done continually, and using
better techniques than I have describe).
To accomplish this task easily, all one needs to do is have
special variables usage
@Perf("Memory_Capacity")
if ($n*length >= $m*capacity)
expand($l*length);
The compiler then stores values(n,m,l) in a list, in this case we
named it, along with the file position, etc. This is the only
extra stuff that is done to a program. The special variable
symbols $ was arbitrary, not necessarily desirable.
A BG thread is created that then does the optimization part. We
might need to specify ranges for the values(the sample space for
the algorithm, that could be done externally though).
More information about the Digitalmars-d
mailing list