opPow, opDollar
Robert Jacques
sandford at jhu.edu
Sat Nov 7 20:05:20 PST 2009
On Sat, 07 Nov 2009 16:53:01 -0500, Andrei Alexandrescu
<SeeWebsiteForEmail at erdani.org> wrote:
> Robert Jacques wrote:
>> On Sat, 07 Nov 2009 12:56:35 -0500, Andrei Alexandrescu
>> <SeeWebsiteForEmail at erdani.org> wrote:
>>
>>> Robert Jacques wrote:
>>>> I'd recommend rolling that into a basic statistics struct containing
>>>> common single pass metrics: i.e. sum, mean, variance, min, max, etc.
>>>
>>> Well the problem is that if you want to compute several one-pass
>>> statistics in one pass, you'd have to invent means to combine these
>>> functions. That ability is already present in reduce, e.g. reduce(min,
>>> max)(range) yields a pair containing the min and the max element after
>>> exactly one pass through range.
>>>
>>> Andrei
>> Yes, but reduce(mean, std)(range) doesn't work.
>
> From std.algorithm's doc:
>
> // Compute sum and sum of squares in one pass
> r = reduce!("a + b", "a + b * b")(tuple(0.0, 0.0), a);
> // Compute average and standard deviation from the above
> auto avg = r.field[0] / a.length;
> auto stdev = sqrt(r.field[1] / a.length - avg * avg);
>
> I'm not saying there's no need for a more specialized library, just that
> I purposely designed reduce to be no slouch either.
>
>> Even reduce(count) would require the range to be mapped.
>
> (This I don't get.)
>
>> Besides, in my use case I need lazy evaluation, and I'd much rather add
>> elements to a statistics struct, than write a range wrapper.
>
> Well if you go for surgery on an existing struct then the opportunity
> for reuse is diminished.
>
>
> Andrei
Thanks. BTW the behavior of a tuple as input to a multiple function
reduce, although in the example, doesn't seem to be in the doc text.
More information about the Digitalmars-d
mailing list