Improving dot product for standard multidimensional D arrays

p.shkadzko p.shkadzko at gmail.com
Mon Mar 2 18:17:05 UTC 2020


On Monday, 2 March 2020 at 15:00:56 UTC, jmh530 wrote:
> On Monday, 2 March 2020 at 13:35:15 UTC, p.shkadzko wrote:
>> [snip]
>
> Thanks. I don't have time right now to review this thoroughly. 
> My recollection is that the dot product of two matrices is 
> actually matrix multiplication, correct? It generally makes 
> sense to defer to other people's implementation of this. I 
> recommend trying lubeck's version against numpy. It uses a 
> blas/lapack implementation. mir-glas, I believe, also has a 
> version.
>
> Also, I'm not sure if the fastmath attribute would do anything 
> here, but something worth looking into.

Yes, this it is a sum of multiplications between elements of two 
matrices or a scalar product in case of vectors. This is not 
simple element-wise multiplication that I did in earlier 
benchmarks.

I tested @fastmath and @optmath for toIdx function and that 
didn't change anyting.


More information about the Digitalmars-d-learn mailing list