Old problem with performance
Joel C. Salomon
joelcsalomon at gmail.com
Thu Feb 26 08:49:48 PST 2009
dsimcha wrote:
> Similarly, I find linear algebra impossible to grok largely because some genius
> mathematician decided to overload a bunch of operators to mean completely
> different things when dealing with matrices than when dealing with scalars. Heck,
> the multiplication operator isn't even commutative.
You’ve got W.R. Hamilton to thank for that last; he found it impossible
to extend multiplication past 1- and 2-tuples (real and complex numbers)
without dropping commutativity for quaternion multiplication. Give
mathematicians a new toy…
Another part of the trouble is that there’s a perfectly good operator
for functional composition, U+2218 RING OPERATOR, i.e.:
[f∘g](x) ≡ f(g(x)).
Treating matrices as linear operators on vectors, you *could* write
[A∘B](v),
but that ignores the fact that matrix “composition” and matrix-vector
“application” have the same form, so we prefer to use the same operator
for both:
ABv ≡ A*B*v.
(The “”, if you can see them, are U+2062 INVISIBLE TIMES.)
I’m playing with Geometric Algebra now, and the problem is *much* worse.
At various times, there are at least 7 different “products” defined:
• AB: the “geometric product”;
• A⋅B: the inner product, defined in two different ways;
• A^B: the outer product, related to the vector cross product;
• A⨼B & A⨽B: the left- and right-contractions, two more “inner products”
(sometimes denoted like A⌋B & A⌊B); and just for extra fun,
• A×B: *not* the familiar cross product, but the “commutator product”
A×B = ½(AB−BA).
(Fortunately, this last is pretty rare in the literature I’ve seen.)
There may also be a defined meaning for A*B, but I can’t recall its meaning.
Lurking on these lists, I’ve seen requests for user-definable infix
operators. This example “forms some sort of argument in that debate, but
I’m not sure whether it’s for or against.”
—Joel Salomon
More information about the Digitalmars-d
mailing list