5/7/2023 0 Comments Wowmatrix not working 2018![]() ![]() It is known that Strassen's algorithm for matrix multiplication is numerically less stable than the naïve algorithm, which has cubic time complexity, compared to Strassen's O(n^2.807), and for that reason, the naïve algorithm is used in quite a few real-world applications. Roughly speaking, this concerns the compounding of errors resulting from using limited-precision representations of real numbers. ![]() Others have pointed out that even for multiplying small matrices, speed is not all that matters, you also need numerical stability. Posted by kaibutsu at 2:09 PM on October 12 Commodity mathematical optimization would presumably let you notice where most of your time is being spent and then go try to find better techniques automatically, instead of needing to hire a wizard to work out a solution for hardware that you will migrate off of on a year anyway. It's typical these days to run tuning tests too pick the best algorithms for specific matrix sizes running on particularly hardware. The other nice part of the result is the practical application. This is exactly the situation where the reinforcement learning helps it can look for promising solutions tirelessly. No one else found these methods in the previous fifty years when they theoretically could have, presumably because the search space is too large. It is extremely common in mathematics for new results to be subsequently simplified and extended, once people know where to look. "The Kauers and Moosbauer paper throw a bit of cold water on that by publishing this very quick response which replicates the 4x4 result with elementary methods, and improves on the 5x5 result in a similar way." Their decision to call the DeepMind team's result "The FBHHRBNRSSSHK-Algorithm" is a purposeful reminder that a human team discovered it (but also a bit of a joke that the paper had a very large number of co-authors compared to papers in the field of mathematical computing where 2-5 is more typical) posted by 3j0hn (14 comments total) The Kauers and Moosbauer paper throw a bit of cold water on that by publishing this very quick response which replicates the 4x4 result with elementary methods, and improves on the 5x5 result in a simlar way. But the result they highlight most is the improvement of 4x4 and 5x5 multiplication over characteristic 2 / boolean arithmetic. Alexandre Sedoglavic at the University of Lille maintains a catalog of best known methods for multiplying small matrices and the new DeepMind results improve a number of the small rectangular records (notably 4x5 times 5x5) and the 10x10 and 11x11 square records (buried in the appendices). Hyperbolic statements about the work aside, this is the problem the DeepMind paper aims to solve. The current best known complexity is now O( n 2.3728596) due to Virginia Vassilevska Williams and Josh Alman but that algorithm is mostly of theoretical interest and doesn't actually answer the question, "What is the fastest way to multiply two small matrices together?" In 1969, Volker Strassen kicked off an entire field of research by showing that you can multiply two 2x2 matrices with 7 scalar multiplications instead of the 8 required using the classical method, which by application of recursion meant that arbitrary nxn matrices can be multiplied together with asymptotically fewer than O(n 3) operations. Matrix multiplication is one of the most basic and import computations in mathematics.
0 Comments
Leave a Reply. |