{"id":839,"date":"2011-11-28T16:09:01","date_gmt":"2011-11-28T21:09:01","guid":{"rendered":"https:\/\/scottaaronson.blog\/?p=839"},"modified":"2017-01-13T12:14:44","modified_gmt":"2017-01-13T17:14:44","slug":"2-373","status":"publish","type":"post","link":"https:\/\/scottaaronson.blog\/?p=839","title":{"rendered":"2.373"},"content":{"rendered":"<p>For twenty years, the fastest known algorithm to multiply two n-by-n matrices, due to Coppersmith and Winograd, took a leisurely O(n<sup>2.376<\/sup>) steps. \u00a0 Last year, though, in his\u00a0<a href=\"http:\/\/www.maths.ed.ac.uk\/pg\/thesis\/stothers.pdf\">PhD thesis<\/a>, Andrew Stothers gave an improvement to O(n<sup>2.374<\/sup>)\u00a0steps. \u00a0And today, \u00a0<a href=\"http:\/\/www.cs.berkeley.edu\/~virgi\/\">Virginia Vassilevska Williams<\/a>\u00a0of Berkeley and Stanford, released a <a href=\"http:\/\/www.cs.berkeley.edu\/~virgi\/matrixmult.pdf\">paper<\/a> that gives a general methodology for analyzing Coppersmith-Winograd-type algorithms, and that improves the matrix-multiplication time to a lightning-fast <span style=\"color: #000000;\">O(n<sup>2.37<span style=\"color: #ff6600;\"><strong>3<\/strong><\/span><\/sup>) steps.<\/span>\u00a0 (Virgi&#8217;s work was independent of Stothers&#8217;, though she credits him and applies an idea of his to simplify her proof.) \u00a0Full disclosure: I actually knew a month ago that this was coming&#8212;I had a hell of a time keeping the secret. \u00a0I&#8217;d recommend that you get started memorizing &#8220;\u03c9&lt;2.373,&#8221; but as Russell Impagliazzo points out in the comments, the exponent <em>might<\/em> get lowered again in short order. \u00a0Huge congratulations to Virgi and to Andrew for this breakthrough!<\/p>\n<hr \/>\n<p><span style=\"color: #ff0000;\"><strong>Update (Nov. 30):<\/strong><\/span> Last night I received an extremely gracious email from Andrew Stothers, which he&#8217;s given me permission to summarize here. \u00a0In the email, Andrew expressed how excited he was about Virgi&#8217;s new result, apologized for the confusion he caused by not mentioning his improvement to \u03c9 until page 71 of his thesis (he says he doesn&#8217;t know why he did it), and said that he meant to publish a paper, but was prevented from doing so by health and job issues. \u00a0He also said that he didn&#8217;t take issue with anything I wrote here, <em>except<\/em> that I mistakenly referred to him as Andy rather than Andrew. \u00a0In response, I congratulated Andrew on his achievement; expressed how happy I was that&#8212;ironically&#8212;his work is now <em>finally<\/em> getting some of the attention that it deserves; and promised to buy him a beer when and if I&#8217;m ever in Edinburgh, a city I&#8217;ve always wanted to visit. \u00a0(On the other hand, I warned Andrew that his LinkedIn profile, which unselfconsciously mentions improvements to his Word and Excel skills as one of the benefits of his PhD research breaching the Coppersmith-Winograd barrier, might have earned him a place in scientific folklore forever!)<\/p>\n<p>In summary, I now see Andrew as an extraordinarily nice fellow who had some bad luck and&#8212;most conspicuously&#8212;a <em>lack of good advice<\/em> from people around him. \u00a0I do stand by the points that I was originally <em>trying<\/em>\u00a0to make:<\/p>\n<p>(a) that this tangled situation shouldn&#8217;t <em>in any way<\/em> detract from Virgi&#8217;s fantastic achievement, which (except for a simplification, as she discusses) must be considered completely independent of Andrew&#8217;s, and<\/p>\n<p>(b) that there&#8217;s indeed an important cautionary lesson for students here, about adequately publicizing your work (yes, there&#8217;s\u00a0a happy medium, between hiring a PR firm to wage a viral marketing campaign and burying your solution to a longstanding open problem so far in the body of your PhD thesis that even world experts in the subject who read your thesis will miss it).<\/p>\n<p>On the other hand, I hereby apologize for anything I said that could even be <em>perceived<\/em> as slighting Andrew, his important work, or his motives.<\/p>\n<hr \/>\n<p><span style=\"color: #ff0000;\"><strong>Another Update:<\/strong><\/span> On the third hand, if you&#8217;re one of the commenters whose beef is not about attribution, but about the entire concept of using a CS theory blog to &#8220;promote&#8221; major milestones in CS theory (like the breaking of the Coppersmith-Winograd barrier), then I apologize for absolutely nothing. \u00a0Go read an economics or physics blog; I understand that those are entirely hype-free. \u00a0Better yet, go to hell.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>For twenty years, the fastest known algorithm to multiply two n-by-n matrices, due to Coppersmith and Winograd, took a leisurely O(n2.376) steps. \u00a0 Last year, though, in his\u00a0PhD thesis, Andrew Stothers gave an improvement to O(n2.374)\u00a0steps. \u00a0And today, \u00a0Virginia Vassilevska Williams\u00a0of Berkeley and Stanford, released a paper that gives a general methodology for analyzing Coppersmith-Winograd-type [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"advanced_seo_description":"","jetpack_seo_html_title":"","jetpack_seo_noindex":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":false,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2},"_wpas_customize_per_network":false},"categories":[31,5],"tags":[],"class_list":["post-839","post","type-post","status-publish","format-standard","hentry","category-announcements","category-complexity"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/scottaaronson.blog\/index.php?rest_route=\/wp\/v2\/posts\/839","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/scottaaronson.blog\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/scottaaronson.blog\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/scottaaronson.blog\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/scottaaronson.blog\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=839"}],"version-history":[{"count":12,"href":"https:\/\/scottaaronson.blog\/index.php?rest_route=\/wp\/v2\/posts\/839\/revisions"}],"predecessor-version":[{"id":1534,"href":"https:\/\/scottaaronson.blog\/index.php?rest_route=\/wp\/v2\/posts\/839\/revisions\/1534"}],"wp:attachment":[{"href":"https:\/\/scottaaronson.blog\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=839"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/scottaaronson.blog\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=839"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/scottaaronson.blog\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=839"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}