{"id":1558,"date":"2013-10-15T09:44:09","date_gmt":"2013-10-15T14:44:09","guid":{"rendered":"https:\/\/scottaaronson.blog\/?p=1558"},"modified":"2017-01-12T16:35:31","modified_gmt":"2017-01-12T21:35:31","slug":"three-things-that-i-shouldve-gotten-around-to-years-ago","status":"publish","type":"post","link":"https:\/\/scottaaronson.blog\/?p=1558","title":{"rendered":"Three things that I should&#8217;ve gotten around to years ago"},"content":{"rendered":"<p><span style=\"color: #ff0000;\"><strong>Updates (11\/8):<\/strong><\/span> Alas, video of Eliezer&#8217;s talk will not be available after all.  The nincompoops who we paid to record the talk wrote down November instead of October for the date, didn&#8217;t show up, then stalled for a month before finally admitting what had happened.  So my <a href=\"https:\/\/scottaaronson.blog\/?p=1558#comment-89317\">written summary<\/a> will have to suffice (and maybe Eliezer can put his slides up as well).<\/p>\n<p>In other news, Shachar Lovett has asked me to announce a <a href=\"http:\/\/cseweb.ucsd.edu\/~slovett\/workshops\/complexity-coding-2014\/\">workshop on complexity and coding theory<\/a>, which will be held at UC San Diego, January 8-10, 2014.<\/p>\n<hr \/>\n<p><span style=\"color: #ff0000;\"><strong>Update (10\/21):<\/strong><\/span> Some readers might be interested in my <a href=\"https:\/\/scottaaronson.blog\/?p=1558#comment-89449\">defense<\/a> of LessWrongism against a surprisingly-common type of ad-hominem attack (i.e., &#8220;the LW ideas must be wrong because so many of their advocates are economically-privileged but socially-awkward white male nerds, the same sorts of people who might also be drawn to Ayn Rand or other stuff I dislike&#8221;). By all means debate the ideas&#8212;I&#8217;ve been doing it for years&#8212;but <i>please<\/i> give beyond-kindergarten arguments when you do so!<\/p>\n<hr \/>\n<p><span style=\"color: #ff0000;\"><strong>Update (10\/18):<\/strong><\/span> I just posted a <a href=\"https:\/\/scottaaronson.blog\/?p=1558#comment-89317\">long summary and review<\/a> of Eliezer Yudkowsky&#8217;s talk at MIT yesterday.<\/p>\n<hr \/>\n<p><span style=\"color: #ff0000;\"><strong>Update (10\/15):<\/strong><\/span> Leonard Schulman sent me the news that, according to <a href=\"http:\/\/www.dailymail.co.uk\/sciencetech\/article-2461133\/Google-D-Wave-quantum-computing-solve-global-warming.html?ito=feeds-videoxml\">an article by Victoria Woollaston in the <em>Daily Mail<\/em><\/a>, Google hopes to use its D-Wave quantum computer to &#8220;solve global warming,&#8221; &#8220;develop sophisticated artificial life,&#8221; and &#8220;find aliens.&#8221;\u00a0 (No, I&#8217;m not making any of this up: just quoting stuff <em>other people<\/em> made up.)\u00a0 The article also repeats the debunked canard that the D-Wave machine is &#8220;3600 times faster,&#8221; and soberly explains that <strong>D-Wave&#8217;s 512 qubits compare favorably to the mere 32 or 64 bits found in home PCs<\/strong> (exercise for those of you who aren&#8217;t already rolling on the floor: think about that until you are).\u00a0 It contains not a shadow of a hint of skepticism anywhere, not one token sentence.\u00a0 I would say that, even in an extremely crowded field, Woollaston&#8217;s piece takes the cake as the single most irresponsible article about D-Wave I&#8217;ve seen.\u00a0 And I&#8217;d feel terrible for my many friends at Google, whose company comes out of this looking like a laughingstock.\u00a0 But that&#8217;s assuming that this isn&#8217;t some sort of elaborate, Sokal-style prank, designed simply to prove that media outlets will publish anything whatsoever, no matter how forehead-bangingly absurd, as long as it contains the words &#8220;D-Wave,&#8221; &#8220;Google,&#8221; &#8220;NASA,&#8221; and &#8220;quantum&#8221;&#8212;and thereby, to prove the truth of what I&#8217;ve been saying on this blog since 2007.<\/p>\n<hr \/>\n<p>1. I&#8217;ve added MathJax support to the comments section!\u00a0 If you want to insert an inline LaTeX equation, surround it with\\( \\backslash(\u00a0 \\backslash) \\), while if you want to insert a displayed equation, surround it with \\(\\text{\\$\\$ \\$\\$}\\).\u00a0 Thanks very much to Michael Dixon for prodding me to do this and telling me how.<\/p>\n<p>2. <del>I&#8217;ve also added upvoting and downvoting to the comments section!<\/del>\u00a0 OK, in the first significant use of comment voting, the readers have voted overwhelmingly, by 41 &#8211; 13, that they want the comment voting to disappear.\u00a0 So disappear it has!<\/p>\n<p>3. Most importantly, I&#8217;ve invited <a href=\"http:\/\/en.wikipedia.org\/wiki\/Eliezer_Yudkowsky\">Eliezer Yudkowsky<\/a> to MIT to give a talk!\u00a0 He&#8217;s here all week, and will be speaking on &#8220;Recursion in Rational Agents: Foundations for Self-Modifying AI&#8221; this Thursday at 4PM in 32-123 in the MIT Stata Center.\u00a0 Refreshments at 3:45.\u00a0 <a href=\"http:\/\/lesswrong.com\/lw\/itp\/meetup_talk_by_eliezer_yudkowsky_recursion_in\/\">See here for the abstract.<\/a>\u00a0 Anyone in the area who&#8217;s interested in AI, rationalism, or other such nerdy things is strongly encouraged to attend; it should be interesting.\u00a0 Just don&#8217;t call Eliezer a &#8220;Singularitarian&#8221;: I&#8217;m woefully out of the loop, but I learned yesterday that they&#8217;ve dropped that term entirely, and now prefer to <del>be known as machine intelligence researchers<\/del> talk about the intelligence explosion.<\/p>\n<p>(In addition, Paul Christiano&#8212;former MIT undergrad, and my collaborator on quantum money&#8212;will be speaking today at 4:30 at the Harvard Science Center, on &#8220;Probabilistic metamathematics and the definability of truth.&#8221;\u00a0 His talk will be related to Eliezer&#8217;s but somewhat more technical.\u00a0 <a href=\"http:\/\/intelligence.org\/2013\/10\/01\/upcoming-talks-at-harvard-and-mit\/\">See here<\/a> for details.)<\/p>\n<hr \/>\n<p><span style=\"color: #ff0000;\"><strong>Update (10\/15):<\/strong><\/span> Alistair Sinclair asked me to post the following announcement.<\/p>\n<p>The Simons Institute for the Theory of Computing at UC Berkeley invites applications for Research Fellowships for academic year 2014-15.<\/p>\n<p>Simons-Berkeley Research Fellowships are an opportunity for outstanding junior scientists (up to 6 years from PhD by Fall 2014) to spend one or two semesters at the Institute in connection with one or more of its programs. The programs for 2014-15 are as follows:<\/p>\n<p>* Algorithmic Spectral Graph Theory (Fall 2014)<br \/>\n* Algorithms and Complexity in Algebraic Geometry (Fall 2014)<br \/>\n* Information Theory (Spring 2015)<\/p>\n<p>Applicants who already hold junior faculty or postdoctoral positions are welcome to apply. In particular, applicants who hold, or expect to hold, postdoctoral appointments at other institutions are encouraged to apply to spend one semester as a Simons-Berkeley Fellow subject to the approval of the postdoctoral institution.<\/p>\n<p>Further details and application instructions can be found at <a href=\"http:\/\/simons.berkeley.edu\/fellows2014\">http:\/\/simons.berkeley.edu\/fellows2014<\/a>. Information about the Institute and the above programs can be found at <a href=\"http:\/\/simons.berkeley.edu\">http:\/\/simons.berkeley.edu<\/a>.<\/p>\n<p>Deadline for applications: 15 December, 2013.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Updates (11\/8): Alas, video of Eliezer&#8217;s talk will not be available after all. The nincompoops who we paid to record the talk wrote down November instead of October for the date, didn&#8217;t show up, then stalled for a month before finally admitting what had happened. So my written summary will have to suffice (and maybe [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"advanced_seo_description":"","jetpack_seo_html_title":"","jetpack_seo_noindex":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":false,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2},"_wpas_customize_per_network":false},"categories":[31,7],"tags":[],"class_list":["post-1558","post","type-post","status-publish","format-standard","hentry","category-announcements","category-self-referential"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/scottaaronson.blog\/index.php?rest_route=\/wp\/v2\/posts\/1558","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/scottaaronson.blog\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/scottaaronson.blog\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/scottaaronson.blog\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/scottaaronson.blog\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=1558"}],"version-history":[{"count":17,"href":"https:\/\/scottaaronson.blog\/index.php?rest_route=\/wp\/v2\/posts\/1558\/revisions"}],"predecessor-version":[{"id":1585,"href":"https:\/\/scottaaronson.blog\/index.php?rest_route=\/wp\/v2\/posts\/1558\/revisions\/1585"}],"wp:attachment":[{"href":"https:\/\/scottaaronson.blog\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=1558"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/scottaaronson.blog\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=1558"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/scottaaronson.blog\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=1558"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}