Great Ideas in Theoretical Computer Science Lectures 8-11

A few more lectures, hot off the Turing machine tape:

Yeah, it’s all standard material, but if you don’t know it … well, this is one more place you can learn.

Apropos of computer science education, some of you may have heard the sad news that the College Board is killing the AP Computer Science AB test [Slashdot] [Pontiff]. The reasons they gave were that (1) not enough students were taking it and (2) of those who were, not enough were underrepresented minorities.

I took the AB test in ’96, despite not having taken the course (my high school didn’t offer it) and not knowing Pascal (the language they still used, before they switched to C++ and then Java). I did read a book to prepare, and that’s where I got my first exposure to O-notation, linked lists, and other crumbs from the feast table of theory. I devoured it all like a hungry dog licking the floor.

It goes without saying that high school students (including underrepresented minorities) won’t take an AP course if it isn’t offered, or if their schools or the colleges they’re applying to don’t take it seriously. Computer science education is a threat to many people’s pet ideas about the world: that CS is not a real subject and is devoid of intellectual content. That everything having to do with computers is mundane and technical, and should (for that reason) be outsourced to China and India. That if we must teach CS, we should at least focus on applications, and get rid of any math or logical thought. That CS is passé, a 20th-century relic that was long ago superseded by the life sciences (haven’t computer scientists gotten the message yet?). Against this backdrop, it’s not surprising that the AB test is being killed: what’s surprising is that it lasted so long.

Blogging will continue to be sporadic, as I have a cold.

Update (4/10): Lenore Blum just gave a talk here at MIT about the future of CS education, in which she echoed several themes of this post, but also argued (among other things) that the cancellation of APCSAB is good news given the course’s obsessive focus on the details of the programming language du jour, and given high schools’ refusal to treat it as a real academic subject, instead grouping it with “electives” like woodshop and metalworking. She held out hope that the US will someday develop a reasonable K-12 computer science curriculum, pointing to successful models from Israel and New Zealand. Needless to say, I have mixed feelings about canceling the mediocre (AB), keeping the truly bad (A), and hoping for the distant good.

81 Responses to “Great Ideas in Theoretical Computer Science Lectures 8-11”

  1. Chris Says:

    It really is a shame that they’re doing away with the AP CS AB exam. Remove that class and the only high school level AP CS class is now basically a primer on object oriented programming. Hooray.

    While I don’t think AP CS AB was really all that much more of a step up from AP CS A, I suppose it was something rather than nothing.

  2. Scott Says:

    Right, given that the AB was already severely watered down (though better than nothing), the A seems like a joke.

  3. Chris Granade Says:

    On the CS education topic, I’d like to point out that often times it is our fellow students who help perpetuate the “CS is not theoretical” myth. Students who, though quite intelligent, have been trained to believe that the theory is dry, uninteresting, etc. Right now, I’m in a “senior project” class that is completely free of theory, and some of my fellow students just look at me funny for complaining about that. It’s for reasons like that that my institution has started branching out into “Computer Engineering,” which focuses more on hardware in exchange for theory.

  4. Patrick Says:

    All I can say is:

  5. Faustus Says:

    > Computer science education is a threat to many people’s pet ideas about the world: that CS is not a real subject and is devoid of intellectual content.

    Straw man?

  6. Xerxes Says:

    I likewise took this AP test without having a class at my school and without previous exposure to Pascal. This was actually the last year the test required Pascal. It was very easy. People who complain that their schools don’t offer courses should kick themselves in the butt and do some studying of their own initiative. I took about a dozen AP exams, 4 of which were not offered as classes by my school. There is no requirement that you be enrolled in an AP course when you take an AP exam!

  7. Sean Pendragon Says:

    Is the Computer Science field saturated? Are there that many jobs available for Computer Scientists? Is there a purpose for investing several years of hard work in obtaining a CS degree? The economy changes. Job opportunities change. Is it cruel to step on a youngster’s dream and steer him/her into a more mundane field?

  8. Chris Says:

    “Xerxes”, what does taking AP exams for which there were no courses available have to do with the College Board canceling AP CS AB, besides serving the convenient purpose of allowing you to boast about your intellectual prowess back in high school? C’mon, you might as well start bragging about your SAT score, too…

  9. relaxing Says:

    Faustus: Agree. Who says this, Scott? Name names.

  10. Scott Says:

    Faustus and relaxing: Read the past comment sections of this blog!

  11. Jelani Says:

    I don’t know much about the AP system, but from what I do understand schools teach the AP courses and not teachers from some central organization. So, what does it actually mean that some central authority has dropped the course? Can’t schools just continue to teach the material anyway?

  12. Job Says:

    Ah, getting to the good stuff.

  13. Domenic Says:

    I think my new favorite phrase is “underrepresented minorities.” I.e., “the Asians were doing too well despite being a minority, so we’re going to create a new classification just to exclude them.”

    I would argue that APCSAB might not even have been that great of a computer science course. I think maybe using APCSA as “computer engineering” (perhaps with more content, since it is indeed really skinny), then creating a new Scheme-based course (or something similar) in the vein of Caltech/MIT’s intro CS courses, would be a good way to go. Because there are students who like both: granted, maybe not too many, but even just having the difference clearly enumerated would probably do wonders for high-school CS/CE engineering. After all, “computer science has as much to do with computers as astronomy has to do with telescopes.”

  14. Scott Says:

    Jelani: As a practical matter, dropping the test means that (1) if students took the course they couldn’t receive college credit for it (or have it count as an “official AP” for college admission purposes), ergo (2) far fewer students will take the course, ergo (3) of the small number of schools that now offer the course, almost all will drop it.

  15. Scott Says:

    Domenic: I agree, it wasn’t a great course. The point is that for many high-school students, it was the only exposure they got to CS, and could therefore do wonders for much the same reason a crappy math textbook did wonders for Ramanujan. Despite my intense fear of falsifiable predictions, I predict that the pipeline of prospective CS majors will now get a little drier.

  16. Job Says:

    You know, Computer science has more to do with computers than Astronomy has to do with telescopes. A telescope is not an implementation of the theory of Astronomy.

  17. Job Says:

    Neither are a telescope’s abilities limited by the theory of Astronomy.

  18. Domenic Says:

    Job: I would argue that both of your statements are in fact incorrect, but to get into the details would take us too far off-topic, so let’s agree to disagree for now.

  19. KaoriBlue Says:

    I took the AB test back in high school… Honestly, I really wasn’t impressed with the test or what one needed to learn in order to do well on it (please don’t think me arrogant, I’m not trying to make a statement about the test’s difficulty). However, what really troubles me is that the primary motivation for almost everyone I knew who took this and the other offered AP exams was to rack up high scores to get an edge during the college admissions process. The general attitude being: if it’s not tested, whatever.

    If kids care about math or theoretical computer science, they’ll learn the material (in a much more thorough and meaningful manner) regardless of whether the test is being offered. What’s really needed is a better way to expose kids to these fields (in a non-lame way). Scott, thank you for contributing to that effort by making your lectures available online.

    Finally, about the board’s comments that not enough minorities were signing up for the test… the kids that care about this material ARE a minority!

  20. Job Says:

    My point is that “Telescopes and Astronomy” is not the greatest analogy for “Computers and Computer Science”.

    I won’t say that you can’t argue for it with some degree of success, but there’s no way that a Telescope embodies as much knowledge of Astronomy as a Computer does of Computer Science, even accounting for my ignorance.

  21. Scott Says:

    Are there that many jobs available for Computer Scientists? Is there a purpose for investing several years of hard work in obtaining a CS degree?

    Sean: In terms of our graduates getting jobs, I think (Phil Greenspun notwithstanding) that we still compare pretty well to sociology, classics, film studies, and dozens of other popular majors. Indeed, if you plotted all fields on intellectual depth vs. practicality, I think we’re pretty far out on the Pareto curve — but I would think that, wouldn’t I? 🙂

  22. harrison Says:

    (1) not enough students were taking it and (2) of those who were, not enough were underrepresented minorities.

    Bah. Just…bah.

    First of all, I’d like to point out that the College Board is a non-profit organization. They aren’t driven (as much) by the bottom line as a Fortune 500 company; even if they’re losing money on the AB test, they’re probably getting a net plus elsewhere which they can use to cover the difference… (OK. As a high school senior who’s spent most of the past nine months applying to colleges and scholarships, I can tell you that they probably make like $50 mil a year just sending out SAT score reports. Cash-strapped is one thing they ain’t.) So I’m not at all sure what point (1) is even supposed to mean.

    As for (2): Yeah, and dropping the CSAB course will get more underrepresented minorities interested in the subject…

  23. Job Says:

    The biggest risk to CS technical jobs is the innate automatability of any programming tasks. We’re guaranteed to reach a point where mostly everyone will be able to program and where mostly no one will have to program.

  24. DrRainicorn Says:

    Job: Computers don’t have to embody any substantial amount of knowledge about CS. All you need is something that acts like a Turing Machine. Of course you can then start optimizing the way it computes certain functions and so on, but I’m sure you can also optimize telescopes for the things you want to observe.

  25. Job Says:

    If you were to substitute Physics for Astronomy then i would agree. But in my interpretation telescopes are outside of Astronomy. Astronomy doesn’t study telescopes, but Computer Science studies the theoretical equivalent of a computer.

  26. Scott Says:

    The biggest risk to CS technical jobs is the innate automatability of any programming tasks.

    Turing had a more positive outlook: “There need be no real danger of [programming] ever becoming a drudge, for any processes that are quite mechanical may be turned over to the machine itself.”

  27. Job Says:

    But the programming of the mechanical can be turned over to the machine itself and successively until the most mundane tasks are taken care of. Anything we understand we can automate especially if we decide to move in that direction.

  28. Job Says:

    From now on i will refer to Astronomy as “Telescope Science”.

  29. Job Says:

    Scott, i misunderstood your point, and i agree.

  30. DrRainicorn Says:

    I guess you could define Astronomy as the study of things that you can see through a telescope, but that would neglect the fact that those things could be observed through other means (such as visiting them in space craft), much like how the name “Computer Science” may under emphasize the fact that things that are not designed to be computers do computation.

  31. Job Says:

    Anyway, i hate to post 4 consecutive comments but originally upon reading this blog entry and lecture 9 i was reminded of a question i once had that my TA wasn’t able to answer: “how do you prove that a problem A in NP isn’t NP-Complete”? Well, you prove that a reduction from an NP-Complete problem to A is impossible. Isn’t that right?

  32. Scott Says:

    Job, it’s true that the AP CS content is “mundane” — in the sense that this material was completely understood by the 60’s, and now occupies a lower level than what programmers or theorists normally have to worry about. Spelling, geography, and arithmetic are all mundane in exactly the same sense. But in each of these cases, the key point is that human understanding isn’t modular: if you’re not comfortable with the layers of abstraction just beneath you, then you almost certainly won’t be comfortable with your own layer either.

  33. Scott Says:

    Scott, i misunderstood your point, and i agree.

    Sorry Job, my last comment was posted before I saw this one.

    Regarding how you prove that a problem A in NP isn’t NP-complete: as a first step, by proving P≠NP! (Note that if P=NP, then all NP problems are NP-complete.)

  34. Job Says:

    AP CS content wasn’t what i had in mind when i mentioned “mundane”. I was referring to tasks, rather than knowledge – as in programming x to do y.

  35. Job Says:

    Well, in my TA’s defense, i think my original question was for any A (not just A in NP). The NP portion was all me.

  36. Job Says:

    Although, for any problem:
    1. prove that A is not in NP
    2. if A in NP prove that P!=NP

  37. Will Says:

    I think the point that’s not being addressed here is that they’re _keeping_ the CS A test. What’s the point? Why not drop that one and make everyone take the AB one?

  38. Jack in Danville Says:

    A few years ago I had a site,, which was (and still would be) the only place you could fully search and browse the entire Statistical Abstract of the U.S. (and some additional statistical data as well). One abstract table on A.P. tests was footnoted as copyrighted by the C.B., so I contacted them to acquire permission to republish…boy! First of all the person in charge had never heard of the Abstract. After a lengthy exchange of emails, I finally did get permission, but only if I didn’t make accessible all the other data the C.B. had buried deep in its website.

    Back in my day there were only a handful of A.P. tests. I took two, one of which my high school had no course for (and that’s the one I got the highest score on).

  39. Jay Gischer Says:

    When I was a kid, there wasn’t even any such thing as AP, or personal computers, even. I learned programming in college with punch cards, in the snow. (Uphill both ways).

    When I applied to the Ph.D. program at Stanford (in 1979, ancient history), they said in their literature that they weren’t all that interested in CS majors, they thought your undergrad time was better spent on something else, like math or physics or EE or something. So the whole “CS doesn’t have intellectual content” thread of thought has been around a long time.

  40. asdf Says:

    I’m not sure why it’s interesting or important that there’s no encryption algorithms known to be based on NP-complete problems. It doesn’t necessary that decryption be in NP. Imagine “encrypting” your message by writing it on the facelets of some abstract, generalized Rubik’s cube, then scrambling it up with a sequence of twists (the “key”) known to the receiver but not to the attacker. This is an encryption algorithm based on the word problem for finitely presented groups, which is much worse than NP-hard, it’s not even Turing-decidable. An algorithm like AES-128 or RSA-1024 is in NP because the attacker can try all possible keys up to length 128 or 1024 or whatever, and either find a decryption or prove that none exists. But with the undecidable algorithm, the attacker might try all keys up to length 1 billion and give up without finding a decryption, because you chose a key of length 2 billion.

    I believe algorithms like the above have been published, though apparently they’re not terribly practical. But then, neither is the Blum-Blum-Shub RNG compared with conventional cryptographic RNG’s.

  41. Job Says:

    If instead of using a Rubik’s we were using the Clique problem, and storing the message in the nodes of a clique of n/2 nodes on a pseudo-random graph of n nodes then, to avoid ambiguity, we’d have to keep the number of cliques of size n/2 to 1, which sounds NP-Complete if we wanted to be able to use any graph of n nodes with a single n/2 clique.

    If we allow multiple solutions, then we’d have to package in extra information to identify the correct clique which could be used to break the encryption faster.

    What’s the complexity of a modified clique problem where the decorations of the nodes of one of its cliques follow a (known) polynomially computable function?

  42. Job Says:

    No, actually the more ambiguity the better, send the location of the clique to the client as the decryption key, place the message in the clique.

  43. Job Says:

    But using 128 bits, we’d have 128 bits for each 8 bits of data i think (128 = 16 nodes = 8 nodes of clique). So a message of X bits would take 128*(X/8) = 16X which is probably a little heavy.

    But we can store the message in the edges of the clique which would take it to 128*(X/28) = 4.5X which isn’t that bad. To break it you’d have to check approx 3,938,220 subsets of 8 nodes which is totally doable.

    With 1024 we’d have 45 nodes, 23 clique nodes, 253 clique edges. So 1024*(X/253) = 4X (huh, it goes down, i did something wrong). And you’d have 17,898,762,4513 subsets to check.


  44. Daniel Says:

    Scott, Lecture 11 does not give a proper description of parameterized complexity. Its goal is to restrict the exponential increase of the running time to a parameter k, instead of the input size n. So it is not about being able to solve a problem in polynomial time for every fixed k (as it is implied by your description), but being able to solve the problem in *uniformly* polynomial time, i.e., the degree of the polynomial is the same for *every* value of k. For example, Minimum Vertex Cover can be solved in (c^k )n time, where c is roughly 1.3. So this can be an efficient algorithm for say, k=50, and n=100000, whereas an n^k algorithm would be hopeless in this case.

  45. Scott Says:

    I’m not sure why it’s interesting or important that there’s no encryption algorithms known to be based on NP-complete problems. It doesn’t necessary that decryption be in NP.

    That’s true if the keys can be arbitrarily long and are never reused — as in your Rubik’s cube example or for that matter a one-time pad. But as soon as people start reusing keys, the decryption problem quickly becomes an NP problem: first, find some messages for which you know the plaintexts (as they were able to do at Bletchley Park, etc.); next, find a key that when applied to those plaintexts yields the ciphertexts (that’s the NP problem); finally, use the key to decrypt future messages. And in that case, it becomes interesting to ask if we can at least get security assuming P≠NP.

    The other point is that in modern cryptography, one considers a huge number of tasks besides secure message transmission (authentication, zero-knowledge proofs, pseudorandom number generation, …), and for many of those, the connection to complexity is even more direct.

  46. asdf Says:

    But I think “find a key that when applied to those plaintexts yields the ciphertexts” is not necessarily an NP problem. For the sender and receiver who have the key, encryption and decryption are in P since the length of the key (to them) is constant. But for the attacker, the key has unknown size (maybe exponentially larger than the messages) and any new message that they intercept might not be a ciphertext at all. There is not necessarily a proof polynomial in the size of a plaintext-ciphertext pair (P,C) that C is the encryptin of P.

  47. asdf Says:

    Btw I had thought there was a way to edit comments here (as you can see, my last few could use some fixups). Am I missing something?

  48. David Moles Says:

    I miss Pascal.

  49. Falk Hüffner Says:

    Lecture 11 seems to imply that parameterized complexity is about “polynomial-time for each fixed k”. That is wrong; fixed-parameter tractability is a stronger notion. An n^k algorithm (as in the example for Clique) is clearly useless. Something like 2^k * n (with fixed degree of the polynomial of n) would be interesting (and show fixed-parameter tractability), although incidentally for the clique problem it is probably not possible to get it (since Clique is W[1]-hard).

  50. Scott Says:

    asdf: Sorry, you’ll have to petition me if you want to change/delete your comments. 😉

  51. Scott Says:

    Daniel and Falk: Thanks! This being an undergrad class, I wanted to give the simplest possible example of taking advantage of a parameter besides n being small. But I shouldn’t have used the term “parameterized complexity,” which (as you rightly point out) has a much more specific meaning. It’s fixed now.

  52. Scott Says:

    asdf: In cryptography, you can’t get away with using a tiny key, and then claiming that your system is secure since “for all the attacker knows, the key might be exponentially long”! The problem here is obvious. The burden is on the cryptosystem designer, to show that no polynomial-time attacker can even guess the message with any non-negligible advantage. It’s not on the attacker to mathematically prove that he decrypted the message correctly.

  53. Not Even Right Says:

    Though I wasn’t able to attend the computer science courses at university, I am glad that I can read some of the relevant lecture notes from this website. I hope I can understand the notes with my limited maths.

  54. albert Says:

    My farther learn computers, work with CARDS, basic maybe, now writing simple programs for metal details with 286 and the same old monitor (at least it was several years ago). Almost everything forget about integrals and diferential equations and perfectly solving under 12 class math and physic problems.

  55. Jonathan Vos Post Says:

    “Education is what remains after we have forgotten all that we have been taught.”
    — George Halifax

  56. Peter Morgan Says:

    Kids who are going to be any good at CS can surely come to it through pure Math. People outside of university may think that CS is about practical programming skills, but I hope that no research Mathematician thinks so. The US system of not specializing until the Junior year means that surely any student who might do well in CS would notice the existence of CS as a separate, interesting, and challenging branch of Mathematics (can I say that without getting flamed?) that is distinct enough as a discipline to have its own department? Until a Sophomore realizes that what they really want to do and are good at is CS, any time spent on doing AP Math or learning about the proof process in Freshman Math classes is probably not wasted. At the Freshman level, CS is a subspecialty of Math. Of course this is a 70s educated speaker, back when there was no such thing as undergraduate CS: my recollection is that we math undergraduates knew well enough that there was such a thing as postgraduate CS, and that it was not an easy or theoretically lightweight option.

  57. Scott Says:

    Peter: Point taken. The more time I spend interacting with students, the more I understand how math is the prerequisite for CS; I suspect that’s as true today as it ever was. (If you can’t prove anything, then in particular you can’t prove the correctness of algorithms, nor can you make simple inferences like the one I just made.)

    What worries me much more than prerequisites is the initial spark of curiosity. For some nerds that spark comes from math, but for others it comes from programming — or from physics, philosophy, robotics, etc. And the fewer K-12 classes there are that provide any sort of content (even in diluted third-hand form), the smaller the chance that a given student’s spark will ever get lighted.

  58. Job Says:

    Doesn’t NP-Complete encryption really amount to:
    1. sender establishes a message size of n bits.
    2. sender picks L=~n/2 bit locations that contain actual data
    (and their order) and sends that to the client
    3. when sending data, the sender breaks the data into chunks of L bits, distributes the L bits into the picked bit locations and fills the unused bit locations in each message with a random value.
    4. the client looks only at the agreed bit locations, reorders the valid bits and ignores the rest.

    A hacker would have to determine which bit locations were being used (which is really subgraph isomorphism).

  59. Yatima Says:

    Job: “We’re guaranteed to reach a point where mostly everyone will be able to program and where mostly no one will have to program.”

    As a member of “the industry” and the programming fraternity, let me just say this: Red. Herring.

    The fallacy of above is that programming is considered akin to writing linear text. Not so. Programming is about design and about planning in a N-dimensional space of possible histories. After doing requirements elicitation at the customer’s premises.

    “Mostly everyone” is unable to do anything of the sort properly. This goes even for the people in the cubicle arena (*cough*.) As for mostly no-one having to program — well, we always have been in that situation, really, haven’t we?

    I daresay that the same confusion is responsible for regular cries of “why can’t we construct software the way we construct cars”, made by academic or industry mavens, no less. Ridiculous.

  60. Peter Morgan Says:

    Scott, Let’s take it that a Math undergraduate who likes the pure stuff would notice that CS exists; chances are reasonable they would at least try CS in a Freshman or Sophomore course, then they’re in the hands of whoever teaches them. I think the same is true for someone who comes to CS through the logic side of Philosophy. Once someone takes a course, CS has to hope for either a spark or at least kindling from the teacher, from an assistant, from a fellow student, from a book, wherever. Where the spark or kindling comes from doesn’t matter much, only let there be as many matches on display as possible.

    That leaves you worrying about the Nerds, but I think a Nerd who started programming or building Bots when they were 8 or 10 who hasn’t accidentally clicked through to your blog, or some other source of the good stuff, by the time they’re 18, really hasn’t wasted enough time on their hobby to be worthy. A smart enough kid should figure out from the general science teaching at school, from SciAm, from Arthur C. Clarke, etc., etc., something like an idea that science of X is a systematic, often mathematical approach to X, which gives them a rough first guess at what Computer Science might be.

    To be good as a CS researcher, a Nerd has to have enough intellectual curiosity about ideas in the abstract when they first come across some CS fact or another when they’re 16, say, to wonder just what CS is. A Nerd is obsessive; if they find a good target for their obsession — CS surely is a good candidate (OK, we’ll call it fascination) — they will obsess. Some Nerds obsess only about concrete things, the high abstractions that are at the heart of CS are just not their cup of tea; for both CS and the Nerd, the Nerd’s choice to do something else is probably a good thing.

    In the 70s at least, the proof process was not part of the school Math curriculum. My first exposure to how to prove something, when I discovered that I was always going to be an applied Mathematician, was my first year of an English undergraduate Maths degree. Computer programming is essentially constructive, almost sui generis with building with LEGO. Insofar as CS is essentially a pure Math approach to computer programming, I can’t believe that being good at programming is a good predictor for success at CS. Nothing at school prepares you for the whole new world of proof. The proof process is the Angel at the Gate for CS.

    I’m not sure that a “spark” is what happens. For me there was a years-long period of dawning realization in my early teens that I could do Math better than almost anyone at my school. Positive reinforcement was frequent. Negative reinforcement in other spheres was also frequent. When I got to university, I wasn’t anything like a standout, but I was already committed. I went within Math where it seemed I could do stuff, which meant no pure Math for me, all applied options, Mathematical Physics, definitely no CS, but a bit of programming (on cards). Later, in my thirties, I read a few dozen books about Quantum Theory that made me cross, and I resolved to show those dumb Physicists why they’re so wrong, but the process of becoming committed to that project, now almost two decades long, was also at least a few years in the making.

    Also perhaps relevant, I had a nascent fascination with Chemistry killed stone dead by a year with a timeserving Chemistry teacher at age 13. It might be better not to have CS taught in schools at all unless it’s taught by teachers who at least don’t suffocate it. Losing Chemistry brought me to Math, so I‘m not complaining, but perhaps Chemistry might.

    I’m curious, Scott. What was your spark? Since you talk of a spark, was there a single moment, or was there a gradual development, to the point where you suddenly realized that you had become, inadvertently, a Computer Scientist?

  61. Anonymous Says:

    Anyone (in your class or otherwise) notice the error on pg4 of lec10.pdf? The 3-colour version of the AND gate needs a NOT between node 2 and node “x AND y”.

  62. asdf Says:

    Job, encryption is more complicated than that, what you describe falls immediately to a chosen-plaintext attack (say the plaintext is all zeros). R. Impagliazzo has a good article on P/NP and cryptography (actually average-case complexity):

  63. Scott Says:

    I’m curious, Scott. What was your spark? Since you talk of a spark, was there a single moment, or was there a gradual development, to the point where you suddenly realized that you had become, inadvertently, a Computer Scientist?

    I guess there were a few sparks. I wrote about the first one here. Another came when I went to Mathcamp in Seattle the summer after I dropped out of high school, and (among dozens of other incredible lectures) heard Karp talk about NP-completeness. (Incidentally, I’ll be lecturing there this summer. When they asked me, I didn’t have to think about it.)

    Then, after I went to the Clarkson School, I did a Westinghouse project on hypertext design (which lost, though I managed to publish it later), supervised by Chris Lynch in the CS department. Besides a thousand other things for which I’m still indebted to him, Chris showed me how to analyze the complexity of my hypertext problem and directed me to the Garey-Johnson book. From then on I knew I wanted to be some kind of computer scientist, though from there to quantum complexity theory would require a couple more epiphanies…

  64. Job Says:

    asdf, that’s a problem but i was mostly trying to understand what a generic implementation would look like. One solution would be to have the sender populate the unused bit locations with bits chosen from the data that’s actually being sent.

  65. roland Says:

    > if P=NP, then all NP problems are NP-complete.

    exept {} and Sigma*

  66. Scott Says:

    Roland: Even those are NP-complete under Turing reductions (which I’m pretty sure is what I was talking about).

  67. roland Says:

    i had many one reductions in mind.

  68. Paul Steckler Says:

    Life sciences, you say? What about the CS content of

    I recently had occasion to investigate phylogenetic reconstruction, and it’s all about creating appropriate
    data structures and devising feasible algorithms.
    Parallelization is used in some strategies. As far as
    I can tell, bioinformatics is just CS with NIH funding. 🙂

    — Paul

  69. Scott Says:

    The 3-colour version of the AND gate needs a NOT between node 2 and node “x AND y”.

    Thanks, fixed!

  70. Jonathan Vos Post Says:

    Two more reasons why I so admire Scott and his blog!

    I’m fascinated that Scott did a Westinghouse project on hypertext design (which he lost, though managed to publish later). Somewhat parallel to that, I won an Honorable Mention in the Westinghouse Science Talent Search of circa 1966, on Physical Chemistry. That was not much better than a loss at my high school (Stuyvesant) which boasts many Westinghouse winners. I’ve cited the losing work in various papers subsequently, as I now and then publish peer-reviewed Mathematical Biochemistry.

    The Hypertext parallelism is more unusual, as I met Ted Nelson in 1973 or 1974, having first heard of him from my mother for his filming of John Lilly’s dolphin lab experiments. His Academy-award-winning mother cooked me a lovely dinner, said that she didn’t mind her son writing a book about computers, because books were part of Show Biz, but hoped that he’d stop wasting time with computers as such. His dad also won an Academy Award.

    Mark Miller and I then co-implemented for Ted Nelson (in TRAC), working in my fiancee’s apartment in New Jersey (while she worked at Bell Labs) the first working demo of Hypertext and Hypermedia, with pull-down menus, hotlinks, wysiwyg text editing (Ted Nelson’s JOT = Juggler Of Text), and stick-figure animation; on a
    heterogenous network of Processor Technology Sol-20, Imsai, and Cromemco, running through a high-level network operating system kernel we hacked together, over the S-100 bus; and demo’d it at the World’s First Personal Computer Convention, Philadelphia, 1976; long before Apple, Tandy, or IBM made personal computers. Let’s say that, working in collaboration with Ted Nelson, I co-implemented a working Alpha Test of the World Wide Web, and showed that it could be learned in under 5 minutes by an 8-10 year old, who could then teach it to a younger sibling.

    Whoah — was that 32 years ago already?

    Ted Nelson, when he accepted the top award at, I think, the 5th WWW Conference, said (I paraphrase from memory): “You all thought I was crazy when I explained Hypertext and Hypermedia. Now you all use it. But you still think I’m crazy, and don’t listen to all the other things I say.”

  71. Aoeuidhtns Says:

    Alright, forget the board. Some CS professors could annually design a test. It could be as simple as a paper-and-pencil role playi test, proctored by the high-school teachers.
    Granted, few if any schools would award credit for it, but 1) a high mark on the test could still be listed by the student as an accomplishment; 2) the students would be exposed to CS content — and in this case the content would be designed by people who are actually doing research; and finally 3) high-achieving students could be offered scholarships or otherwise enticed to study computer science.

    The fatal flaw is obviously that no professors that I know have time to do something like this. Unless, of course, it was done as part of a grant proposal 🙂

  72. Raoul Ohio Says:

    Consider three well know goofy statements that are often quoted to prove a dubious point:

    1. “Humans only use 10% of their brain”. Freud?

    2. “Doing the same thing and expecting a different outcome is the definition of insanity” Einstein?

    3. “Computer science has as much to do with computers as astronomy has to do with telescopes.” Dijkstra.

    The only interesting aspect of statement 1 is why anyone believed it in the first place, and that it is still in print after a century or so. I don’t know if it was Freud that came up with this, but the same applies to much of what he wrote. I have wondered if Freud was bright guy with some major issues, or was just making up nonsense off the top if his head to scam the rich people who hired him.

    Statements 2 and 3 are more interesting in that they are largely nonsense but have a glimmer of truth.

    Is it known if Einstein actually said that, and was he joking? Given that he didn’t believe in quantum phenomena, the statement makes sense in strictly controlled situations. But I doubt if his batting average was very good in softball games at the company picnic. Or picture him in the annual IAS vs. Princeton Physics dept. basketball game, switching shooting hands whenever he missed a foul shoot.

    As for Dijkstra, my guess would be that he thought it was a funny way to make the point that CS goes far beyond computers. As I recall, back in the 1970’s, for some reason Springer-Verlag included a free copy of a book of Dijkstra’s essays with every order. Very interesting book. It contains a rant about the kind of country that does not get the day off on the Queen’s birthday. My main recollection is that he enjoyed writing provocative stuff that would obviously tick everyone off.

  73. Mauro Says:

    Great material. Thank you!

    Perhaps, in the lecture that mentions Dijkstra’s Algorithm, you could also mention that the algorithm may have helped the lecture notes find their way through the internet to the reader’s computer (with the OSPF protocol).

  74. the reader from Istanbul Says:

    Lecture 11 says: “coNP is the complement of NP”. Wrong, right?

  75. Michael Says:

    A question on lecture three: The Puzzle.
    Basically I do not see how you change any input values: using a ‘and’ or ‘or’ to combine variables does not change the values of the variables it only gives you an output. In fact even applying a ‘not’ to a variable does not change the value of variable but gives an output that is its opposite value.

    I do not see that how the output of any of these operations effects the value of the variables.

    BTW I find these lectures very interesting much more interesting than the lectures I took at Columbia 30 years ago.

  76. Scott Says:

    Michael, in a computer science circuit you don’t change the variables. The goal is just to compute an output from them.

  77. Scott Says:

    Reader from Istanbul: Thanks, will fix!

  78. Jonathan Vos Post Says:

    Which is the better anagram, and why, for scott aaronson:


    Sonata Consort

    A Contrast Soon

    Satan Con Roots

    Can Assort Onto

    As Onto Cantors

    Not As Cartoons

    something else (if so, be specific).

  79. Kea Says:

    Toronto ass can

    …from Mirna Deepsharp

  80. Bruno Says:

    On lecture 9, you say that the “Duh”problem (given a Turing Machine M, is there any y such that M(y) accepts in less than n^k steps) is computable. How is that possible? Isn’t this problem r.e. (ie, if there is such y, we will find it), but not co-r.e? I can’t see any way to decide that such y does not exist, even in exponential time, because we would have to test all y’s before rejecting.


  81. Bruno Says:

    Oh, forget about it. This is a deterministic machine; you need at least n^k steps to check an n^k sized input.