Bill Steiger taught the first weeks (finite automata), but unfortunately he has had to return to New Jersey unexpectedly, and I was asked to take over.
So why did I accept? On the one hand, I can cite Feynman on why teaching is important: if you research fails for a while, you can at least find comfort in your constant teaching. But since said comfort is so easily found in wine, this does not quite explain it.
The main reason I accepted was that I remember taking this course with Mike Sipser in sophomore year: it was all so much fun. No student needs to go through this! This course should be significantly redesigned, and by teaching it, I can take a stab at the new version.
The main reasons I dislike the course are: (1) the mental masturbation: teach kids clean unimportant topics (because they're clean and elegant); and (2) the very out-dated perspective, which might have seemed elegant at the time, but with time has proven misleading. Here are some examples:
- the pumping lemma. In the era in which limited state matters (think streaming algorithms), and we have many exciting developments in understanding limited state (by lower bounds), we have students memorize a mechanical way to show that {w wR} is not regular.
- push-down automata. In case you don't remember this: push-down automata are not a nice limited class of machines that can parse grammars --- they are nondeterministic ! Grammars that need the whole nondeterministic shebang are parsed via dynamic programs (most famously, the Earley). More restricted classes are parsed by algorithms with cryptic names, like the LALR parser.
- the lack of exciting stuff in the undecidability section. People in programming languages and databases go on and on about decidable and undecidable logics. Surely there has to be a more exciting example than the Post correspondence problem (and, no, all rights to use "PCP" for this problem have been revoked as of 1990).
- Turing machines. All computer-looking models of computation are equivalent within logarithmic (often constant) factors. Saying that we cannot fix a model of computation with less than a polynomial imprecision in the running times can only lead to a new generation of seriously misguided theorists.
- the P vs NP fetish. I have too much to say about this, so let's keep it for another post. Suffice it to say that the P vs NP fetish has led to people ignoring major topics like fixed-parameter tractability or SAT sparsification.
- a dry approach to space complexity. While things like NL= coNL are reasonably interesting, they seem to me far less interesting than space/time trade-offs. The issue of space/time trade-offs (which has some connection to what computation means in the real world) is completely buried by looking only at the space complexity alone, leading to some unexciting language theory.
My attempts to "fix" the course will be documented on this blog. In the time, please suggest your favorite topic that you think should be covered in introductory complexity courses.
Sounds exciting! Looking forward for more updates.
ReplyDeleteIt seems that Berkeley students will have a lot of fun.
ReplyDeleteWe "fixed" it at my university by essentially nixing the course. Little bits survived here and there. Grammars in compilers, P vs NP in algorithms, undecidability (using virus detection and debugging) in software engineering.
ReplyDeleteTuring Machines are past their due date, so they didn't survive.
Well, I wouldn't be too happy about the course going away altogether. Students should hear about basic complexity.
ReplyDeleteWould you do PSPACE in an algorithms course? Probably not. More likely, it would fit in AI next to alpha-beta. But just distributing the course around is losing a lot of contents and making for an unclear picture.
I'm sure you'll do a great job teaching the course. Sure, get rid of the pumping lemma; it only worsens understanding of what regular is.
ReplyDeleteI do not understand what you are saying about pushdown automata. Do you want to get rid of them? Can you give more details?
For undecidability, I think the point is to show just how little is needed to get there. My suggestion is to use cellular automata or Minsky machines.
What is your suggestion for replacing Turing machines?
I am curious about hearing your opinion on P vs NP; I can't wait for the other post that you promised.
As for space complexity, I think it's important to define all "standard" classes; then explain why we can't prove almost anything about separations. Sure time/space tradeoff would be nice as well, if you have the space and the time to do them :D
Good luck with the class.
I don't know what you find "exciting", but a basic example of an undecidable problem is Peano arithmetic, which can be contrasted to Presburger arithmetic.
ReplyDeletePost rox!
ReplyDeletehttp://www.motoimportada.net
Post rox!
ReplyDeleteI'm really interested in this... yes... more on this topic
ReplyDeleteAnonymous #4, if the pumping lemma worsens understanding of regularity, either it's not being taught properly, or the students aren't spending enough time understanding it. It isn't a litmus test for regularity. When I learned this subject (also from Sipser at MIT, many years ago), he made a clear distinction between the use of the pumping lemma for proving non-regularity vs. finding regular languages (or grammars) to prove regularity.
ReplyDeleteI am not a theorist, but I occasionally make comments on my experience of the subject as an undergrad and grad student. Look at the "problem solving" and "learning" memories in my journal for more info.