Tim Bray recently mused that "...the World Wide Web would serve well as a framework for structuring much of the academic Computer Science curriculum. A study of the theory and practice of the Web's technologies would traverse many key areas of our discipline." He then goes on to outline a curriculum that emphasizes theoretical computer science concepts relevant to modern systems... ...where modern == 2005. These days, forward-looking programmers care more about mobile devices, post-Moore's Law robots, and NUIs (natural user interfaces) than they do about the web. Shouldn't we look ahead to what students are going to want to know by the time they graduate, rather than look back to what was hot when they were in high school? In most cases, the answer is "no". It's hard to teach well unless you know how the pieces fit together, but it's hard to figure that out when the pieces are only half-formed, and are moving around all the time. For me, the fact that it actually does make sense to think about reorganizing the undergrad curriculum around the concepts needed to understand the web is a sign that the web has become "normal" in Thomas Kuhn's sense: we're filling in the blanks and extending in predictable ways, rather than fundamentally changing the paradigm.