Once upon a time, about a quarter of a century ago, I went into a prof's office in Edinburgh and told him that neural networks were the future, because, look, that's how our brains work, right? Neurons and connections and firing potentials—those are the building blocks of human intelligence, so all we have to do to make AI work is, you know, do that in software. And if it isn't working yet, well, that just means we need a bigger neural network, or maybe different firing rules. I was in my mid-twenties at the time—about the age that many of today's ed-tech entrepreneurs are now—and when he tried to explain to me why things might not be that simple, I quickly filed him under "old fogey". I didn't bother to read any of the papers he pointed me at: what could those patient, painstaking, footnoted summaries of old news possibly have to say about the brave new world my friends and I were about to create?
Well, you all know how that turned out...
But now it's 2012, and Audrey Watters' post "What Should Every Techie Know About Education?" (a.k.a. "the Audrey Test") is generating some interesting comments. Some are thought-provoking, like Ben Huffman's that, "Every school superintendent, college president and graduate student in education would be able to easily pass the Audrey Test,and yet, the system is still broken." Others, like SM's, do a great job of show why techies need to do their homework before trying to "fix" education. (See BF's reply for a dissection.)
The Audrey Test has also led to some interesting questions. The two I was asked recently are:
How much does one need to know before one solves a problem?
How much of the status quo is deeply embedded within the theories of learning and academia?
My answer to the first is, "A lot more than I did two years ago (or even three months ago)." For example, watch this interview with John Sweller. I don't agree with everything he says, but my disagreement is based on gut instinct, i.e., on a poorly-examined mental model based on (probably inaccurate) recollections of personal experience. He knows more about how learning actually works than I do; that doesn't mean he's right, but if I don't take some time to learn what he and the constructivists he's criticizing are arguing about, the odds are that I'll repeat mistakes they made and moved past twenty years ago. Yes, there's a tradeoff between learning the past and building the future—there are only so many hours in a life—but if I could send email back in time to May 2010, I'd tell myself to spend an hour a day, every day, learning from others
The second question is harder for me to answer. On the one hand, many of the people I've found since I started doing my homework are much more deeply opposed to the status quo than I am. (In fact, I didn't realize how different from the status quo it was possible to be until Audrey's article pointed me at Paolo Freire, which kind of reinforces my answer to question #1.) On the other hand, every big institution's prime directive is to perpetuate itself, and one strategy that academia uses well is to keep people so busy talkin' radical to one another in safe little sandboxes called "departments" and "journals" that they never actually have any effect on the real world. Which brings us back to the Ben Huffman comment quoted earlier...
I don't know what the answer to #2 is. But here's what I currently think:
But that doesn't mean that technology changes everything, every time.
So if someone dismisses previous work in education without knowing what it was, they're probably doing so to give themselves an excuse not to do their homework.
Which means they're probably going to frame questions naively, reinvent a lot of wheels, and waste a lot of time. That will undoubtedly keep them busy, but that's definitely not the same thing as being productive.
To boil it down: if someone came to you and said, "I've never looked at the source code for Django, Rails, Struts, Zend, or any other web application framework, but I don't need to, because this new language I'm using is so different from everything that's come before that they're all obsolete," you'd admit that there was a chance they were right, but would you bet on it? Particularly if they'd never talked to anyone with, you know, actual real-world web application programming experience? I don't think it would matter how smart or hard-working or charistmatic or well-funded they were—I think you'd put your money elsewhere. So why is education different?
Later: today's post by Mark Guzdial is another example of why people like me need to know more about stuff like this. I respect Mark tremendously, and Hal Abelson's record is as good as anyone's; if they disagree, I think I have an obligation to do a little learnin' of my own before venturing an opinion. See also "Scottish Verdict".comments powered by Disqus