TL;DR: I propose a two-day workshop to teach programmers what they need to know about politics, economics, and justice from a progressive point of view in order to succeed in tech. The approach draws on the success of previous workshops and needs $80K-$100K to implement.
I grew up on 70s detective shows: The Rockford Files, Columbo, and Hawaii Five-Oh were the highlights of my week. One thing I remember from them is that every mystery starts with a victim, and that every case revolves around method, motive, and opportunity. Forty years later I find myself using those same principles to think about teaching. Every lesson starts with a learner, and every teacher has to answer three questions:
- Method: how will we teach them?
- Motive: why are we teaching them (and why are they learning)?
- Opportunity: what can we do to make their learning possible?
Lesson content isn’t on this list because it’s a consequence not a cause: once you know who, how, why, and what, the rest is straightforward. For example, if I was starting Software Carpentry today, my “case” would be:
The learner is a graduate student who may or may not have done a programming class in high school or as an undergraduate, but isn’t really comfortable with loops or functions. They do most calculations interactively in Excel or SAS and record their results by copying files to folders with names like
Their motive for learning basic programming skills is that there’s a big pile o’ data between them and their thesis. We want to teach them because it’s painful to see talented young people spend weeks doing things that could be done in hours, and because the more they know, the more likely they are to get the right answer.
Our method is going to be intensive two-day workshops, in person and on site, using live coding on the learners’ own machines.
We will create the opportunity to teach through:
- an instructor training program to turn people who already have these skills and program alumni into volunteer instructors;
- sponsorship from professors, universities, government labs, and professional societies that understand the need for this training; and
- word-of-mouth advertising combined with articles in open access journals making the case for this approach and these skills.
I think I got the first three right at the time, but I didn’t pay nearly enough attention to the fourth. Researchers’ need for remedial computing classes was so clear that I fell into the trap of thinking, “If you build it, they will come.” The world is getting noisier by the day; how will people find out that you’re doing something that will help them? How are you going to make your effort self-sustaining so that they’ll still have an opportunity to learn next year when you realize you’ve been away from home too often and for too long?
I am thinking about all of this again because
teaching programmers how the world works
now seems more important than teaching researchers how to program.
Earlier this year I outlined
a four-day workshop
that would teach 60 people how to engineer structural change.
I now believe that would be like
starting a programming class with object-oriented programming and performance optimization
when people are still struggling with
for loops and
I think we should start with something like this:
The learner is a male white or Asian programmer in his mid-20s who has been working in industry for four years. He would protest indignantly if you called him racist or sexist, but grumbles occasionally to his friends about his company’s “diversity hires”. He gets his news from Twitter and thinks that politicians are stupid, corrupt, or social justice warriors.
We want to teach them because we need their support if we’re going to fix what’s broken in tech. Changing the minds of a few well-meaning CEOs won’t do it: if we don’t reach the hearts and minds of the rank and file, they will say what they know they’re supposed to in public and then keep thinking and doing what they’re thinking and doing right now once they’re back at their desks.
But what’s their motive? Why would the guy described in the previous bullet point show up of his own accord and actually listen? My current answer is that this is now something they need to know in order to survive and thrive in tech. Society isn’t going to let Zuckerberg and Bezos play robber baron forever, or keep overlooking Lutke’s support for white nationalist hatemongers. What do you need to know to be ready as tech starts to take its social impact and obligations seriously? What do you need to know in order to understand where GDPR comes from or how monopolies have been broken up and regulated in the past? What is a benefit corporation, and is the Hippocratic License actually workable? If the courts come after your startup for algorithmic bias, what precedents will they be drawing on and how can you make sure that you’re on the right side of the line both legally and morally?
Our method is going to be intensive two-day in-person workshops for two dozen people offered at tech conferences, on-site at companies for their staff, and in community venues for those who can’t get to either of the previous kinds of offerings. Each day will comprise four 90-minute sessions; each session will have 30-45 minutes of instruction with 2-3 formative assessments followed by a longer small-group exercise and 10-15 minutes of whole-group wrap-up.
We will create the opportunity to teach through:
- an instructor training program to help activists in tech and program alumni learn how to teach this themselves;
- sponsorship from companies whose leaders understand the need for this training (probably through a “buy one, pay for two” model); and
- social media endorsements from those same leaders coupled with advertising by conferences hosting sessions.
What would we teach? I would borrow the method of Freakonomics: instead of preaching outright that the only way to analyze social relationships is through market exchange of economic value, the authors nudge their readers and say, “Hey, do you want to feel smart? Do you want to see what’s going on behind the curtain?” and then tell stories that have unexpected twists. It’s a great way to hook people whose self-value comes from being the smartest person in the room, and has made Freakonomics the most influential piece of propaganda for neoliberal capitalism in the last twenty-five years. My starting point would be:
- How did we decide that someone can own something intangible?
- Why does discrimination persist when it’s economically inefficient?
- Why do bullshit jobs proliferate as companies get larger?
- Why does the US put so many more people in prison than other countries?
- How do we decide what is and isn’t a human right?
- Why are drugs and sex work illegal even though they’re ubiquitous?
- If central planning is inefficient, why do all successful companies do it?
- How are dictators and oligarchs able to stay in power?
These topics are chosen because they allow us to sidle up to the larger points we want to make without scaring our intended audience away. We can’t start with Audre Lorde or Ijeoma Oluo or Zeynep Tufekci: again, that would be like starting an introductory coding bootcamp with callbacks and recursion. I think we can learners to the point where they could tackle Tressie McMillan Cottom’s Lower Ed or James Scott’s Seeing Like a State, but I’m not the right person to make that decision: as a straight middle-aged white guy who got into tech in the 1980s I have a lot of blind spots.
I would therefore want to recruit half a dozen who understand the industry and the people in it—who have worked with our prototypical learner and know what he’ll actually listen to—and have them decide what stories are most likely to be effective. I would want to pay them for their time, so at $10K per lesson, getting this workshop into a runnable state will cost something like $80K-$100K. That may seem like a lot, but it’s actually a bit less than it cost to get Software Carpentry up and running back in 2010, and if a dozen sponsors each get a workshop in return for their backing, it’s right in line with what other professional training costs.
Will this work? I don’t know, but I want to try. I watched late-night horror movies back in the 70s as well as cop shows, and I never understood why some people hid in the basement and waited for the monster to find them instead of doing something. I’ve been frightened and angry for the last three years watching tech moguls amass obscene fortunes by amplifying hate while democracy rots and the planet burns. Giving young programmers the tools they need to see that for what it is won’t fix it right away, but what we’re doing right now isn’t going to fix it at all.
You can’t spend your whole life criticizing something and then, when you have the chance to do it better, refuse to go near it.
— Václav Havel