2004 · 2005 · 2006 · 2007 · 2008 · 2009
2010 · 2011 · 2012 · 2013 · 2014 · 2015 · 2016 · 2017 · 2018 · 2019
2020 · 2021

Staying Safe Online

I am pleased to announce that “Ten quick tips for staying safe online” by Danielle Smalls and myself has been published by PLoS. Researchers studying everything from COVID-19 to sexual health and gun violence are increasingly targeted online; we hope these tips will help keep you a little bit safer.

  doi = {10.1371/journal.pcbi.1008563},
  url = {https://doi.org/10.1371/journal.pcbi.1008563},
  year = {2021},
  month = {3},
  publisher = {Public Library of Science (PLoS)},
  volume = {17},
  number = {3},
  pages = {e1008563},
  author = {Danielle Smalls and Greg Wilson},
  editor = {Scott Markel},
  title = {Ten quick tips for staying safe online},
  journal = {PLoS Computational Biology}

Strategies for Change

I’ve written before about Henderson et al’s work on theories of change in education, and a recent conversation made me wonder if anyone had built a similar theoretical basis for adoption of new software development practices. This diagram from Borrego & Henderson 2014 sums up the kind of model I’m looking for:

Borrego & Henderson change strategies

I think the categories translate pretty directly to tech: in the upper left we can teach people new coding practices directly, in the lower left we can change policies to require evidence of the use of continuous integration and unit testing, and so on. What would be equally interesting would be a theory of resistance to change, or more generously, studies of why developers don’t adopt better practices even when research has proven their value. For that, I would look to work such as Barker et al 2015, which explored several related issues:

How do faculty hear about practices?
They are motivated to solve a problem; they become aware through funded and institutional initiatives; they bump into new ideas at conferences; they learn from colleagues; or (in many cases) they simply don’t find out.
Why do faculty try out practices?
Sometimes their institutions encourage or require it (though there is often a tension between this and getting research done); they do an ad hoc cost-benefit analysis; or they are influenced by role models or other trusted sources or colleagues. There is often a tension between pedagogical innovation and covering the required material, and physical classroom layout can either help or hinder efforts; unfortunately, “Despite being researchers themselves, the CS faculty we spoke to for the most part did not believe that results from educational studies were credible reasons to try out teaching practices.” This echoes what Herckis has found, and part of the explanation is that people are afraid of looking stupid.
Why do faculty keep using practices?
“Perhaps the single strongest kind of evidence that supports a decision to routinize practices was student feedback.” Requirements from funders and faculty buy-in are also important, but “do the students like it?” (or “will they put up with it?”) is clearly the most important factor.

Once again, I think many of these ideas translate pretty directly to adoption of better software development practices. But that doesn’t mean such a translation would be correct: plausible analogies often turn out to be wrong, so I hope one day to see studies like these replicated in software engineering. If you’re doing this or know someone who is, I’d be grateful if you’d reach out.

Asteroids and Dinosaurs

Many years ago, I was working for one of the largest tech companies in the world (at that time, anyway). One Monday morning our team was called together and told that our project was being shut down. It came as a shock: we’d been making steady progress and were close to releasing something that other divisions in the company said they needed.

My boss had been with the company his entire career; he knew people who knew people. After a bit of digging he found out that a couple of vice presidents three levels up from us and three timezones away had been arm-wrestling for a year. The one we indirectly reported to had lost, and the one who had won was consolidating his position by shutting down everything his adversary had started.

I was outraged, but my boss just shrugged. “Asteroids happen,” he said. There you are, happily grazing on palm fronds and thinking about making little dinosaurs when wham! a cubic kilometer of rock hits the planet and that’s all she wrote. In a small company you can usually tell when something’s coming: you may not know why the senior management team is spending all their time in the board room with the curtains down so that no one else can see what’s on the whiteboard, but you know it’s happening. But in a large company, or a distributed one, too much is going on and too much is out of your peripheral vision. And if you’re on the other side of the planet from the point of impact, you might not even realize what’s happened. All you know is that the sky has gone dark and summer is long overdue.

I’m less outraged now than I was then. I grew up thinking that the dinosaurs had been failures because they went extinct, but they lasted about 165 million years and their descendents are still flying around all over the world. That long-ago project didn’t ship, but bits and pieces of what we built cropped up later in other things that members of our team were reassigned to. And without that great big wham! 65 million years ago, my distant ancestors wouldn’t have had a bunch of empty ecological niches to expand into. Having a cubic kilometer of rock make a crater in your world is always going to be a bit of a shock, but after the dust settles, it’s better to think about how much there was and what came after than about how unexpectedly it all ended.

Blinkered Visions

I was looking for some books in storage yesterday and came across this:

2020 Vision

Published in 1974, it contains eight stories by some of the leading science fiction writers of its time: all of them white men. I didn’t notice that when I first read it at the age of 12 or 13, any more than I noticed how much the authors wanted the future to be louder and brighter but not really different—at least, not in any way that would make them feel uncomfortable. Angry or afraid, yes: dystopias are meant to make you feel that way. But none of these stories would make their authors or readers wonder, “Am I the bad guy?”

Coincidentally, Malcolm Bradbury’s novel The History Man came out at almost the same time. I didn’t read it until some time in the late 1980s, and recognized enough of myself in the (awful) character of Howard Kirk to have to set it aside two or three times before I could finish it. It didn’t have any spaceships or aliens, but it turns out that it had a lot more to say about the future than the one that did.