Lorena Barba has done it again. Having created a wonderful 12-step introduction to the Navier-Stokes equations using the IPython Notebook, she has now published AeroPython, which teaches the use of potential flow for aerodynamic analysis via the panel method. You don't have to know what that is to appreciate the beauty of what she has built—I certainly don't, at least not yet—but I'd like to explore something that isn't in those notebooks. Before doing that, though, I need to introduce an acronym that never caught on.

Most programmers are familiar with the term "API", which stands for "Application Programming Interface". It's the functions, classes, and what-not that a library provides, i.e., the things a programmer can call or use after loading that library. The complementary term that never caught on is "XPI", or "External Programming Interface", which is the set of things a library needs in order to run. For example, the XPI of the Python glob library includes:

  • sys.getdefaultencoding()
  • sys.getfilesystemencoding()
  • os.curdir
  • os.error
  • os.listdir(dirname)
  • os.path.join(dirname, filename)
  • os.path.lexists(pathname)
  • os.path.split(pathname)

and a few things from the re and fnmatch libraries.

I sometimes think of lessons in terms of APIs and XPIs. A lesson's API is what it defines and explains; its XPI is the terms and concepts it relies on. As with code, APIs are usually explicit: "this library provides these functions" and "this lesson introduces these topics". In contrast, XPIs are usually implicit: programs don't list the things they use, they just use them, and the only way to discover exactly what concepts a lesson depends on is to read it through carefully. That takes so much time that it's often faster for people to write their own short lesson on a topic than to integrate with one someone else has written.

And then there's the maintenance problem. Software Carpentry's lessons are constantly evolving; how can someone who depends on them know whether everything they require is still there a year or two down the road? With software, they can recompile their program or re-run its unit tests and see whether things still work. There's no equivalent for lessons—no easy way to find out whether dependencies that used to resolve are still there.

Sooner or later, any large, multi-author project has to find a way to track and manage dependencies. Conversely, I believe that if a project can't do this, it won't be able to scale up. It isn't the only obstacle to collaborative lesson development, or the biggest, but it is an obstacle, even within Software Carpentry itself. If we can figure out how to solve it, we'll be one step closer to helping all the potential Lorena Barbas out there create a network of wonderful lessons.

This post originally appeared in the Software Carpentry blog.