Is It Right vs. Does It Work?

I was catching up on my journal reading over the weekend, and it struck me that one of the differences between academia and industry is the difference between caring whether something is right, and caring whether it works. Neither implies the other... (I used to think that openness was another difference, but you can't actually read any of the papers listed below for free.)
  • Soares, Borba, and Laureano: "Distribution and persistence as aspects." In Software: Practice & Experience, vol 36, 2006. This is an interesting in-depth case study of how to disentangle the concepts in a fairly hefty Java application, and recast some of them using aspect-oriented programming. I'm actually pretty sceptical of AOP, but the authors make their case well.
  • Binder: "Portable and accurate sampling profiling for Java". In Software: Practice & Experience, vol 36, 2006. Binder's profiler counts bytecodes, kicking off a separate profiling agent every once in a while to snarfle the call stack. What's interesting is that the overhead of this scheme isn't much different from that of timing-based profilers, but the resulting profiles are actually more accurate.
  • Viegas, Golder, and Donath: "Visualizing Email Content: Portraying Relationships from Conversational Histories". CHI 2006. I currently have over 50,000 email messages in roughly 500 archives, and that's just on this machine. Finding things in that mess is difficult, which is why the idea of showing a sliding-window tag cloud of the conversation appeals.
Still to go:
  • Syed-Abdullah, Holcombe, and Gheorge: "The Impact of an Agile Methodology on the Well Being of Development Teams". Empirical Software Engineering, vol 11, 2006.
  • Do, Rothermel, and Kinneer: "Prioritizing JUnit Test Cases: An Empirical Assessment and Cost-Benefits Analysis". Empirical Software Engineering, vol 11, 2006.
  • Ellims, Bridges, and Ince: "The Economics of Unit Testing". Empirical Software Engineering, vol 11, 2006.