Skip to main content

How can productivity differences in software projects be reliably measured?

In this case study, we compare six different metrics in terms of their suitability for productivity benchmarking at the level of development results. To do this, we use detailed data from four itestra software projects with a total development time of 264 months and 1.1 million lines of source code.

Code changes, absolute growth, number of commits and invested effort are measured in consecutive 3-month periods. This allows us to observe changes in productivity over the course of a project and compare different projects. We find correlations between effort and the selected output metrics as well as significant and explainable differences in productivity between projects and project phases.

We also analyze whether the use of a clone detection algorithm can improve the measurement by taking into account additions through copy & paste as well as renamed or moved code. We find that this provides a small benefit. The redundancy-adjusted amount of added or modified code tokens proves to be the best metric among the selected ones – especially in further development where an existing codebase is modified. The number of commits and absolute growth can usefully complement the overall picture.

  • Measuring productivity in software development is crucial, as the output of individual developers and teams can empirically differ by a factor of 10. Benchmarking makes it possible to classify actual productivity on an absolute scale and realistically assess the potential for improvement – a central component of our HealthChecks.
    In contrast to story points, whose definition is team-dependent, objective metrics provide reliable information.
  • The empirical research results presented support the benchmarking approaches that itestra has been using successfully in consulting projects for over 20 years. The real data from large industrial projects is particularly valuable for science.
  • Two metrics developed by itestra use an automated clone analysis that also recognizes renaming, moving and copy & paste. This is crucial, as large software systems are rarely created from scratch, but are continuously developed further.

Read more?

Benchmarking ongoing development output
in real-life software projects

Jonathan Streit and Lukas Feye

info@itestra.de
+49 89 381570-110

Shaping the future together.