Friday 27 June 2014

When off-shoring software to India, include code quality metrics as a part of the contract

I understand the appeal of off-shoring software development to India: low rates, scalable team size, and a process that has really matured over the years.  India is a serious and credible competitor for software development services.

I have personally been asked to maintain software written by large Indian off-shore companies. While the software usually meets the functional requirements, and passes manual QA testing, in my experience, the quality of the code written overseas is often poor.  Specifically, the resulting code is not extensible and it is expensive to maintain.  I am not exaggerating when I say I have seen 2000 line methods within 6000 line classes that were copy/pasted multiple times.

Setting aside for a moment the implicit conflict-of-interest of writing code that is expensive to maintain, in fairness to the Indian offshore developers, when customers complain that it's expensive to change features and add new ones to the delivered system, the developers innocently respond, "well you never told us you were going to need those changes..."

There is a simple answer to this.  Ask for it up front.  And I don't mean ask for the system to be extensible and maintainable.  That's vague.  I mean require the developer to run a Continuous Integration server (such as Jenkins) with a code quality plugin such as SonarQube, and measure the specific code quality metrics that matter.

In my experience, measuring the following 4 metrics goes a long way towards ensuring the code you get back is extensible and maintainable.

  1. Package Tangle Index = 0 cycles.  This ensures the software is properly layered, essential for extensibility.
  2. Code Coverage between 60% and 80%.  This is essential for low maintenance costs.  This metric is about automated testing.  The automated unit tests quickly discover side-effects of future feature changes, allowing you to make changes to how the system behaves and get those changes into production with a minimum of manual regression testing.
  3. Duplication < 2%.  Any competent developer will maintain low code duplication as a basic pride of craft.  But I have been astonished at the amount of copy/paste code I've seen come back from India.  If you don't measure it, unscrupulous coders will take this shortcut and produce a system whose maintenance costs quickly spiral out of control.
  4. Complexity: < 2.0 / method and < 6.0 / class.  This metric plays a huge factor in extensibilty.  Giant classes with giant methods make a system brittle and resistant to change.  Imagine a building made out of a few giant Lego blocks versus the same building made out of 10 times as many smaller Lego blocks.  The latter building will be far more flexible to reshape as business needs change.
A word of caution about using SonarQube.  Some developers, particularly those with a perfectionist bent, can get lost in a rabbit hole of trying to improve their code's "score" on many of the other metrics offered by the tool.  Violations, Rules Compliance, Technical Debt Score and LCOM4 are particularly tempting to undisciplined developers.  But in my experience, these metrics provide limited return on investment.  If you do decide to measure your code quality, I urge you to ignore these metrics.  While it can be a hill of fun spending weeks making your code "squeeky clean," the business value of these other metrics pales in comparison to what you get out of the 4 metric I recommended.

So the next time you outsource a development project to India, protect yourself from getting back junk by requiring code quality metrics right in the contract.  It might add an extra 10% to the initial cost of the system, but that cost will be more than offset by the resulting extensibility and maintainability of the code you get back.

No comments:

Post a Comment