According to Wikipedia, the definition of ROI is “the benefit to the investor resulting from an investment of some resource. A high ROI means the investment gains compare favorably to investment cost. As a performance measure, ROI is used to evaluate the efficiency of an investment or to compare the efficiency of a number of different investments. In purely economic terms, it is one way of considering profits in relation to capital invested.” https://en.wikipedia.org/wiki/Return_on_investment
I was recently asked by a potential client to justify the ROI of not only the investment of training of the tool but of the tool itself. While I have always known the implied value of using a modern development tool here I decided it was time to actually quantify it. The results are impressive, particularly when the number of developers increase. But even for a shop where a single developer works (and there are many!) it’s hard to dispute the result.
Facts and assumptions for determining ROI
Price of RDi – On the IBM website the price of RDi is listed at $1030. This is the number I will be using for my calculations. Obviously if you have a different currency please convert accordingly. Click HERE.
Hours per week – Having to pick a starting point, I chose 40 hours as a typical work week with 30 hours being allocated to application development. This includes source code maintenance and debugging.
Productivity gain using RDi vs. not using RDi – This is always an interesting discussion because I have heard values from 20 percent to as high as 60 percent. In fact once even 70 percent. Clearly it is a personal number, for this example I will use the lowest estimate, 20 percent. I believe a more accurate number is closer to 40 percent (when coupled with education).
Time to realize gain in productivity – Without any type of training at all, it’s been said a developer will become more productive within two months’ (8 weeks) time. This productivity gain will likely be on the lower end of the spectrum.
Remaining time of first year to continue using RDi – With the first 8 weeks removed, this leaves 44 weeks of development. Accounting for holidays and personal time, round down to 40 weeks.
The numbers at work
Current: 30 hours of application development time.
Increase of 20 percent in productivity: 30 * 1.20 = 36 hours (6 “extra” hours each week).
Salary of $52,000 per year = $52,000 / 52 = $1,000 per week
$1,000 / 40 hours per week = $25 per hour
Effective $ increase of output = $150 per week (6 hours * $25)
Increase of productivity for first year for ONE developer = $6,000 (40 weeks * $150)
Using the same formula above for a salary of $75,000 and the $ increase becomes $8,640.
Raise the salary to $100,000 and the increase becomes $11,520.
Points to reiterate:
This example is for ONE developer.
The most conservative estimate of 20 percent was used and was assumed to be stable the whole year, which is not correct. You should expect this number to increase throughout the year and eventually plateau.
With proper training, the learning curve is much flatter so the actual application development productivity increases much quicker. Additionally, using a trainer in a hands-on environment increases the confidence of the developer thus increasing the odds RDi will become the permanent development tool of choice.
The investment for RDi is a one-time purchase, with required but much lower software maintenance for the following years.
Final thought:
To be sure, these calculations are my own but I believe representative of a true indicator of ROI.