In its early years Agile was closely associated with small software development teams working in the same room. In 2006 Agile was mainly adopted by companies of up to 20 people in size, and five years later half of all Agile adopters globally were already mid- to large organizations.
According to the 2011 State of Agile Development Survey by VersionOne, 60% of companies already use Agile on half of their corporate projects and 27% - on 76% to 100% of their projects. With such an increasing popularity of Agile practices and a rapidly changing Agile environment, having an entire team in the same room or even in the same building is a luxury that not many companies can afford. As a result of this, a lot of Agile projects are currently distributed among two and more teams, both onshore and across borders (see Fig. 1).
While only 5% of organizations plan to start adopting Agile in order to improve their distributed teams management, 41% of those who actually implement it say Agile does help them to better manage their teams.
Agile is becoming more popular among the outsourcing companies, too. The number of outsourcers using and/or planning to continue using Agile on their projects more than doubled in 2011, compared to 2010 (see Fig. 2).
The Agile processes, having three times the success rate of the traditional Waterfall method and a much lower percentage of cost and time overruns, are more than likely to become “the universal remedy for software development project failure” in the years to come.
However, implementing Agile does not necessarily mean being Agile. With the growing number of organizations that implement or plan to implement Agile in the future, the question arises – How good are they at doing Agile? And since de facto there are no industry standards for measuring Agile effectiveness, it doesn’t matter how good you think you are at Agile unless you compare yourself against your peers and competitors.
Ciklum has developed a unique way of comparing Agile teams against 80+ distributed (onshore + nearshore) teams. It is called Comparative Agile Measurement System (CAMSSM).
Currently, Ciklum services 165+ clients’ own nearshore software development teams. Most of these teams do some kind of SCRUM and Agile development and are in fact the distributed setups with the nearshore teams being “organic” extensions of the onshore ones. Back in 2010 Ciklum realized how difficult it was for its clients to become Agile in multiple locations and decided to help them to:
To be able to develop a working solution and create real value for clients, Ciklum had to:
To do the above, Ciklum had to answer the following questions:
1. What is the ideal Agile adoption pattern to compare the clients’ teams to?
2. What should be measured - teams’ adherence to plans and processes or productivity of techniques, tools and people?
As Ciklum looked deeper at those questions, it realized that:
1. The ideal Agile adoption pattern is something fictional rather than real-life and, therefore, it is totally senseless and useless to compare one’s team performance to the ideal pattern. It is much more rational to compare the team against its industry peers and/or competitors to determine if it is performing better or worse.
2. It is important to measure each client team’s adherence to Agile processes / best practices AND velocity of project’s execution to identify the productivity gaps of each distributed setup, collect the right data and put it into development of further guidelines for efficiency improvement (see Fig. 3).
As Ciklum began investigating the existing Agility metrics, it bumped into a very interesting and inspiring Comparative AgilityTM assessment system developed by Michael Cohn of Mountain Goat Software and Kenny Rubin of Innolution. This assessment system is based on a concept “determine how good you are compared to your competitors” and is a set of more than 100 questions divided into 7 dimensions: teamwork, requirements, planning, technical practices, quality, culture and knowledge creation. These 7 dimensions represent broad classifications of changes to be expected of a team or organization as it becomes more Agile.
However, after Ciklum ran a pilot assessment of 10 to 15 clients’ distributed teams, identified the Ciklum average score and compared the results against the Comparative AgilityTM assessment system’s database of around 4,000 teams, it has realized that the Comparative AgilityTM was not the solution it could use effectively across numerous client teams.
Namely, Ciklum had two major concerns:
1. With the Comparative AgilityTM system you can compare your team, project or organization against a total set of collected responsesor against responses filtered from organizations in the same industry, with the similar types of projects or the similar lengths of Agile experience. However, Ciklum realized that each person within a team might have his/her own perception of problems, which is quite natural, as people are different and have different mindsets. Thus, Ciklum had to figure out how to collect a wide array of different perceptions from all team members, puzzle them together and achieve a common team vision from various perspectives.
2. Comparative AgilityTM is based on the online survey that anyone can access and fill out. As it is not known who exactly has completed the survey – whether it has been a Project Manager, a Scrum Master, an IT Manager or a junior team member – the quality of responses is very questionable and not fully reliable. Therefore, Ciklum realized it would need a “quality assurance” consultant to check the quality of the survey responses and double-check it with some leading questions, exercises and other tools.
After continuous direct discussions with Michael Cohn and Kenny Rubin, Ciklum Agile consultants have modified and adjusted the Comparative AgilityTM framework with the considerations for the above concerns to focus specifically on distributed software development teams. That is how Ciklum’s Comparative Agility Measurement SystemSM (CAMSSM) has been created.
Ciklum Services & Consulting started using CAMSSM back in 2010 in order to foster performance competition among Ciklum Client Own Development Teams. However, at first they did not involve the entire teams in the analysis and only asked Project Managers, SCRUM masters or Team Leads to fill out the CAMSSM questionnaire. It was eventually realized it might have some side effects. For instance, when certain gaps were identified and the CAMSSM analysis results were shown to all team members, most would not have a shared vision of issues or would disagree the results reflect the actual state of affairs within their team. Such a discrepancy between Ciklum analysis and team members’ perceptions drove the adoption of a different, more consultative approach towards collecting various perceptions of issues and converting them into the workable action plans under the CAMSSM delivery models (see “CAMSSM Delivery Models” section). Since then Ciklum has begun involving entire teams in the Agility measurement with the Agile Process Consultants on its side to lead the effort and do the quality checks.
As the basis for comparison Ciklum uses Agile best practices collected from 80+ client own development teams. These practices are grouped up into 7 key dimensions within all aspects of the distributed software development:
4. Technical practices
A detailed analysis of 80+ Ciklum clients’ teams allows determining a Ciklum average score. Further, Ciklum Process Consultants analyze a client’s distributed setup in terms of each of the 7 dimensions, depending on a chosen delivery model (see “CAMSSM Delivery Models” section).
A detailed analysis of each of the dimensions and Agility patterns of the client’s distributed teams are then compared to the Ciklum average score (see Fig. 4).
Depending on clients’ business needs, Ciklum currently offers three models of CAMSSM delivery:
After the productivity gaps have been identified, Ciklum provides its clients with a number of consulting packages to facilitate improvements in respective areas. These packages include, but are not limited to:
After the suggested change has been implemented, Ciklum repeats Agility adherence measurement to see what has and has not worked well. Ciklum continues to improve its client own teams’ Agility until the client is fully satisfied with the team productivity, velocity, quality of delivery, cross-functional collaboration and other Agile outcomes.
To wrap up, CAMSSM is a proprietary tool developed by Ciklum Services & Consulting Office (CSC) that aims to:
CAMSSM identifies productivity gaps on three levels:
1. Individual – each nearshore team member fills out a questionnaire with multiple-choice questions and statements falling under 7 core dimensions of Agile software development
2. Team – Ciklum Process Consultants interview the entire client’s Agile team in an open discussion style
3. Company – Ciklum Process Consultants interview all client’s stakeholders, including top management and Agile teams distributed across multiple locations
Once the productivity gaps have been identified, Ciklum Consultants help clients:
Agile practices are an important pillar of each company’s overall business success. Assessing how well your distributed teams are at Agile adherence and identifying areas for improvement will provide you with even better business returns as well ability to outpace competitors and accumulate valuable knowledge and Agile experience.
 Ibid., 2011
 Ibid., 2011
 Ibid., 2011
ciklum, agile software development, agility measurement, agile efficiency measurement, agile teamwork, distributed agile teams nearshore, own nearshore software development, agile roi, agile planning, agile best practices, agile development outsourcing