Implementing Agile : SCRUM METRICS

There are lots of metrics that could be collected to assess a software development team’s competency, success, inventiveness, quality, and quantity of work. To get an idea of what metrics one could collect, take a look at the 185 practices in CMMI for Development. Agile, unlike CMMI, doesn’t require « evidence » that engineering practices are being followed and therefore has few metrics that a Scrum Team may collect to measure the success of each sprint. Below are 9 metrics that a Scrum Team might consider using.

  1. Actual Stories Completed vs. Committed Stories
  2. Technical Debt Management
  3. Team Velocity
  4. Quality Delivered to Customer
  5. Team Enthusiasm
  6. Retrospective Process Improvement
  7. Communication
  8. Scrum Team’s Adherence to Scrum Rules & Engineering Practices
  9. Development Team’s Understanding of Sprint Scope and Goal

To answer the question of who should be collecting metrics and measuring the Scrum Team’s success, consider who in Scrum is responsible for the team’s success. In describing the role of ScrumMaster, the Scrum Guide states, « The ScrumMaster teaches the Scrum Team by coaching and by leading it to be more productive and produce higher quality products. » Clearly it’s the responsibility of the ScrumMaster to measure the success of the team if only to increase the team’s productivity and product quality. The ScrumMaster could use a spider chart as shown below to track the Scrum Team.

Using a spider chart is an easy way for the ScrumMaster to track and compare results from sprint to sprint.

Below are short descriptions of each metric, how it can be measured, and some of the issues that could lead to a low score. Any additional metrics or comments on these 9 would be most welcomed.

1.  Actual Stories Completed vs. Committed Stories

This metric is used to measure the development team’s ability to know and understand their capabilities. The measure is taken by comparing the number of stories committed to in sprint planning and the number of stories identified in the sprint review as completed. A low score may indicate any of the following may need special attention:

  • Team does not have a reference story to make relative estimates (see Team Velocity),
  • Not every Team member understands the reference story (see Team Velocity),
  • Product Owner isn’t providing enough information to the development team (see Communication, Development Team’s Understanding of Sprint Scope and Goal),
  • Requirements scope creep (see Development Team’s Understanding of Sprint Scope and Goal),
  • Team is being disrupted (see Scrum Team’s Adherence to Scrum Rules & Engineering Practices, Team Enthusiasm).

Even if the Team completes the stories they committed to, there are a few things the ScrumMaster should be looking for:

  • Team under-commits and works at a slower than ‘normal’ pace (see Team Velocity),
  • Team has one or more ‘hero’ (see Team Enthusiasm),
  • Story commitment is met but the product is ‘buggy’ (see Technical Debt Management, Quality Delivered to Customer).

2.  Technical Debt Management

This metric measures the team’s overall technical indebtedness; known problems and issues being delivered at the end of the sprint. This is usually counted using bugs but could also be deliverables such as training material, user documentation, delivery media, and others. A low score may indicate any of the following may need special attention:
  • Customer is not considered during sprint (see Quality Delivered to Customer),
  • Weak or no ‘Definition of Done’ which includes zero introduced bugs (see Scrum Team’s Adherence to Scrum Rules & Engineering Practices),
  • Management interfering with Team-forcing delivery before the Team is ready (see Scrum Team’s Adherence to Scrum Rules & Engineering Practices),
  • Team working multiple stories such that team compromises quality to complete stories as end of sprint nears (see Scrum Team’s Adherence to Scrum Rules & Engineering Practices),
  • Team is creating bugs that reflect their opinion on how things should work rather than listening to the Customer (see Scrum Team’s Adherence to Scrum Rules & Engineering Practices, Quality Delivered to Customer).
Even if technical debt didn’t increase, here are a few things the ScrumMaster should be looking for:
  • Team is not documenting problems found or fixed (see Communication, Scrum Team’s Adherence to Scrum Rules & Engineering Practices),
  • Team is not engaging with Customer/Product Owner during user acceptance testing (see Communication, Scrum Team’s Adherence to Scrum Rules & Engineering Practices)

3.  Team Velocity

The velocity metric measures the consistency of the team’s estimates from sprint to sprint. Feature story estimates are made relative to estimates of other feature stories, usually using story points. The measure is made by comparing story points completed in this sprint with points completed in the previous sprint, +/- 10% (the Nokia Test uses +/- 10%). A low score may indicate any of the following may need special attention:
  • Product Owner isn’t providing enough information to the development team (see Communication, Development Team’s Understanding of Sprint Scope and Goal),
  • Team size is changing between sprints (generally, the core team must be consistent; allowances should be made for absences e.g. vacation, sick),
  • Team doesn’t understand the scope of work at the start of the sprint (see Development Team’s Understanding of Sprint Scope and Goal, Communication, Actual Stories Completed vs. Committed Stories),
  • Team is being disrupted (see Scrum Team’s Adherence to Scrum Rules & Engineering Practices),
  • Team’s reference feature stories are not applicable to the current release  (see Scrum Team’s Adherence to Scrum Rules & Engineering Practices),
  • Team is doing very short release cycles (< 3 sprints) or doing maintenance work (the Team might consider Kanban or XP over Scrum under these circumstances).
Even if the Team seems to have a consistent velocity, there are a few things the ScrumMaster should be looking for:
  • Team under-commits and works at a slower than ‘normal’ pace (see Actual Stories Completed vs. Committed Stories),
  • Team has one or more ‘hero’ (see Team Enthusiasm),
  • Velocity is consistent but the product is ‘buggy’ (see Technical Debt Management, Quality Delivered to Customer).
4.  Quality Delivered to Customer

In most companies, delivering a quality product to the customer and keeping the customer happy is your only reason for being. Scrum attempts to have the outcome of every sprint provide value to the customer i.e. a ‘potentially releasable piece of the product’. This is not necessarily a product that is released but a product that can be shown to the customers as a work in progress, to solicit the customers’ comments, opinions, and suggestions i.e. are we building the product the customer needs. This can be best measured by surveying the customers and stakeholders. Below shows a spider chart survey of customers made after the Sprint Review.

The ScrumMaster can document all customer or stakeholder opinions using an easy to read spider chart.

A low score may indicate any of the following may need special attention:

  • Product Owner doesn’t really understand what the customer wants and needs (see Communication),
  • The Product Owner is not adequately communicating the customer needs to the development team (see Communication, Development team’s Understanding of Sprint
  • Scope and Goal),
  • Customers are not involved in the development of stories (see Communication),
  • Customer is not involved with defining story acceptance criteria (see Communication)
  • Bugs are delivered with the product (see Technical Debt Management).

Even if the customer is satisfied with the sprint product, there are a couple things the ScrumMaster should be looking for:

  • The product is ‘buggy’ (see Technical Debt Management),
  • Not all customers and stakeholders are invited, show up, or participate in the sprint review (see Scrum Team Adherence to Scrum Rules & Engineering Practices, Communication).
5.  Team Enthusiasm

Enthusiasm is a major component for a successful Scrum Team. If they don’t have it, no process or methodology is going to help. There are many early warnings signs that left unchecked, can lead to loss of ‘enthusiasm’. It’s up to the ScrumMaster to recognize these symptoms and take appropriate action for each particular circumstance if the ScrumMaster is to keep the Team productive and happy. Measuring team enthusiasm can be very subjective and is best accomplished through observations during the various sprint meetings; planning, review and daily stand-up. However, the simplest approach is to ask the direct question of the Scrum Team, « Do you feel happy? » and « How motivated do you feel? » A low score may indicate any of the following may need special attention:

  • ScrumMaster is taking too long to remove impediments,
  • The number of impediments during the sprint was high,
  • The team is being interrupted by managers or the Product Owner (see Actual Stories Completed vs. Committed Stories),
  • Some team members can’t contribute in certain product areas i.e. lack of cross-functional training,
  • Team members are working long hours i.e. not working at a sustainable pace (see Actual Stories Completed vs. Committed Stories),
  • Internal conflicts between Product Owner and Team,
  • Internal conflicts between Team members,
  • Team is repeating same mistakes (see Retrospective Process Improvement),
  • Team is continually grumbling, complaining, or bickering,
  • Team member(s) don’t have passion for their work,
  • The Team is not being creative or innovative.
6.  Retrospective Process Improvement

The Retrospective Process Improvement metric measures the Scrum Team’s ability to revise, within the Scrum process framework and practices, its development process to make it more effective and enjoyable for the next sprint. From the Scrum Guide, « The purpose of the [Sprint] Retrospective is to inspect how the last Sprint went in regards to people, relationships, process and tools. The inspection should identify and prioritize the major items that went well and those items that-if done differently-could make things even better. These include Scrum Team composition, meeting arrangements, tools, definition of ‘done,’ methods of communication, and processes for turning Product Backlog items into something ‘done.' » This can be measured using the count of retrospective items identified, the number of retrospective items the Team had committed to addressing in the sprint, and the number of items worked/resolved by the end of the sprint. A low score may indicate any of the following:

  • Team doesn’t identify any items for improvement due to feeling there’s nothing they can do or it’s out of their control i.e. Team not self-organizing and managing,
  • During sprint, Product Owner, ScrumMaster, or management discourages work on self-improvement at the expense of feature stories,
  • During the sprint, the Team discourages work on self-improvement at the expense of feature stories,
  • Team doesn’t look inward at their own performance and environment during the retrospective,
  • Team is not acknowledging or addressing repeated mistakes (see Team Enthusiasm).
7.  Communication

The Communication metric is a subjective measure of how well the Team, Product Owner, ScrumMaster, Customers, and stakeholders are conducting open and honest communications. The ScrumMaster, while observing and listening to the Team, Product Owner, Customers, and other stakeholders throughout the sprint, will get indications and clues as to how well everyone is communicating. A low score may indicate any of the following:

  • The Customer is not actively involved with feature story development (see Quality Delivered  to Customer),
  • The Customer is not providing acceptance criteria for stories (see Quality Delivered  to Customer, Technical Debt Management),
  • The Team is not providing the acceptance tests to the Customer for review and comment (see Quality Delivered  to Customer, Technical Debt Management),
  • The Team and Customer are not running acceptance tests together (see Quality Delivered  to Customer, Technical Debt Management),
  • The Customer(s) and other stakeholders are not invited/present at the Sprint Review (see Quality Delivered  to Customer),
  • The ScrumMaster isn’t ensuring Customers are surveyed after each Sprint Review (see Quality Delivered  to Customer),
  • The Product Owner isn’t available some portion of each day for scrum team for collaboration (see Scrum Team Adherence to Scrum Rules & Engineering Practices),
  • The Product Owner stories do not address features from the Customers’ perspective e.g. stories are implementation specific  (see Scrum Team Adherence to Scrum Rules & Engineering Practices, Quality Delivered  to Customer),
  • Product Owner isn’t providing information on the Customer ‘pain’ or needs to the Team (see Actual Stories Completed vs. Committed Stories),
  • Team is not documenting problems found or fixed (see Technical Debt Management),
  • Product Owner doesn’t really understand what the customer wants and needs (see Quality Delivered  to Customer),
  • The Product Owner is not adequately communicating the customer needs to the development team  (see Quality Delivered  to Customer, Actual Stories Completed vs. Committed Stories)
  • Team is not conducting daily meeting  (see Scrum Team’s Adherence to Scrum Rules & Engineering Practices),
  • Scrum Team is not conducting release planning   (see Scrum Team’s Adherence to Scrum Rules & Engineering Practices),
  • Scrum Team is not conducting sprint planning  (see Scrum Team’s Adherence to Scrum Rules & Engineering Practices),
  • Scrum Team is not conducting a sprint review  (see Scrum Team’s Adherence to Scrum Rules & Engineering Practices),
  • Scrum Team is not conducting a retrospective  (see Scrum Team’s Adherence to Scrum Rules & Engineering Practices),
  • Scrum Team is not visibly displaying release burndown chart  (see Scrum Team’s Adherence to Scrum Rules & Engineering Practices),
  • Scrum Team is not visibly displaying sprint burndown chart  (see Scrum Team’s Adherence to Scrum Rules & Engineering Practices),
  • Scrum Team is not visibly displaying stories and acceptance criteria  (see Scrum Team’s Adherence to Scrum Rules & Engineering Practices),
  • Scrum Team is not visibly displaying non-function requirements that apply to entire sprint or release  (see Scrum Team’s Adherence to Scrum Rules & Engineering Practices).

8.  Scrum Team’s Adherence to Scrum Rules & Engineering Practices

The rules for scrum are defined in the Scrum Guide by Ken Schwaber and Jeff Sutherland. Although scrum doesn’t prescribe engineering practices as does XP, most companies will have several of these defined for software engineering projects. The ScrumMaster is responsible for holding the Scrum Team accountable to these rules and any engineering practices defined. This metric can be measured by counting the infractions that occur during each sprint. A low score may indicate any of the following:
  • Management or Product Owner interfering with Team-forcing delivery before the Team is ready (see Technical Debt Management),
  • Weak or no ‘Definition of Done’ which includes zero introduced bugs (see Technical Debt Management),
  • Customer is not considered during sprint (see Technical Debt Management),
  • Team is creating bugs that reflect their opinion on how things should work rather than listening to the Customer (see Technical Debt Management),
  • Team is not documenting problems found or fixed (see Technical Debt Management)
  • Team is not engaging with Customer/Product Owner during user acceptance testing (see Technical Debt Management),
  • Team is being disrupted (see Actual Stories Completed vs. Committed Stories, Team Velocity),
  • ScrumMaster is not protecting Team from disruptions,
  • Team’s reference feature stories are not applicable to the current release (see Team Velocity)
  • Not all customers and stakeholders are invited, show up, or participate in the sprint review (see Quality Delivered to Customer),
  • ScrumMaster is not ensuring that the Scrum Team adheres to Scrum values, practices, and rules,
  • ScrumMaster is not ensuring that the Scrum Team adheres to company, departmental, and Scrum Team engineering rules and practices,
  • ScrumMaster is not teaching the Scrum Team by coaching and by leading it to be more productive and produce higher quality products,
  • ScrumMaster is not helping the Scrum Team understand and use self-organization and cross-functionality,
  • ScrumMaster is not ensuring the Scrum Team has a workable ‘Definition of Done’ for stories,
  • ScrumMaster is not ensuring the Team has a daily meeting,
  • ScrumMaster is not ensuring the Team has the release burndown chart is prominently displayed,
  • ScrumMaster is not ensuring the Team has the sprint burndown chart is prominently displayed,
  • ScrumMaster is not ensuring the Team has the sprint stories and acceptance criteria are prominently displayed,
  • ScrumMaster is not ensuring sprint planning meeting is held,
  • ScrumMaster is not ensuring the sprint review meeting is held,
  • ScrumMaster is not ensuring the sprint retrospective meeting is held.

9.  Development Team’s Understanding of Sprint Scope and Goal

The Team Understanding of Sprint Scope and Goal metric is a subjective measure of how well the Customer, Product Owner, and Team interact, understand, and focus on the sprint stories and goal. The sprint goal is broad and states the general intention of the sprint. The goal is usually an abstraction and is not testable as such but is aligned with the intended value to be delivered to the Customer at the end of the sprint. The sprint scope or objective is defined in the acceptance criteria of the stories. The stories and acceptance criteria are ‘placeholders for a conversation’ and will not be incredibly detailed. The Product Owner and Customer define the acceptance criteria but the Product Owner alone defines the sprint goal. The ScrumMaster can use the scoring from « Actual Stories Completed vs. Committed Stories » and « Team Velocity » as indications of problem understanding the stories and goal but is best determined through day-to-day contact and interaction with the Scrum Team. A low score may indicate any of the following:

  • Product Owner isn’t providing enough information to the development team i.e. needs of the Customer (see Actual Stories Completed vs. Committed Stories, Team Velocity, Communication),
  • Product Owner is not writing a sprint goal,
  • Product Owner doesn’t understand what an incremental development approach means,
  • Requirements scope creep: Product Owner is adding to the scope of the stories or adding additional acceptance criteria that were not agreed to during planning (see Actual Stories Completed vs. Committed Stories, Team Velocity, Communication),
  • Team doesn’t understand the scope of work at the start of the sprint (see Team Velocity, Communication).

SOURCE : http://implementingagile.blogspot.ch/2011/06/scrum-metrics.html

39 réflexions au sujet de « Implementing Agile : SCRUM METRICS »

  1. Have you ever considered about including a little bit more than just your articles?

    I mean, what you say is valuable and all. Nevertheless
    think of if you added some great images or video clips
    to give your posts more, « pop »! Your content is excellent but with pics and clips, this website could definitely be
    one of the very best in its field. Very good blog!

  2. Do you mind if I quote a few of your articles as
    long as I provide credit and sources back to your weblog? My
    blog is in the very same area of interest as yours
    and my visitors would definitely benefit from some of the information you provide here.
    Please let me know if this ok with you. Appreciate it!

  3. Hmm is anyone else experiencing problems with the images on this blog loading?
    I’m trying to figure out if its a problem on my end or
    if it’s the blog. Any suggestions would be greatly appreciated.

  4. I am really inspired together with your writing abilities as smartly as with the layout on your
    blog. Is this a paid subject matter or did you modify
    it your self? Either way stay up the nice quality writing, it’s rare to peer a nice blog like this one today..

  5. Thanks for your personal marvelous posting! I truly
    enjoyed reading it, you can be a great author.I will ensure that I bookmark your blog and
    definitely will come back someday. I want to encourage you continue
    your great posts, have a nice morning!

  6. Hey there just wanted to give you a quick heads up.

    The words in your content seem to be running off the screen in Opera.

    I’m not sure if this is a format issue or something
    to do with web browser compatibility but I thought
    I’d post to let you know. The layout look great though!

    Hope you get the problem solved soon. Kudos

  7. What i do not realize is in fact how you are not actually much more
    well-appreciated than you might be now. You’re very intelligent.

    You already know therefore significantly on the subject of this
    subject, made me personally believe it from a lot of varied angles.
    Its like women and men aren’t fascinated except it’s one thing to accomplish with Girl gaga!
    Your individual stuffs nice. At all times handle it up!

  8. Hi are usіng WordPress fߋr yoir site platform? Ι’m new too
    thе blog world but I’m trуing to ɡet stаrted and crеate mү own. Do yоu neеԁ anyy
    coding kjowledge toо make your օwn blog? Anny
    heⅼр would ƅe reaⅼly appreciated!

  9. Hey! I know this is kinda off topic but I was wondering which blog
    platform are you using for this site? I’m getting sick and tired of WordPress because I’ve had problems with hackers and
    I’m looking at options for another platform.
    I would be awesome if you could point me in the direction of a good platform.

  10. Wow, this piece of writing is nice, my younger sister is analyzing these kinds of things, therefore I am going
    to inform her.

  11. Great weblog right here! Also your website loads up fast!
    What web host are you the use of? Can I get your associate hyperlink for your host?
    I want my web site loaded up as fast as yours lol

  12. Hey! I know this is sort of off-topic but I needed to ask.
    Does running a well-established website such as yours require a
    large amount of work? I’m completely new to running a blog but I do write
    in my diary every day. I’d like to start a blog
    so I can share my experience and thoughts online.
    Please let me know if you have any suggestions or tips for new
    aspiring bloggers. Appreciate it!

  13. Hi I am so glad I found your webpage, I really found you by mistake, while I was researching on Digg for something
    else, Anyhow I am here now and would just like to say thanks a lot for a remarkable post
    and a all round interesting blog (I also love the theme/design), I don’t have time to go through
    it all at the moment but I have bookmarked it and also added in your RSS feeds, so when I
    have time I will be back to read more, Please do keep up the
    superb work.

  14. I have been browsing on-line greater than three hours as of late, yet I by no means discovered any fascinating article like yours.
    It is beautiful value sufficient for me. Personally, if all web owners and bloggers made excellent content material
    as you probably did, the net might be much more helpful than ever before.

  15. Excellent article. Keep writing such kind of information on your site.
    Im really impressed by your site.
    Hello there, You’ve done an excellent job. I’ll certainly digg it and for my
    part suggest to my friends. I am sure they will be benefited from this site.

  16. Howdy! Do you know if they make any plugins
    to help with SEO? I’m trying to get my blog to rank for some targeted keywords but I’m not seeing very good gains.
    If you know of any please share. Thank you!

  17. Pretty part of content. I simply stumbled upon your weblog and in accession capital to say that
    I get actually enjoyed account your weblog posts. Anyway I will be subscribing in your feeds or even I achievement
    you get entry to persistently quickly.

  18. Wonderful paintings! This is the kind of info that are meant to
    be shared around the web. Shame on the search engines
    for no longer positioning this submit upper! Come on over and discuss with my website .
    Thank you =)

  19. 一番タチが悪いのが経験年数を盛ること。中にはできる限り客を集めようと、1年しか占い師をやってきていないのに7年と書く人もいます

Laisser un commentaire

Votre adresse de messagerie ne sera pas publiée.