So, what do I mean by "Agile?" (Part 2)

As I hinted in my previous post, planning is broken down into three levels:

  • Product Planning
  • Release Planning
  • Iteration Planning

As I am not a product planner, I will not try to discuss how that process happens. Rather, I’ll focus on the active development process through Release and Iteration Planning. First off, I like some to see symmetry between releases and iterations. The only difference is the scale at which the two types of planning occur. Specifically, I like both releases and iterations to include:

  • Planning,
  • Review, and
  • Retrospective

Here’s what I mean by each…

Planning

Each release and iteration should begin with a planning session designed to select some number of user stories from the parent backlog for inclusion in the current cycle of the process. For example, during Release Planning, user stories are selected from the Product Backlog for implementation in the next Release.

Review

Each release and review should end with a review meeting wherein the Product Owner (and other stakeholders) is shown the running acceptance tests and asked to approve or reject them. If an item is approved, it becomes part of the next release. If it is rejected, it is placed into the backlog for the next iteration (or release, potentially). These meetings are also often the source of additional user stories for the backlog. For example, the Product Owner may accept a user story as it is, but desire that something about it change in the next iteration before accepting the feature in the release review.

Retrospective

At the end of each iteration and release, after the review meeting, the whole team should get together to discuss process improvement. Some teams do this without the Product Owner. I recommend including them, as their perspectives are valuable and it serves to reinforce the concept of the “Whole Team.”

Next up: Release and Iteration Duration and Structure

So, what do I mean by "Agile?" (Part 1)

Since you’re reading my blog, I’ll assume that you’re either a relative, or you’re already somewhat knowledgable about agile software development. So, let’s jump right in... (Sorry, Aunt Gina.)

Given my drothers, I prefer to mix and match terminology and practices from Scrum and Extreme Programming. Here’s what I like to do:

Roles & Responsibilities

Product Owner is responsible for: 

  • Defining and/or collecting the set of features to be implemented in the product
  • Maintaining a Product Backlog of User Stories ordered by Business Value
  • Presenting user stories to the Team during Release Planning
  • Selecting user stories to place on the Release Backlog
  • Presenting user stories to the team in more detail during Iteration Planning
  • Selecting user stories to place on the Iteration Backlog
  • Accepting or rejecting user story implementations during Iteration Review
  • Presenting implemented user stories to other stakeholders during Release Review
  • Determining whether or not to deploy software during Release Review
  • Answering questions from the Team throughout the process

The cross-functional Development Team (or Team) is responsible for:

  • Estimating stories with T-shirt Sizes during Release Planning
  • Estimating stories with Story Points during Iteration Planning
  • Splitting large (epic) stories into smaller, more manageable stories 
  • Calculating their team velocity using Yesterday’s Weather 
  • Incremental Design and Test-First Programming of user stories
  • Maintaining a backlog of Technical Debt
  • Maintaining code and test quality with Refactoring
  • Presenting implemented user stories to Product Owner during Iteration Review
  • Assisting Product Owner with presentation of implemented user stories at Release Review
  • Asking questions of the Product Owner throughout the process

Coach (or Scrum Master) is responsible for: 

  • Identifying and resolving roadblocks hindering the Team, including, process, technical and managerial issues, and more.

The Whole Team, which includes the Product Owner, Development Team and Coach, is responsible for:

  • Conducting Release and Iteration Planning meetings
  • Conducting Iteration and Release Review meetings
  • Conducting Iteration and Release Retrospective

Next up: Release & Iteration Duration and Structure

More pictures from p&p

While I was with patterns & practices, Microsoft invested a considerable sum of money to build the team a custom space that that lent itself to agile software development. My friend Darrell Snow was largely responsible for securing the project and shepherding it through the 40 some odd cooks who want to put their fingers in the stew. Overall, it turned out amazingly well - though the team spaces could be a bit bigger, IMHO. Even so, I really miss working in that space.

Most of the photographs below were all taken shortly after the team moved into the space. For context, the first picture shows what our team room looked like before the new space and the second picture is a floor plan of the new space.

My approach to agile testing...

I’ve talked about agile testing before, here, here and here. But, a recent thread on the Alt.Net Seattle Google Group got me thinking about it again. Here’s the response I sent to the thread:

Testing is a huge domain. If you’re familiar with Marick’s testing quadrant, you know that there are four basic areas that testing covers:

  • Business Facing tests in Support of Programming (Business Requirements testing – Does the code do what it should?)
  • Business Facing tests to Critique the Product (Business Defect testing – Does the code do something it shouldn’t? Are there missing requirements?)
  • Technology Facing tests in Support of Programming (Technical Requirement testing – Does this method do what the developer intended?)
  • Technology Facing tests to Critique the Product (Technical defect testing – Are there leaks? Can it handle a load? Is it fast enough?)

Typically, testers focus on the business facing tests. And, people with specialized technical skills focus on the technology facing tests. (Developers on the support programming side; Performance testers on the critique product side.)

None of these tests can be run before the software is written. But, the tests in support of technology can be written before the code. And, metrics for perf/load/stress can be defined before the code is written. I recommend doing all of that (unless perf/load/stress isn’t important to you). Obviously, exploratory testing is something that has to wait for the code to be written.

If I were designing an agile team from scratch, I would propose the following approach:

During planning:

  • Track requirements as user stories.
  • Document acceptance criteria with each story, including perf/load/stress criteria (on the back of the 3x5 card, in Rally or TFS, etc.)

During an iteration:

  • One pair works on one story at a time.
  • Acceptance tests are automated first, based on acceptance criteria.
  • Code is written using TDD
  • Story is not functionally complete until all acceptance tests are passing (for the right reasons – no hard coded answers left)

After story is functionally complete:

  • Original pair leverages existing acceptance tests in perf/load/stress tests to determine if those criteria are met.
  • Tweak code as necessary to meet perf/load/stress acceptance criteria.
  • Story is not perf/load/stress complete until all perf/load/stress acceptance tests are passing

Exploratory testing should happen outside the constraints of a single story:

  • Limiting it to a single story would put blinders on that could negatively impact the effort. But, it is important that it happen.
  • Perhaps the team sets aside time during the day or iteration for banging on the software.

Once all acceptance tests are passing:

  • Ship it!

Variations:

  1. Have the entire team bang out the acceptance tests at the beginning of the iteration.  I’ve seen this done. It works. But, quite often, tests get written for stories that end up getting cut from the iteration due to time constraints. That is excess inventory sitting on the production floor until those stories make it into another iteration. In other words, doing this encourages the accumulation of waste.
  2. If you’re concerned about a single pair working a story from beginning to end, mix it up. Give pairs one day to work on something, or 4 hours, or two, whatever works for you. Then switch things up – preferably by keeping one person on the story and bringing in a new pair. Then, the next time you switch, bring the older pair leaves.
  3. Even though exploratory testing should not be constrained by a single story, it really is important to do it before shipping the software. Microsoft calls this a bug bash. They give away prizes for the most bugs, and the hardest to find bugs. But, they don’t do it until very late in their process. It would be most agile to do it continuously.

How do you do agile testing?

Deadlines cause poor decisions to be made slowly

In an article published in The Washington Post on July 22, Daniel Carpenter discussed why our government always seems to be working down to the wire. He blames deadlines (both real and conjured), stating that the "eleventh-hour effect" tends to delay decision making until the last possible moment:

When deadlines are imposed, decisions and bargains that could happen more quickly — because of momentum or normal work flow — often end up getting put off until the last minute. Social scientists have referred to this as the "eleventh-hour effect," and we see it both in experiments and in real life.

Furthermore, as a Harvard researcher studying the effect of deadlines on the FDA drug approval process, Mr. Carpenter and his colleagues discovered that drugs approved close to the Congressionally imposed deadline tend to be recalled often than others:

A few years ago I joined some Harvard Medical School colleagues in examining the deadlines that, since 1992, Congress has placed upon the Food and Drug Administration's drug reviews. Our research found that medications approved right before these deadlines were considerably more likely to be pulled from the market or have significant warning labels attached later on.

In other words, deadlines cause poor decisions to be made slowly.

I must admit, as a proponent of agile software development practices such as continuous, iterative planning, I find more than a grain of truth in this statement — and not just within the bounds of our government. In fact, it has been my experience that deadlines rarely encourage people to complete their work early, as the task at hand tends to fill the time available. And, besides, experience has also taught me that deadlines often seem arbitrary, feel capricious, and demotivate teams. (Details in footnote.)

Granted, businesses need to be able to plan around software development projects. There are end-users to train, products to market, and systems to integrate. And, no matter how arbitrary it may seem to software developers, a deadline does provide a business with a stake in the ground around which to plan these other activities. In this sense, deadlines perform a valuable service to the business. However, due to the "eleventh-hour effect" and the rush to finish work on time, deadlines can also do quite a bit of harm to an organization's systems.

Therefore, my preference is to manage projects using a stack-ranked, estimated backlog of user-stories and team velocity. The backlog shows me what's complete, how much work remains, and in what order it will be completed. The team velocity tells me when I can expect a specific story to be completed.

So, if I have a specific business need to see story X completed by date Y, then I can make the appropriate adjustments to the stack-ranking of the backlog without putting undue pressure on the development team. Furthermore, because user-stories are discreet, end-to-end slices of functionality, I can deploy them as soon as there’s sufficient business value to offset the costs of deployment.

The only question I have is could this management strategy be effective in Congress?


Footnotes:

  1. Early in my career, I worked for a man who threatened to fire his entire development team if we did not ship all the projects we were working on by the end of the month. This is what I mean by an arbitrary, capricious and demotivating deadline. Oh, we managed to meet the deadline. It took some serious corner cutting. But, we shipped the software. In the end, though, it didn’t matter. We all left the company within three months, anyway.
  2. Hat tip to Ezra Klein, also of The Washington Post, for [the reference to the Carpenter article](http://www.washingtonpost.com/blogs/ezra-klein/post/the-downside-of-deadlines...).