User Tools

Site Tools


How Can We Improve Collaboration on User Stories?

“Everything is vague to a degree you do not realise till you have tried to make it precise” – Bertrand Russell

When it comes to understanding and documenting requirements there always seems to be a discussion and room for improvement. In general we capture a user story in an automated tool with a summary / title and a description that has both the user story (“as a”, “I want”, “so that”) and a section describing the “Acceptance Criteria” (aka “Conditions of Satisfaction”). People wonder “how much is too much detail?” Others wonder whether they really understand the requirement in enough detail to do something useful. Still others think that we should write a complete specification.

One team I worked with recently ran an experiment using a slightly more formalized approach to capturing Acceptance Criteria (or Conditions of Satisfaction) which helped us understand the requirement, but also did not specify how solution was to be derived. At the time we were trying to solve a problem with the (acceptance) testing of the workflows but found that the solution we worked also helped us improve the discussion and collaboration around the requirements.

By way of background, when we tested during the first phase of the project, while we had good requirements, we really had not taken the time to develop a good test plan, relying instead on an ad-hoc approach. The result was that we did not feel we had covered all the basics. We felt that we had duplicated efforts in some areas. To improve we decided to capture the acceptance tests during the development of the requirements.

For the second phase of the project we used a format for the Acceptance Criteria (or Conditions of Satisfaction) which is related to acceptance test or behavior driven development. The basic format of the acceptance criteria is described as:

Given <some initial context>
When <an event occurs>
Then <ensure some outcomes>

The work we were doing involved the flow of data between Siebel and JIRA. Let’s say we have a user story “As a support engineer I would like see what is happening as people discuss and work a Siebel issue so I can ensure that it’s heading in the right direction”, we would then write the Acceptance Criteria (or Conditions of Satisfaction) as “Given a person is in working in JIRA Agile, when the person adds a comment to the JIRA issue that is linked to a Siebel CR then the related Siebel record notes will by updated with the comment”.

Just like the user story format helps us capture and understand not only the feature but who needs it and why this format for the acceptance test helped us capture and understand the acceptance criteria from an end user’s perspective. It also helped us understand what it meant to test the capability when that time came around, forming the basis of our test plan. It really increased the amount of discussion we had about the requirements, and made things clearer for everyone.

Some of our requirements did not easily fit into this format. For example we had a number of requirements that given a condition, we'd have certain mappings go between the two systems. For these we simply captured a table of conditions, inputs and outputs. I will say it sometimes seemed that it took a while to get to a definition that we all agreed on, but I think that is the point - this also helped with collaboration and clarity as well.

It should be noted that when you capture acceptance criteria in this format they can also be the basis of an automated test using tools like Fitness or Cucumber to actually run the acceptance test. But even if you do not do this, it is worthwhile experimenting with the approach as it really helps with collaborative communication ensuring everyone gets on to the same page.

Note: Good starting book to read if you want to learn more is "Bridging the Communication Gap: Specification by Example and Agile Acceptance Testing" - Gojko Adzic.

/home/hpsamios/ · Last modified: 2021/04/28 11:37 by hans