Recently I was involved in a discussion about how to introduce test automation in agile teams. These were the main discussion questions and below is my attempt to answer them.

How can an agile project manager or scrum master directly influence test automation in agile teams?

I believe directly influencing team to implement any improvement doesn’t go a long way unless the change comes from within the team. If team doesn’t think they have a problem then why change to what is already working.

Scrum master is there to facilitate discussions, these collaborative discussions should highlight problems and team figures out how they will solve them. Scrum master should be focusing their energy on removing any impediments that team faces when they are implementing their chosen solutions.

How do you introduce and improve the level of test automation in agile teams?

Identify what is the core problem and by solving it how you will help your business, clients and ultimately team itself.

Once can help team members to think in a structured manner, so they understand their current state, desired state and steps between them. Here are some of the steps to think about.

  • Problem identification
  • Building understanding
  • Seeding ideas
  • Iterative implementation
  • Adaption
  • Continuation
  • Improvement

Try running a  The A3 Process workshop with your team. I found this a very effective tool in solving complex issues in structured manner.

Once team understands the core problem and knows to solve it then tools, languages, processes, frameworks, artifacts, ceremonies and metrics are primary factors in big scheme of things.

I hope this helps!

In previous post I explained just one scenario where lack of user feedback during development can land your project into the wilderness with your clients.

Traditionally in software teams roles like business analysts, product owners and usability specialists are responsible for getting client’s feedback by variety of means. It’s time to empower our teams with new feedback and interface testing tools, so they can grab every opportunity to collect smart feedback data and build great products, save cost with higher client satisfaction.

Let me propose few ideas for software development teams

  1. Get closer to your clients by measuring and reducing feedback loop travel time.
  2. To me feedback is a “Value Channel”, so if you create new channels you’ll increase the value of your product.
  3. Use new tools available in the market to collect your feedback.
  4. Make data based decisions rather than emotional ones by validating your idea.
  5. Client’s time is important and providing feedback shouldn’t be more than 2 clicks away.

List of few feedback and interface testing tools

Verify -
Notable –
Bugherd -
Usabilla –

What’s your feedback loop travel time and what tools you are using to collect data?

So why not validate your strategy before or during your project before you spend too much time and money.

Ever been a part of a team where

  • You were handed a UI design on a new project.
  • Your teams starts a project and build what it was asked for.
  • This design came from some other team or group.
  • You built this new application it and tested it (all ready to go).
  • Then the aha moment comes when you present this newly build application in front of your clients waiting for a WOW reaction.
  • Opss… your clients say we needed a Zebra and this looks more like a Donkey.

Did someone missed the point somewhere in the whole process?

You’ve got it right! while you were busy building the perfect Zebra no-one asked users for their timely feedback during the development. I am not saying that’s the case with every team but reality of software development business is that teams always try to make perfect application in the first attempt to wow their clients.

Research shows the more client involvement in the product development the better the outcome. The quick you receive feedback the better position you are in to make changes with less cost and you might end-up close to building the perfect Zebra that your client wanted :-)

Let me ask how your team is collecting feedback from clients and how you are making smart data-based decisions?

In order to build your perfect Zebra you can leverage new tools to collect user feedback, these tools are smart, cheap, scalable and will guide your team to make data-based decissions.

In the next blog post I’ll be explaining more about these tools.

Late 2010 I started a project with my new team, everyone was relatively new to agile development. Although we were defining acceptance criteria for user stories but Product Owner and Quality Assurance team had different expectations on what is acceptable to them in a story. Below was my attempt to highlight the difference between different expectations.

Acceptance Criteria: Usually defines the scope of a user story and product owners expectations. Clarifies the product owners intent and what they see as acceptable for clients.

Definition of Done: It’s how we define the quality of a story, steps we must take to ensure the highest quality work delivered within a user story. This can be a quality agreement between developers and QAs.

Example Story: As a guest user, I want to be able to view a shared dashboard so I can view the latest trends in my industry.

Acceptance Criteria:

  1. A guest user must provide valid email address to access a shared dashboard.
  2. Dashboard access and privileges are correctly matched for a given email.
  3. A guest user must be able to view a dashboard on all supported browsers.

Definition of Done:

  1. No fatal errors in code.
  2. No new notices.
  3. Must have unit tests.
  4. Completed code docs.
  5. Code review finished.
  6. Code checked into the build branch.
  7. Interfaces works on all supported browsers.
  8. Performance checks are successful

Let me ask how your team writes story acceptance criteria and creates quality parameters?

The above comparison helped my team to have a greater understanding of delivering high quality features; it also helped us to create a fun development environment where team members were not getting frustrated over compiling errors.