Testing Heaven: When should we test agile apps?

Testing Heaven: When should we test agile apps?

#110.  My current team is stuck in Limbo. We've finished development of a Dynamics 365 application for a local government department and we're tapping our fingers waiting for acceptance testing to start.

What should agile testing look like on a Dynamics 365 or Power Platform project? I describe my agile testing Heaven, and the next best thing, Elysium. And what it's like to be stuck in Limbo.

What does testing look like on your business applications projects today?

Resources

  • Agile Testing | https://amzn.to/3rCzDEf
  • More Agile Testing | https://amzn.to/3gCBLFO
  • Continuous Delivery | https://amzn.to/3JbHKO0

Links to Amazon books are affiliate links for which I receive a small commission.

Support the show

CONNECT
🌏 Amazing Applications
🟦 Customery on LinkedIn
🟦 Neil Benson on LinkedIn

MY ONLINE COURSES
🚀 Agile Foundations for Microsoft Business Apps
🏉 Scrum for Microsoft Business Apps
📐 Estimating Business Apps

Keep sprinting 🏃‍♂️
-Neil

Transcript

[00:00:00] I find myself now in a situation that I bet a lot of you have found yourself in too: when you're building complex business apps and trying to adopt an agile approach: where does testing fit in? 

 Hi, this is Neil Benson from Customery. Thanks for grabbing this episode of Amazing Applications. It's great to be with you. Listen, it's been a couple of weeks since my last podcast episode was published. I've been busy creating new content for the Customery Academy YouTube channel, where I've published a video about my amazing spider desk here at my home studio. It's been a labor of love for the past five years, and I finally got it into a perfect place. 

I've created another video recently on managing dependencies using a cross team refinement board, which is a tool from the Nexus framework. If you're a scaling Scrum and you've got multiple scrum teams building a complex product from a single product backlog, and someone always seems to be held up waiting for another team to finish building something. Then you want to take a look at the Nexus framework and the cross team refinement board. 

I've also been getting ready for the private preview of a new masterclass I'm launching called Winning Agile Projects. It's for the sales, pre-sales and delivery teams of Microsoft Dynamics 365 and Power Platform partners. It covers when to pitch an agile project, how to describe the benefits, how to create a compelling proposal, how to estimate agile business applications projects, and how to write agile statements of work.

One of the most exciting things about the new masterclass is the live workshops. Each cohort of 10 students will meet weekly for five weeks to share and reflect on what they've learned, not just from the course, but [00:02:00] from each other. And I can't wait for that. 

Once the feedback is in from one or two private preview cohorts, we'll be launching it at full price later in the year. If you're interested in applying for the preview, or for the general release, visit the link in the show notes and submit an application. Especially if you're in a European or Asia Pacific time zone. 

So that's what I've been up to from a Customery Academy perspective, but I've also been busy with delivering another Dynamics 365 application for a local client. They are a Queensland state government department. We reached our development complete milestone in January after five months of development. And now we're in the testing phase and scheduled for our major release to production at the end of March. 

I find myself now in a situation that I bet a lot of you have found yourself in too. When you're building complex business apps and trying to adopt an agile approach: where does testing fit in? 

If my team is such an amazing scrum team, and they are (like guys), why is development separate from testing? 

Here's my heavenly approach to testing. And I'm not there yet, by the way. 

Before the sprint starts, the stories at the top of the backlog are well elaborated. The product owner has prioritized them, validated the business value with our stakeholders, and has helped the developers understand what's needed. The developers have had an opportunity to clarify the requirement, consider a candidate design and estimated the risk-adjusted effort involved. 

Developers in my scrum teams include business analysts, architects, app makers, professional developers, testers, and DevOps engineers. Everyone who's going to build the feature is a developer in Scrum. 

Together the developers and the product owner have agreed the acceptance criteria. So that everyone's clear on when this story will be done, because it will meet our definition of done that applies to all of our work and will meet the acceptance criteria for this story. 

We have a single estimate for the story that reflects all of the effort needed to get it done. In my [00:04:00] heavenly approach to testing, done means tested and in production. Releasing into production isn't always realistic, but at least a feature should be released into a pre production or staging environment. And it's ready to be included in a production release. 

While I'm prepared to relax a little bit on that released to production criterion for my definition of done, I hate relaxing on the tested criterion. But there's usually some negotiation here.

The Scrum Guide says that increments should be verified and usable. For my teams, that means tested. All sorts of testing: unit tests, functional tests, integration tests, acceptance tests, security tests, and performance tests. Let's work our way through those one by one. 

Unit tests 

Our professional developers write unit tests for our custom components as they start development, so they know when a feature is working and complete, and they can easily retest it after making any changes to their code or refactoring it later. Having unit tests for all of our custom code, like our plugins and functions, is just part of our definition of done. And I'd suggest it should be one of yours too. 

Functional tests

 Our testers write functional test cases based on the story's acceptance criteria agreed with the product owner. And they manually test the feature once a developer deploys it into the test environment. In my teams I've found that the quality assurance professionals write the best test cases and anyone else in the team can help execute those test cases and record the results.

This is especially true when we find that our teams are completing most of our stories in the last a day or two of the sprint, instead of steadily completing stories consistently throughout the sprint. That last minute rush crushes the testers so everyone needs to pile in and help with test execution until the team learns to smooth out its delivery over the course of the sprint. 

Integration tests

Integration testing for my teams is usually an expanded form of functional testing, but the scope is broader. [00:06:00] Integration tests validate an end to end business process that a user would carry out to meet a customer's goal. This business process usually spans more than one feature and often more than one system, not just our Power Apps or Dynamics 365 application. Integration tests are usually conducted by the professional testers in our team, sometimes working as a virtual team with the testing specialists from other teams when we're building an enterprise application that involve multiple teams. 

Automated regression testing 

It's worth mentioning here that I'm a big fan of automated regression tests that span functional tests and integration tests.

Automating regression tests isn't easy or free. And the automations can be fragile and require frequent refactoring when the features being tested get enhanced or changed. But the investment is worth it in the long run. I recommend starting with a small portfolio of automated tests that cover your critical business features, or features that have exhibited a higher than average set of quality issues in previous rounds of testing. And then go on to expand it from there. 

Having a reliable, automated battery of regression tests running every day, or every sprint, or before and after every release, or after every Microsoft update gives everyone a lot more confidence in our product's quality. 

Acceptance testing 

Unit tests, functional tests, integration tests, and regression tests. Nothing too controversial in there. If you've been developing software for any length of time or even just building low code business apps, you're probably familiar with these types of testing, even if you're not an ISTQB certified quality assurance professional. Acceptance testing too feels familiar, comfortable even.

You might've started your business apps career as a subject matter expert involved in testing your application. I know lots of app makers who got started that way. 

The biggest thing about acceptance testing, agile teams have an issue with is the timing.

In fact, [00:08:00] getting the timing right for all types of testing is important for agile teams, but acceptance testing is the hardest. In our perfect world, a product owner is available to validate and accept every increment the developers build in this sprint in which the increment was built. Not in the next sprint or at the end of all of the development, but in this sprint in which was actually developed. 

In a two week sprint, if the product backlog item is turned into an increment on day six and tested, by our testers on day seven or eight, then it's validated and accepted by the product owner by day nine or ten. Done!

If the product owner doesn't have capacity to verify and accept every increment, then she can delegate responsibility to a subject matter expert or other stakeholder within the organization. 

Having the product owner verify and accept the increment within the sprint is Heaven. Having a subject matter expert, do it instead is Elysium.

What about Limbo though? Limbo is the state we enter when acceptance testing is deferred until later. When it's not done in this sprint in which the increment was built. Limbo is the first circle of hell. 

I don't mind the differing performance testing or security testing until close to a major release into production. And once you're in production, you should be conducting periodic performance and security testing for your critical business applications. 

But I hate acceptance testing Limbo. Developers think that all the features are done, but those features haven't been verified or accepted yet. 

In my current project, we're about to start a period of acceptance testing. It's just over four weeks since we completed development of all the features for this application, and as feedback starts to come back from our intrepid acceptance, testers it'll have been four or five weeks at a minimum, possibly up to four months, since we built that feature. We prefer immediate feedback on a feature is fresh in our minds and it still occupies our headspace.

When it's been months since we built it, there's a huge switching cost as we try [00:10:00] to remember how all those components worked together to make up the feature. And that's assuming that it's the original developers who worked on the feature and they're still available. The longer we're in Limbo, the greater the risk that they're not available.

My acceptance testing Limbo is not all that unusual. You might've found yourself in that state as well. In fact, it's something I've seen in lots of our Microsoft customers whose business applications teams have adopted an agile approach, but perhaps the rest of the business has not. They haven't got subject matter experts available continually throughout the course of our project.

Sometimes it's the other way around though. Sometimes it's the business application being developed by the business unit, along with a Microsoft partner, but it's the IT department's antiquated governance processes that demand a dedicated period of acceptance testing after development. 

Leaving Limbo is hard. I can't sit here and pretend that it's easy to get into Heaven or even Elysium by testing everything within the sprint. There is no simple ceremony or practice I can give you that will allow you to enter Heaven. It's a journey from where you are now. It's a pilgrimage, in fact. I think it's worth it. And I wish you well, as you take the first steps by experimenting with acceptance testing, closer and closer to the sprint in which your increment was built. Until eventually you ascend to great heights. Keep sprinting.

[00:12:00]