Embracing BDD
April 2021
In my previous blogs, I provided insights to the benefits, challenges and myths of microservice architecture pattern. The insights that I provided in these articles were coming straight out of my experience in being part of monloith-to-microservices migration project. Few reasons why we were moving to microservices
Build global platform to achieve multi-country rollouts with single code base
Faster time-to-market
Scalable and highly configurable services
Truly agile - Continuous Delivery and Short-term release cycles
There were plenty of learnings and un-learnings, not only from the design and development perspective, but also from the execution and testing methodology perspective. When the decision was made to migrate to microservices, there were couple of bottlenecks that we found in the testing methodology.
How to automate functional/component tests for each microservice?
How to perform end-to-end integration tests when microservices integrated?
After multiple discussions and iterations, we decided to explore the following areas
Behavior Driven Development (BDD) to address automate functional tests for each microservice
PACT (Consumer driven Contract testing) to perform integration testing of various microservices in action
Through this article, I am going to provide insights to the BDD, and the required process changes that we had to bring in to have our objectives met.
When we started deep dive into our current processes and approach, we came up with the list of the following challenges.
Inverted test pyramid
Traditional Agile Testing
Usage of proprietary driven test automation tools
Absence of efficient collaboration between stakeholders - Product team, Dev team and QA team
These required immediate attention and changes, without which it was not possible to align to delivery cycles of microservices.
Fixing Inverted Test Pyramid
Before we could embrace BDD, our primary objective was to fix the test pyramid. We had few unit tests (coverage of around 55 to 60%) but more of integration and acceptance tests at the top. In an ideal test pyramid, the bottom of the pyramid should have more number of unit tests, and less number of functional, integration and acceptance test. So, we formed a core team who were handed over with the responsibility of fixing the unit tests, adapt TDD (Test Driven Development) and increase the test coverage to 80% (in some cases > 90% wherever possible). Our baseline remained 80% across all modules/components. This initiative took 3 to 4 months of an effort to move from 60% coverage to 80%. This is how we transitioned from inverted test pyramid to an ideal test pyramid.
Traditional Agile Testing
Once the test pyramid was fixed, it was time to look back into our agile practices and address few challenges there. One of them was in-sprint testing. The traditional practice is to indepedently work on the User Stories within the Sprint. Then there would be nightly build to check if there are any broken functionalities due to code commits on daily basis, and provide immediate fixes if there are any broken functionality. Once the user stories are delivered (after in-sprint testing), then the stories will have to go through regression , and issues will have to be fixed in the coming Sprints. This is considered quite inefficient as there were instances of story spillovers eventually leading to quarterly delivery misses. We wanted to overcome this traditional testing practices, and in fact BDD was the answer.
Traditional agile process that we had...
Illustration: Traditional agile testing model that we wanted to replace
Usage of proprietary driven test automation tools
The traditional test automation tools that were in use were very proprietary driven leading to vendor lock-in issues. The tool also involved heavy and unnecessary customizations. The tool was good for long-term release cycles but was not helping to achieve short-term release cycles. Moreover, frequent patches and minor releases lead to additional overhead in managing multiple and compatible versions of test scripts. We were desperate to sunset the tool and embrace open source, industry standard, and lightweight tools that helped us to achieve our core objectives. BDD was the answer again.
Absence of efficient collaboration between stakeholders
The approach that is being practiced in agile projects,
Product owner or Business analyst documents the user stories with necessary acceptance criteria
Dev team starts implementing and provides necessary unit test coverage
Meanwhile, QA team will start writing test automation scripts as per the defined acceptance criteria
This approach is perfectly alright when it goes as per the plan, but we have had issues of story spillovers due to mis-interpretation of acceptance criteria and providing wrong implementation. This eventually led to delivery issues. We wanted to fix this by bringing in all the stakeholders on to the same page and speaking same domain language. Once again, BDD was an answer.
Illustration: Traditional Sprint execution model that we wanted to replace
Why BDD?
As most of you already know that, the BDD is not a technology or a framework, but just a process that helps in simplifying the testing processes overall. BDD is a process designed to focus on the usage of “Domain Specific Language” by different stakeholders, thereby helping in better collaboration between multiple stakeholders in adding business value. It works on 3 core principles,
Business/Product and Technology teams should see the system in the same way
Any system should have identifiable and verifiable value to the business
Upfront analysis, design and planning always provides best outcome
Some of the advantages that BDD provides over traditional agile testing processes and tools are,
Enforces “Test First” approach. Which means, in order to embrace BDD, you must be TDD ready. As we were TDD ready, we were closer to achieving this objective
Emphasizes on Business Functionality, not Technical Details nor Test Strategy
Specification by Example – Human readable format
Written in domain specific language (DSL)
Improves Collaboration - All stakeholders are involved in the definition of acceptance criteria thereby speaking the same language
Self-explanatory
How did we embrace BDD?
The learnings and approach were both cultural as well as technology. The team had to embrace the shift in the team composition (with varied skillset), and the processes. Culturally, the QA team were henceforth called "Quality Engineering" and each member were called "Test Developer". They also had to put an effort to learn Java, Selenium and JBehave - the tools that were used to implement BDD. Overall, the iniative involved multiple steps,
Council Formation - The council team was responsible to explore, do a PoC, and define the strategy. They had the following responsibilities.
Analyse the frameworks capability and make trade-off
Prototyping to check the readiness
Build a framework to provide an abstraction to the underlying frameworks to reduce the learning curve
Continuous Improvement
Transition - This step involved the strategy and planning for transition to core members.
Train the core members
Extending the prototype to accommodate modules
Feedback to Council
Share the best practices
Implementation - The step where the reference implementation was documented
QAs were trained by core members
Less complex user stories were picked
Clubbed to release cycles
Migration - Migration of existing tests to BDD
Incremental approach
SPIKE done to assess the impact and effort
Technical Stories were created and aligned to ongoing release related topics
Conclusion
By embracing BDD, we could achieve most of our objectives but the journey wasn't easy. There was a high learning curve due to the new processes and tools involved. It took us almost an year to migrate from traditional testing tools to BDD approach, and also to define a strategy to perform independent functional testing of microservices.
Overall, the BDD helped us to define the test scripts using domain specific language, thereby creating uniformity in terms of testing methodology. All the stakeholders were able to speak the same language, better clarity in terms of acceptance criteria, and could add business value. The internal framework developed also paved the way in simplifying the automation of functional tests for microservices. I shall write another article on another initiative called PACT, that we took to achieve integration/contract testing of microservices.