An amazing story of collaboration to accomplish high-quality end-to-end testing for a Life Insurance major.

The COVID-19 outbreak has necessitated many, if not all, organizations to rethink their business strategy. Foremost being, a foray into the world of remote working to enable business continuity, which can sometimes prove to be challenging when it comes to collaborating with different teams to accomplish a common goal. It becomes especially challenging when the organizations and their employees are expected to meet hard deadlines during times of distress, such as the Covid19 pandemic. These situations test the mettle of the teams and can make or break the organization. This article is about one such company, a major Life Insurer in the East Coast that we partnered with to perform independent verification and validation to successfully roll out a major feature in the Quote to Application process.

The Challenge

The customer approached Aspire earlier in the year with a unique problem. They were about to add a new integration to their already live Quote website with a complete application submission process. This new application was already tested by their internal teams and the basic functionalities were found to be working as expected. However, there were a few challenges:

  • Intermittent errors were being thrown and they were not reproducible
  • No idea how the application would behave cross-browser/cross devices
  • No application performance baselining was done
  • Needed the testing to happen from multiple geographies to replicate real-life scenarios
  • No requirements documentation – FRD, BRD, Architecture diagram, etc.

These challenges were something that Aspire has seen day in and day out and our strong testing experience could be relied upon to address them. But the delivery level challenges compounded by the Covid19 situation were entirely new and needed much more thought to be put in to make the project a success.

India, where the majority of the test team was located, went into a lockdown within a short notice of 4 hours. It’s unlike anything we had seen before. Aspire invoking the business continuity plan immediately, enabled all the test team members – 12 in number, to continue work from their homes right from Day 1 without a glitch. Though it’s a critical step, it was only a first step towards successful delivery. We understood that in order to be effective and productive during these challenging times, 2 key items should be prioritized.

  • Effective communication
  • Collaborative test environment

The test approach was drafted after careful thought on the delivery challenges listed above. And the timeline of just 15 business days didn’t make the process easier.

The Solution

Though the customer expectation was nothing more than for us to address the above-mentioned challenges, we knew the best way for us to go about doing this was to not just be pointing issues in the system but to work as a partner, do as much research as possible and provide as much information to help the development teams get to the bottom of the issues. With this in mind, we went ahead and drafted a solution approach that would involve looking at quality as a single objective (not as functional quality or non-functional quality, but just quality) and included functional manual testing, test automation, and performance tests. And in order for us to achieve all this within a mere 15 business days, we had to have a very strong global team ably supported by a domain expert working collaboratively, with a seamless communication protocol.

Global Team

We onboarded a team of testers across our US & India offices, working from home, to enable a faster turnaround time and also to replicate production scenarios as close as possible. While our testers in India were based out of Chennai, our US testers were spread across Texas, North Carolina, Arizona, New Jersey, and Connecticut. To hit the ground running to begin testing from day 1, we took the ‘follow the sun’ approach. While the testers in India derived the test cases each day, they were handed over to the US-based testers who would execute those test cases.

We also had our automation testers and performance testers in Chennai to script and execute automated tests and performance tests respectively.

When it came to communication, nothing extra was needed to be done since we already had a robust communication plan that was the key to our success of the onsite/offshore/nearshore model of working. The team had 2 touchpoint calls every day to discuss the goals and expectations for the day and status on the progress made during the day. The teams had frequent interactions through collaboration tools throughout the day as well.

Test Approach

To offset the lack of time, our first and foremost plan was to have a subject matter expert (SME) work closely with the team to identify scenarios that covered all key functionalities of the application. Of course, this task becomes doubly harder when there are no requirements documents or any other artifacts to explain the system in detail. Our SME, based on past experiences, derived over 250 scenarios covering end to end quote to the application process. These scenarios were the basis for the functional test team to derive the test cases. In a matter of 5 business days, all test cases were written, and the functional manual testing achieved the target of 30% coverage for the first week.

All the execution was done in Sauce Labs cloud to enable a collaborative testing experience while working from remote locations. Sauce Labs enabled on-demand provisioning of multiple browser versions and mobile devices thereby enabling the remote testers to be productive to a great deal. The results started showing and we started reporting the issues in the application. However, most of these issues being unreproducible, we documented a wide range of supporting information. Not only that. Sauce Labs also helped capture the execution video for all tests. This information came in handy for us to identify patterns and pinpoint the issue areas and also of course to provide hard-proof issues.

In addition to these details, we also captured the HAR files for each test that was a failure.

Test Automation & Performance Tests

The task for the automation team was to identify around 15 end-to-end smoke test scenarios and automate them. These scripts would be used for a daily health check and also for running cross-browser/device tests. During the 3rd and last week of testing when the fixes for defects identified during the first two weeks of testing would go in.

It was an aggressive timeline to automate 15 test cases, dry run them, make them compatible across desktop browsers and mobile browsers. True to the adage ‘all is well that ends well’, the team completed automating 17 smoke test scenarios in 10 days flat. This was made possible due to using of Aspire’s homegrown test automation framework AFTA 3.0. And the scripting was not just a patch-up work but a really good quality code that was efficient and scalable. The code was integrated to Jenkins for scheduling daily health checks across browsers and mobile devices and came in handy during the crucial last-mile testing. Sauce Labs greatly helped here to run multiple instances of parallel tests across browsers and mobile devices. This enabled the execution of close to 400 end to end tests in a span of about 3 days.

And same was the case with performance tests as well, after 10 days of scripting, a couple of performance benchmark tests were run in increasing concurrent user loads of 20, 50, 100, 200, and 300 and a breakpoint arrived.

Value for Customer

The 15 days of independent 3rd party testing was executed without a glitch. And the results, replete with all the critical information required to fix the issues, were highly appreciated. But then the results were not exactly satisfactory as far as the stakeholders were concerned, primarily due to a blocker for Safari browser that would impact anyone using Mac/iOS Safari browser (which would be nothing short of 50% of the consumers’ profile) and also since the defect percentage stood at over 14% of the total test cases executed. This basically meant that, in the current state, if the application were to go live, over 64% of the consumers wouldn’t be able to complete their application process.

The report garnered attention across all levels in the organization and the second round of testing, a repeat of the first set of tests, was planned after the development team released a new code with fixes. This time, the defect percentage came down to under 6%, though it was deemed too high to go live. The code fixes and repeat testing went on for 2 more cycles after which the percentage of the defects came down to 3.6%.

Different cycles of testing were performed and with each cycle, the defects were minimized. Given below is the representation that gives a clearer picture of executed tests, defects, and the number of days used for testing.

  Manual Cases Executed Automation Cases Executed Defects Identified Defect % No. of days used for
testing
Test 1 1291 394 239 14.18% 15
Test 2 1055 525 85 5.38% 15
Test 3 798 340 68 5.98% 10
Test 4 417 0 15 3.60% 5
Test 5 3561 1259 407 45

Despite a tightly packed schedule and working full-time remote, the QA team emerged and learned to work efficiently in getting the things done with great collaboration. By following the best practices to achieve the best results, the QA team helped the client to ensure to maintain the quality of their product and deliver a better product to their end-users.

To know more in detail about how we managed to achieve all this for a valued customer!

Srinivasan Sankar

Project Manager

Author

Janaki Jayachandran

Vice President

Practice Head