Our website uses cookies to give you the most optimal experience online by: measuring our audience, understanding how our webpages are viewed and improving consequently the way our website works, providing you with relevant and personalized marketing content.
You have full control over what you want to activate. You can accept the cookies by clicking on the “Accept all cookies” button or customize your choices by selecting the cookies you want to activate. You can also decline all non-necessary cookies by clicking on the “Decline all cookies” button. Please find more information on our use of cookies and how to withdraw at any time your consent on our privacy policy.

Managing your cookies

Our website uses cookies. You have full control over what you want to activate. You can accept the cookies by clicking on the “Accept all cookies” button or customize your choices by selecting the cookies you want to activate. You can also decline all non-necessary cookies by clicking on the “Decline all cookies” button.

Necessary cookies

These are essential for the user navigation and allow to give access to certain functionalities such as secured zones accesses. Without these cookies, it won’t be possible to provide the service.
Matomo on premise

Marketing cookies

These cookies are used to deliver advertisements more relevant for you, limit the number of times you see an advertisement; help measure the effectiveness of the advertising campaign; and understand people’s behavior after they view an advertisement.
Adobe Privacy policy | Marketo Privacy Policy | MRP Privacy Policy | AccountInsight Privacy Policy | Triblio Privacy Policy

Social media cookies

These cookies are used to measure the effectiveness of social media campaigns.
LinkedIn Policy

Our website uses cookies to give you the most optimal experience online by: measuring our audience, understanding how our webpages are viewed and improving consequently the way our website works, providing you with relevant and personalized marketing content. You can also decline all non-necessary cookies by clicking on the “Decline all cookies” button. Please find more information on our use of cookies and how to withdraw at any time your consent on our privacy policy.

Skip to main content

Performance testing in an agile world: Time to shift left

Traditionally, the performance of any application is assessed only after it has been built. However, a cultural shift is necessary to meet the growing customer expectations and build scalable applications. It is high time that performance testing is included at the beginning of the project, not at the end.

With industries adopting shorter delivery cycles for faster time to market and competitive advantage, it becomes prudent to ensure the performance of every deliverable, no matter how small. Integrating performance testing in the continuous testing process will ensure that every deliverable is tested thoroughly for functionality as well as performance.

The shift left approach involves bringing in all test requirements early in the software development lifecycle (SDLC), including both functional and non-functional requirements. Any improvements made to an application before it is pushed to production benefit both the business and the end user.

Here are three essential steps for implementing shift left testing:

1. Define clear roles and responsibilities

A clear role segmentation between architects, developers, testers and operations managers is necessary to help each understand their responsibilities within the application lifecycle.

It is high time that performance testing is included at the beginning of the project, not at the end.

Identifying appropriate tools based on architecture, executing performance and unit tests, designing workload models and conducting actual performance testing are some of the most essential responsibilities of the team. Defining clear communication protocols will reduce the time needed to test, debug and retest.

2. Understand the requirements in detail

A clear and complete understanding of the requirements is mandatory for a successful performance testing. Here are a few areas to focus on when shifting your performance testing to the left:

  • Type of application and its architecture: Understand the compatibility of the application across web, desktop and mobile platforms, along with the database and architecture.
  • System capacity: Determine the threshold point where the system stops responding and fails to scale and handle more concurrent users. The key considerations are response time, throughput, network and memory usage, and requests per second. Scalability must be validated incrementally.
  • Application bottlenecks: Conduct due diligence on the application’s existing performance standards, user experience and responses across different integrated components.
  • KPIs at the module level: Define KPIs for modules and sub-modules to improve the efficiency and performance of these smaller units Investing the time to identify these KPIs will help establish a baseline during the integration process.
  • Test data and reusability: Identify test data that will be used and reused across all modules. Test data can be generated manually or automatically using tools such as NeoLoad or SOAPUI. This data will then be stored in external CSV files or database tables to be used when tests are run.
  • Load/stress: During requirements gathering, you should define the maximum load the application can sustain during production. This includes defining the number of users to be handled (single and concurrent), response time, number of requests, and CPU/memory utilization.

 

3. Manage test runs effectively

As part of the shift left approach for performance testing, test runs are performed in an iterative fashion for every sprint (in case of agile) or deployment cycle (in case of waterfall). This includes running tests, identifying anomalies against defined SLAs, optimizing scripts based on requirements and improving modules at each stage.

Introduce performance testing at build check-in level

Performing smoke tests on the builds at the time of check-in with moderate loads in the testing environment can serve as an indicator of performance issues. As with unit testing, scripts can be built to validate the performance requirements for critical business scenarios. The test data sets will enable quick and effective validation of non-functional requirements at an early stage.

Automate test runs

By shifting the performance testing left, the same code will be tested multiple times, which is an ideal way to avoid human errors. Reusing automated scripts and converting them into load test scripts will help improve efficiency. The performance test scripts should be integrated with the pipeline, enabling the execution of performance unit tests on every build according to pre-run conditions. Hence, whenever a new build is checked into the CI/CD tool, automated tests will be triggered.

Prepare the test environment properly

Setting up the environment to run the performance tests is an integral part of getting accurate results. During this process, key factors to consider are setting up a load balancing server for uniform distribution of tasks, isolating the test environment from unwanted apps when tests are running, having stable network bandwidth, sufficient space for the application database and conducting tests with real-time customer data while masking sensitive data. Ensure that all tests are monitored for outages and system utilization.

Perform iterative end-to-end tests

Performance tests will run during every sprint or deployment to determine the quality of code developed based on the non-functional requirements. Upon completion of every release, the code merge occurs in an iterative manner until all the modules are integrated to a single application.

By executing incremental performance tests on a full-size test environment, we reduce the risk of unexpected issues arising from code merges or module integrations. Based on the performance test strategy, once the production code/ application is developed, an end-to-end test run will be performed based on the KPIs outlined during requirements gathering.

Centralize test results

Sharing the test results across the board is the quickest way to solve any problem. It not only saves time but also cumulative efforts, resulting in reduced costs.

Performance testing report provides a detailed analysis of latency, response time and errors encountered during peak load. Apart from server-side load testing, reports are generated for each build, which helps compare SLA breaches, identify poor code pushes, and baseline KPIs for every improvement or build made to the production code.

The early bird gets the worm

Every application or product that is available in production will be accessed by a certain number of users, resulting in more traffic — and the number of responses may reach into the thousands or even millions. To ensure that the end-user experience is not compromised under high load, applications must be both scalable and reliable.

Any incremental changes to your application can affect performance, so the sooner you can find performance-related bugs, the easier and cheaper it will be to fix them. Shift left testing will help you do exactly that.


By Ajinth Gabriel Jelestin, Senior Consultant – Digital Assurance

By Gowtham Raja Suresh, Principle Consultant

Posted on: January 3, 2023

Share this blog article


About Ajinth Gabriel Jelestin
Senior Consultant – Digital Assurance
Software Quality Assurance and Services professional with 13+ years of experience in software industry with experience in Solution & Consulting, Delivery and Program management. Predominantly worked on manufacturing & logistics domain and have been part of solutioning for FSI domain for global customers. Got experience over consulting services for digital journey supporting with operational and people and delivery transformation.

Follow or contact Ajinth


About Gowtham Raja Suresh
Principle Consultant
Seasoned Leader with18 + years of experience in Delivery Leadership, handling complex engagements, full accountability for delivering multiple projects for the customer engagement. With my experience and success in global team leadership, I have gained a deep understanding of domain across industries viz. Automobile, Manufacturing & Logistics and Retail. I am skilled in training, coaching, mentoring, and developing teams, across Domain. Managing the full cycle of large scale complex programs delivery, contract management, SLA constructs, stake holder management, talent transformation, Business Development, change management and successfully delivered programs on time. I have maintained an effective performance in meeting the expectations of customers through both personal and team program contributions. I have built an excellent reputation as a respected Test strategist, change agent, communicator, facilitator, and relationship builder.

Follow or contact Gotham Raja