The Importance of Performance Testing

Mark Franks | April 28, 2017

We have spent a good amount of time discussing the Acumatica Cloud xRP platform in general and drilling into a number of topics, demonstrating its power & utility over the course of the past nine months. We discussed customization via White Labeling & Creating & Validating Fields. We have discussed Web Services & Debugging your applications. We have discussed the various opportunities for ISV Partners along our Integration Spectrum – among a number of other topics.

One of the often neglected arts in software development is Performance Testing.  This is a topic that is dear to my heart – as I have worked earlier in my career bringing many Enterprise ISV Partners into our performance labs at Microsoft.  At our recent Summit, I had a conversation with Rahul Gedupudi, CEO of Kensium and found that he too was very interested in the topic.  So of course, I asked him to write about it for us in the context of our platform. He was very gracious – as he always is – agreeing to deliver the following guest post for our developer & partner community.

ENHANCE PRODUCTS THROUGH PERFORMANCE TESTING

The Acumatica Framework delivers a set of core services and tools that are important for building and deploying large business applications. All these tools and services are generic and transparent to the application developer. This means that the application developer need not worry about implementing them during the design or application programming stages. With Acumatica’s development framework & Acumatica Extensibility Framework (AEF) provide endless possibilities for independent software vendors (ISVs) to build solutions on the Acumatica platform. In addition to creating add-on solutions, the AEF can also be leveraged to customize a customer’s installation. Some of the core features of AEF include:

  • Customization of the data access layer through extensions of the database scheme and existing tables.
  • Customization of business logic layer through extension classes built into a separate assembly.
  • Support for multiple interdependent extensions of the data access layer and business logic layer on a single instance.

These AEF features, along with others that provide powerful extensibility support, allow developers to build scalable Acumatica-based applications without worrying about interactions with third-party customization. Along with solid structural flexibility, the AEF gives ISVs a number of advantages, including:

  • The ability to deploy multiple projects extending a single DAC or BLC.
  • A measure of protection for the source code and intellectual property via pre-compilation.
  • An auto-discovery mechanism that makes deployment and upgrade processes straightforward.
  • An advanced level of control over business logic and multilevel extension models.
  • Extensions with no hard dependencies on base classes, meaning they are fairly resistant to upgrades.

Compatible with C#.NET compliant code, the AEF provides endless solution capabilities for ISVs building their third-party products and extensions on Acumatica. The framework delivers faster development times and access to built-in tools that help reduce errors in design, coding, and more.

With complex business logic varying between applications, performance and scalability can be unpredictable. Products should be tested in every case to ensure an optimized final output.

One of the most common challenges that ISVs face in optimizing their products is the handling and processing of large amounts of data. These pieces of data can include stock items, customers, orders, shipments, and other essential data that need to be exchanged between Acumatica and third-party applications. Issues related to these factors should be properly addressed to guarantee consistent levels performance at all reasonable capacities.

DISCOVERING DEGRADATION WITH LOAD TESTING

Before products are put through a comprehensive QA process, there’s no guarantee that they run flawlessly under all conditions. While many common tests check to see that the product runs seamlessly in all necessary areas for a single user, load testing ensures whether the product continues to function similarly when several users are placing the system under specific, expected conditions.

In short, load testing exposes bottlenecks and potential design flaws that lead to degradation. These types of results can save you from costly reworks that may be necessary after an inadequately tested product goes to market.

Load testing results give product developers useful insights regarding interactions inside the product environment, performance analytics, and product optimization tasks. These insights can be used to make adjustments that improve the scalability of the application and ensure that you’re doing everything in your control to release a stable, functional product.

There are several other types of testing that are similar to load testing, with the most popular of these probably being stress testing. While load testing is only meant to simulate expected loads and gather performance results, stress testing pushes systems to the point at which they fail to function properly and begin to break.

Stress testing can yield results that help increase product capacity, but these results often appear at levels far beyond realistic peak loads. If a traditional load test yields the results you would expect from a stress test and a breaking point is revealed, the system’s load capacity must increase dramatically to avoid degradation and breakdown in regular use.

REQUEST PROFILER: ACUMATICA’S BUILT-IN TESTING TOOL

Acumatica’s built-in Request Profiler tool provides a way for development teams to test their product capacity. It measures performance of logic built for different functionalities and screens. Some of the components the Request Profiler can measure to help pinpoint degradation and bottlenecks include:

  • Screen being accessed
  • Time to render
  • Executed activities
  • Interactions with the database
  • Execution times
  • Code being used

TYPICAL INPUTS, OUTPUTS, AND VARIABLES

While every load test should be considered its own entity and parameters should be based on requirements specific to the company and product, here are some of the more common inputs, outputs, and variables.

Input: The information that is assembled for the test
Output: The information that is yielded from the test
Variables: The specific information sought in outputs

 

INPUT OUTPUT VARIABLES
-Number of users -Behavior of product -Database touch-points
-Critical scenarios -Performance Data -Execution Times
-Parameters Threshold data Server communication
-Workload models -Bottlenecks -Traffic variation
-Feedback methods -Product finality -Peak capacity

 

IMPROVING PERFORMANCE THROUGH INSIGHTS

Through countless hours of testing our Acumatica products, we have uncovered several insights which we have shared below.  While we are unable to point out exactly how Acumatica products should be coded to your specific requirements, here are some things we’ve learned that we often apply to our products today, and should help you in your development as well.

  • Use the Operation.Status to check the operation status, especially while overriding the Row Persisted Event.
  • Use joins in the BQL and do not query on the BQL output to avoid iterative operations in reading the data.
  • Use PXFormula for performing calculations and avoid using the .NET coding standard of doing calculations in the logic.
  • Use PXTransactionScope to automatically rollback the unwanted data persisting when using multiple views.
  • Use the endpoints and extensibility made available in Acumatica 6 to avoid issues related to web service calls.
  • Create custom attributes supported by the Acumatica framework when there is complex business logic involved in processing.
  • Optimize business logic with enriched BQL queries to avoid loading large amounts of data on third-party integrations.
  • Handle dependent grids when applicable to avoid loading large data grids and usage of pop-ups that cause complex UX.
  • Unless essential, avoid field defaulting events.
  • Unless essential, avoid instantiating the graph objects within the loops.
  • Avoid instantiating new PXGraph() and new PXSelect(). Instead refactor your code using the keywords and base (as applicable).
  • Avoid using the events which fire on every invoke, such as Row Selected. This will cause huge in difference in performance while rendering the data.
  • Avoid using BQL views in looping operations.  Unless essential, use a caching mechanism to avoid hitting the database multiple times.
  • Avoid loading large amounts of data on grids, selectors, custom inquiry screens, and other processing pages.
  • Avoid any BQL queries in the events such as Row_Inserted and Row_Updated, unless the BQL results in less than one record.

To find insights regarding your products, it is recommended you follow Acumatica’s testing procedures. These should be followed to maintain compliance with Acumatica’s coding standards and optimal functionality on the platform. You should test your products using the built-in Acumatica features such as web services, Integration Services, Test SDK, Request Profiler, etc.  Outlined below are some of these tools and the usage.

  • Use built-in APIs to generate your desired number of requests and measure critical metrics (response time, CPU usage, etc.). These can be executed in batches to simulate numerous live users inside of the application.
  • Generate the desired amount of data entry operations using Acumatica Integration Services. You can create several import scenarios and run them simultaneously with the help of Acumatica’s scheduled operations mechanism.
  • Compose several GUI-based data entry scenarios using Acumatica Test SDK. You can run them in parallel against the server to simulate the desired number of concurrent users.
  • Use the Request Profiler to obtain your business logic’s performance metrics, such as time taken to perform certain action and the number of SQL calls being made to the database server. Learn how to optimize these procedures to achieve optimal performance on Acumatica.

Following Acumatica’s coding standards and testing protocols has allowed us to reduce performance issues in our applications. By incorporating the above mentioned solutions into our development and review processes, we have been improving the overall quality of our products, and pass on this value earned from refined processes to our clients.

For information on how we at Kensium Solutions can help you bring your Acumatica products to customers, visit our Acumatica Partners page.

Also please watch Rahul’s Video discussing developing on our Platform here.

Mark Franks

As a Platform Evangelist, Mark is responsible for showing people the specifics about what makes Acumatica’s Cloud Development Plaform wonderfully attractive to ISV & Partners. He's also passionate about Running, Latin, and his family. | E-mail: mfranks@acumatica.com | Skype: mfranks |

Subscribe to our bi-weekly newsletter

Subscribe