Search results
Suggest a FeaturePDF

Performance metrics

Overview

This section presents the results of performance testing conducted on Bold Reports® On-Premise Edition, focusing primarily on how the system performs when rendering large reports.

The tests also evaluate how the system behaves under concurrent user load, providing a comprehensive view of both report rendering performance and server-side stability. These results are intended to help you assess Bold Reports® stability, responsiveness, and scalability in real-world usage scenarios.

Report Rendering Performance in On-Demand Mode

The performance evaluation began with testing how Bold Reports® handles report rendering in On-Demand Mode, which is designed to optimize responsiveness when working with large datasets.

Enabling On-Demand Mode significantly improves performance by rendering only the visible portion of the data initially, allowing faster access to the first page.

Test Environment Configuration

  • RAM: 16 GB

  • Processor: 4 vCPUs

  • Deployment: Azure Virtual Machine

  • Report Name, Settings: Flat table report with 10 columns

  • Property settings: On-Demand (PageCreation → Background)

Performance Results

Note: All times are displayed in the format hh:mm:ss.ms .

Records First Page Time (Standard Mode) First Page Time (On-Demand Mode)
100K 00:00:25.92 00:00:03.12
500k 00:01:36.97 00:00:05.83
1M Exception thrown at 42 sec 00:00:08.22
1.5M - 00:00:20.63
1.7M - 00:00:21.06
2M - 00:00:24.81

These results confirm that On-Demand Mode significantly reduces the initial load time, even for reports containing over 2 million records.

General Performance Metrics

The following table outlines the performance metrics of Bold Reports® during report rendering with various row and column counts. The time taken indicates how long it takes to load the specified number of records. All times are in the format mm:ss.ms .

Representative Performance for Various Data Sizes:

Column Count Row Count First Page Time
10 1000 00:05.02
10 50000 00:30.36
10 1 Lakh 00:55.33
10 10 Lakh 07:09.40
20 10000 00:13.81
20 2 Lakh 01:59.37
20 10 Lakh 09:28.77

Note: As you can see, the time taken increases with the row count, particularly for larger datasets (1 Lakh or more). This highlights the system’s performance when handling large datasets.

Complete Performance Data Table

If you would like more granular details, the full dataset is shown below.

Column Count Row Count Records Time Taken
10 1000 1000 00:05.02
10 10000 10000 00:08.22
10 20000 20000 00:13.99
10 50000 50000 00:30.36
10 1 Lakh 1 Lakh 00:55.33
10 2 Lakh 2 Lakh 01:04.10
10 3 Lakh 3 Lakh 02:09.75
10 4 Lakh 4 Lakh 02:56.11
10 5 Lakh 5 Lakh 03:48.43
10 10 Lakh 10 Lakh 07:09.40
20 1000 1000 00:07.10
20 10000 10000 00:13.81
20 20000 20000 00:26.03
20 50000 50000 00:55.64
20 1 Lakh 1 Lakh 01:06.99
20 2 Lakh 2 Lakh 01:59.37
20 3 Lakh 3 Lakh 02:43.84
20 4 Lakh 4 Lakh 03:31.67
20 5 Lakh 5 Lakh 04:58.09
20 10 Lakh 10 Lakh 09:28.77

Concurrent User Load Testing

We simulated 100 concurrent users to test how the system behaves under load. This type of test is essential for understanding how the system handles real-world scenarios where many users are running reports at the same time.

The test was conducted in a Kubernetes environment with the following system configuration:

  • CPU: 4 virtual CPUs (vCPUs)

  • Memory: 16 GiB RAM

The goal of this test is to help you understand how Bold Reports® On-Premise Edition performs under stress, allowing you to assess its stability and speed when multiple users are interacting with reports at once.

Test Setup and Key Parameters

The following table outlines the key parameters used during the performance testing, providing detailed information about the test configuration, including the number of concurrent users, test duration, and the specific settings used to simulate the load.

Property Value Explanation
Concurrent Users 100 Total users accessing reports simultaneously
Ramp Up period (seconds) 50 Gradual login of users. For 100 users, 2 users log in every second.
Loop 1 Number of times the test will repeat its process.
Thread Life Time (seconds) 700 Maximum duration of the test; ends after the set time, even if incomplete.
Startup Delay (seconds) 2 Initial delay before the test starts, to handle setup time.
RunTime Controller runtime (seconds) 480 Limits how long the test runs before stopping, applied to child elements.
Flow Control Action Pass Duration (seconds) 3 Regulates the speed of requests sent to the server during the test.

Performance Results

The test results highlight the system’s stability under load:

  • 1st Run: 100%

  • 2nd Run: 99.55%

  • 3rd Run: 99.91%

These results indicate that the system consistently performs well, maintaining stability across multiple test iterations.