AspAlliance.com LogoASPAlliance: Articles, reviews, and samples for .NET Developers
URL:
http://aspalliance.com/articleViewer.aspx?aId=1825&pId=-1
Load Testing Crystal Reports Caching Features
page
by Eric Landes
Feedback
Average Rating: This article has not yet been rated.
Views (Total / Last 10 Days): 23613/ 41

Introduction

For those of us integrating Business Intelligence tools into our applications, it can be useful to have information on performance of our BI tools. The idea behind the article is to do some load testing on certain Crystal Reports in a .NET environment. We want to see how effective the standard Crystal caching is in assisting developers in discovering where they may need to do more optimization in their reporting.

Since this is a test case in a very limited scope, the numbers and results revealed in this article are to help in comparison. They are not intended to give absolute numbers for how your reports should be deployed. Rather, the numbers can help as you decide whether to use shared data sources and how many records with the crystal reports caching.

System Requirements

·         Visual Studio 2008 Team Suite (Developer and Test Edition used)

·         Crystal Reports 10

·         SQL Server 2005

Crystal Reports Caching

To understand what types of caching available in Crystals bag of tricks, let us take a peak under the Crystal caching hood. According to MSDN, the standard crystal report displayed in the Crystal Report viewer uses a default caching strategy which will be good for most scenarios. 

For higher end scenarios, where out of the box caching is not performing as well as you would like, you can take advantage of programmable caching options. Here, the load test scenario takes into account out of the box caching used in the Crystal Report Viewer.

Test Strategy

The test strategy is to simulate the environment of and access several reports from a web environment. Using Visual Studio 2008 Tester edition, a test is called "LoadTest1."  Pictured below, the illustration shows that this test includes 2 Web Tests and uses a mix of IE and FireFox to connect and a network mix of LAN, Cable/DSL speeds. 

Figure 1: Load Test Screen

Here are the Visual Studio automated web tests included in this load test.  The first test is called "Customer Report". This test displays one report, Customer.rpt. This report joins four tables together. The test displays the first page of the report, and pages through the report. The report is displayed on a web page, using a web crystal viewer control. There is a validation test in Customer Report also. This first test is simple and is not intended to put a lot of stress on the application or server. 

Figure 2: Customer Report Test

The second test is called AllReports. This test uses a web page to display three different reports.  First is the Customer report. Next is the SalesOrders.rpt and then SalesPersons.rpt. Each report is displayed and paged through. SalesOrders.rpt includes 3 tables and SalesPerson.rpt joins over six tables. It is all recorded in a web test in Visual Studio Testing edition tests.

In the load test, we set our browser mix to be 84% IE 7.0 and 16% Firefox. Other browser types could be added through the load test to this mix if necessary. The tests network mix includes LAN and two cable/DSL speed mixes. This test should be simulating something close to the real world of what type of connections and browsers are hitting the site, for a global enterprise intranet application. 

This first load test is fairly simple. It runs for 10 minutes and distributes the user load with a maximum of 25. This test will be our benchmark for testing. The next set of tests will include new records having been added to the tables that the reports are based on. That process will be explained in the next section. Below are the results of the first load test run for a total of 15 minutes. 

The times are based on a Virtual PC version of Windows Server 2003. This VPC has 1 GB of RAM available to it.

Figure 3

For this run, the AllTests test suite took an average time of 152 seconds. The CustomerReport test took 88.8 seconds.  Because AllTests displays three reports, it would seem that AllTests would be longer. Surprisingly, AllTests does not take even twice as long as CustomerReport.

The first page takes 7.44 to load. That page displays the Customer report. The other page times are below 1 second except the first one. Now that there is a baseline established, the next step is to see how we perform by dynamically adding data. 

Second Tests

For the second run of the tests, we have added records to different tables used in the reports. In a normal scenario, data would be added to a live database in a real time environment or your data warehouse would be updated once a night with the delta data. 

For this test, the assumption is that the reports are running off the live database, not a data warehouse. This simple test added 142 records to the saleorderheader table and 341 records to the salesorderdetail table. After adding those records, we reran the load test to see what effects we might have.

The second run took around 10 times longer than the first. See Figure 4 which shows the results of the test. These results show that the new records affected the performance dramatically. And in the 10 minute test, the off the shelf caching did not appear to have much affect. 

After running the test a second time though, the performance increased dramatically. See Figure 5 to see how the reporting times increased. This time the tests showed a 10 times faster increase from the test run in Figure 4.    

The test in Figure 4 had 4 errors shown. These 4 errors included one http request timeout, 1 error accessing the performance counters and 2 errors accessing the performance category "Network Interface."

The test in Figure 5 had 1 error shown. The error was that the performance category "Memory" could not be accessed.

The Figure 4 percent of processor time was at 56.3%. Figure 5's test had percent of Processor time at 50.5%. This seems to show that the reports are not using a significantly larger percentage of processor time when adding new data to the cache. 

Figure 4

When digging into the details for the scenarios of the tests, the information should help us to see any speed patterns with the different reports in the tests. When drilling down for the details in the Figure 4 test, we see that the most time is in the first 2 scenarios. When looking at the details in the Figure 5 test, the results are similar. The Figure 5 results are faster, but the first 2 scenarios are the slowest scenarios in the test group. 

Figure 5

These results do not show what the exact cause is of slower responses. It looks like the caching takes a little while to show up in the tests. Keep in mind that these tests do not conclusively show that the quicker results in the Figure 5 test are due to Crystal caching. It could be caching done at the database level as well. In future articles we will explore this subject.

For speeding up the response times there are different methods to take. For instance, we could try to improve the SQL and tune the database. If we want to do this, we could programmatically try to do some things with the Crystal caching model. We will try to see if some of these techniques could improve results in a future article.

Summary

The tests we viewed showed that there are some affects from caching made when running Crystal reports. When running a report after data has been added to the database, there is an initial affect on speed. This also points to the accepted practice of creating a data warehouse that is not updated regularly to speed up your reports. In later articles we will explore ways to speed up your reporting with Crystal Caching. Happy coding!


Product Spotlight
Product Spotlight 

©Copyright 1998-2024 ASPAlliance.com  |  Page Processed at 2024-04-25 4:49:16 AM  AspAlliance Recent Articles RSS Feed
About ASPAlliance | Newsgroups | Advertise | Authors | Email Lists | Feedback | Link To Us | Privacy | Search