09 Jan 2008
This article provides some tips to improve the performance of your ASP.NET applications by using multiple recordsets, paged data access, and Cache API. The author further examines Per-request, Page output, and Kernel caching mechanisms along with a description of connection pooling and GZip compression.
Views (Total / Last 10 Days):
I am going to present some of best approaches to improve the
performance of ASP.NET applications.
You should think about the separation of your application
into logical tiers. You might have heard of the term 3-tier (or n-tier)
physical architecture. These are usually prescribed architecture patterns that
physically divide functionality across processes and/or hardware. As the system
needs to scale, more hardware can easily be added. There is, however, a
performance hit associated with process and machine hopping, and it should be
avoided. So whenever possible, run the ASP.NET pages and their associated
components together in the same application.
Because of the separation of code and the boundaries between
tiers, using Web services or remoting will decrease performance by 20 percent
|Return Multiple Recordset|
Review your database code to see if you have requested the
database more than once. (Best example is if you are loading the master data
from a different table to populate list boxes.) Each of those round-trips
decreases the number of requests per second your application can serve. By
returning multiple resultsets in a single database request, you can cut the
total time spent communicating with the database. You will be making your
system more scalable too; as you will cut down on the work the database server
is doing managing requests.
While you can return multiple resultsets using dynamic SQL,
it is preferable to use stored procedures. It is arguable whether business
logic should reside in a stored procedure.
Using a SqlCommand instance and its ExecuteReader method to populate strongly
typed business classes, you can move the resultset pointer forward by calling
|Paged Data access|
The ASP.NET DataGrid exposes a wonderful capability: data
paging support. When paging is enabled in the DataGrid, a fixed number of
records are shown at a time. Additionally, paging UI is also shown at the
bottom of the DataGrid for navigating through the records. The paging UI allows
you to navigate backwards and forwards through displayed data, displaying a
fixed number of records at a time.
But there is a slight wrinkle in this. Paging with DataGrid
requires that all the data needs to be bound to the grid. If 100,000 records
are there, 99,975 records would be discarded on each request (assuming a page
size of 25).
A better approach to avoid this is to use the stored
procedure. SP should include logic for returning only the required number of
rows for each request.
|Using Cache API (Programmatic way of caching)|
One of the very first things you should do before writing a
line of application code is architect the application tier to maximize and
exploit the ASP.NET Cache feature.
You simply need to include a reference to System.Web.dll in
your application project. When you need access to the Cache, use the
HttpRuntime.Cache property (the same object is also accessible through Page.Cache
There are several rules for caching data. First, if data can
be used more than once, it is a good candidate for caching. Second, if data is
general rather than specific to a given request or user, it is a great
candidate for the cache.
Whereas the Cache API is designed to cache data for a long
period or until some condition is met, per-request caching simply means caching
the data for the duration of the request. A particular code path is accessed
frequently on each request but the data only needs to be fetched, applied,
modified, or updated once. This sounds fairly theoretical, so let us consider a
In the Forums application of Community Server, each server control used on a
page requires personalization data to determine which skin to use, the style
sheet to use, as well as other personalization data. Some of this data can be
cached for a long period of time, but some data, such as the skin to use for
the controls, is fetched once on each request and reused multiple times during
the execution of the request.
To accomplish per-request caching, use the ASP.NET
HttpContext. An instance of HttpContext is created with every request and is
accessible anywhere during that request from the HttpContext.Current property.
The HttpContext class has a special Items collection property; objects and data
added to this Items collection are cached only for the duration of the request.
Just as you can use the Cache to store frequently accessed data, you can use
HttpContext.Items to store data that you will use only on a per-request basis.
The logic behind this is simple: data is added to the HttpContext.Items
collection when it does not exist, and on subsequent lookups the data found in
HttpContext.Items is simply returned.
|Page output caching|
If you have an ASP.NET page that generates output, whether
HTML, XML, images, or any other data, and you run this code on each request and
it generates the same output, you have a great candidate for page output
Simply add this line to the top of your page.
<%@ Page OutputCache VaryByParams="none" Duration="60" %>
Now the above effectively generates the output for this page
once and reuses it multiple times for up to 60 seconds.
This reduces the number of database hits and improves the
performance of page load.
There are several configurable settings for output caching, such as the
VaryByParams attribute just described. VaryByParams just happens to be
required, but allows you to specify the HTTP GET or HTTP POST parameters to
vary the cache entries. For example, default.aspx?Report=1 or default.aspx?Report=2
could be output-cached by simply setting VaryByParam="Report."
Additional parameters can be named by specifying a semicolon-separated list.
|Kernel Caching (Only with IIS 6.0)|
If you are using IIS 6.0, there is a nice little feature
called kernel caching that does not require any code changes to ASP.NET. When a
request is output-cached by ASP.NET, the IIS kernel cache receives a copy of
the cached data. When a request comes from the network driver, a kernel-level
driver (no context switch to user mode) receives the request, and if cached,
flushes the cached data to the response, and completes execution.
The following default setting in the Machine.config file
ensures that dynamically generated ASP.NET pages can use kernel mode caching,
subject to the requirements listed below.
<httpRunTime enableKernelOutputCache="true" . . ./>
Dynamically generated ASP.NET pages are automatically cached
subject to the following restrictions:
Pages must be retrieved by using HTTP GET requests. Responses to
HTTP POST requests are not cached in the kernel.
Query strings are ignored when responses are cached. If you want
a request for http://abc.com/myapp.aspx?id=1234 to be cached in the kernel, all
requests for http://abc.com/myapp.aspx are served from the cache, regardless of
the query string.
Pages must have an expiration policy. In other words, the pages
must have an Expires header.
Pages must not have VaryByParams.
Pages must not have VaryByHeaders.
The page must not have security restrictions. In other words, the
request must be anonymous and not require authentication. The HTTP.sys driver
only caches anonymous responses.
There must be no filters configured for the W3wp.exe file
instance that are unaware of the kernel cache.
Rather than setting up a new TCP connection on each request,
a new connection is set up only when one is not available in the connection
pool. When the connection is closed, it is returned to the pool where it
remains connected to the database, as opposed to completely tearing down that
No matter what anyone says about garbage collection within
the Microsoft .NET Framework, always call Close or Dispose explicitly on your
connection when you are finished with it. Do not wait for CLR to close the
connection for you; you have no guarantee when garbage collection will happen!
To use connection pooling optimally, there are a couple of
rules to live by. First, open the connection, do the work, and then close the
connection. It is okay to open and close the connection multiple times on each
request if you have to (optimally you apply Tip 1) rather than keeping the
connection open and passing it around through different methods. Second, use
the same connection string (and the same thread identity if you are using
integrated authentication). If you do not use the same connection string, for
example customizing the connection string based on the logged-in user, you will
not get the same optimization value provided by connection pooling. And if you
use integrated authentication while impersonating a large set of users, your
pooling will also be much less effective. The .NET CLR data performance
counters can be very useful when attempting to track down any performance
issues that are related to connection pooling.
|Using GZip compression|
Using gzip compression can decrease the number of bytes sent
by the server. This gives the perception of faster pages and also cuts down on
bandwidth usage. Depending on the data sent, how well it can be compressed, and
whether the client browsers support it (IIS will only send gzip compressed
content to clients that support gzip compression, such as Internet Explorer 6.0
and Firefox), your server can serve more requests per second. In fact, just
about any time you can decrease the amount of data returned, you will increase
requests per second.
To make this change, you are going to need to edit the Metabase. Edit
\Windows\System32\inetsrv\MetaBase.xml with your favorite text editor. Search
for "IIsCompressionScheme." There will be two XML elements, one for
deflate and one for gzip. Both elements have properties called HcFileExtensions
and HcScriptFileExtensions. These contain a space-delimited list of file
extension for compressible content. At a bare minimum, you will need to add
aspx to the HcScriptFileExtensions list. Note that if the properties are left
blank, then all content, regardless of file extension, will be compressed.
|Server Control View State|
When the page is posted back to the server, the server can
parse, validate, and apply this view state data back to the page's tree of
controls. View state is a very powerful capability since it allows state to be
persisted with the client and it requires no cookies or server memory to save
this state. Many ASP.NET server controls use view state to persist settings
made during interactions with elements on the page, for example, saving the
current page that is being displayed when paging through data.
There are a number of drawbacks to the use of view state,
however. First of all, it increases the total payload of the page both when
served and when requested. There is also an additional overhead incurred when
serializing or deserializing view state data that is posted back to the server.
Lastly, view state increases the memory allocations on the server.
Several server controls, the most well known of which is the DataGrid, tend to
make excessive use of view state, even in cases where it is not needed. The
default behavior of the ViewState property is enabled, but if you do not need it,
you can turn it off at the control or page level. Within a control, you simply
set the EnableViewState property to false, or you can set it globally within
the page using this setting.
<%@ Page EnableViewState="false" %>
It is also possible to disable the view state for a particular
<asp:textbox id=text1 runat=server EnableViewState="false"></asp:textbox>
If you are not doing postbacks in a page or are always
regenerating the controls on a page on each request, you should disable view
state at the page level.
|Trim the page size|
Use script includes for any static scripts in your page to
enable the client to cache these scripts for subsequent requests. The following
script element shows how to do this.
<script language=jscript src="scripts\myscript.js">
Remove characters such as tabs and spaces that create white
space before you send a response to the client. Removing white spaces can
dramatically reduce the size of your pages. The following sample table contains
// with white space
The following sample table does not contain white spaces.
// without white space
In an Internet scenario that involves slow clients, removing
white space can increase response times dramatically.
Limit the use of graphics, and consider using compressed
Consider using cascading style sheets to avoid sending the same
formatting directives to the client repeatedly.
Avoid long control names; especially ones that are repeated in a
DataGrid or Repeater control. Control names are used to generate unique HTML ID
names. A 10-character control name can easily turn into 30 to 40 characters
when it is used inside nested controls that are repeated.
|Using Page.IsPostBack to Minimize Redundant Processing|
Use the Page.IsPostBack property to ensure that you only
perform page initialization logic when a page is first loaded and not in response
to client postbacks. The following code fragment shows how to use the
if (!Page.IsPostBack) //IsPostBack property will be false for the first time
//load the dataset for the first time
//use the loaded dateset for other post back requests
|Using Server.Transfer instead of Response.Redirect|
Response.Redirect sends a metatag to the client that makes
the client send a new request to the server by using the new URL.
Server.Transfer avoids this indirection by making a server-side call.
|Avoid Using Page.DataBind, instead call data bind on
Calling Page.DataBind invokes the page-level method. The
page-level method in turn calls the DataBind method of every control on the
page that supports data binding. Instead of calling the page-level DataBind,
call DataBind on specific controls. Both approaches are shown in the following
The following line calls the page level DataBind. The page
level DataBind in turn recursively calls DataBind on each control.
The following line calls DataBind on the specific control.
|Minimize Calls to DataBinder.Eval|
The DataBinder.Eval method uses reflection to evaluate the
arguments that are passed in and to return the results. If you have a table
that has 100 rows and 10 columns, you call DataBinder.Eval 1,000 times if you
use DataBinder.Eval on each column. Your choice to use DataBinder.Eval is
multiplied 1,000 times in this scenario. Limiting the use of DataBinder.Eval
during data binding operations significantly improves page performance.
Consider the following ItemTemplate element within a Repeater control using
<td><%# DataBinder.Eval(Container.DataItem,"field1") %></td>
<td><%# DataBinder.Eval(Container.DataItem,"field2") %></td>
There are alternatives to using DataBinder.Eval in this
scenario. The alternatives include the following:
Use explicit casting. Using explicit casting offers better
performance by avoiding the cost of reflection. Cast the Container.DataItem as
<td><%# ((DataRowView)Container.DataItem)["field1"] %></td>
<td><%# ((DataRowView)Container.DataItem)["field2"] %></td>
You can gain even better performance with explicit casting
if you use a DataReader to bind your control and use the specialized methods to
retrieve your data. Cast the Container.DataItem as a DbDataRecord.
<td><%# ((DbDataRecord)Container.DataItem).GetString(0) %></td>
<td><%# ((DbDataRecord)Container.DataItem).GetInt(1) %></td>
|Disable unnecessary session state|
The Session is used to store the specific user's information
by a decision whether the user is authorizable. But if some pages are
independent of the User's authorization, we should disable the session state of
<%@ Page EnableSessionState="false" %>
|Use SqlDataReader to visit the Read-Only Data instead of
A DataReader is a lean, mean access method that returns
results as soon as they are available, rather than waiting for the whole of the
query to be populated into a DataSet. This can boost your application performance
quite dramatically and, once you get used to the methodology, can be quite
elegant in and of itself.
To get first employee information we are waiting until the whole
data is filled in the dataset.
SqlConnection conn = new SqlConnection(connectionString);
SqlDataAdapter dta = new SqlDataAdapter
("select * from EmpTable;", conn);
DataSet ds = new DataSet();
string strFirstEmpInfo = ds.Tables.Rows.ToString();
Now using a datareader once the required record is fetched,
collecting other results is no way required. This can boost your application
SqlConnection conn = new SqlConnection(connectionString);
SqlCommand comm = new SqlCommand("select * from EmpTable", conn);
SqlDataReader dr = comm.ExecuteReader(CommandBehavior.CloseConnection);
strFirstEmpInfo = dr.GetString(0);
Not only can we inspect as we go, but the DataReader only
stores one result at a time on the client. This results in a significant
reduction in memory usage and system resources when compared to the DataSet,
where the whole query is stored.
|Use of server controls|
There are two kinds of control in ASP.NET. The one is Server
Control, the other one is Html Control. The last only inspires the client-side event;
however, the server control would generate objects on server-side by
RunAtServer property. So its function is very powerful, but its cost is expensive
as well. We should make a choice depending on the different condition.
These are the some of the tips that I have found useful for
writing ASP.NET applications.
No comments posted yet.