.NET Data Access Performance Comparison
page 1 of 7
Published: 16 Feb 2005
Unedited - Community Contributed
Abstract
This article attempts to compare several different data access techniques through the use of a stress testing tool. DataTables and DataReaders are compared with a number of different variables, and recommendations for best practices are provided.
by Steven Smith
Feedback
Average Rating: 
Views (Total / Last 10 Days): 38034/ 51

Introduction and Background

Download Sample Files

In .NET, there are several ways to extract data from a data source.  The two most common techniques using ADO.NET involve the use of the DataReader or the filling of a DataSet or DataTable with a DataAdapter. In this article, a very easy-to-reproduce set of tests is analyzed to determine which techniques performs the fastest.  Further, additional variables such as N-Tier architecture and the effects of caching on the results are considered.  Finally, I recommend some best practices based on the results.

Background

I’ve been interested in the debate between DataReaders and DataSets ever since .NET’s first preview was made available.  The conventional wisdom has always held that DataReaders are the best way to go, offering the smallest memory footprint and the fastest access to the data.  However, while I don’t argue these points, DataReaders have always been a dangerous tool to use, especially in an N-Tier application, in which an open DataReader is passed up from the data access layer to a business or user interface layer, thereby delegating the responsibility for closing the DataReader to those layers.  I have personally been burned by the effects of unclosed DataReaders on a busy site, and so for a long time I was rather religiously against the use of DataReaders unless they were opened and closed within the same method.

More recently, I’ve come across a few techniques that make using DataReaders across tiers safe.  One such technique is detailed in Teemu Keiski’s article.  Another is found in the opening pages of Steven Metsker’s Design Patterns in C#.  Both of these techniques take advantage of delegates to enable a DataReader to be accessed from a higher tier in the application while still forcing control of that DataReader to pass through the data access layer method prior to its destruction (allowing for proper cleanup).  In this way, it is possible to pass an open DataReader from one tier to another without the risks that would otherwise be involved.

Just how great is the risk, and how large is the problem, when a DataReader is accidentally left open?  That was another of my questions that I sought to answer when I began this testing.  I had seen the empirical effects on one of my sites, but I didn’t know the exact effects in a controlled environment.  I was quite surprised to see just how devastating the effect such a simple error could have on a busy site.


View Entire Article

User Comments

Title: finding open datareaders   
Name: Jeff
Date: 2007-12-03 10:54:30 AM
Comment:
what is the best way to find open datareaders throughout a larger web application? (those datareaders not closed or disposed. - thanks.
Title: Borrowed reader delegate need not return anything   
Name: borrower
Date: 2007-06-17 10:44:34 PM
Comment:
public delegate void BorrowReader(IDataReader reader);

public static void LendAuthorsReader(BorrowReader borrower){
using(SqlConnection conn = new SqlConnection(_connectionString)){
using(SqlCommand cmd = new SqlCommand("SELECT * FROM Authors", conn)){
conn.Open();
using(SqlDataReader reader = cmd.ExecuteReader()){
borrower(reader); // invoke borrower delegate directly inside another using() clause and let IDisposable do its thing
}
}
}
}
Title: Prefer disconnected access   
Name: JNSSoft
Date: 2007-05-24 3:41:50 AM
Comment:
I always use DataSets and datatables because of their disconnected behavior. I dont use DataReader.
Title: Delegate for DataReaders   
Name: Varangian
Date: 2007-03-08 5:58:04 AM
Comment:
I didn't describe myself properly perhaps.... what about using the CommandBehaviour.CloseConnection - it's an Enum that closes the underlying connection once the datareader is closed... it's basically the same and simpler than using delegates... I would like to have your view on what I said!

Thanks!
Title: Why a delegate   
Name: Steven Smith
Date: 2007-02-28 11:56:19 AM
Comment:
Read Teemu's article about datareaders and delegates. You *can* just pass back an open datareader and hope/pray that the calling function is written such that it closes it properly, even in the event of an error. But that's just asking for problems. It's far safer to ensure that it is closed in the function that opens it, and the only way to achieve this with a datareader is by using a delegate.
Title: Delegate for DataReaders   
Name: Varangian
Date: 2007-02-28 4:19:42 AM
Comment:
I didn't quite understood why you made use of the delegate to properly dispose of the DataReader

if the method returns the DataReader and then close it would be enough.

Can you explain why you need to make use of the Delegate?
Title: on caching DataTables...   
Name: Willem
Date: 2006-05-04 1:14:50 PM
Comment:
Just found an interesting problem with caching DataTables: we only cache static data, however, we do create DataViews on the DataTables. Apparently when you create a DataView, .Net rebuilds the internal index. When you use cached DataTables (and lots of users), .Net can get confused about the internal index and you get the following error: "DataTable internal index is corrupted: '5'." The only workaround I found so far is using the DataTable.Copy() as suggested above...
Title: Comment   
Name: JK
Date: 2005-11-03 12:14:51 AM
Comment:
Good one
Title: Fair comparison   
Name: Steven Smith
Date: 2005-05-24 3:12:38 PM
Comment:
Brian,
Do you know of another way to use the DataReader than to loop through its contents? It's 'fair' in that both techniques are doing the same work (the user sees the same result in each case). If you know of a more efficient way to give the user the same results using a DataReader, then by all means share it. I realize that readers and tables have different implementations -- that's largely the point of the article.
Title: Good Article but   
Name: Brian O'Connell
Date: 2005-05-24 3:07:57 PM
Comment:
Is it a fair comparrison to loop through all records in a datareader compared to accessing a property of the datable? Just wondering.
Title: Grat Thing to know   
Name: Ashish Patel
Date: 2005-04-13 7:18:30 AM
Comment:
I really found this artical intresting. I have been working on .NET since last 6 month.
Title: Re: Datatable caching   
Name: Ian Cox
Date: 2005-04-08 5:17:38 AM
Comment:
Interesting comments. I think you are both correct that another method should be used do update data and the cached data should always be read-only.
In the system I work on, historically everything was done with typed datasets, so when we came to implement caching the natural thing to do was cache the static datasets. Then we implemented caching on dynamic data as well using a sql server custom extended stored proc and a trigger to drop a file into a directory which in turn caused ASPNET to clear the item from cache.
Without time to re-architect the middle-tier we ended up having to copy dynamic items out of cache to prevent concurrency problems.
Anyway, this is drifting of the point of your excellent article Steven. thanks for your good work!
Title: Copying?   
Name: Steven Smith
Date: 2005-04-07 10:52:49 AM
Comment:
Ian/David,
Normally what I do is what David suggested -- use the DataTable in Cache for read-only purposes and send updates via another channel. Typically through direct SQL statements. You will find that for a busy application, having a cache duration of 1 second yields significant perf gains while ensuring that any users acting on 'old' data are acting on data that is, at most, 1 second old. If I were building a system for an environment where it was critical that users be notified ASAP when changes occurred from other users to data they were dealing with, I would either build a smarter singleton business object and have all reads and writes go through this, or if possible I would build it in ASP.NET v2 and use Sql Cache Invalidation.
Title: And of course....   
Name: David V. Corbin
Date: 2005-04-07 8:47:21 AM
Comment:
1) As mentioned earlier...measure cachine the custom objects that are created at the Business layer...$(Insert large amount of mone here)says that will be the true winner.

2) Th epoint (in the comments) about needing to copy the data [if it is being modified] to provide transaction isolation is only one way to accomplish the goal. You can simply NOT modify the data at all and post changes back through a different path, but you DO need to do SOMETHING to prevent users for seeing other's (possibly temporary) changes.

3) Inheriting from IDisposable (again from the comments) does NOTHING to help the unclosed reader. You can NOT ENFORCE that the user will call dispose.
Title: Re: DataTable Caching   
Name: Ian Cox
Date: 2005-04-07 6:07:09 AM
Comment:
My query about DataTable.Copy() was with regard to getting the data out of the cache not expiring that data. Let me try give an example:
Product data can be updated by a small amount of different users.
It doesn't change much so is cached and expired when an update is made.
User 1 gets the product datatable from cache (without DataTable.Copy())
User 1 modifies items in the datatable but has not yet saved them to the database
User 2 needs to get the product data for some other purpose. This also comes out of cache.

The problem is that User 2 can see User 1's modifications because they are both looking at the same in-memory copy of the datatable. To get around this issue a DataTable.Copy() would create a separate in-memory copy for each user.

I was just interested to know how this performed in relation to the other methods.

Cheers!
Title: IDisposable   
Name: Wesley
Date: 2005-04-07 4:01:05 AM
Comment:
Why not let the Data class inherit from IDispable and on Dispose close open reader and connection???

Thats the way I do it and as far as I can see this does a perfect job... am I overseeing something???

Cheers,
Wes
Title: Custom object comparison   
Name: Sharbel
Date: 2005-04-06 8:54:16 PM
Comment:
Nice article. It would have been interesting if you would have also compared cached/uncached custom objects in the comparison. We develop all our non-trivial applications all with custom objects. So instead of databinding a grid to a DataReader or a DataSet, we bind to our custom objects. The overhead of a custom object that inherits from CollectionBase should be less than a DataSet/DataTable, so I would have liked to have seen some comparisons on that.

Again, good article.
Title: DataTable Caching   
Name: Steven Smith
Date: 2005-04-06 4:43:48 PM
Comment:
There's no need for DataTable.Copy() that I know of. Whenever you the cache expires, a brand new DataTable is added to the cache. I don't normally overwrite live DataTables - I normally check to see if the cache entry is null (expired), and only then do I repopulate it.
Title: Datasets over DataTables   
Name: Sean Crouch
Date: 2005-04-06 4:41:29 PM
Comment:
Hi,

Great article.

I have been struggling with which to use for a while and have now settled on using datasets after nasty connection pool problems using (badly!) datareaders.

Do you have any view on what the extra overhead of a dataset is over a datatable, if any?

Thanks
Sean.
Title: Programmer Analyst   
Name: Prodip K. Saha
Date: 2005-04-06 4:01:19 PM
Comment:
Steven,
Indeed, it is a very informative article. I take your point on the unclosed DataReader. The difference is way off within the same set of environment. You are absolutely right about -so many variations between applications and architectures. Those can significantly alter the performance.

I hope to see similar analysis between DataTable and DataReader (a closed DataReader serialized into a class) with thousands of records.

Keep up the good work for the .NET community.

Thanks,
Prodip
http://www.aspnet4you.com
Title: Interesting...also...   
Name: Ian Cox
Date: 2005-04-06 1:04:54 PM
Comment:
Good article.
Related to datatable caching, if you are caching non-static data (and using some mechanism for flushing the cache when the data does change) then you will have to be doing a DataTable.Copy() in order to get your datatable from cache (otherwise users will be looking at the same in-memory copy). The DataTable.Copy() function is fairly slow. It would be interesting to see how cached datatables compare to other methods when the retrieval requires you to copy the datatable.
Title: Comment   
Name: Parth
Date: 2005-02-19 2:37:28 AM
Comment:
Best
Title: Thanks, I've been wondering   
Name: Steve Sharrock
Date: 2005-02-16 9:36:17 PM
Comment:
I've been working with "gut feel" for that past few years, and it's nice to see some stats on this topic. I've only done the most rudimentary tests, and you've gone beyond that. I agree that each archetecture/implementation might need to be tested, but this is a good starting point.






Community Advice: ASP | SQL | XML | Regular Expressions | Windows


©Copyright 1998-2024 ASPAlliance.com  |  Page Processed at 2024-03-19 4:07:08 AM  AspAlliance Recent Articles RSS Feed
About ASPAlliance | Newsgroups | Advertise | Authors | Email Lists | Feedback | Link To Us | Privacy | Search