.NET Data Access Performance Comparison
page 5 of 7
by Steven Smith
Feedback
Average Rating: 
Views (Total / Last 10 Days): 37848/ 65

The Results

The overall winner was the cached DataTable, with an average of 132 requests per second.  The DataReader method, whether using delegates or properly closed by the calling function, averaged about 112 requests per second.  The uncached DataTable averaged 99 requests per second.  The real surprise to me was the unclosed DataReader.  I knew it would have a negative impact on performance, but even I (who long considered them ‘evil’ for this reason, before learning of the delegate approach) didn’t expect it would be this bad.  The unclosed DataReader averaged just 7 requests per second.

Another important consideration for web performance is the per-request time required.  This is measured by the TTLB, or Time To Last Byte, which records how long it took from when the request was made until the last byte of the response was sent to the client.  These corresponded directly to the requests per second, in this case, with the cached DataTable taking 8ms, the DataReaders taking 30ms, the uncached DataTable taking 35ms, and the unclosed DataReader averaging 661ms.

A summary of the results is listed below.  What the summary doesn’t show, but which is also worth noting, is that while all the other tests had more-or-less constant requests per second for the duration of the test, the unclosed DataReader behaved erratically.  It would run for several seconds with 20 or more requests per second, then it would simply hang, and process no requests at all for 15 or 20 seconds at a time.  (Most likely this behavior is due to the connection pool being tapped out and not releasing connections until they time out.  However, I have not proven this to be the case.)  It was also the only test that resulted in HTTP errors, of which it recorded 90 during the 5 minute test run which included 2,141 requests.

Scenario

Req/s

TTLB (ms)

Cached DataTable

132

8.34

Unclosed DataReader Using Delegate

113

30.13

Closed DataReader Using Delegate

112

30.26

DataReader Closed by Client

112

30.76

DataTable

99

35.97

DataReader Left Open by Client

7

661.88


View Entire Article

User Comments

Title: finding open datareaders   
Name: Jeff
Date: 2007-12-03 10:54:30 AM
Comment:
what is the best way to find open datareaders throughout a larger web application? (those datareaders not closed or disposed. - thanks.
Title: Borrowed reader delegate need not return anything   
Name: borrower
Date: 2007-06-17 10:44:34 PM
Comment:
public delegate void BorrowReader(IDataReader reader);

public static void LendAuthorsReader(BorrowReader borrower){
using(SqlConnection conn = new SqlConnection(_connectionString)){
using(SqlCommand cmd = new SqlCommand("SELECT * FROM Authors", conn)){
conn.Open();
using(SqlDataReader reader = cmd.ExecuteReader()){
borrower(reader); // invoke borrower delegate directly inside another using() clause and let IDisposable do its thing
}
}
}
}
Title: Prefer disconnected access   
Name: JNSSoft
Date: 2007-05-24 3:41:50 AM
Comment:
I always use DataSets and datatables because of their disconnected behavior. I dont use DataReader.
Title: Delegate for DataReaders   
Name: Varangian
Date: 2007-03-08 5:58:04 AM
Comment:
I didn't describe myself properly perhaps.... what about using the CommandBehaviour.CloseConnection - it's an Enum that closes the underlying connection once the datareader is closed... it's basically the same and simpler than using delegates... I would like to have your view on what I said!

Thanks!
Title: Why a delegate   
Name: Steven Smith
Date: 2007-02-28 11:56:19 AM
Comment:
Read Teemu's article about datareaders and delegates. You *can* just pass back an open datareader and hope/pray that the calling function is written such that it closes it properly, even in the event of an error. But that's just asking for problems. It's far safer to ensure that it is closed in the function that opens it, and the only way to achieve this with a datareader is by using a delegate.
Title: Delegate for DataReaders   
Name: Varangian
Date: 2007-02-28 4:19:42 AM
Comment:
I didn't quite understood why you made use of the delegate to properly dispose of the DataReader

if the method returns the DataReader and then close it would be enough.

Can you explain why you need to make use of the Delegate?
Title: on caching DataTables...   
Name: Willem
Date: 2006-05-04 1:14:50 PM
Comment:
Just found an interesting problem with caching DataTables: we only cache static data, however, we do create DataViews on the DataTables. Apparently when you create a DataView, .Net rebuilds the internal index. When you use cached DataTables (and lots of users), .Net can get confused about the internal index and you get the following error: "DataTable internal index is corrupted: '5'." The only workaround I found so far is using the DataTable.Copy() as suggested above...
Title: Comment   
Name: JK
Date: 2005-11-03 12:14:51 AM
Comment:
Good one
Title: Fair comparison   
Name: Steven Smith
Date: 2005-05-24 3:12:38 PM
Comment:
Brian,
Do you know of another way to use the DataReader than to loop through its contents? It's 'fair' in that both techniques are doing the same work (the user sees the same result in each case). If you know of a more efficient way to give the user the same results using a DataReader, then by all means share it. I realize that readers and tables have different implementations -- that's largely the point of the article.
Title: Good Article but   
Name: Brian O'Connell
Date: 2005-05-24 3:07:57 PM
Comment:
Is it a fair comparrison to loop through all records in a datareader compared to accessing a property of the datable? Just wondering.
Title: Grat Thing to know   
Name: Ashish Patel
Date: 2005-04-13 7:18:30 AM
Comment:
I really found this artical intresting. I have been working on .NET since last 6 month.
Title: Re: Datatable caching   
Name: Ian Cox
Date: 2005-04-08 5:17:38 AM
Comment:
Interesting comments. I think you are both correct that another method should be used do update data and the cached data should always be read-only.
In the system I work on, historically everything was done with typed datasets, so when we came to implement caching the natural thing to do was cache the static datasets. Then we implemented caching on dynamic data as well using a sql server custom extended stored proc and a trigger to drop a file into a directory which in turn caused ASPNET to clear the item from cache.
Without time to re-architect the middle-tier we ended up having to copy dynamic items out of cache to prevent concurrency problems.
Anyway, this is drifting of the point of your excellent article Steven. thanks for your good work!
Title: Copying?   
Name: Steven Smith
Date: 2005-04-07 10:52:49 AM
Comment:
Ian/David,
Normally what I do is what David suggested -- use the DataTable in Cache for read-only purposes and send updates via another channel. Typically through direct SQL statements. You will find that for a busy application, having a cache duration of 1 second yields significant perf gains while ensuring that any users acting on 'old' data are acting on data that is, at most, 1 second old. If I were building a system for an environment where it was critical that users be notified ASAP when changes occurred from other users to data they were dealing with, I would either build a smarter singleton business object and have all reads and writes go through this, or if possible I would build it in ASP.NET v2 and use Sql Cache Invalidation.
Title: And of course....   
Name: David V. Corbin
Date: 2005-04-07 8:47:21 AM
Comment:
1) As mentioned earlier...measure cachine the custom objects that are created at the Business layer...$(Insert large amount of mone here)says that will be the true winner.

2) Th epoint (in the comments) about needing to copy the data [if it is being modified] to provide transaction isolation is only one way to accomplish the goal. You can simply NOT modify the data at all and post changes back through a different path, but you DO need to do SOMETHING to prevent users for seeing other's (possibly temporary) changes.

3) Inheriting from IDisposable (again from the comments) does NOTHING to help the unclosed reader. You can NOT ENFORCE that the user will call dispose.
Title: Re: DataTable Caching   
Name: Ian Cox
Date: 2005-04-07 6:07:09 AM
Comment:
My query about DataTable.Copy() was with regard to getting the data out of the cache not expiring that data. Let me try give an example:
Product data can be updated by a small amount of different users.
It doesn't change much so is cached and expired when an update is made.
User 1 gets the product datatable from cache (without DataTable.Copy())
User 1 modifies items in the datatable but has not yet saved them to the database
User 2 needs to get the product data for some other purpose. This also comes out of cache.

The problem is that User 2 can see User 1's modifications because they are both looking at the same in-memory copy of the datatable. To get around this issue a DataTable.Copy() would create a separate in-memory copy for each user.

I was just interested to know how this performed in relation to the other methods.

Cheers!
Title: IDisposable   
Name: Wesley
Date: 2005-04-07 4:01:05 AM
Comment:
Why not let the Data class inherit from IDispable and on Dispose close open reader and connection???

Thats the way I do it and as far as I can see this does a perfect job... am I overseeing something???

Cheers,
Wes
Title: Custom object comparison   
Name: Sharbel
Date: 2005-04-06 8:54:16 PM
Comment:
Nice article. It would have been interesting if you would have also compared cached/uncached custom objects in the comparison. We develop all our non-trivial applications all with custom objects. So instead of databinding a grid to a DataReader or a DataSet, we bind to our custom objects. The overhead of a custom object that inherits from CollectionBase should be less than a DataSet/DataTable, so I would have liked to have seen some comparisons on that.

Again, good article.
Title: DataTable Caching   
Name: Steven Smith
Date: 2005-04-06 4:43:48 PM
Comment:
There's no need for DataTable.Copy() that I know of. Whenever you the cache expires, a brand new DataTable is added to the cache. I don't normally overwrite live DataTables - I normally check to see if the cache entry is null (expired), and only then do I repopulate it.
Title: Datasets over DataTables   
Name: Sean Crouch
Date: 2005-04-06 4:41:29 PM
Comment:
Hi,

Great article.

I have been struggling with which to use for a while and have now settled on using datasets after nasty connection pool problems using (badly!) datareaders.

Do you have any view on what the extra overhead of a dataset is over a datatable, if any?

Thanks
Sean.
Title: Programmer Analyst   
Name: Prodip K. Saha
Date: 2005-04-06 4:01:19 PM
Comment:
Steven,
Indeed, it is a very informative article. I take your point on the unclosed DataReader. The difference is way off within the same set of environment. You are absolutely right about -so many variations between applications and architectures. Those can significantly alter the performance.

I hope to see similar analysis between DataTable and DataReader (a closed DataReader serialized into a class) with thousands of records.

Keep up the good work for the .NET community.

Thanks,
Prodip
http://www.aspnet4you.com
Title: Interesting...also...   
Name: Ian Cox
Date: 2005-04-06 1:04:54 PM
Comment:
Good article.
Related to datatable caching, if you are caching non-static data (and using some mechanism for flushing the cache when the data does change) then you will have to be doing a DataTable.Copy() in order to get your datatable from cache (otherwise users will be looking at the same in-memory copy). The DataTable.Copy() function is fairly slow. It would be interesting to see how cached datatables compare to other methods when the retrieval requires you to copy the datatable.
Title: Comment   
Name: Parth
Date: 2005-02-19 2:37:28 AM
Comment:
Best
Title: Thanks, I've been wondering   
Name: Steve Sharrock
Date: 2005-02-16 9:36:17 PM
Comment:
I've been working with "gut feel" for that past few years, and it's nice to see some stats on this topic. I've only done the most rudimentary tests, and you've gone beyond that. I agree that each archetecture/implementation might need to be tested, but this is a good starting point.






Community Advice: ASP | SQL | XML | Regular Expressions | Windows


©Copyright 1998-2024 ASPAlliance.com  |  Page Processed at 2024-04-24 11:12:46 AM  AspAlliance Recent Articles RSS Feed
About ASPAlliance | Newsgroups | Advertise | Authors | Email Lists | Feedback | Link To Us | Privacy | Search