For the final test, both reads and writes will be cached to optimize performance. Naturally, this will have the best performance of the three scenarios we’ve examined (no caching, read caching, read-write caching). For this test, I enabled batch updates (write caching) in addition to the read caching enabled in the previous test scenario. I set both caching periods to 10 seconds and ran the test on two system configurations just as before (remote database, local database). Figure 11 shows a graph of the results. The blue line (1) indicates the setup with the remote database.
Figure 11 – Test Results With Read and Write Caching Enabled
As you can see, the difference between the two configurations is almost zero, and the total requests per second is significantly higher for both configurations (though obviously much more so for the remote database option).
Note: The remote configuration option tended to have regular drops in requests per second in every scenario tested (but most noticeable in this last graph). I’m not sure what the cause of this is, but since the behavior was consistent across all of my tests and only lasted a few seconds, I concluded that whatever was causing it was not related to my caching, nor would it have a significant effect on my statistics. I couldn’t tie it to garbage collection or the connection pool running low, based on the performance counters I collected.
Test 1 (Blue) averaged 113.1 ASP.NET requests/second. Test 2 averaged 118.52 ASP.NET requests/second. For Test 1, the web server’s CPU averaged 69.62%. For test 2, the web server CPU (and also the db server) averaged 72.9%. Average Time to Last Byte (TTLB) in Test 1 for each whole page was 3.95ms. For Test 2, it was just 3.52ms.
Looking at the database, in Test 1, the remote database averaged 1.24 Batch Requests/sec. In Test 2, the local database averaged 0.6 Batch Requests/sec. I’m not sure why there is such a discrepancy here, but they’re both much lower than in the previous tests.