When you think ASP, think...
Recent Articles
All Articles
ASP.NET Articles
Related Web Technologies
User Tips!
Coding Tips

Sample Chapters
JavaScript Tutorials
MSDN Communities Hub
Official Docs
Stump the SQL Guru!
XML Info
Author an Article
Print this page.
Published: Sunday, February 03, 2002

ConvertRsToXml, A Case Study of Timing and Optimizing ASP/ADO Code, Part 2

By Michael Balloni

  • Read Part 1

  • In Part 1 we looked at how to use a more sophisitcated timing mechanism to test the performance of code in an ASP page, and looked at the results of the various ConvertRSToXML() methods under a server with no load. In this part we'll examine the results of these tests for a server with a load, as well as examine why some methods took longer than others.

    - continued -

    Testing the ConvertRSToXML() Methods Under Load
    Under load (with two other copies of the same test running on the one server), these are the results I got:

    Scope Total Time (ms) Total Hits Time Per Hit (ms/hit)
    firehose_orig 621.77 1 621.77
    firehose_july 835.39 1 835.39
    firehose_v2 95.07 1 95.07
    firehose_v3 67.08 1 67.08
    firehose_v3_vb 85.07 1 85.07
    firehose_save 148.31 1 148.31
    firehose_save_xml 92.43 1 92.43
    static_orig 622.85 1 622.85
    static_july 2,064.59 1 2,064.59
    static_v2 1,352.82 1 1,352.82
    static_v3 571.12 1 571.12
    static_v3_vb 1,753.13 1 1,753.13
    static_save 1,003.98 1 1,003.98
    static_save_xml 2,169.92 1 2,169.92
    client_orig 896.53 1 896.53
    client_july 605.36 1 605.36
    client_v2 74.19 1 74.19
    client_v3 67.79 1 67.79
    client_v3_vb 38.19 1 38.19
    client_save 1,592.06 1 1,592.06
    client_save_xml 85.27 1 85.27

    Each of the tests results in a variable (strXml) being filled with XML content generated-from and textually equivalent to the recordset variable.  This isn't quite fair to the later versions of ConvertRsToXml since they hand back an XMLDOM object and not a string, and it's not fair to the Recordset's Save methods since they just save the XML and don't hand it back in any way, but this does make each of the methods compute the XML hierarchy and then "render" it in some way, so in that sense it makes for a level playing field.

    In the under load numbers you can see how the static cursor version doesn't fare so well with three hungry clients pawing at the server.  Remember that server-side static cursors says to the server "make copies of all rows in the resultset so that I can run around them without seeing other peoples' changes to them" and this requires a lot more work than either of the other two methods.  You can also see file system woes in some of the _save numbers.

    I didn't have the Northwind database installed, so I made use of a development version of the Streamload database, and that is all encapsulated within a GetRs method.  (Streamload is the company I work for; I ran my tests against our database.) The recordset consisted of 100 rows of a SELECT TOP * FROM SomeWideTable WITH (NOLOCK) query.  (The performance would likely be the same with the NOLOCK present or not, although I didn't test it without the NOLOCK hint. I imagine that under load it would do a bit worse due to SQL's read locks. I did test it with other cursor-types and lock options, but the lock hint in the query overrides whatever ADO would ask for. For more information no NOLOCK be sure to read an earlier article of mine, SQL Server Lock Contention Tamed: The Joys Of NOLOCK and ROWLOCK.) I found that keyset and dynamic cursors generated numbers very similar to that of firehose and client, probably because no rows needed to be locked, but this just goes to show you that if you insist on optimizing your scripts to all use "<insert your favorite cursor type here>" even the worst cursor type may do no worse than what you can get otherwise.  So, again, don't optimize it if it doesn't need it.

    The VB incarnation of the rogue v3 ASP version made use of typed variables, but was otherwise equivalent to the ASP methods (view the source code for my version of ConvertRStoXML).  It's performance was darn near the best under no load, and the best under load, and substantially better than any VB method.

    The time taken by the Recordset's Save-to-disk method is mostly attributable to disk I/O, and is an example of why you should avoid using the file-system for this type of system unless you're extremely tight on memory.

    The Recordset's Save-to-XMLDOM in conjunction with client-side cursors is the winner by a hair for no load, and off by only a little under load, and the size of its output was the lowest by 50-100%: 40K vs. 60K for v2-v3 vs. 80K for the original.  However, its output is not directly useable for XSL transformations and such.  Still, if you just need to cache recordsets and you don't want to have to use a VB component to do the job, it may be the way to go.

    The slight benefit of client-side cursors was a surprise to me.  I'd heard they were good, but I always figured that the firehose was always the way to go since it (seems) so much simpler.  The benefit of the the client-side cursor is that the rows are quickly fetched from the database into the client's memory (in this case ASP/IIS's) by ADO and the data-exchange with the server should be quick and painless, whereas with the firehose the server has to spoonfeed the client the rows at whatever rate the client can consume them.  Consider a Do Until objRS.EOF loop in an unbuffered ASP script where each row may be accessed for a second or more because the whole operation is bound by the end-(modem-)user's connection.  The downside of client-side cursors is the memory usage, but hey, memory's cheap, right?

    The Rogue ASP and VB Versions
    David's optimizations to the original versions of the method were substantial.  He made use of one XML "row" object which he clones for each output row and simply sets the text into, and he made use of rs.GetRows to get all the recordset data in one fell swoop.  I took this version, and moved all the recordset code to before the XML code so that the recordeset could be cleaned up entirely before any XML resources were invested in.  This was an optimization intended to improve under-load performance by reducing the amount of resources in use at any one time.  I also moved the cloning of the template XML row into a loop which stores the blank rows in an array, basically flattening and simplifying the original for-loops.  These changes made for ~10% speedup.  Moving the code into VB made for a 100% speedup.  This is exactly what COM and ASP are good for, hopping development environment to achieve fundamentally better performance, debugging, what have you.  Once in VB, I can use a VB Profiler like Aivostos VB Watch (see www.aivosto.com) to figure out exactly what still could use optimizing.

    Michael's COM/.NET Rant
    And this why this technology platform (IIS/ASP/COM) is superior (IMHO) to Java or Perl or, for that matter, what I've seen from .NET: you can use a limited functionality, almost rinky-dink scripting language (VBScript) for application-level, often changing, not-terribly-performance-critical page logic, and then hop down into VB or C++ or whatever (via the monstrosity that must die named COM) to do real nuts-and-bolts crunching.  It seems that in .NET we'll have to use full-fledged languages to do the high-level stuff, but in exchange we'll get a more stable and powerful execution environment and lots of other perks.  Despite a few really annoying aspects of it, I really enjoy working in the current environment and wish that its ills could have been addressed without sacrificing its simplicity.  That said, I look forward to what the new .NET platform will have to offer.

    Happy Programming!

  • By Michael Balloni


  • Download David O'Neill's testing code (in ZIP format)
  • Download my VBScript testing class (in text format)
  • Download my tester code (in text format)
  • Download my version of ConvertRStoXML (in text format)

  • ASP.NET [1.x] [2.0] | ASPFAQs.com | Advertise | Feedback | Author an Article