Report Performance (continued)

  In my last blog I discussed the performance of the Top 20 Reports and my plan at the time was to evaluate performance of all reports.  After further consideration, I decided to take a different approach before embarking on such a huge project.  With all the focus on the Top 20, I wondered what a study of the last 20 would reveal.  I also thought this would be a fun and interesting analysis that could provide some valuable input to the Testing and Development teams.  With that goal in mind, I crafted a SQL query that would list out the worst performing reports based on the 10-second completion standard.  I was interested in seeing how many reports were run, and what percentage of those reports completed in over 10 seconds.  Additionally, I focused on the most popular of those reports, assuming that a report run only a couple hundred times a year, while it could benefit from optimization, may not necessarily warrant prioritized development resources.

Using the above criteria, here are the Bottom 9 reports and % of the time they completed in over 10 seconds.

Bottom9Chart-a

After parsing and analyzing the data for the Bottom 9, I was pleasantly surprised to find that most of the reports were completing in around 20 seconds.  The following charts illustrate this performance behavior for the Pay201, Pay01, and Fiscal04 reports.

Pay201ChartPay01ChartFiscal04Chart

 

Even the slowest performing report – Pay201 – where only 23% of the reports finished in 10 seconds or less - most reports of this type still completed in about 20 seconds or so!

Is a report that takes 20 seconds to complete an indication of “bad” performance?  That depends on several factors, including the complexity of the report and user willingness to wait.  Back in the early days of the internet, during my tenure as a Software Performance Engineer, we performed studies on human tolerance to waiting on the computer and followed some of the Human Factors Engineering (Useability Engineering) principles outlined by Jakob Nielsen, one of pioneers for HFE.  Many years ago, Nielsen contributed to the design of the Google web site, much of which is still present to this day.

We learned that in general, users would wait up to 45 seconds for a response back from a computer when waiting on a “Transaction Complete” type of event.  If nothing happened within that period of time, users would start hammering on the Enter Key.  We would have to tell them that it wasn’t a Data Pump Key, and that hitting it repeatedly would not speed things up.  (In fact, it probably made things worse since all those carriage returns were being sent back to be processed by the host, only to be discarded.)

On the other hand, there was clearly a much lower tolerance for slow field-to-field and screen-to-screen response time. This needed to be 1 second or less.  Users would tolerate an occasional spike to 2-3 seconds, but not often.

While a sub-10 second response time for reports is preferred, a 20 second response time is still within what users may tolerate.  On the other hand, if most reports are completing in sub-10 seconds, the expectation may be that all reports exhibit similar performance behavior, and users may be less tolerant to any report that takes longer than that.  Achieving “one size fits all” report performance would be a difficult task given the numerous report parameters and select criteria, as well as significantly varying database sizes.  Only the end users know for sure how report performance impacts the process of doing business.  For that, I invite your comments on this topic and look forward to your feedback!