3 Website Performance Testing Mistakes To Avoid

04th March 2013

As long as there has been software, there has been testing. Good testing has always been centred on the business outcomes. It’s not just about finding bugs, it’s about finding Important Bugs: bugs that impact the end user or customer.

Mike Brown of uTest jogged my memory recently when his blog referenced the timeless  ‘Classic Testing Mistakes‘ by Brian Marick, from back in 1999. One of Brian’s points still stands out as a very powerful approach that directly aids eCommerce Managers in today’s world,  more, in fact, than it has any previous generations of technology management.

Failing To Correctly Identify Risky Areas: Load Testing The Live Site!

In a resource constrained world not everything can be tested before a project is launched,   so there is considerable scope for mistakes in judging where to focus test effort and time.

In the arena of website load testing, there is a set of common blind-spots that crop up repeatedly as our team work with clients on planning Load Testing projects.

  • Load testing an environment that is not the live environment

It is often a preferred choice of tech teams to load test a pre-production environment, rather than the real live one. This is usually on the basis that there is less hassle as there is no need to care about upsetting real customers if problems occur, and no need to tidy-up production databases afterwards.

However, unless you test the live site any metrics you get from the pre-production site are more likely to be misleading than helpful. Performance on the live site will not be a simple X% bigger than on a smaller pre-production one. There are too many places for small differences in configuration, impact of resource sharing etc to cause big differences in performance.

Load testing your live site should at minimum be an annual process, with testing on pre-production if nothing better can be arranged used between times and bench marked back to the full live test.

Paying More Attention To Running Tests Than To Designing Them

Recently, we started to work with a major client that had a load-testing regime that they followed per site iteration.

At planning meetings where we looked at our tools being used to take on their website performance in general, the details of the client’s load testing specification was looked at.

When it was put on the table and examined it became clear that the User Journeys used were ones that no real customer could follow. The test script was a list of URLs, in an order that was impossible to follow on the live site by using links on the actual pages!  Yes, all the URLs did work in isolation, but could not be followed by using a browser and clicking on available things in each page.

What had happened was that the live site design had changed in the last couple of years, but the load test script had not. This change in site would have been spotted if they had not been using the poor approach of a list of static URLs being considered as a User Journey – but that’s another story.

Sometimes there seems to be a psychological discomfort for tech teams in giving enough attention to the design of load tests.

They are predisposed to wanting the spec to be as concrete as the lines of code in the program, and working with the business team to understand the end-user and the complex data from Web Analytics logs etc can be too fuzzy. It can be less cognitive effort to spec up simplistic lists of page URLs than to embrace the complexity of real usage.

This is where eCommerce management has a vital role to play: counter-balancing this tendency and ensuring that the end-user experience is the starting point for designing load testing projects.

Not Reporting Usability Problems –  All eCommerce Managers Should Read This!

This is a real gem that Brian drops in among the excellent advice in his document.

He relates how, in the course of testing, the tester becomes familiar as a user of the software and may spot an issue of usability: something that could be much smoother / more intuitive for the end user.

In many organisations there is no mechanism for the testing team to report back such an issue, somehow UI improvement is not what the testers are assumed to be doing.  They are assumed to be just testing if the releasee worked exavtly as designed and to report back the important bugs in that.

This is even more relevant today, where our websites are more complex than ever, with HTML5 functionality, off-line web applications and so on.

The lesson to be learnt is that usability is at the heart of the customer experience, and all the testing we do, including website load testing and performance testing, must be based on the customer experience.

We have a phrase at thinkTRIBE we use when planning web performance projects:

  • User Journeys should “Do what the Customer Does”

Make that the heart of your load testing and you’ll end up with load test performance metrics that make sense to the Business teams and to the senior management. This also, back to lesson one (failing to correctly identify risky areas), won’t leave you vulnerable to unexpected performance problems later on due to the fact that the load testing was not accurately doing what customers do.