This video describes an approach to simple performance testing that has proven very valuable when building public, content–rich web sites. Automation and data gathering during development, testing and maintenance has provided a way to ensure robust and predictable performance on web sites that now deliver more than one million pages per day.
By running a web crawler and doing some careful processing of the output, we have experienced that it is possible to investigate and gain control over important server–side performance aspects like:
* What is the baseline performance of our setup?
* What happens with the overall performance as we add feature X to our site?
* How is the performance influenced as we change code, server software, server hardware, network or architecture?
* How efficient is the caching in our system?
* How has the performance changed the latest three months?
* What are the main resource hogs in our system?
…without resorting to time–consuming testing.
The approach detailed in this talk puts an emphasis on doing low–cost but effective testing and can be used both before and after putting a site into production.