Not too long ago we deployed a web-based application using SQL Server to one of our clients. Initially the users were disappointed by the seemingly terrible performance. The database was a SQL Server 2000 database because that was a corporate standard at the time of deployment. All the testing in the development environment, production test environment, and Virtual PC test environments exhibited acceptable performance.
Appropriate Indexing strategies were used on the database tables. Queries were evaluated and tuned. We had other applications we created that were deployed to their servers with architectural similarities and no performance related concerns. We had no direct access to their servers to do more sophisticated performance testing and profiling which would have made the troubleshooting effort more efficient.
After making a slight change to a few similar queries to further improve the dismal performance, a slight indication surfaced that something was wrong. The query returned different results on the customer’s server than all of the test instances. Why? Was it old data that was cached in the web-browser? Was there a version control issue with other tables, views, or stored procedures? Nope.
I asked the Sys Admin to check the SQL Server Edition and It was the RTM edition. Since MSSQL 2000 RTM edition there were 4+ Service Packs released that were not applied. I loaded up the RTM version on a Virtual PC and sure enough, I got the same incorrect results. The Sys Admin updated to SP4 and the performance immediately improved by orders of magnitude.
The moral of the story, sometimes some of the things we take for granted, like Service Pack updates, should be included on the pre-deployment or pre-development checklist.
No comments:
Post a Comment