Profilers and The Perils of Micro-Optimization

Ian Griffiths has just written a nice piece on profiling .Net code and why the obvious things that people do are often very wrong: “Unfortunately, lots of developers just love to go off on micro-benchmarking exercises, where they write about 10 lines of code in a variety of different ways, and discover that one runs faster than the rest. (Or worse, one looks like it runs faster than the rest when running in the profiler.)”

He advocates measuring performance outside of the profiler as well as within it so that you can be sure you’re improving actual real world performance and not just playing a pointless game with your tools. This is one of the reasons that I always like to include appropriate perfmon performance counters in my servers. They let me see real world figures, such as transactions per second or bytes per second and they let me see these figures all the time not just when I’m running under a profiler or running on a test machine. We can look at these figures for live, production systems. It’s only by being able to see and measure these kinds of things that you can really know what effect your changes really have on the system.