07-15-2008
6,384,
2,214
Join Date: May 2005
Last Activity: 28 October 2019, 4:59 PM EDT
Location: In the leftmost byte of /dev/kmem
Posts: 6,384
Thanks Given: 143
Thanked 2,214 Times in 1,548 Posts
It is a widespread misconsception that "performance" is some value you can simply measure, like "length" or "weight". "Performance" means something like "fitness for a given purpose" and without defining this purpose it means nothing at all.
If you want to "test performance" first agree with your customer what the goal of the system is. Something like "the system has to respond within 2 seconds to every message" or "the application has to finish every user action within 0.5 seconds" or "the system has to do 5000 transactions per hour" or whatever. But this parameter(s) have to be defined first, this is your "given purpose".
Only then you can start measuring (by counting transactions, seconds, ....) and find out how far off your requirement you are, if at all.
Everything else is aggregation of meaningless data*) to come up with meaningless curves (or other graphs) displaying them to meaningless people. Refuse such time-killing nonsense and concentrate on your work.
I hope this helps.
bakunin
PS: if, after defining your requirements, you need help in collecting or interpreting data for this defined purpose please write again. I will gladly help you further in this case.
_________________
*) To further expand on that: you can measure the number of transactions of an OLTP system and measure its weight, then come up with some "weight-to -transactions-quotient". You can now cut away some parts of the systems case and "measure" again and you will find that you have "optimized" the systems because afterwards it weighs, say, 3 pounds less. Would you think that you have made the system any better doing this? Would a nice curve generated from the data before and afterwards change your opinion?