For most performance testers, our first few years of work in the industry were focused on learning the basics of automated testing tools and how to conduct and develop load tests. But what I used to hear often from testers is that even though they developed these skills, they were not included in the decision about what testing tool they must learn first -- the performance tooling solution was usually pre-determined. This may have been due to a corporate licensing decision or the specific market domination of one particular tool.
Which means that as a tester, someone above you in your organization was traditionally engaged (and probably completely overwhelmed) by the task of selecting the right tools for the job you would be performing. The reason usually given for this is that the majority of performance testing solutions have been almost exclusively commercial products, some of which required formal evaluation and validation before your company would be willing to make an investment in the vendor’s solution.
Ah, how times have changed.
Today we have a much larger array of performance testing solutions with a range of costs -- from completely free open source tools, to cloud-based tools with incremental operational licensing, and sophisticated enterprise-class tools. With more choices (in different pricing classes) the perception is that you have more freedom to choose and now may use many different tools to get the job done right.
There has also been an increased focus in recent years on automation. Although automation is not a new idea for performance testing, it is becoming a dominant topic of conversation, so much so that the expectation now is that performance testers have a comprehensive understanding of all tools on the market, can conduct a full feature set comparison, and really know the benefit of one tool over another.
And today performance testers can be engaged much more directly with vendors than in years past; this can be challenging for an engineer with no practical experience in vendor relationship management. There are often complex policies and processes when it comes to working with your management and procurement departments. Even if you are selecting free or open-source tools, navigating all this can become a full time job!
So nowadays you are expected, as a performance tester, to know everything about all the tools, to make an objective choice for the best one to use, to decipher all of the licensing constructs, to compute and justify the costs/benefit calculations and then manage the implementation for your entire team. With little time or education on the subject, you typically end up just “shooting from the hip” when selecting performance tools. Thus, typically, you often shoot yourself in the foot.
And this is why Scott Barber and I have put together an STP Online Summit virtual conference series, to help you learn how to do all this. The “Survey of Performance Testing Tools” (August 21st-23rd) is a great opportunity to learn how to conduct tool evaluation and vendor management in less than three days. During the summit you will be guided by Scott and I through interviews with actual testing tool vendors, engage them with your questions, and learn how to bring all the information back together at the end to drive your decision. It’s a crash-course on tool surveying and necessary navigation skills that you will not find anywhere else.
In closing, what does this have to do with Hyperspace? Well, there is a great scene in Star Wars Episode IV where Han Solo turns to the inexperienced Luke Skywalker and states:
"Traveling through hyperspace ain't like dusting crops, boy! Without precise calculations we could fly right through a star or bounce too close to a supernova, and that'd end your trip real quick, wouldn't it?”
In an alternate universe (and alternate galaxy…far, far away) where Luke is a fledgling performance tester, Han Solo might have instead given the strict warning:
"Selecting the proper performance testing tool ain't like dusting crops, boy! Without precise calculations from the tools we could fly right past a false positive or come too close to overloading the wrong system component, and that'd end your testing career real quick, wouldn't it?”