As the industry is changing with many modern trends, performance testing should change, too. A stereotypical, last-moment performance validation in a test lab using a record-play backload testing tool is no longer enough.
Cloud : Cloud practically eliminated the lack of appropriate hardware as a reason for not doing load testing while also significantly decreasing the cost of large-scale tests. Cloud and cloud services significantly increased a number of options to configure the system under test and load generators
There are some advantages and disadvantage of each option. Depending on the specific goals and the systems to test, one deployment model may be preferred over another.For example, to see the effect of a performance improvement (performance optimization), using an isolated lab environment may be a better option for detecting even small variations introduced by a change. For load testing the whole production environment end-to-end to make ensure the system will handle the load without any major issue, testing from the cloud or a service may be more appropriate. To create a production-like test environment without going bankrupt, moving everything to the cloud for periodical performance testing may be your best solution.
When conducting comprehensive performance testing, you’ll probably need to combine several approaches. For example, you might use lab testing for performance optimization to get reproducible results and distributed, realistic outside testing to check real-life issues you can’t simulate in the lab.
Agile: Agile development eliminates the primary problem with traditional development: you need to have a working system before you may test it. Now, with agile development, we’ve had a major “shift left,” allowing us to start testing early.
Theoretically, it should be rather straightforward — every iteration you have a working system and know exactly where you stand with the system’s performance. From the agile development side, the problem is that, unfortunately, it doesn’t always work this way in practice. So, such notions as “hardening iterations” and “technical debt” get introduced. From the performance testing side, the problem is that if we need to test the product each iteration or build, the volume of work skyrockets.
Recommended remedies usually involve automation and making performance everyone’s job. Automation here means not only using tools (in performance testing, we almost always use tools), but automating the whole process including setting up the environment, running tests, and reporting/analyzing results. Historically, performance test automation was almost non-existent as it’s much more difficult than functional testing automation, for example. Setups are more complicated, results are complex (not just pass/fail) and not easily comparable, and changing interfaces is a major challenge — especially when recording is used to create scripts.
While automation will take a significant role in the future, it only addresses one side of the challenge. Another side of the agile challenge is usually left unmentioned. The blessing of agile development, early testing, requires another mindset and another set of skills and tools. Performance testing of new systems is agile and exploratory in itself. Automation, together with further involvement of development, offloads performance engineers from routine tasks. But, testing early — the biggest benefit being that it identifies problems early when the cost of fixing them is low — does require research and analysis; it is not a routine activity and can’t be easily formalized.
https://dzone.com/articles/reinventing-performance-testing