Welcome!

Agile Computing Authors: Pat Romanski, Liz McMillan, Corey Roth, Elizabeth White, Yeshim Deniz

Related Topics: @DevOpsSummit, Java IoT, Agile Computing

@DevOpsSummit: Article

How to Create the Most Realistic Load Tests | @DevOpsSummit #DevOps

The whole point of performance testing is to know that your app can handle whatever is thrown at it

You may have heard the phrase, "You've got to fake it till you make it."

Not with load testing.

Performance testing is one of the most important things you can do when building a web or mobile app, and it's only becoming more vital as the expectations of users are going up. People demand access to anything, anywhere, anytime, and they'll switch to a competitive solution if the app they're trying to use is slow or clunky.

Performance is critical to the success of your web and mobile apps - and will be for a long time in the future. It's not a matter of if you have to do it. It's about how to do it best.

So, how do you do it best? You make your test scenarios as real as possible. If you have ever listened to the performance testing horror stories from Brad Stoner, you've heard the importance of realistic testing scenarios

Why Is Realism So Important?
The whole point of performance testing is to know that your app can handle whatever is thrown at it. But application architecture, and in particular application delivery, is very complex. There are a lot of variables that impact the user's ultimate experience. Some are more obvious, like the specific device in use, or the task the user is trying to accomplish. Others are less apparent, hidden under abstraction layers or deep in the network layer.

You'll also want to consider the impact of other software running on the platform, the local environment, the ISP, and more. Not to mention the effect of 3rd-party services integrated with the app.

The list goes on. If your performance tests are overly simple, it means you aren't testing places that could impact the experience. It's like only testing a car on an empty highway and not taking the realities of street driving into account: potholes, traffic, other drivers, or suddenly-appearing pedestrians. Without realistic tests, you are not preparing for scenarios that are likely to happen and will be detrimental to your users (and your business). Plus you will be wasting a ton of time, effort, and money on useless tests.

So we've decided to put together a few ways that will show you how you can make your performance testing more realistic.

Don't Fake It - Realistic Load Test Scenarios Should Include...
Geography

Geographically speaking, where are your users? What are the predominant regions and how are people distributed between then? A user's geographic location plays a very important role in the experience they have and includes many factors that allow you to simulate load. The number of hops and the backbone speeds in the path between the website and the client system are just a couple of attributes involved in determining how fast packets will travel and how many packets are dropped.

Geographic diversity also shows you a range of user experience patterns, since people around the world may behave differently, especially when it comes to how people use desktops versus phones. Another important benefit of geographically dispersed load testing is the ability to distribute load across lots of places so your servers are handling traffic from a broader range of endpoints. Finally, using 3rd party locations forces you to execute your load tests outside of your data center, so they exercise the full data pathway from device to server.

Devices and Browsers
The web browser is the key blind spot for gaining true end-to-end visibility into application performance. Increased usage of client-side scripting means you must monitor the processing that takes place within the browser. It's the only way to get full visibility into performance. Browser test cases measure events that happen in the rendered page and are visible to the user. They can even measure things like: "After the user does A, how long does it take for button B to become enabled?"

Furthermore, you should monitor devices. Why? Because the variability in devices is growing rapidly. You need to look for software changes in each device and monitor their evolution and updates, which also impact performance. Here's the bottom line: With a little effort and maintenance, you can accurately find out if any clients are reacting poorly to your code, and work flexibly with your product to fix browser- or device-specific issues.

User Behavior
Accurately recreating the way users interact with your site is a key part of building realistic load tests. This usually starts with how you record scenarios in the first place. As users traverse through the app, you'll capture their clicks and form submissions and use this stream as the basis for future load tests. When you do this, you need to make sure the recording is parameterized in a way so that variables are randomized and represent what happens most often for people. All scenarios must be designed to be representative, replaying scenarios accurately with all the elements of users experience like popups or interruptions. So for example, if the user you are recording waits 10 seconds to click a button, you could turn this into a parameter that randomly waits between 5 and 20 seconds.

For even more realism, use Google Analytics to get a sense of the variability of parameters in actual users. While recording a scenario, you may need to specify some parameters that will be used for further test runs to help with repeatability. It is not a good practice to play back a test with the exact same recorded data for each user because it does not simulate real-life conditions. You should load test using special variables and store the desired data. Your requests can use this data during test runs.

User Paths
Sometimes, testers only test a limited set of paths through the application. This is often due to a limitation of the load testing software they are using. Take, for example, a chat window. This is small component on a typical web-based application, and as such, it's a part of the app that rarely gets tested. Other examples include infrastructure packages like Java Messaging Service or 3rd-party services like ad networks. If your load testing software doesn't help you incorporate these elements, you may have no idea how long those ads are making your users wait.

From the developer perspective, these experiences are somewhat separated from the application. But, this is not the case from the user's perspective. Think comprehensively about the way a user navigates through the app. Also talk to users. Ask about frustrations. This may lead to insights about how to build your performance tests with more realism.

Network Behavior
Knowledge about network latency and bandwidth is needed for any application that isn't local, because bumps and burps in the network can have a real adverse impact on your users. Monitoring network bandwidth and web application performance from multiple locations helps isolate problems in the network tier.

Bandwidth bottlenecks cause network queues to develop and data to be lost, impacting the performance of applications. This is especially true for mobile and cloud apps. High jitter, increased latency, and packet loss all work to degrade application performance. Use emulators and other monitoring functions to get a picture of the range of network characteristics. Then build that behavior directly into your load tests. Use your load testing platform to actually introduce network problems that users typically encounter, so you know what happens when they do.

Connection Parameters
Modern web browsers send requests to the server using several simultaneous connections. These parallel requests download images, scripts, CSS files and other resources located on the page. Different devices and browsers maintain different policies about how many concurrent requests are allowed. For example, phones generally restrict more than desktops or laptops do. You'll need to simulate an appropriate number of parallel requests during your tests. For example, if you are running an emulated mobile test from a server-based load simulator, make sure to set up some connection limitations. Try to send requests exactly like your browser did when you recorded your scenario. This makes the simulation all the more closer to real-world conditions.

Welcome to the Real World
It all boils down to one thing: keeping it real is paramount to load testing. We've given you a list of attributes to consider so that you can represent what actually happens for most users. If you can simulate what a user is most likely to experience, you'll be a step ahead of the game. With the rising demand for perfection in web application performance, there's no replacement for realistic load testing.

Image Credit: Jean-Etienne Minh-Duy Poirrier

More Stories By Tim Hinds

Tim Hinds is the Product Marketing Manager for NeoLoad at Neotys. He has a background in Agile software development, Scrum, Kanban, Continuous Integration, Continuous Delivery, and Continuous Testing practices.

Previously, Tim was Product Marketing Manager at AccuRev, a company acquired by Micro Focus, where he worked with software configuration management, issue tracking, Agile project management, continuous integration, workflow automation, and distributed version control systems.

@ThingsExpo Stories
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
DXWorldEXPO LLC, the producer of the world's most influential technology conferences and trade shows has announced the 22nd International CloudEXPO | DXWorldEXPO "Early Bird Registration" is now open. Register for Full Conference "Gold Pass" ▸ Here (Expo Hall ▸ Here)
Join IBM November 1 at 21st Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Cognitive analysis impacts today’s systems with unparalleled ability that were previously available only to manned, back-end operations. Thanks to cloud processing, IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Imagine a robot vacuum that becomes your personal assistant tha...
"MobiDev is a software development company and we do complex, custom software development for everybody from entrepreneurs to large enterprises," explained Alan Winters, U.S. Head of Business Development at MobiDev, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
I think DevOps is now a rambunctious teenager - it's starting to get a mind of its own, wanting to get its own things but it still needs some adult supervision," explained Thomas Hooker, VP of marketing at CollabNet, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
It is of utmost importance for the future success of WebRTC to ensure that interoperability is operational between web browsers and any WebRTC-compliant client. To be guaranteed as operational and effective, interoperability must be tested extensively by establishing WebRTC data and media connections between different web browsers running on different devices and operating systems. In his session at WebRTC Summit at @ThingsExpo, Dr. Alex Gouaillard, CEO and Founder of CoSMo Software, presented ...
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, introduced two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a multip...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...
In his session at @ThingsExpo, Dr. Robert Cohen, an economist and senior fellow at the Economic Strategy Institute, presented the findings of a series of six detailed case studies of how large corporations are implementing IoT. The session explored how IoT has improved their economic performance, had major impacts on business models and resulted in impressive ROIs. The companies covered span manufacturing and services firms. He also explored servicification, how manufacturing firms shift from se...
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of bus...
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
IoT solutions exploit operational data generated by Internet-connected smart “things” for the purpose of gaining operational insight and producing “better outcomes” (for example, create new business models, eliminate unscheduled maintenance, etc.). The explosive proliferation of IoT solutions will result in an exponential growth in the volume of IoT data, precipitating significant Information Governance issues: who owns the IoT data, what are the rights/duties of IoT solutions adopters towards t...
Amazon started as an online bookseller 20 years ago. Since then, it has evolved into a technology juggernaut that has disrupted multiple markets and industries and touches many aspects of our lives. It is a relentless technology and business model innovator driving disruption throughout numerous ecosystems. Amazon’s AWS revenues alone are approaching $16B a year making it one of the largest IT companies in the world. With dominant offerings in Cloud, IoT, eCommerce, Big Data, AI, Digital Assista...