Click here to close now.


Agile Computing Authors: Elizabeth White, Pat Romanski, Yeshim Deniz, Jayaram Krishnaswamy, Ian Khan

Related Topics: Microservices Expo, Java IoT, Industrial IoT, Microsoft Cloud, Agile Computing, Cloud Security

Microservices Expo: Article

Building Good Tests: Why, What, How?

Good testers and good tests always retain and use an awareness of what the intended audience wants and expects

Tests are an investment in the quality of any given system. There's always a cost to build, run, and maintain each test in terms of time and resources. There's also a great deal of value to be extracted from running the right test at the right time. It's important to remember that for everything you do test, there's something else you're not testing as a result.

Understanding that some tests are more important than others is vital to creating a useful and fluid test plan capable of catering to modern software development techniques. Traditional waterfall development - where everything is delivered for a specific release date in a finished package - has been succeeded by continuous feature roll outs into live systems. This necessitates a different approach from QA departments.

How Do You Build Good Tests?
You can't design or execute the right tests without understanding the intended purpose of the system. Testers need to have an insight into the end user's expectations. Communication between the product people at the business end, the engineers working on the code, and the test department enables you to score tests in terms of their importance and work out where each test cycle should be focusing.

We can break it down into three simple queries: why, what, and how.

"Why" is a higher level overview that really ties into the business side. It's the big-picture thinking that reveals why you're building the software in the first place. What audience need is your product fulfilling? For example, we need to build an e-commerce website to sell our product to the general public.

"What" is really focused on individual features or functions of a system. Using a shopping cart analogy for an e-commerce website, you might say that users must be able to add and remove items from their shopping cart, or that they shouldn't be able to add something that's out of stock.

"How" relates to the practical application of your testing. How exactly is the software going to be tested? How is the success and failure measured?

Good tests are always going to cover our trio, but it can be a useful exercise to break things down.

The Why
If you get too caught up in the "what" and the "how," it's possible to miss the "why" completely and it's the most important element because it dictates that some tests are more important than others. The business case for developing your software in the first place has to remain front and center throughout the project. If you begin to lose sight of what the end user needs, then you could be focusing your efforts in the wrong places. Delivering value to your customers is critical. Good testers and good tests always retain and use an awareness of what the intended audience wants and expects.

One technique we can employ is risk-based analysis of tasks. With risk-based analysis, we can arrive at a numerical value for each test, which gives you a sense of its importance. We can assign a score of between 1 and 9 to each test. At the top end, a score of 9 would be a critical test, and at the other end of the spectrum a score of 1 might indicate a test that only needs to be used sparingly. The value is determined by multiplying two factors:

  • Impact to the user: What are they trying to accomplish and what would the impact be if they couldn't? How critical is this action?
  • Probability of failure: How likely is it that this code will fail? This is heavily influenced by how new it is and how much testing it has already undergone.

If we return to our e-commerce website analogy then we could take the action of being able to buy goods, clearly that's essential, so it would be assigned a 3. However, the functionality for adding goods to the basket and checking out has been there since day one so it has already been tested quite extensively; however, some new features have been added that could impact the code, so that might result in a score of 2. Multiply the two together and you've got a 6. This figure will change over time, because probability of failure will go up if this part of the system is significantly altered, and it will go down over time if it isn't. There's also a discretionary factor that might lead you to bump that 6 up to a 7 if you feel it's merited.

The What
Testers come up with specific scenarios of how an end user might interact with the software and what their expected outcome would be. A typical test might consist of many steps detailed in a script, but this approach can cause problems. What if a new tester comes in to run the test? Is the intent of the test clear to them? What if the implementation of the feature changes? Perhaps the steps no longer result in the expected outcome and the test fails, but that doesn't necessarily mean that the software is not working as intended.

The steps and scripts are delving into the "how," but if the "what" is distinct from the "how" then there's less chance of erroneous failure. Allow the tester some room for an exploratory approach and you're likely to get better results. If something can be tightly scripted and you expect it to be a part of your regression testing, then there's an argument for looking at automating it.

Adopting a technique like Specification by Example or Behavior Driven Design, you're going to lay each test out in this format:

  • Given certain preconditions
  • When one or more actions happen
  • Then you should get this outcome

Regardless of the specifics of the user interface, or the stops along the way between A and Z, the "Given, When, Then" format covers the essential core of the scenario and ensures that the feature does what it's intended to do, without necessarily spelling out exactly how it should do it. It can be used to generate tables of scenarios that describe the actions, variables, and outcomes to test.

The How
Getting down to the nuts and bolts of how testers will create, document, and run tests, we come to the "how." Since projects are more fluid now and requirements or priorities can change on a daily basis, there needs to be some flexibility in the "how" and a willingness to continually reassess the worth of individual tests. Finding the right balance between automated tests, manually scripted tests, and exploratory testing is an ongoing challenge that's driven by the "what" and the "why."

Traditionally, manual tests have been fully documented as a sequence of steps with an expected result for each step, but this is time-consuming and difficult to maintain. It's possible to borrow from automation by plugging in higher-level actions, or keywords, that refer to a detailed script or a common business action that's understood by the tester. There's also been a move toward exploratory testing, where the intent of the test is defined but the steps are dynamic. Finally, there's a place for disposable testing where you might use a recording tool to quickly document each step in a manual test as you work through it. These tests will need to be redone if anything changes, but as it's a relatively quick process and you're actually testing while you create the test, that's not necessarily a problem.

Continuous Assessment
Each individual test should be viewed as an investment. You have to decide whether to run it, whether it needs to be maintained, or if it's time to drop it. You need to continually assess the value and the cost of each test so that you get maximum value from each test cycle. Never lose sight of the business requirements. When a test fails, ask yourself if it's a problem with the system or a problem with the test, and make sure that you always listen to the business people, the software engineers, and most importantly the customer.

More Stories By Sellers Smith

Sellers Smith is Director of Quality Assurance & Agile Evangelist for Silverpop of Atlanta, GA, a digital marketing technology provider that unifies marketing automation, email, mobile, and social for more than 5,000 global brands.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@ThingsExpo Stories
Internet of Things (IoT) will be a hybrid ecosystem of diverse devices and sensors collaborating with operational and enterprise systems to create the next big application. In their session at @ThingsExpo, Bramh Gupta, founder and CEO of, and Fred Yatzeck, principal architect leading product development at, discussed how choosing the right middleware and integration strategy from the get-go will enable IoT solution developers to adapt and grow with the industry, while at the same time reduce Time to Market (TTM) by using plug and play capabilities offered by a robust IoT ...
Today’s connected world is moving from devices towards things, what this means is that by using increasingly low cost sensors embedded in devices we can create many new use cases. These span across use cases in cities, vehicles, home, offices, factories, retail environments, worksites, health, logistics, and health. These use cases rely on ubiquitous connectivity and generate massive amounts of data at scale. These technologies enable new business opportunities, ways to optimize and automate, along with new ways to engage with users.
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new data-driven world, marketplaces reign supreme while interoperability, APIs and applications deliver un...
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
“In the past year we've seen a lot of stabilization of WebRTC. You can now use it in production with a far greater degree of certainty. A lot of the real developments in the past year have been in things like the data channel, which will enable a whole new type of application," explained Peter Dunkley, Technical Director at Acision, in this interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Through WebRTC, audio and video communications are being embedded more easily than ever into applications, helping carriers, enterprises and independent software vendors deliver greater functionality to their end users. With today’s business world increasingly focused on outcomes, users’ growing calls for ease of use, and businesses craving smarter, tighter integration, what’s the next step in delivering a richer, more immersive experience? That richer, more fully integrated experience comes about through a Communications Platform as a Service which allows for messaging, screen sharing, video...
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet conditions, Dyn ensures traffic gets delivered faster, safer, and more reliably than ever.
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete end-to-end walkthrough of the analysis from start to finish. Participants will also be given the pract...
The IoT market is on track to hit $7.1 trillion in 2020. The reality is that only a handful of companies are ready for this massive demand. There are a lot of barriers, paint points, traps, and hidden roadblocks. How can we deal with these issues and challenges? The paradigm has changed. Old-style ad-hoc trial-and-error ways will certainly lead you to the dead end. What is mandatory is an overarching and adaptive approach to effectively handle the rapid changes and exponential growth.
Mobile messaging has been a popular communication channel for more than 20 years. Finnish engineer Matti Makkonen invented the idea for SMS (Short Message Service) in 1984, making his vision a reality on December 3, 1992 by sending the first message ("Happy Christmas") from a PC to a cell phone. Since then, the technology has evolved immensely, from both a technology standpoint, and in our everyday uses for it. Originally used for person-to-person (P2P) communication, i.e., Sally sends a text message to Betty – mobile messaging now offers tremendous value to businesses for customer and empl...
Can call centers hang up the phones for good? Intuitive Solutions did. WebRTC enabled this contact center provider to eliminate antiquated telephony and desktop phone infrastructure with a pure web-based solution, allowing them to expand beyond brick-and-mortar confines to a home-based agent model. It also ensured scalability and better service for customers, including MUY! Companies, one of the country's largest franchise restaurant companies with 232 Pizza Hut locations. This is one example of WebRTC adoption today, but the potential is limitless when powered by IoT.
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
Clearly the way forward is to move to cloud be it bare metal, VMs or containers. One aspect of the current public clouds that is slowing this cloud migration is cloud lock-in. Every cloud vendor is trying to make it very difficult to move out once a customer has chosen their cloud. In his session at 17th Cloud Expo, Naveen Nimmu, CEO of Clouber, Inc., will advocate that making the inter-cloud migration as simple as changing airlines would help the entire industry to quickly adopt the cloud without worrying about any lock-in fears. In fact by having standard APIs for IaaS would help PaaS expl...
SYS-CON Events announced today that ProfitBricks, the provider of painless cloud infrastructure, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. ProfitBricks is the IaaS provider that offers a painless cloud experience for all IT users, with no learning curve. ProfitBricks boasts flexible cloud servers and networking, an integrated Data Center Designer tool for visual control over the cloud and the best price/performance value available. ProfitBricks was named one of the coolest Clo...
Organizations already struggle with the simple collection of data resulting from the proliferation of IoT, lacking the right infrastructure to manage it. They can't only rely on the cloud to collect and utilize this data because many applications still require dedicated infrastructure for security, redundancy, performance, etc. In his session at 17th Cloud Expo, Emil Sayegh, CEO of Codero Hosting, will discuss how in order to resolve the inherent issues, companies need to combine dedicated and cloud solutions through hybrid hosting – a sustainable solution for the data required to manage I...
NHK, Japan Broadcasting, will feature the upcoming @ThingsExpo Silicon Valley in a special 'Internet of Things' and smart technology documentary that will be filmed on the expo floor between November 3 to 5, 2015, in Santa Clara. NHK is the sole public TV network in Japan equivalent to the BBC in the UK and the largest in Asia with many award-winning science and technology programs. Japanese TV is producing a documentary about IoT and Smart technology and will be covering @ThingsExpo Silicon Valley. The program, to be aired during the peak viewership season of the year, will have a major impac...
Apps and devices shouldn't stop working when there's limited or no network connectivity. Learn how to bring data stored in a cloud database to the edge of the network (and back again) whenever an Internet connection is available. In his session at 17th Cloud Expo, Bradley Holt, Developer Advocate at IBM Cloud Data Services, will demonstrate techniques for replicating cloud databases with devices in order to build offline-first mobile or Internet of Things (IoT) apps that can provide a better, faster user experience, both offline and online. The focus of this talk will be on IBM Cloudant, Apa...
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, will look at different existing uses of peer-to-peer data sharing and how it can become useful in a live session to...
As a company adopts a DevOps approach to software development, what are key things that both the Dev and Ops side of the business must keep in mind to ensure effective continuous delivery? In his session at DevOps Summit, Mark Hydar, Head of DevOps, Ericsson TV Platforms, will share best practices and provide helpful tips for Ops teams to adopt an open line of communication with the development side of the house to ensure success between the two sides.
SYS-CON Events announced today that IBM Cloud Data Services has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IBM Cloud Data Services offers a portfolio of integrated, best-of-breed cloud data services for developers focused on mobile computing and analytics use cases.