|By Sellers Smith||
|November 23, 2013 05:00 PM EST||
Tests are an investment in the quality of any given system. There's always a cost to build, run, and maintain each test in terms of time and resources. There's also a great deal of value to be extracted from running the right test at the right time. It's important to remember that for everything you do test, there's something else you're not testing as a result.
Understanding that some tests are more important than others is vital to creating a useful and fluid test plan capable of catering to modern software development techniques. Traditional waterfall development - where everything is delivered for a specific release date in a finished package - has been succeeded by continuous feature roll outs into live systems. This necessitates a different approach from QA departments.
How Do You Build Good Tests?
You can't design or execute the right tests without understanding the intended purpose of the system. Testers need to have an insight into the end user's expectations. Communication between the product people at the business end, the engineers working on the code, and the test department enables you to score tests in terms of their importance and work out where each test cycle should be focusing.
We can break it down into three simple queries: why, what, and how.
"Why" is a higher level overview that really ties into the business side. It's the big-picture thinking that reveals why you're building the software in the first place. What audience need is your product fulfilling? For example, we need to build an e-commerce website to sell our product to the general public.
"What" is really focused on individual features or functions of a system. Using a shopping cart analogy for an e-commerce website, you might say that users must be able to add and remove items from their shopping cart, or that they shouldn't be able to add something that's out of stock.
"How" relates to the practical application of your testing. How exactly is the software going to be tested? How is the success and failure measured?
Good tests are always going to cover our trio, but it can be a useful exercise to break things down.
If you get too caught up in the "what" and the "how," it's possible to miss the "why" completely and it's the most important element because it dictates that some tests are more important than others. The business case for developing your software in the first place has to remain front and center throughout the project. If you begin to lose sight of what the end user needs, then you could be focusing your efforts in the wrong places. Delivering value to your customers is critical. Good testers and good tests always retain and use an awareness of what the intended audience wants and expects.
One technique we can employ is risk-based analysis of tasks. With risk-based analysis, we can arrive at a numerical value for each test, which gives you a sense of its importance. We can assign a score of between 1 and 9 to each test. At the top end, a score of 9 would be a critical test, and at the other end of the spectrum a score of 1 might indicate a test that only needs to be used sparingly. The value is determined by multiplying two factors:
- Impact to the user: What are they trying to accomplish and what would the impact be if they couldn't? How critical is this action?
- Probability of failure: How likely is it that this code will fail? This is heavily influenced by how new it is and how much testing it has already undergone.
If we return to our e-commerce website analogy then we could take the action of being able to buy goods, clearly that's essential, so it would be assigned a 3. However, the functionality for adding goods to the basket and checking out has been there since day one so it has already been tested quite extensively; however, some new features have been added that could impact the code, so that might result in a score of 2. Multiply the two together and you've got a 6. This figure will change over time, because probability of failure will go up if this part of the system is significantly altered, and it will go down over time if it isn't. There's also a discretionary factor that might lead you to bump that 6 up to a 7 if you feel it's merited.
Testers come up with specific scenarios of how an end user might interact with the software and what their expected outcome would be. A typical test might consist of many steps detailed in a script, but this approach can cause problems. What if a new tester comes in to run the test? Is the intent of the test clear to them? What if the implementation of the feature changes? Perhaps the steps no longer result in the expected outcome and the test fails, but that doesn't necessarily mean that the software is not working as intended.
The steps and scripts are delving into the "how," but if the "what" is distinct from the "how" then there's less chance of erroneous failure. Allow the tester some room for an exploratory approach and you're likely to get better results. If something can be tightly scripted and you expect it to be a part of your regression testing, then there's an argument for looking at automating it.
Adopting a technique like Specification by Example or Behavior Driven Design, you're going to lay each test out in this format:
- Given certain preconditions
- When one or more actions happen
- Then you should get this outcome
Regardless of the specifics of the user interface, or the stops along the way between A and Z, the "Given, When, Then" format covers the essential core of the scenario and ensures that the feature does what it's intended to do, without necessarily spelling out exactly how it should do it. It can be used to generate tables of scenarios that describe the actions, variables, and outcomes to test.
Getting down to the nuts and bolts of how testers will create, document, and run tests, we come to the "how." Since projects are more fluid now and requirements or priorities can change on a daily basis, there needs to be some flexibility in the "how" and a willingness to continually reassess the worth of individual tests. Finding the right balance between automated tests, manually scripted tests, and exploratory testing is an ongoing challenge that's driven by the "what" and the "why."
Traditionally, manual tests have been fully documented as a sequence of steps with an expected result for each step, but this is time-consuming and difficult to maintain. It's possible to borrow from automation by plugging in higher-level actions, or keywords, that refer to a detailed script or a common business action that's understood by the tester. There's also been a move toward exploratory testing, where the intent of the test is defined but the steps are dynamic. Finally, there's a place for disposable testing where you might use a recording tool to quickly document each step in a manual test as you work through it. These tests will need to be redone if anything changes, but as it's a relatively quick process and you're actually testing while you create the test, that's not necessarily a problem.
Each individual test should be viewed as an investment. You have to decide whether to run it, whether it needs to be maintained, or if it's time to drop it. You need to continually assess the value and the cost of each test so that you get maximum value from each test cycle. Never lose sight of the business requirements. When a test fails, ask yourself if it's a problem with the system or a problem with the test, and make sure that you always listen to the business people, the software engineers, and most importantly the customer.
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Oct. 23, 2016 07:00 AM EDT Reads: 4,091
Amazon has gradually rolled out parts of its IoT offerings, but these are just the tip of the iceberg. In addition to optimizing their backend AWS offerings, Amazon is laying the ground work to be a major force in IoT - especially in the connected home and office. In his session at @ThingsExpo, Chris Kocher, founder and managing director of Grey Heron, explained how Amazon is extending its reach to become a major force in IoT by building on its dominant cloud IoT platform, its Dash Button strat...
Oct. 23, 2016 06:30 AM EDT Reads: 4,734
SYS-CON Events announced today that SoftNet Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. SoftNet Solutions specializes in Enterprise Solutions for Hadoop and Big Data. It offers customers the most open, robust, and value-conscious portfolio of solutions, services, and tools for the shortest route to success with Big Data. The unique differentiator is the ability to architect and ...
Oct. 23, 2016 06:00 AM EDT Reads: 677
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Oct. 23, 2016 05:30 AM EDT Reads: 3,919
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
Oct. 23, 2016 05:15 AM EDT Reads: 1,834
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
Oct. 23, 2016 04:45 AM EDT Reads: 4,335
One of biggest questions about Big Data is “How do we harness all that information for business use quickly and effectively?” Geographic Information Systems (GIS) or spatial technology is about more than making maps, but adding critical context and meaning to data of all types, coming from all different channels – even sensors. In his session at @ThingsExpo, William (Bill) Meehan, director of utility solutions for Esri, will take a closer look at the current state of spatial technology and ar...
Oct. 23, 2016 03:45 AM EDT Reads: 1,694
Everyone knows that truly innovative companies learn as they go along, pushing boundaries in response to market changes and demands. What's more of a mystery is how to balance innovation on a fresh platform built from scratch with the legacy tech stack, product suite and customers that continue to serve as the business' foundation. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, will discuss why and how ReadyTalk diverted from healthy revenue an...
Oct. 23, 2016 03:30 AM EDT Reads: 2,952
SYS-CON Media announced today that @WebRTCSummit Blog, the largest WebRTC resource in the world, has been launched. @WebRTCSummit Blog offers top articles, news stories, and blog posts from the world's well-known experts and guarantees better exposure for its authors than any other publication. @WebRTCSummit Blog can be bookmarked ▸ Here @WebRTCSummit conference site can be bookmarked ▸ Here
Oct. 23, 2016 02:30 AM EDT Reads: 9,650
SYS-CON Events announced today that Streamlyzer will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Streamlyzer is a powerful analytics for video streaming service that enables video streaming providers to monitor and analyze QoE (Quality-of-Experience) from end-user devices in real time.
Oct. 23, 2016 02:30 AM EDT Reads: 941
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
Oct. 23, 2016 02:30 AM EDT Reads: 851
In past @ThingsExpo presentations, Joseph di Paolantonio has explored how various Internet of Things (IoT) and data management and analytics (DMA) solution spaces will come together as sensor analytics ecosystems. This year, in his session at @ThingsExpo, Joseph di Paolantonio from DataArchon, will be adding the numerous Transportation areas, from autonomous vehicles to “Uber for containers.” While IoT data in any one area of Transportation will have a huge impact in that area, combining sensor...
Oct. 23, 2016 02:15 AM EDT Reads: 657
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity.
Oct. 23, 2016 02:00 AM EDT Reads: 10,963
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...
Oct. 23, 2016 01:30 AM EDT Reads: 3,661
Cloud based infrastructure deployment is becoming more and more appealing to customers, from Fortune 500 companies to SMEs due to its pay-as-you-go model. Enterprise storage vendors are able to reach out to these customers by integrating in cloud based deployments; this needs adaptability and interoperability of the products confirming to cloud standards such as OpenStack, CloudStack, or Azure. As compared to off the shelf commodity storage, enterprise storages by its reliability, high-availabil...
Oct. 23, 2016 01:30 AM EDT Reads: 1,051
In the next forty months – just over three years – businesses will undergo extraordinary changes. The exponential growth of digitization and machine learning will see a step function change in how businesses create value, satisfy customers, and outperform their competition. In the next forty months companies will take the actions that will see them get to the next level of the game called Capitalism. Or they won’t – game over. The winners of today and tomorrow think differently, follow different...
Oct. 23, 2016 01:15 AM EDT Reads: 896
The IoT industry is now at a crossroads, between the fast-paced innovation of technologies and the pending mass adoption by global enterprises. The complexity of combining rapidly evolving technologies and the need to establish practices for market acceleration pose a strong challenge to global enterprises as well as IoT vendors. In his session at @ThingsExpo, Clark Smith, senior product manager for Numerex, will discuss how Numerex, as an experienced, established IoT provider, has embraced a ...
Oct. 23, 2016 01:15 AM EDT Reads: 1,002
SYS-CON Events announced today that Super Micro Computer, Inc., a global leader in Embedded and IoT solutions, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 7-9, 2017, at the Javits Center in New York City, NY. Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology, is a premier provider of advanced server Building Block Solutions® for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and ...
Oct. 23, 2016 12:45 AM EDT Reads: 3,552
The Internet of Things (IoT), in all its myriad manifestations, has great potential. Much of that potential comes from the evolving data management and analytic (DMA) technologies and processes that allow us to gain insight from all of the IoT data that can be generated and gathered. This potential may never be met as those data sets are tied to specific industry verticals and single markets, with no clear way to use IoT data and sensor analytics to fulfill the hype being given the IoT today.
Oct. 23, 2016 12:15 AM EDT Reads: 2,457
Donna Yasay, President of HomeGrid Forum, today discussed with a panel of technology peers how certification programs are at the forefront of interoperability, and the answer for vendors looking to keep up with today's growing industry for smart home innovation. "To ensure multi-vendor interoperability, accredited industry certification programs should be used for every product to provide credibility and quality assurance for retail and carrier based customers looking to add ever increasing num...
Oct. 23, 2016 12:00 AM EDT Reads: 462