|By Vu Lam||
|December 12, 2013 08:09 AM EST||
It made headlines for all the wrong reasons when it launched on October 1, but things could have been so different for the HealthCare.gov website if only it had been tested properly before release. Users trying to enroll encountered all sorts of glitches, including very slow page updates, "page not found" errors and frequent crashes.
Early server outages were blamed on an unexpectedly high volume of traffic as nearly 5 million Americans tried to access the website on day one, but it soon emerged that serious flaws existed in the software, and the security was not properly assessed or signed off.
According to CBS, the security testing was never completed. Fox uncovered a testing bulletin from the day before the launch that revealed the site could only handle 1,100 users "before response time gets too high." The Washington Examiner revealed, via an anonymous source, that the full testing was delayed until just a few days before the launch and instead of the 4 to 6 months of testing that should have been conducted it was only tested for 4 to 6 days.
Amid the apologies, the resignations, and the frantic efforts to fix it up by the end of November, there are serious and important lessons to be learned. A proper testing plan with a realistic schedule would have prevented this catastrophe.
Start with an Estimate
It's incredibly rare for any software to be released with zero defects, but major functional bugs and inadequate security is certainly avoidable if you plan correctly. That starts with a realistic estimate of the scope of the testing that's required. The QA department must be consulted and asked to use their experience to provide a picture of how much testing is needed.
That plan will be based on documentation outlining the requirements of the software and discussion with the developers, as well as the wealth of experience that testing professionals possess. If requirements change significantly, or new requests are introduced, then the plan must be altered to cater for that. This is one major area where things obviously went awry. According to the Washington Examiner's source there were "ever-changing, conflicting and exceedingly late project directions. The actual system requirements for Oct. 1 were changing up until the week before."
This is a clear recipe for disaster.
Modern software development is typically based on Agile methodology where requirements are built into the system quickly and feedback informs the project going forward. This approach does not mesh with traditional testing where testers would work out a comprehensive test plan based on detailed documentation, and then carry out that testing in a predefined block at the end of the project.
To adapt testing for modern software development it pays to get testers involved earlier in the process. They need to understand the system and really identify with the end user. It's much more cost effective to fix flaws and bugs sooner rather than later.
There's a logistical consideration as well. Each new build means a full regression test, bug fix verification, and a healthy dose of exploratory testing to make sure the new features are working as intended. It's important for the test team to scale up as the amount of work grows, and as much of the regression testing as possible should be automated to reduce the workload.
With a fast-paced development it is absolutely vital to get experienced testers and have them perform some level of exploratory testing. This combines their knowledge about how the system should work with educated guesses about where it might fail. It's also very useful when documentation is lacking because testers can effectively design and execute tests at the same time.
Targeted exploratory testing is the perfect complement to scripted testing. It requires some creative thinking and some freedom for the tester, but it can be a great way of emulating an end user and ensuring that specific features and functions actually deliver what they're supposed to. Properly recorded by good cloud-based testing tools, the data can be used to provide clarity for developers trying to fix problems, and it can serve as the basis of scripted testing or even automated tests in the future.
A project such as this, where disparate teams have to work together toward a common goal, can be an integration nightmare. Test management can be a real challenge, so the right tool is invaluable. The full lifecycle of every defect or requirement should be recorded to produce a clear chain from the original feature request, through the test case, to the defect, and on to repeated test cycles. It has to be clear who is responsible for each action every step of the way, so the blame game can be avoided entirely.
The ultimate aim is traceability, usability, and transparency.
If this data is gathered then it becomes easier to apply root cause analysis at a later date and discover where things went wrong. Remember that the earlier you can catch and fix the defect, the cheaper and easier it is to do. Identifying the root causes of the problems with the HealthCare.gov website requires an objective analysis of the original requirements, the documentation, the code implementation and integration, the test planning, and the test cycles. Understanding what went wrong through this process could ensure that the same mistakes are not made again in the future.
Knowing When to Pull the Trigger
Kathleen Sebelius, the health and human services secretary, apologized for her part in the botched website launch, but the real problem, and her cardinal sin, was to tell Obama that the website was ready to be launched in the first place.
QA departments are not the gatekeepers for projects, business decisions are always going to trump everything else, and the pressure to deliver ensures that every project launches with defects in it, but you ignore them at your peril. If the testers had been consulted about the state of the website and the back end before launch, you can bet they would have pointed out that it wasn't ready for prime time. A one- or two-month delay would undoubtedly have been greeted with some alarm and criticism, but it would have caused far less damaging PR than releasing an unfinished and potentially insecure product.
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
May. 23, 2015 04:00 AM EDT Reads: 4,792
15th Cloud Expo, which took place Nov. 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA, expanded the conference content of @ThingsExpo, Big Data Expo, and DevOps Summit to include two developer events. IBM held a Bluemix Developer Playground on November 5 and ElasticBox held a Hackathon on November 6. Both events took place on the expo floor. The Bluemix Developer Playground, for developers of all levels, highlighted the ease of use of Bluemix, its services and functionality and provide short-term introductory projects that developers can complete between sessions.
May. 23, 2015 04:00 AM EDT Reads: 6,203
The 3rd International @ThingsExpo, co-located with the 16th International Cloud Expo – to be held June 9-11, 2015, at the Javits Center in New York City, NY – is now accepting Hackathon proposals. Hackathon sponsorship benefits include general brand exposure and increasing engagement with the developer ecosystem. At Cloud Expo 2014 Silicon Valley, IBM held the Bluemix Developer Playground on November 5 and ElasticBox held the DevOps Hackathon on November 6. Both events took place on the expo floor. The Bluemix Developer Playground, for developers of all levels, highlighted the ease of use of...
May. 23, 2015 04:00 AM EDT Reads: 3,354
In the consumer IoT, everything is new, and the IT world of bits and bytes holds sway. But industrial and commercial realms encompass operational technology (OT) that has been around for 25 or 50 years. This grittier, pre-IP, more hands-on world has much to gain from Industrial IoT (IIoT) applications and principles. But adding sensors and wireless connectivity won’t work in environments that demand unwavering reliability and performance. In his session at @ThingsExpo, Ron Sege, CEO of Echelon, will discuss how as enterprise IT embraces other IoT-related technology trends, enterprises with i...
May. 23, 2015 03:00 AM EDT Reads: 4,256
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With "smart" appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user's habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps, abiding by privacy concerns and making the concept a reality. These challenges can't be addressed w...
May. 23, 2015 02:45 AM EDT Reads: 6,645
We’re no longer looking to the future for the IoT wave. It’s no longer a distant dream but a reality that has arrived. It’s now time to make sure the industry is in alignment to meet the IoT growing pains – cooperate and collaborate as well as innovate. In his session at @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, will examine the key ingredients to IoT success and identify solutions to challenges the industry is facing. The deep industry expertise behind this presentation will provide attendees with a leading edge view of rapidly emerging IoT oppor...
May. 23, 2015 02:30 AM EDT Reads: 4,900
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focused on understanding how industrial data can create intelligence for industrial operations. Imagine ...
May. 23, 2015 02:00 AM EDT Reads: 5,164
SYS-CON Events announced today that Liaison Technologies, a leading provider of data management and integration cloud services and solutions, has been named "Silver Sponsor" of SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York, NY. Liaison Technologies is a recognized market leader in providing cloud-enabled data integration and data management solutions to break down complex information barriers, enabling enterprises to make smarter decisions, faster.
May. 23, 2015 01:30 AM EDT Reads: 5,287
The 17th International Cloud Expo has announced that its Call for Papers is open. 17th International Cloud Expo, to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, APM, APIs, Microservices, Security, Big Data, Internet of Things, DevOps and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal today!
May. 23, 2015 01:15 AM EDT Reads: 4,258
Collecting data in the field and configuring multitudes of unique devices is a time-consuming, labor-intensive process that can stretch IT resources. Horan & Bird [H&B], Australia’s fifth-largest Solar Panel Installer, wanted to automate sensor data collection and monitoring from its solar panels and integrate the data with its business and marketing systems. After data was collected and structured, two major areas needed to be addressed: improving developer workflows and extending access to a business application to multiple users (multi-tenancy). Docker, a container technology, was used to ...
May. 23, 2015 01:00 AM EDT Reads: 1,950
Hadoop as a Service (as offered by handful of niche vendors now) is a cloud computing solution that makes medium and large-scale data processing accessible, easy, fast and inexpensive. In his session at Big Data Expo, Kumar Ramamurthy, Vice President and Chief Technologist, EIM & Big Data, at Virtusa, will discuss how this is achieved by eliminating the operational challenges of running Hadoop, so one can focus on business growth. The fragmented Hadoop distribution world and various PaaS solutions that provide a Hadoop flavor either make choices for customers very flexible in the name of opti...
May. 23, 2015 12:30 AM EDT Reads: 3,738
For years, we’ve relied too heavily on individual network functions or simplistic cloud controllers. However, they are no longer enough for today’s modern cloud data center. Businesses need a comprehensive platform architecture in order to deliver a complete networking suite for IoT environment based on OpenStack. In his session at @ThingsExpo, Dhiraj Sehgal from PLUMgrid will discuss what a holistic networking solution should really entail, and how to build a complete platform that is scalable, secure, agile and automated.
May. 23, 2015 12:00 AM EDT Reads: 4,251
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo, June 9-11, 2015, at the Javits Center in New York City. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be
May. 22, 2015 11:30 PM EDT Reads: 1,949
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal an...
May. 22, 2015 11:30 PM EDT Reads: 2,617
Wearable devices have come of age. The primary applications of wearables so far have been "the Quantified Self" or the tracking of one's fitness and health status. We propose the evolution of wearables into social and emotional communication devices. Our BE(tm) sensor uses light to visualize the skin conductance response. Our sensors are very inexpensive and can be massively distributed to audiences or groups of any size, in order to gauge reactions to performances, video, or any kind of presentation. In her session at @ThingsExpo, Jocelyn Scheirer, CEO & Founder of Bionolux, will discuss ho...
May. 22, 2015 09:00 PM EDT Reads: 5,127
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impact.
May. 22, 2015 09:00 PM EDT Reads: 4,771
The 4th International Internet of @ThingsExpo, co-located with the 17th International Cloud Expo - to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA - announces that its Call for Papers is open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
May. 22, 2015 08:00 PM EDT Reads: 1,612
Every day we read jaw-dropping stats on the explosion of data. We allocate significant resources to harness and better understand it. We build businesses around it. But we’ve only just begun. For big payoffs in Big Data, CIOs are turning to cognitive computing. Cognitive computing’s ability to securely extract insights, understand natural language, and get smarter each time it’s used is the next, logical step for Big Data.
May. 22, 2015 08:00 PM EDT Reads: 2,072
There's no doubt that the Internet of Things is driving the next wave of innovation. Google has spent billions over the past few months vacuuming up companies that specialize in smart appliances and machine learning. Already, Philips light bulbs, Audi automobiles, and Samsung washers and dryers can communicate with and be controlled from mobile devices. To take advantage of the opportunities the Internet of Things brings to your business, you'll want to start preparing now.
May. 22, 2015 07:00 PM EDT Reads: 5,727
17th Cloud Expo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterprises are using some form of XaaS – software, platform, and infrastructure as a service.
May. 22, 2015 05:00 PM EDT Reads: 2,387