|By Vu Lam||
|December 12, 2013 08:09 AM EST||
It made headlines for all the wrong reasons when it launched on October 1, but things could have been so different for the HealthCare.gov website if only it had been tested properly before release. Users trying to enroll encountered all sorts of glitches, including very slow page updates, "page not found" errors and frequent crashes.
Early server outages were blamed on an unexpectedly high volume of traffic as nearly 5 million Americans tried to access the website on day one, but it soon emerged that serious flaws existed in the software, and the security was not properly assessed or signed off.
According to CBS, the security testing was never completed. Fox uncovered a testing bulletin from the day before the launch that revealed the site could only handle 1,100 users "before response time gets too high." The Washington Examiner revealed, via an anonymous source, that the full testing was delayed until just a few days before the launch and instead of the 4 to 6 months of testing that should have been conducted it was only tested for 4 to 6 days.
Amid the apologies, the resignations, and the frantic efforts to fix it up by the end of November, there are serious and important lessons to be learned. A proper testing plan with a realistic schedule would have prevented this catastrophe.
Start with an Estimate
It's incredibly rare for any software to be released with zero defects, but major functional bugs and inadequate security is certainly avoidable if you plan correctly. That starts with a realistic estimate of the scope of the testing that's required. The QA department must be consulted and asked to use their experience to provide a picture of how much testing is needed.
That plan will be based on documentation outlining the requirements of the software and discussion with the developers, as well as the wealth of experience that testing professionals possess. If requirements change significantly, or new requests are introduced, then the plan must be altered to cater for that. This is one major area where things obviously went awry. According to the Washington Examiner's source there were "ever-changing, conflicting and exceedingly late project directions. The actual system requirements for Oct. 1 were changing up until the week before."
This is a clear recipe for disaster.
Modern software development is typically based on Agile methodology where requirements are built into the system quickly and feedback informs the project going forward. This approach does not mesh with traditional testing where testers would work out a comprehensive test plan based on detailed documentation, and then carry out that testing in a predefined block at the end of the project.
To adapt testing for modern software development it pays to get testers involved earlier in the process. They need to understand the system and really identify with the end user. It's much more cost effective to fix flaws and bugs sooner rather than later.
There's a logistical consideration as well. Each new build means a full regression test, bug fix verification, and a healthy dose of exploratory testing to make sure the new features are working as intended. It's important for the test team to scale up as the amount of work grows, and as much of the regression testing as possible should be automated to reduce the workload.
With a fast-paced development it is absolutely vital to get experienced testers and have them perform some level of exploratory testing. This combines their knowledge about how the system should work with educated guesses about where it might fail. It's also very useful when documentation is lacking because testers can effectively design and execute tests at the same time.
Targeted exploratory testing is the perfect complement to scripted testing. It requires some creative thinking and some freedom for the tester, but it can be a great way of emulating an end user and ensuring that specific features and functions actually deliver what they're supposed to. Properly recorded by good cloud-based testing tools, the data can be used to provide clarity for developers trying to fix problems, and it can serve as the basis of scripted testing or even automated tests in the future.
A project such as this, where disparate teams have to work together toward a common goal, can be an integration nightmare. Test management can be a real challenge, so the right tool is invaluable. The full lifecycle of every defect or requirement should be recorded to produce a clear chain from the original feature request, through the test case, to the defect, and on to repeated test cycles. It has to be clear who is responsible for each action every step of the way, so the blame game can be avoided entirely.
The ultimate aim is traceability, usability, and transparency.
If this data is gathered then it becomes easier to apply root cause analysis at a later date and discover where things went wrong. Remember that the earlier you can catch and fix the defect, the cheaper and easier it is to do. Identifying the root causes of the problems with the HealthCare.gov website requires an objective analysis of the original requirements, the documentation, the code implementation and integration, the test planning, and the test cycles. Understanding what went wrong through this process could ensure that the same mistakes are not made again in the future.
Knowing When to Pull the Trigger
Kathleen Sebelius, the health and human services secretary, apologized for her part in the botched website launch, but the real problem, and her cardinal sin, was to tell Obama that the website was ready to be launched in the first place.
QA departments are not the gatekeepers for projects, business decisions are always going to trump everything else, and the pressure to deliver ensures that every project launches with defects in it, but you ignore them at your peril. If the testers had been consulted about the state of the website and the back end before launch, you can bet they would have pointed out that it wasn't ready for prime time. A one- or two-month delay would undoubtedly have been greeted with some alarm and criticism, but it would have caused far less damaging PR than releasing an unfinished and potentially insecure product.
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
Oct. 8, 2015 01:45 PM EDT Reads: 108
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new data-driven world, marketplaces reign supreme while interoperability, APIs and applications deliver un...
Oct. 8, 2015 01:30 PM EDT Reads: 205
SYS-CON Events announced today that ProfitBricks, the provider of painless cloud infrastructure, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. ProfitBricks is the IaaS provider that offers a painless cloud experience for all IT users, with no learning curve. ProfitBricks boasts flexible cloud servers and networking, an integrated Data Center Designer tool for visual control over the cloud and the best price/performance value available. ProfitBricks was named one of the coolest Clo...
Oct. 8, 2015 01:00 PM EDT Reads: 756
Organizations already struggle with the simple collection of data resulting from the proliferation of IoT, lacking the right infrastructure to manage it. They can't only rely on the cloud to collect and utilize this data because many applications still require dedicated infrastructure for security, redundancy, performance, etc. In his session at 17th Cloud Expo, Emil Sayegh, CEO of Codero Hosting, will discuss how in order to resolve the inherent issues, companies need to combine dedicated and cloud solutions through hybrid hosting – a sustainable solution for the data required to manage I...
Oct. 8, 2015 01:00 PM EDT Reads: 469
NHK, Japan Broadcasting, will feature the upcoming @ThingsExpo Silicon Valley in a special 'Internet of Things' and smart technology documentary that will be filmed on the expo floor between November 3 to 5, 2015, in Santa Clara. NHK is the sole public TV network in Japan equivalent to the BBC in the UK and the largest in Asia with many award-winning science and technology programs. Japanese TV is producing a documentary about IoT and Smart technology and will be covering @ThingsExpo Silicon Valley. The program, to be aired during the peak viewership season of the year, will have a major impac...
Oct. 8, 2015 01:00 PM EDT Reads: 251
Apps and devices shouldn't stop working when there's limited or no network connectivity. Learn how to bring data stored in a cloud database to the edge of the network (and back again) whenever an Internet connection is available. In his session at 17th Cloud Expo, Bradley Holt, Developer Advocate at IBM Cloud Data Services, will demonstrate techniques for replicating cloud databases with devices in order to build offline-first mobile or Internet of Things (IoT) apps that can provide a better, faster user experience, both offline and online. The focus of this talk will be on IBM Cloudant, Apa...
Oct. 8, 2015 12:45 PM EDT Reads: 504
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, will look at different existing uses of peer-to-peer data sharing and how it can become useful in a live session to...
Oct. 8, 2015 12:00 PM EDT Reads: 599
As a company adopts a DevOps approach to software development, what are key things that both the Dev and Ops side of the business must keep in mind to ensure effective continuous delivery? In his session at DevOps Summit, Mark Hydar, Head of DevOps, Ericsson TV Platforms, will share best practices and provide helpful tips for Ops teams to adopt an open line of communication with the development side of the house to ensure success between the two sides.
Oct. 8, 2015 12:00 PM EDT Reads: 570
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete end-to-end walkthrough of the analysis from start to finish. Participants will also be given the pract...
Oct. 8, 2015 11:45 AM EDT Reads: 210
SYS-CON Events announced today that IBM Cloud Data Services has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IBM Cloud Data Services offers a portfolio of integrated, best-of-breed cloud data services for developers focused on mobile computing and analytics use cases.
Oct. 8, 2015 11:00 AM EDT Reads: 723
The enterprise is being consumerized, and the consumer is being enterprised. Moore's Law does not matter anymore, the future belongs to business virtualization powered by invisible service architecture, powered by hyperscale and hyperconvergence, and facilitated by vertical streaming and horizontal scaling and consolidation. Both buyers and sellers want instant results, and from paperwork to paperless to mindless is the ultimate goal for any seamless transaction. The sweetest sweet spot in innovation is automation. The most painful pain point for any business is the mismatch between supplies a...
Oct. 8, 2015 10:30 AM EDT Reads: 177
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Oct. 8, 2015 07:00 AM EDT Reads: 5,865
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Oct. 8, 2015 06:00 AM EDT Reads: 756
Nowadays, a large number of sensors and devices are connected to the network. Leading-edge IoT technologies integrate various types of sensor data to create a new value for several business decision scenarios. The transparent cloud is a model of a new IoT emergence service platform. Many service providers store and access various types of sensor data in order to create and find out new business values by integrating such data.
Oct. 8, 2015 04:00 AM EDT Reads: 540
The broad selection of hardware, the rapid evolution of operating systems and the time-to-market for mobile apps has been so rapid that new challenges for developers and engineers arise every day. Security, testing, hosting, and other metrics have to be considered through the process. In his session at Big Data Expo, Walter Maguire, Chief Field Technologist, HP Big Data Group, at Hewlett-Packard, will discuss the challenges faced by developers and a composite Big Data applications builder, focusing on how to help solve the problems that developers are continuously battling.
Oct. 8, 2015 04:00 AM EDT Reads: 484
WebRTC converts the entire network into a ubiquitous communications cloud thereby connecting anytime, anywhere through any point. In his session at WebRTC Summit,, Mark Castleman, EIR at Bell Labs and Head of Future X Labs, will discuss how the transformational nature of communications is achieved through the democratizing force of WebRTC. WebRTC is doing for voice what HTML did for web content.
Oct. 8, 2015 03:00 AM EDT Reads: 1,376
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, will discuss the impact of technology on identity. Should we federate, or not? How should identity be secured? Who owns the identity? How is identity ...
Oct. 8, 2015 03:00 AM EDT Reads: 433
Developing software for the Internet of Things (IoT) comes with its own set of challenges. Security, privacy, and unified standards are a few key issues. In addition, each IoT product is comprised of at least three separate application components: the software embedded in the device, the backend big-data service, and the mobile application for the end user's controls. Each component is developed by a different team, using different technologies and practices, and deployed to a different stack/target - this makes the integration of these separate pipelines and the coordination of software upd...
Oct. 8, 2015 03:00 AM EDT Reads: 279
WebRTC services have already permeated corporate communications in the form of videoconferencing solutions. However, WebRTC has the potential of going beyond and catalyzing a new class of services providing more than calls with capabilities such as mass-scale real-time media broadcasting, enriched and augmented video, person-to-machine and machine-to-machine communications. In his session at @ThingsExpo, Luis Lopez, CEO of Kurento, will introduce the technologies required for implementing these ideas and some early experiments performed in the Kurento open source software community in areas ...
Oct. 8, 2015 02:00 AM EDT Reads: 690
WebRTC: together these advances have created a perfect storm of technologies that are disrupting and transforming classic communications models and ecosystems. In his session at WebRTC Summit, Cary Bran, VP of Innovation and New Ventures at Plantronics and PLT Labs, will provide an overview of this technological shift, including associated business and consumer communications impacts, and opportunities it may enable, complement or entirely transform.
Oct. 8, 2015 01:30 AM EDT Reads: 734