Click here to close now.


Agile Computing Authors: Pat Romanski, Carmen Gonzalez, Yeshim Deniz, SmartBear Blog, Anders Wallgren

Related Topics: Containers Expo Blog, @CloudExpo, Cloud Security

Containers Expo Blog: Blog Feed Post

Security vs. Compliance in the Cloud

To codify data security and privacy protection, the industry turns to auditable standards

Security at Cloud Expo

Security is always top of mind for CIOs and CSOs when considering a cloud deployment. An earlier post described the main security challenges companies face in moving applications to the cloud and how CloudSwitch technology simplifies the process. In this post, I’d like to dig a little deeper into cloud security and the standards used to determine compliance.

To codify data security and privacy protection, the industry turns to auditable standards, most notably SAS 70 as well as PCI, HIPAA and ISO 27002. Each one comes with controls in a variety of categories that govern operation of a cloud provider’s data center as well as the applications you want to put there. But what does compliance really mean? For example, is SAS 70 type II good enough for your requirements, or do you need PCI? How can your company evaluate the different security claims and make a sound decision?

SAS 70 (Types I and II)
SAS 70
is a well-known auditing standard that features prominently in many compliance discussions. It encompasses a variety of controls in different categories (physical security, application security, security policies and processes, etc.). SAS 70 is not a specific set of standards; instead service organizations such as cloud providers are responsible for choosing their own controls and the goals those controls intend to achieve. With SAS 70 Type I, an independent auditor evaluates the controls and issues an opinion, while the more coveted Type II is based on at least six months of active data. Accordingly, many providers will state that they are in compliance with Type I, and Type II evaluation is underway.

SAS 70 has some wiggle room, and you have to dig a little deeper to determine what the certification really involves. The savvy cloud customer will want to know not just whether a cloud is SAS 70 Type II compliant, but what controls they selected in order to get there. This is a question that people normally don’t ask, and under SAS 70 guidelines, service providers have no obligation to tell you. Thus, the level of transparency varies. Some providers may be quite willing to share their audit report describing their controls, objectives and methods. Others will explain that the information is confidential and delivering it would expose company secrets. Or some types of control information may be freely available and others off-limits.

PCI (and Its HIPAA Component)
A second major security standard in cloud computing is PCI. As the security standard for Mastercard and Visa, PCI has a known set of required controls, making it inherently more stringent than SAS 70 where controls are determined by the service provider. The inference is that PCI has stronger security than SAS 70 (and can command higher pricing). However this is not cast in stone—it depends on the SAS 70 controls that the service provider has chosen. Due to the more rigid compliance requirements PCI branding is usually harder to achieve than SAS 70. HIPAA is a subset of PCI, which means that if a cloud is PCI compliant, HIPAA compliance comes with it.

Compliance Building Blocks
Regardless of which standard is used, achieving compliance to run an application in a cloud involves building blocks, with the cloud provider’s physical infrastructure providing the foundation. Infrastructure controls include obvious things like protecting the facility from natural disasters, assuring reliable electrical power (such as backup distribution systems) in the event of outages, and backing up data in the event of a hardware failure. They also include controls governing the cloud provider’s processes and policies such as employee authorization to access the data center and how internal security reviews are performed and reported.

Sitting on top of the infrastructure controls is a separate set of application controls. Multiple levels of security are required, for example, the transport media must be secure and data must be encrypted once it leaves the data center with encryption keys under enterprise control. An application might meet SAS 70 or other standards within a company’s data center but not when it’s moved to a cloud because of exposures that may exist there or along the way. Likewise, a SAS 70 TII application in the cloud may not meet the controls if moved back to the enterprise datacenter, and could require a re-audit.

Deploying to the Cloud
There is a difference between compliance standards and what a company needs to feel secure. For data and applications that have regulatory requirements, compliance standards and audits are mandatory. For these types of applications, we’re still in the very early days for cloud computing—let’s face it, no company is going to put critical regulated applications into the cloud without the ability to conduct complete end-to-end audits. However, even for applications that do not require compliance, enterprises want to know that their data and applications are protected. Achieving security in these environments is where CloudSwitch is focused.

Cloud computing creates a division of responsibility between the cloud provider and the cloud customer. While the cloud provider needs to address infrastructure operation and protection, the customer is responsible for ensuring compliance for their application, and ultimately the overall solution. The central idea here is keep the controls separated between the cloud provider infrastructure and the customer application. If the controls mix, where for example the cloud provider has access to stored data, then things get very complicated. When this occurs, you have to worry about who in the cloud provider’s organization has access to your data, how and when they can access it, and how this access is audited and controlled. If the provider is opaque, then you can’t know. Even if the cloud provider is more transparent in their access polices, you have to evaluate those controls against your standards and potentially have to adjust your own controls in response. Further, you have to adjust to all changes in the cloud provider’s processes over time.

By keeping your systems isolated from the cloud provider’s infrastructure, you can minimize this mixing of controls. Placing protection mechanisms into your resources in the cloud can assure that data moving across the cloud provider’s networks and all data stored in their systems is encrypted. Combined with external key storage and management, your applications can be separated from the cloud provider’s infrastructure. This still requires that the cloud provider run its data center with proper physical security, power management, etc, but can greatly enhance the application level security that the enterprise needs. Finally, this separation can simplify the process of achieving compliance at the application level when running in the cloud. This isolation layer can address a number of the data protection controls by providing a uniform and repeatable process for encrypting data.

The days of cloud computing are just beginning, but with the right combination of cloud providers and additional technologies, it’s not too early to start doing real work in the cloud and to reap the benefits of this new computing paradigm. Our early customers are doing it, and so can you.

Read the original blog entry...

More Stories By Ellen Rubin

Ellen Rubin is the Founder & VP Products at CloudSwitch. She's an experienced entrepreneur with a proven track record in founding innovative technology companies and leading strategy, market positioning and go-to-market. Prior to founding CloudSwitch, Ellen was a member of the early management team at Netezza (NYSE: NZ), the pioneer and market leader in data warehouse appliances, where she helped grow the company to over $125M in revenues and a successful IPO in 2007. Prior to Netezza, she founded Manna, an Israeli and Boston-based developer of real-time personalization software. Rubin began her career as a marketing strategy consultant at Booz, Allen & Hamilton, and holds an MBA from Harvard Business School and an undergraduate degree from Harvard College. .

@ThingsExpo Stories
This week, the team assembled in NYC for @Cloud Expo 2015 and @ThingsExpo 2015. For the past four years, this has been a must-attend event for MetraTech. We were happy to once again join industry visionaries, colleagues, customers and even competitors to share and explore the ways in which the Internet of Things (IoT) will impact our industry. Over the course of the show, we discussed the types of challenges we will collectively need to solve to capitalize on the opportunity IoT presents.
As more intelligent IoT applications shift into gear, they’re merging into the ever-increasing traffic flow of the Internet. It won’t be long before we experience bottlenecks, as IoT traffic peaks during rush hours. Organizations that are unprepared will find themselves by the side of the road unable to cross back into the fast lane. As billions of new devices begin to communicate and exchange data – will your infrastructure be scalable enough to handle this new interconnected world?
SYS-CON Events announced today that Super Micro Computer, Inc., a global leader in high-performance, high-efficiency server, storage technology and green computing, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology is a premier provider of advanced server Building Block Solutions® for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and Embedded Systems worldwide. Supermi...
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
The IoT market is on track to hit $7.1 trillion in 2020. The reality is that only a handful of companies are ready for this massive demand. There are a lot of barriers, paint points, traps, and hidden roadblocks. How can we deal with these issues and challenges? The paradigm has changed. Old-style ad-hoc trial-and-error ways will certainly lead you to the dead end. What is mandatory is an overarching and adaptive approach to effectively handle the rapid changes and exponential growth.
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal an...
The IoT is upon us, but today’s databases, built on 30-year-old math, require multiple platforms to create a single solution. Data demands of the IoT require Big Data systems that can handle ingest, transactions and analytics concurrently adapting to varied situations as they occur, with speed at scale. In his session at @ThingsExpo, Chad Jones, chief strategy officer at Deep Information Sciences, will look differently at IoT data so enterprises can fully leverage their IoT potential. He’ll share tips on how to speed up business initiatives, harness Big Data and remain one step ahead by apply...
There will be 20 billion IoT devices connected to the Internet soon. What if we could control these devices with our voice, mind, or gestures? What if we could teach these devices how to talk to each other? What if these devices could learn how to interact with us (and each other) to make our lives better? What if Jarvis was real? How can I gain these super powers? In his session at 17th Cloud Expo, Chris Matthieu, co-founder and CTO of Octoblu, will show you!
Developing software for the Internet of Things (IoT) comes with its own set of challenges. Security, privacy, and unified standards are a few key issues. In addition, each IoT product is comprised of at least three separate application components: the software embedded in the device, the backend big-data service, and the mobile application for the end user's controls. Each component is developed by a different team, using different technologies and practices, and deployed to a different stack/target - this makes the integration of these separate pipelines and the coordination of software upd...
As a company adopts a DevOps approach to software development, what are key things that both the Dev and Ops side of the business must keep in mind to ensure effective continuous delivery? In his session at DevOps Summit, Mark Hydar, Head of DevOps, Ericsson TV Platforms, will share best practices and provide helpful tips for Ops teams to adopt an open line of communication with the development side of the house to ensure success between the two sides.
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi's VP Business Development and Engineering, will explore the IoT cloud-based platform technologies driving this change including privacy controls, data transparency and integration of real time context w...
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, will explore the current state of IoT connectivity and review key trends and technology requirements that will drive the Internet of Things from hype to reality.
SYS-CON Events announced today that Sandy Carter, IBM General Manager Cloud Ecosystem and Developers, and a Social Business Evangelist, will keynote at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Nowadays, a large number of sensors and devices are connected to the network. Leading-edge IoT technologies integrate various types of sensor data to create a new value for several business decision scenarios. The transparent cloud is a model of a new IoT emergence service platform. Many service providers store and access various types of sensor data in order to create and find out new business values by integrating such data.
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete end-to-end walkthrough of the analysis from start to finish. Participants will also be given the pract...
WebRTC services have already permeated corporate communications in the form of videoconferencing solutions. However, WebRTC has the potential of going beyond and catalyzing a new class of services providing more than calls with capabilities such as mass-scale real-time media broadcasting, enriched and augmented video, person-to-machine and machine-to-machine communications. In his session at @ThingsExpo, Luis Lopez, CEO of Kurento, will introduce the technologies required for implementing these ideas and some early experiments performed in the Kurento open source software community in areas ...
Electric power utilities face relentless pressure on their financial performance, and reducing distribution grid losses is one of the last untapped opportunities to meet their business goals. Combining IoT-enabled sensors and cloud-based data analytics, utilities now are able to find, quantify and reduce losses faster – and with a smaller IT footprint. Solutions exist using Internet-enabled sensors deployed temporarily at strategic locations within the distribution grid to measure actual line loads.
“In the past year we've seen a lot of stabilization of WebRTC. You can now use it in production with a far greater degree of certainty. A lot of the real developments in the past year have been in things like the data channel, which will enable a whole new type of application," explained Peter Dunkley, Technical Director at Acision, in this interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.