Welcome!

Agile Computing Authors: Elizabeth White, Liz McMillan, AppNeta Blog, William Schmarzo, Sematext Blog

Related Topics: Containers Expo Blog, @CloudExpo, Cloud Security

Containers Expo Blog: Blog Feed Post

Security vs. Compliance in the Cloud

To codify data security and privacy protection, the industry turns to auditable standards

Security at Cloud Expo

Security is always top of mind for CIOs and CSOs when considering a cloud deployment. An earlier post described the main security challenges companies face in moving applications to the cloud and how CloudSwitch technology simplifies the process. In this post, I’d like to dig a little deeper into cloud security and the standards used to determine compliance.

To codify data security and privacy protection, the industry turns to auditable standards, most notably SAS 70 as well as PCI, HIPAA and ISO 27002. Each one comes with controls in a variety of categories that govern operation of a cloud provider’s data center as well as the applications you want to put there. But what does compliance really mean? For example, is SAS 70 type II good enough for your requirements, or do you need PCI? How can your company evaluate the different security claims and make a sound decision?

SAS 70 (Types I and II)
SAS 70
is a well-known auditing standard that features prominently in many compliance discussions. It encompasses a variety of controls in different categories (physical security, application security, security policies and processes, etc.). SAS 70 is not a specific set of standards; instead service organizations such as cloud providers are responsible for choosing their own controls and the goals those controls intend to achieve. With SAS 70 Type I, an independent auditor evaluates the controls and issues an opinion, while the more coveted Type II is based on at least six months of active data. Accordingly, many providers will state that they are in compliance with Type I, and Type II evaluation is underway.

SAS 70 has some wiggle room, and you have to dig a little deeper to determine what the certification really involves. The savvy cloud customer will want to know not just whether a cloud is SAS 70 Type II compliant, but what controls they selected in order to get there. This is a question that people normally don’t ask, and under SAS 70 guidelines, service providers have no obligation to tell you. Thus, the level of transparency varies. Some providers may be quite willing to share their audit report describing their controls, objectives and methods. Others will explain that the information is confidential and delivering it would expose company secrets. Or some types of control information may be freely available and others off-limits.

PCI (and Its HIPAA Component)
A second major security standard in cloud computing is PCI. As the security standard for Mastercard and Visa, PCI has a known set of required controls, making it inherently more stringent than SAS 70 where controls are determined by the service provider. The inference is that PCI has stronger security than SAS 70 (and can command higher pricing). However this is not cast in stone—it depends on the SAS 70 controls that the service provider has chosen. Due to the more rigid compliance requirements PCI branding is usually harder to achieve than SAS 70. HIPAA is a subset of PCI, which means that if a cloud is PCI compliant, HIPAA compliance comes with it.

Compliance Building Blocks
Regardless of which standard is used, achieving compliance to run an application in a cloud involves building blocks, with the cloud provider’s physical infrastructure providing the foundation. Infrastructure controls include obvious things like protecting the facility from natural disasters, assuring reliable electrical power (such as backup distribution systems) in the event of outages, and backing up data in the event of a hardware failure. They also include controls governing the cloud provider’s processes and policies such as employee authorization to access the data center and how internal security reviews are performed and reported.

Sitting on top of the infrastructure controls is a separate set of application controls. Multiple levels of security are required, for example, the transport media must be secure and data must be encrypted once it leaves the data center with encryption keys under enterprise control. An application might meet SAS 70 or other standards within a company’s data center but not when it’s moved to a cloud because of exposures that may exist there or along the way. Likewise, a SAS 70 TII application in the cloud may not meet the controls if moved back to the enterprise datacenter, and could require a re-audit.

Deploying to the Cloud
There is a difference between compliance standards and what a company needs to feel secure. For data and applications that have regulatory requirements, compliance standards and audits are mandatory. For these types of applications, we’re still in the very early days for cloud computing—let’s face it, no company is going to put critical regulated applications into the cloud without the ability to conduct complete end-to-end audits. However, even for applications that do not require compliance, enterprises want to know that their data and applications are protected. Achieving security in these environments is where CloudSwitch is focused.

Cloud computing creates a division of responsibility between the cloud provider and the cloud customer. While the cloud provider needs to address infrastructure operation and protection, the customer is responsible for ensuring compliance for their application, and ultimately the overall solution. The central idea here is keep the controls separated between the cloud provider infrastructure and the customer application. If the controls mix, where for example the cloud provider has access to stored data, then things get very complicated. When this occurs, you have to worry about who in the cloud provider’s organization has access to your data, how and when they can access it, and how this access is audited and controlled. If the provider is opaque, then you can’t know. Even if the cloud provider is more transparent in their access polices, you have to evaluate those controls against your standards and potentially have to adjust your own controls in response. Further, you have to adjust to all changes in the cloud provider’s processes over time.

By keeping your systems isolated from the cloud provider’s infrastructure, you can minimize this mixing of controls. Placing protection mechanisms into your resources in the cloud can assure that data moving across the cloud provider’s networks and all data stored in their systems is encrypted. Combined with external key storage and management, your applications can be separated from the cloud provider’s infrastructure. This still requires that the cloud provider run its data center with proper physical security, power management, etc, but can greatly enhance the application level security that the enterprise needs. Finally, this separation can simplify the process of achieving compliance at the application level when running in the cloud. This isolation layer can address a number of the data protection controls by providing a uniform and repeatable process for encrypting data.

The days of cloud computing are just beginning, but with the right combination of cloud providers and additional technologies, it’s not too early to start doing real work in the cloud and to reap the benefits of this new computing paradigm. Our early customers are doing it, and so can you.

Read the original blog entry...

More Stories By Ellen Rubin

Ellen Rubin is the Founder & VP Products at CloudSwitch. She's an experienced entrepreneur with a proven track record in founding innovative technology companies and leading strategy, market positioning and go-to-market. Prior to founding CloudSwitch, Ellen was a member of the early management team at Netezza (NYSE: NZ), the pioneer and market leader in data warehouse appliances, where she helped grow the company to over $125M in revenues and a successful IPO in 2007. Prior to Netezza, she founded Manna, an Israeli and Boston-based developer of real-time personalization software. Rubin began her career as a marketing strategy consultant at Booz, Allen & Hamilton, and holds an MBA from Harvard Business School and an undergraduate degree from Harvard College. .

@ThingsExpo Stories
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to imp...
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and shared the must-have mindsets for removing complexity from the develo...
SYS-CON Events announced today that MangoApps will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. MangoApps provides modern company intranets and team collaboration software, allowing workers to stay connected and productive from anywhere in the world and from any device.
Large scale deployments present unique planning challenges, system commissioning hurdles between IT and OT and demand careful system hand-off orchestration. In his session at @ThingsExpo, Jeff Smith, Senior Director and a founding member of Incenergy, will discuss some of the key tactics to ensure delivery success based on his experience of the last two years deploying Industrial IoT systems across four continents.
Basho Technologies has announced the latest release of Basho Riak TS, version 1.3. Riak TS is an enterprise-grade NoSQL database optimized for Internet of Things (IoT). The open source version enables developers to download the software for free and use it in production as well as make contributions to the code and develop applications around Riak TS. Enhancements to Riak TS make it quick, easy and cost-effective to spin up an instance to test new ideas and build IoT applications. In addition to...
Identity is in everything and customers are looking to their providers to ensure the security of their identities, transactions and data. With the increased reliance on cloud-based services, service providers must build security and trust into their offerings, adding value to customers and improving the user experience. Making identity, security and privacy easy for customers provides a unique advantage over the competition.
"We've discovered that after shows 80% if leads that people get, 80% of the conversations end up on the show floor, meaning people forget about it, people forget who they talk to, people forget that there are actual business opportunities to be had here so we try to help out and keep the conversations going," explained Jeff Mesnik, Founder and President of ContentMX, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"There's a growing demand from users for things to be faster. When you think about all the transactions or interactions users will have with your product and everything that is between those transactions and interactions - what drives us at Catchpoint Systems is the idea to measure that and to analyze it," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York Ci...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus...
Verizon Communications Inc. (NYSE, Nasdaq: VZ) and Yahoo! Inc. (Nasdaq: YHOO) have entered into a definitive agreement under which Verizon will acquire Yahoo's operating business for approximately $4.83 billion in cash, subject to customary closing adjustments. Yahoo informs, connects and entertains a global audience of more than 1 billion monthly active users** -- including 600 million monthly active mobile users*** through its search, communications and digital content products. Yahoo also co...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Amazon has gradually rolled out parts of its IoT offerings in the last year, but these are just the tip of the iceberg. In addition to optimizing their back-end AWS offerings, Amazon is laying the ground work to be a major force in IoT – especially in the connected home and office. Amazon is extending its reach by building on its dominant Cloud IoT platform, its Dash Button strategy, recently announced Replenishment Services, the Echo/Alexa voice recognition control platform, the 6-7 strategic...
The best-practices for building IoT applications with Go Code that attendees can use to build their own IoT applications. In his session at @ThingsExpo, Indraneel Mitra, Senior Solutions Architect & Technology Evangelist at Cognizant, provided valuable information and resources for both novice and experienced developers on how to get started with IoT and Golang in a day. He also provided information on how to use Intel Arduino Kit, Go Robotics API and AWS IoT stack to build an application tha...
IoT generates lots of temporal data. But how do you unlock its value? You need to discover patterns that are repeatable in vast quantities of data, understand their meaning, and implement scalable monitoring across multiple data streams in order to monetize the discoveries and insights. Motif discovery and deep learning platforms are emerging to visualize sensor data, to search for patterns and to build application that can monitor real time streams efficiently. In his session at @ThingsExpo, ...