Click here to close now.


Agile Computing Authors: Yeshim Deniz, Janakiram MSV, Jason Bloomberg, Jayaram Krishnaswamy, Joe Pruitt

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Cloud Security, @BigDataExpo, SDN Journal

@CloudExpo: Article

Best Practices to Ensure Security in the Private Cloud

A private cloud environment significantly reduce risks by providing secure, multi-layer segmentation of client access and data

As regulatory oversight across the financial landscape continues to drive greater transparency and stricter penalties, outsourcing to the private cloud has become an integral resource for hedge fund and private equity managers. Cloud infrastructure services are now synonymous with increased efficiency, decreased costs and added security. However, security in particular remains a key concern for many financial services firms. The costs a cloud services provider can incur in dealing with a security breach, both financially and to its reputation, can be devastating.

Infrastructure providers, particularly those catering to financial services firms such as hedge funds, must have strict policies in place and employ best practices to ensure that their clients receive the same level of security as they would achieve with an on-site network. While most participants in the financial services industry are familiar with the benefits that cloud computing offers in terms of efficiency, scalability and cost savings, two of the features that seem to be forgotten are increased security protection and risk mitigation.

The key differentiator between launching an in-house network as opposed to outsourcing to a hosted services provider is that service providers offer economies of scale that enable them to deploy institutional strength security services to ensure the client's environment is protected and secure. A large portion of spending by cloud providers goes directly into measures that ensure the highest levels of security and data protection. This will typically include services such as advanced intrusion detection, traffic monitoring, forensic analysis and incident history/investigation. These systems and processes can range into the hundreds of thousands and even millions of dollars in some cases. Therefore they are usually not deployed by a hedge fund or private equity firm's in-house IT staff.

One of the major advantages of a private cloud environment is that it can significantly reduce risks by providing secure, multi-layer segmentation of client access and data. When examining cloud providers, financial service firms should keep in mind a few key factors. The first factor is the location of your data. Clients will always have questions about where their data is being stored, who can gain access to it and how it is secured from being accessed. This may be the most important factor for cloud computing providers, but it is also something that is commonly overlooked when potential clients are reviewing data security. Most data breaches do not take place via cyber-attacks, but instead they will occur when hard disks or backup tapes are misplaced or stolen. A common best practice backup procedure for an on-site server is to rotate the tapes off-site.

Consideration must also be given to the concept of physical servers versus a shared environment. In a service provider's data center, multiple companies will share services on the same infrastructure, which in some cases may raise a red flag in the mind of a CFO or CTO. When resources in a data center are shared, security and segregation must be guaranteed at every layer, from the server to the network to the storage.

Network is the next factor that must be considered. Methods such as data encryption - where files may be encrypted prior to transmission - can prevent data from being used should it be compromised at any point during transmission. The hosted service provider is responsible for supplying the firm with a storage solution that provides secure data segmentation and enables rapid resource allocation. The hosted storage provider should provide high data availability and disaster recovery, particularly after what Wall Street firms experienced during Hurricane Sandy in October 2012. Service providers must also be able to offer data replication for off-site backup and archiving in the case of an emergency. Protecting the firm against all possible natural disasters and intrusions is now a major deciding factor for financial decision-makers.

Another factor that is now emerging as a standard business practice due to the amount of executives that are constantly on the go is the management of mobile devices. In today's fast-paced business environment, mobile devices essentially serve as an extension of a firm's offices, so they should be incorporated into all security measures. A service provider should take the necessary steps to actively manage these resources, including implementing and managing a password policy and being able to remotely wipe the device's memory of all information if it is lost /stolen.

The bottom line is that companies considering a move to the private cloud need assurance that service providers offer security standards and best practices that are better than what they can received from on-site or internal technology services. By taking into consideration the various components discussed throughout this piece, firms can ensure up front that a service provider has taken the necessary steps to provide a robust and secure platform environment for their business technology.

More Stories By Viktor Tadijanovic, CTO, Abacus Group LLC

Viktor Tadijanovic is a Founding Member & CTO of the Abacus Group. He is the principal architect for Abacus's Hosted IT Platform. Previously, he was a Senior Systems Architect at the Gerson Lehrman Group (GLG). Prior to GLG, he was a Technical Director at Eze Castle Integration where he was responsible for managing technology delivery to all hedge fund clients in New York City and Connecticut. Viktor possesses accreditations from NetApp, Cisco, VMware, Citrix and Microsoft. He received a degree in Network Engineering and Data Communications from the Chubb Institute in New York.

Comments (1)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@ThingsExpo Stories
Electric power utilities face relentless pressure on their financial performance, and reducing distribution grid losses is one of the last untapped opportunities to meet their business goals. Combining IoT-enabled sensors and cloud-based data analytics, utilities now are able to find, quantify and reduce losses faster – and with a smaller IT footprint. Solutions exist using Internet-enabled sensors deployed temporarily at strategic locations within the distribution grid to measure actual line loads.
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, will explore the current state of IoT connectivity and review key trends and technology requirements that will drive the Internet of Things from hype to reality.
SYS-CON Events announced today that IBM Cloud Data Services has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IBM Cloud Data Services offers a portfolio of integrated, best-of-breed cloud data services for developers focused on mobile computing and analytics use cases.
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi's VP Business Development and Engineering, will explore the IoT cloud-based platform technologies driving this change including privacy controls, data transparency and integration of real time context w...
There will be 20 billion IoT devices connected to the Internet soon. What if we could control these devices with our voice, mind, or gestures? What if we could teach these devices how to talk to each other? What if these devices could learn how to interact with us (and each other) to make our lives better? What if Jarvis was real? How can I gain these super powers? In his session at 17th Cloud Expo, Chris Matthieu, co-founder and CTO of Octoblu, will show you!
Developing software for the Internet of Things (IoT) comes with its own set of challenges. Security, privacy, and unified standards are a few key issues. In addition, each IoT product is comprised of at least three separate application components: the software embedded in the device, the backend big-data service, and the mobile application for the end user's controls. Each component is developed by a different team, using different technologies and practices, and deployed to a different stack/target - this makes the integration of these separate pipelines and the coordination of software upd...
As a company adopts a DevOps approach to software development, what are key things that both the Dev and Ops side of the business must keep in mind to ensure effective continuous delivery? In his session at DevOps Summit, Mark Hydar, Head of DevOps, Ericsson TV Platforms, will share best practices and provide helpful tips for Ops teams to adopt an open line of communication with the development side of the house to ensure success between the two sides.
Mobile messaging has been a popular communication channel for more than 20 years. Finnish engineer Matti Makkonen invented the idea for SMS (Short Message Service) in 1984, making his vision a reality on December 3, 1992 by sending the first message ("Happy Christmas") from a PC to a cell phone. Since then, the technology has evolved immensely, from both a technology standpoint, and in our everyday uses for it. Originally used for person-to-person (P2P) communication, i.e., Sally sends a text message to Betty – mobile messaging now offers tremendous value to businesses for customer and empl...
SYS-CON Events announced today that Sandy Carter, IBM General Manager Cloud Ecosystem and Developers, and a Social Business Evangelist, will keynote at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
The IoT market is on track to hit $7.1 trillion in 2020. The reality is that only a handful of companies are ready for this massive demand. There are a lot of barriers, paint points, traps, and hidden roadblocks. How can we deal with these issues and challenges? The paradigm has changed. Old-style ad-hoc trial-and-error ways will certainly lead you to the dead end. What is mandatory is an overarching and adaptive approach to effectively handle the rapid changes and exponential growth.
The IoT is upon us, but today’s databases, built on 30-year-old math, require multiple platforms to create a single solution. Data demands of the IoT require Big Data systems that can handle ingest, transactions and analytics concurrently adapting to varied situations as they occur, with speed at scale. In his session at @ThingsExpo, Chad Jones, chief strategy officer at Deep Information Sciences, will look differently at IoT data so enterprises can fully leverage their IoT potential. He’ll share tips on how to speed up business initiatives, harness Big Data and remain one step ahead by apply...
WebRTC converts the entire network into a ubiquitous communications cloud thereby connecting anytime, anywhere through any point. In his session at WebRTC Summit,, Mark Castleman, EIR at Bell Labs and Head of Future X Labs, will discuss how the transformational nature of communications is achieved through the democratizing force of WebRTC. WebRTC is doing for voice what HTML did for web content.
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Nowadays, a large number of sensors and devices are connected to the network. Leading-edge IoT technologies integrate various types of sensor data to create a new value for several business decision scenarios. The transparent cloud is a model of a new IoT emergence service platform. Many service providers store and access various types of sensor data in order to create and find out new business values by integrating such data.
The broad selection of hardware, the rapid evolution of operating systems and the time-to-market for mobile apps has been so rapid that new challenges for developers and engineers arise every day. Security, testing, hosting, and other metrics have to be considered through the process. In his session at Big Data Expo, Walter Maguire, Chief Field Technologist, HP Big Data Group, at Hewlett-Packard, will discuss the challenges faced by developers and a composite Big Data applications builder, focusing on how to help solve the problems that developers are continuously battling.
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete end-to-end walkthrough of the analysis from start to finish. Participants will also be given the pract...
WebRTC: together these advances have created a perfect storm of technologies that are disrupting and transforming classic communications models and ecosystems. In his session at WebRTC Summit, Cary Bran, VP of Innovation and New Ventures at Plantronics and PLT Labs, will provide an overview of this technological shift, including associated business and consumer communications impacts, and opportunities it may enable, complement or entirely transform.
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet conditions, Dyn ensures traffic gets delivered faster, safer, and more reliably than ever.