Welcome!

Agile Computing Authors: Pat Romanski, Liz McMillan, Sematext Blog, Elizabeth White, Kevin Benedict

Related Topics: @CloudExpo

@CloudExpo: Article

Cloud Optimized Storage Solutions: Tiering & Expectations

Part 3 of our ongoing series

Dave Graham's Blog

In Part One of this Cloud Optimized Storage Solutions (COSS) series, we took a look at the content being stored on COSS and in Part Two at how it is stored.

Storage within the cloud is meaningless without a measurable level of performance that it can be compared against. Since there are no established benchmarks that determine performance of storage within a cloud infrastructure, it is reasonable to apply tiering metrics to storage based on content valuation and service level agreements (SLAs) and utilize this as an overarching methodology to judge COSS storage capabilities based on application set.

Within the concept of Information Lifecycle Management (ILM), there exists the idea that storage can be “tiered” based on criticality of need, application storage performance, or other derived metrics that couple relative responsiveness and bandwidth to serviceability. Layered over the top of this metric is the concept of aggrandized Service Level Agreements that cover compliance, data protection, and data access (to name a few). To bring things into more direct focus, there are 4 direct “Tiers” that are recognized (or promoted) within the storage community: Tier 0, Tier 1, Tier 2, Tier 3. While it is worthwhile to note that these Tiers are arbitrary in nature, they do provide a baseline framework upon which to build a more robust data service model.

The Data Tier Models

Tier 0 & 1: Performance & Availability Guaranteed Storage

Tier 0 is a relatively recent addition to the ILM tiering schema and is based solely on the emergence of Solid State Devices (Fusion I/O, EMC/STEC EFDs, et al.). These storage devices feature sub-millisecond access times coupled with extremely high bandwidth to the storage subsystems, thus driving a higher level of storage access and bandwidth metrics for applications such as Online Transaction Processing (OLTP) and Data Warehousing. The criticality of response to these applications is of a higher priority than Tier 1.

Tier 1 was originally established to service high availability, high bandwidth, low response time application and storage needs that tied directly to OLTP/OLAP, DSS, Data warehousing, etc. type workloads. Typically applications in this Tier have a sub-20ms response time requirement (or best practice) and are more sensitive to latency issues than, perhaps, Tier 2 workloads.

Tier 2: Availability Guaranteed Storage

Tier 2 is commonly referred to a transitional data tier, namely due to the nature of data that lives within it. Common data placement within Tier 2 centers on file systems or data that has a change rate that is based on occasional access within a fixed window of time (e.g. 30-60-90 days). It has a decided focus and tuning towards making sure that data is “online” and accessible without stringent latency or performance criteria.

Tier 3: Accessibility Guaranteed Storage

The last noted Tiering level for data is Tier 3. Tier 3 can best be described as an archive level for stagnant or “stale” data that has infrequent access. Historically, Tier 3 has been policy driven, that is, the recipient for some sort of data migration movements versus being a primary storage Tier for end-user access. Within the COSS environment, however, Tier 3 becomes as crucial of a storage target as Tier 1 or 2 simply due to the large preponderance of unstructured data within the cloud space.

Expectations of Data Tiering

The somewhat open-ended definitions to these data tiering levels are purposeful. In defining the principles of a COSS, there is an inherent need to keep them somewhat fluid especially as content continues to change and develop more complexity. Additionally, while the cloud currently has a majority stakeholder in unstructured data, there is no reason why structured data (and associated programmatic hooks and layers) cannot regain ground. As stated previously, these data tiers, while arbitrary, still provide an essential top-down view of how data can be categorized when planning for a COSS implementation. As an extension of data tiering, it’s also important to understand how global and particular Service Level Agreements (SLAs) can and will affect the data stored on COSS.

Author's Notes:

  • ILM, to the best of my knowledge, is not an EMC-designed concept. Whether or not storage tiers existed before EMC popularized the term is indisputable and inherently inconsequential to this discussion.
  • Disclaimer - The opinions expressed here are my personal opinions. Content published here is not read or approved in advance by EMC and does not necessarily reflect the views and opinions of EM

More Stories By Dave Graham

Dave Graham is a Technical Consultant with EMC Corporation where he focused on designing/architecting private cloud solutions for commercial customers.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
IoT is rapidly changing the way enterprises are using data to improve business decision-making. In order to derive business value, organizations must unlock insights from the data gathered and then act on these. In their session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, and Peter Shashkin, Head of Development Department at EastBanc Technologies, discussed how one organization leveraged IoT, cloud technology and data analysis to improve customer experiences and effi...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today's coding paradigm ...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
"We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Unless your company can spend a lot of money on new technology, re-engineering your environment and hiring a comprehensive cybersecurity team, you will most likely move to the cloud or seek external service partnerships. In his session at 18th Cloud Expo, Darren Guccione, CEO of Keeper Security, revealed what you need to know when it comes to encryption in the cloud.
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
What are the successful IoT innovations from emerging markets? What are the unique challenges and opportunities from these markets? How did the constraints in connectivity among others lead to groundbreaking insights? In her session at @ThingsExpo, Carmen Feliciano, a Principal at AMDG, will answer all these questions and share how you can apply IoT best practices and frameworks from the emerging markets to your own business.
Ask someone to architect an Internet of Things (IoT) solution and you are guaranteed to see a reference to the cloud. This would lead you to believe that IoT requires the cloud to exist. However, there are many IoT use cases where the cloud is not feasible or desirable. In his session at @ThingsExpo, Dave McCarthy, Director of Products at Bsquare Corporation, will discuss the strategies that exist to extend intelligence directly to IoT devices and sensors, freeing them from the constraints of ...
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Traditional IT, great for stable systems of record, is struggling to cope with newer, agile systems of engagement requirements coming straight from the business. In his session at 18th Cloud Expo, William Morrish, General Manager of Product Sales at Interoute, outlined ways of exploiting new architectures to enable both systems and building them to support your existing platforms, with an eye for the future. Technologies such as Docker and the hyper-convergence of computing, networking and sto...
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus...
With an estimated 50 billion devices connected to the Internet by 2020, several industries will begin to expand their capabilities for retaining end point data at the edge to better utilize the range of data types and sheer volume of M2M data generated by the Internet of Things. In his session at @ThingsExpo, Don DeLoach, CEO and President of Infobright, discussed the infrastructures businesses will need to implement to handle this explosion of data by providing specific use cases for filterin...
IoT generates lots of temporal data. But how do you unlock its value? You need to discover patterns that are repeatable in vast quantities of data, understand their meaning, and implement scalable monitoring across multiple data streams in order to monetize the discoveries and insights. Motif discovery and deep learning platforms are emerging to visualize sensor data, to search for patterns and to build application that can monitor real time streams efficiently. In his session at @ThingsExpo, ...
Internet of @ThingsExpo has announced today that Chris Matthieu has been named tech chair of Internet of @ThingsExpo 2016 Silicon Valley. The 6thInternet of @ThingsExpo will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Early adopters of IoT viewed it mainly as a different term for machine-to-machine connectivity or M2M. This is understandable since a prerequisite for any IoT solution is the ability to collect and aggregate device data, which is most often presented in a dashboard. The problem is that viewing data in a dashboard requires a human to interpret the results and take manual action, which doesn’t scale to the needs of IoT.