|By Greg Schulz||
|January 31, 2014 01:16 AM EST||
Part III Until the focus expands to data protection - Taking action
By Greg Schulz
Part III - Until the focus expands to data protection - Taking action
This is the thrid of a three-part series (read part II here) about how vendors are keeping backup alive, however what they can and should do to shift and expand the conversation to data protection and related themes.
Modernizing is more than simply swapping one technology for another
As I have said for a couple of years now, modernizing data protection, or data protection modernization if you prefer is more than simply deduping or swapping out media, mediums, tape, disk, clouds, software or services like a recurring flat tire on an automobile. If you keep getting flat tires, instead of treating the symptom, find and fix the problem which means for backup, taking a step back and realizing that what is really being done is protecting data (e.g. data protection).
Granted the security people may not like sharing the term data protection as some of them prefer to keep that unique, just like some of the compliance people want to keep archiving exclusive to their focus areas, however lets move on.
On the other hand, data protection also means that, protect, preserve and enable data and information to be accessed and served when and were needed in a cost-effective way with consistency and coherency.
Sure there is still the act of making a copy or a backup at time intervals (frequency) with various coverage (how much gets copied) to multiple locations (copies) with versions kept for different amounts of time (retention) to support RTO and RPO, not to mention SLA and SLO for ITSM (how's that for some buzzword bingo ;).
This means using copies, sync (or rsync), snapshots, replication and CDP, discrete copies such as backups along with all the other buzzword bingo enabling tools, technologies and techniques (e.g. Agent or Agent less, Archive, Availability zones. Not to mention Bare metal, virtual bare metal, Block based, CDP, Compression, Consolidation, Deletion, Data management, Dedupe, eDiscovery, durability, erasure coding/parity, file level, meta data and policy management, replication, snapshots, RAID, plugin, object storage, NAS, VTL, disk, tape, cloud, virtual among others). In addition to taking a step back, this also means rethinking why, how, when, where data (and information) gets protected to meet various threat risks as well as diverse business requirements.
No tools in the toolbox (physical, virtual or cloud)
Part of the rethinking is expanding the focus from what are the tools, who makes what's, how do they work, their features and functions to how to use the tool or technology for different things.
Various tools (hardware, software, services) for different physical, virtual and cloud tasks
This is like going into a store like Lowe's or Home Depot and talking to the sales people their (ok, associates or team members) who can tell you everything thing there is to know about the tool or technology, however they can't tell you how to use it.
Sometimes you can get lucky and there will be somebody working at the tool (hardware or software) store who will ask you what you are trying to do and give you advice based on their experience of a different approach with another tool or tools and some supporting material or parts and supplies.
Does this sound familiar to data infrastructure or IT in general, not to mention server, storage, backup and data protection among other areas of interest?
If all you have, or know how to use is a hammer, then everything or situation starts to look like a nail. Expand your toolbox with more tools AND learn how to use or apply them in new and different ways. Align the right tool, technology and technique to the task at hand!
Expand from talking new technology to using new (and old) things in new ways
In addition to focusing on new tools and technology along with their associated terminologies across physical, virtual and cloud environments, it is also time to expand the discussion and awareness to using new (and old) things in new ways. This also means expanding the terminology from backup/restore to more comprehensive data protection as part of modernizing your environment.
For example some people (and vendors) use the term or phrase "Modernizing Data Protection" to mean swap out tape for disk, or disk for cloud, or one cloud for another cloud, or upgrade from one software version to another, or simply swap one vendors software or tool for another, yet continue to use it for all practical purposes in the same way. Sure, moving from hourly or daily copies to tape over to direct to disk and then either redeploying tape where it is better suited (streaming large amounts of data, powering off to save energy, e.g. deep cold archive). This also means leveraging fast random access to small files that need to be recovered (usually within first hours or days of being protected).
Aligning tools, technologies, techniques to various threat risk scenarios
Modernizing data protection (also known as transformation) also means recognizing that not everything is the same in the data center or information factory regardless of size, and that there are also different and evolving data access patterns. Another reason and trend to consider is that there is no such thing as an information recession and that people plus data are living longer as well as getting larger.
Expand your awareness and focus beyond simply knowing what the tools are and who makes them to how, when, where, why along with pros/cons of using them to discuss different situations. This means having multiple tools in your data protection toolbox as well as knowing how to use different tools for various tasks instead of always using a hammer. - GS @StorageIO
The data protection continuum, more than tools and technoligiues
Call to action, stop talking about it, start walking the talk
If you or somebody else is tired of hearing about backup, then stop complaining about it and take some action. Following are some things to expand your thinking, awareness, discussions and activities around modernizing data protection (and moving beyond traditional backup).
- Take a step back and check the basics or fundamentals of data protection which when enabled, allows your organization to move forward after a small or big incident (or disaster).
- Start thinking beyond backup tools and technologies (hardware, software, services) particular how its been done, to why it needs to be done, how can it be done differently.
- Revisit why you are protecting different things, realize that not everything is the same, so does that mean you have to protect everything the same way?
- Learn about how to use different tools and technologies which is different from learning about the tools, features and functions.
- Also keep in mind that a barrier is often people and process (along with organizational politics) that also result in new (and old) technologies being used in old ways.
- Think about using different tools and technologies in different e.g. hybrid ways.
- This means start using new (and old) tools, techniques, techniques in new ways, start to apply your return on innovation by using things to discuss issues, vs. simply using them for the sake of using them.
In addition to the above items, here are some added links on various topics and themes mentioned here:
Via StorageIOblog - Only You Can Prevent Cloud Data Loss, Cloud conversations: confidence, certainty and confidentiality, Modernizing data protection with certainty, More Data Footprint Reduction (DFR) Material, More modernizing data protection, virtualization and clouds with certainty, EMC Evolves Enterprise Data Protection with Enhancements and Data protection modernization, more than swapping out media.
Via Internet evolution - People, Not Tech, Prevent IT Convergence.
Closing comments (for now)
Now having said all of that, It would be unrealistic to think that we can simply overnight drop the term backup and switch to data protection, after all, we need backwards compatibility. However until the industry which means from vendors, their pundits (analyst, bloggers, consultants, evangelists), press/media, vars, investors and customers start thinking and speaking in the broader context of data protection, life beyond backup, guess what, we will still be talking about backup. Start calling it (e.g. backup) data protection and perhaps within a generation (or sooner), the term backup will have been ILM, compressed, deduped, tiered, spun down, put into deep cold archive storage to take a long REST on object storage with a NAS interface in a software defined hybrid virtualized cloud ;).
Watch for more data protection conversations about related trends, themes, technologies, techniques perspectives in my ongoing data protection diaries discussions (e.g. www.dataprotectiondiaries.com).
Ok, nuff said
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Oct. 10, 2015 07:00 AM EDT Reads: 5,933
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Oct. 10, 2015 06:00 AM EDT Reads: 815
Nowadays, a large number of sensors and devices are connected to the network. Leading-edge IoT technologies integrate various types of sensor data to create a new value for several business decision scenarios. The transparent cloud is a model of a new IoT emergence service platform. Many service providers store and access various types of sensor data in order to create and find out new business values by integrating such data.
Oct. 10, 2015 04:00 AM EDT Reads: 601
The broad selection of hardware, the rapid evolution of operating systems and the time-to-market for mobile apps has been so rapid that new challenges for developers and engineers arise every day. Security, testing, hosting, and other metrics have to be considered through the process. In his session at Big Data Expo, Walter Maguire, Chief Field Technologist, HP Big Data Group, at Hewlett-Packard, will discuss the challenges faced by developers and a composite Big Data applications builder, focusing on how to help solve the problems that developers are continuously battling.
Oct. 10, 2015 04:00 AM EDT Reads: 520
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete end-to-end walkthrough of the analysis from start to finish. Participants will also be given the pract...
Oct. 10, 2015 03:00 AM EDT Reads: 333
WebRTC: together these advances have created a perfect storm of technologies that are disrupting and transforming classic communications models and ecosystems. In his session at WebRTC Summit, Cary Bran, VP of Innovation and New Ventures at Plantronics and PLT Labs, will provide an overview of this technological shift, including associated business and consumer communications impacts, and opportunities it may enable, complement or entirely transform.
Oct. 10, 2015 02:15 AM EDT Reads: 772
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet conditions, Dyn ensures traffic gets delivered faster, safer, and more reliably than ever.
Oct. 10, 2015 02:00 AM EDT Reads: 661
WebRTC services have already permeated corporate communications in the form of videoconferencing solutions. However, WebRTC has the potential of going beyond and catalyzing a new class of services providing more than calls with capabilities such as mass-scale real-time media broadcasting, enriched and augmented video, person-to-machine and machine-to-machine communications. In his session at @ThingsExpo, Luis Lopez, CEO of Kurento, will introduce the technologies required for implementing these ideas and some early experiments performed in the Kurento open source software community in areas ...
Oct. 10, 2015 01:00 AM EDT Reads: 785
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
Oct. 10, 2015 12:00 AM EDT Reads: 185
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi's VP Business Development and Engineering, will explore the IoT cloud-based platform technologies driving this change including privacy controls, data transparency and integration of real time context w...
Oct. 9, 2015 10:15 PM EDT Reads: 151
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, will discuss the impact of technology on identity. Should we federate, or not? How should identity be secured? Who owns the identity? How is identity ...
Oct. 9, 2015 10:00 PM EDT Reads: 455
The IoT market is on track to hit $7.1 trillion in 2020. The reality is that only a handful of companies are ready for this massive demand. There are a lot of barriers, paint points, traps, and hidden roadblocks. How can we deal with these issues and challenges? The paradigm has changed. Old-style ad-hoc trial-and-error ways will certainly lead you to the dead end. What is mandatory is an overarching and adaptive approach to effectively handle the rapid changes and exponential growth.
Oct. 9, 2015 10:00 PM EDT Reads: 250
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new data-driven world, marketplaces reign supreme while interoperability, APIs and applications deliver un...
Oct. 9, 2015 08:00 PM EDT Reads: 322
Electric power utilities face relentless pressure on their financial performance, and reducing distribution grid losses is one of the last untapped opportunities to meet their business goals. Combining IoT-enabled sensors and cloud-based data analytics, utilities now are able to find, quantify and reduce losses faster – and with a smaller IT footprint. Solutions exist using Internet-enabled sensors deployed temporarily at strategic locations within the distribution grid to measure actual line loads.
Oct. 9, 2015 06:30 PM EDT Reads: 148
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, will explore the current state of IoT connectivity and review key trends and technology requirements that will drive the Internet of Things from hype to reality.
Oct. 9, 2015 05:30 PM EDT Reads: 128
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
Oct. 9, 2015 04:00 PM EDT Reads: 254
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
Oct. 9, 2015 03:45 PM EDT Reads: 512
Today’s connected world is moving from devices towards things, what this means is that by using increasingly low cost sensors embedded in devices we can create many new use cases. These span across use cases in cities, vehicles, home, offices, factories, retail environments, worksites, health, logistics, and health. These use cases rely on ubiquitous connectivity and generate massive amounts of data at scale. These technologies enable new business opportunities, ways to optimize and automate, along with new ways to engage with users.
Oct. 9, 2015 02:00 PM EDT Reads: 194
The IoT is upon us, but today’s databases, built on 30-year-old math, require multiple platforms to create a single solution. Data demands of the IoT require Big Data systems that can handle ingest, transactions and analytics concurrently adapting to varied situations as they occur, with speed at scale. In his session at @ThingsExpo, Chad Jones, chief strategy officer at Deep Information Sciences, will look differently at IoT data so enterprises can fully leverage their IoT potential. He’ll share tips on how to speed up business initiatives, harness Big Data and remain one step ahead by apply...
Oct. 9, 2015 01:45 PM EDT Reads: 568
There will be 20 billion IoT devices connected to the Internet soon. What if we could control these devices with our voice, mind, or gestures? What if we could teach these devices how to talk to each other? What if these devices could learn how to interact with us (and each other) to make our lives better? What if Jarvis was real? How can I gain these super powers? In his session at 17th Cloud Expo, Chris Matthieu, co-founder and CTO of Octoblu, will show you!
Oct. 9, 2015 01:15 PM EDT