Welcome!

Web 2.0 Authors: Lori MacVittie, Victoria Livschitz, Elizabeth White, Liz McMillan, Larry Dragich

Blog Feed Post

Effective Storage for Growing Data Volumes

By Ian Kilpatrick, chairman Wick Hill Group

Dealing with vast amounts of data used to be a problem faced purely by large enterprises. However, in today's world of rapidly increasing data, it's now an issue for companies of all sizes.

Data is increasing for all sorts of reasons, which include the use of social media and the increased use of mobile devices such as smartphones and tablets, which have dramatically increased the amount of data being backed up on the network.

Symantec's recent State of Information Survey conducted over 38 countries worldwide found that SMEs expected their storage to increase by 178% over the next year.

The large amount of data companies now produce, which can be in the terabytes and petabytes, needs to be backed up and stored so it can be accessed easily and quickly. It needs to be archived in case it is needed in the future and also for compliance reasons; and it needs to be replicated, so it's available for use in the case of a disaster. All this needs to be done cost-effectively and securely.

Storage is a notoriously boring subject for many people, and it tends to get pushed down the list of priorities. While enterprises have the resources to manage their back-up and storage issues, many organisations consider it a necessary evil and don't review its financial effectiveness or its fitness for future purpose.

This is a shame, as in all matters IT, times have moved on from things such as old-style tape systems. Modern solutions are bigger, better, easier and lower cost, often with features previously only available on enterprise systems.

Increases in data volume now make storage a key business issue for companies. Performance issues, which result from companies being overwhelmed by data or by backing up, could impact on profitability. The inability to access data quickly, in the event of a disaster, could put a company out of business.

Some recent statistics indicate that 43% of businesses that close after a natural disaster never re-open and a further 29% close within two years. One of the key reasons for this is the failure of their disaster-recovery planning. Back-ups and back-up plans need to be tested regularly to ensure that they are relevant or actually working. Finding that your back-ups are corrupted, when you absolutely need them, is too late.

Which route forward?
The question is how to choose a storage solution that will cope with current storage needs as well as taking you forward into a future of unpredictable, and mushrooming, data growth?

Storage solutions for companies outside the Top 500 range from traditional tape to the cloud, with other options and permutations in between. For some organisations, cloud may be the answer. For others, particularly if they have been using tape in the past, the leap to the cloud is just too great and they want something more tangible, such as disc-based storage.

For some, the solution may be a hybrid one which gives local back up with background cloud back-up. This option provides speed of access with the security of offsite cloud.

Companies still using traditional tape for back-up and archiving will be finding it increasingly inadequate for their needs, that is if they've checked recently. Tape also has inherent disadvantages.

It's cumbersome, expensive, has a finite life and is easily damaged. It takes longer and longer, the more you are backing up. And it can be very difficult to find things quickly on tape when you need them.

The cloud, while it may seem like a great option to many, isn't for everyone. One disadvantage of the cloud, which many aren't aware of, is that the data is probably going to be stored on traditional tape.

Another issue to be aware of is how long it takes to seed and download data over limited capacity internet connections. Seeding a terabyte of data in the cloud on a 10mbps connection with nothing else going on will take 300 hours (12.5 days).

This means that accessing it and downloading it could take days. If you have a serious disaster recovery situation, this may possibly be acceptable, but it's a serious hindrance if you need to access stored data quickly during the normal day-to-day running of a business.

Solutions
One solution for more conservative SMEs is to use RDX removable hard disc cartridges for storage. They combine the best of hard disc and tape storage. They scale with a business, but, unlike tape, are very rugged and reliable. It's a step forward, for those who have been using tape, which isn't too different from what went before.

RDX cartridges back-up and restore data very quickly (much quicker than tape) and are very secure. They are typically available in sizes from 160 gigabytes to 1.5 terabytes. You just add extra cartridges as your storage needs expand.

Imation, for example, which has a specialist division for SME storage, provides an RDX solution, the A8, which can accommodate up to 12 terabytes of data.

The A8 helps SMEs conduct high-performance back-up, data protection, archiving, restoration and cloud-enabling. It can mix RDX cartridges of different sizes, so users can start with lower capacity cartridges and add higher capacity ones as needs dictate.

A solution like this allows organisations to quickly back up and instantly access their crucial data. It gives them more operational flexibility and the ability to cost-effectively and quickly recover their data in the event of a disaster. Users get the benefits of RDX cartridge storage, but also keep cloud options open.

Imation also provides another product which gives companies a comprehensive set of storage and back-up options.

DataGuard is a network attached storage (NAS) backup appliance which uses hard disk drives, removable RDX® disk cartridges, replication, and cloud storage to provide up to four layers of data protection. It shortens back-up windows and allows for fast recovery.

DataGuard is capable of making multiple copies of content as local online copies, replicated copies, optional offline RDX copies and remote online (cloud) copies.

It means companies can have all bases covered. They don't have to go with the cloud straight away, but the facility is there to do it when and if they are ready.

Another company which provides a solution which gives the best of both worlds, both local and cloud storage, is Barracuda with its Barracuda Backup Service.

This provides full local data back-up and is combined with a storage subscription, to replicate data to the cloud at two offsite locations. So organisations get onsite back-ups for fast restore times and secure, offsite storage for disaster recovery.

The Barracuda system uses a technology called deduplication, which reduces traditional back-up storage requirements by 20 to 50 times, while also reducing back-up windows and bandwidth requirements.

Deduplication works by eliminating redundant data. Only one unique instance of the data is actually retained. The redundant data is replaced with a pointer to the original copy.

Conclusion
One constant is guaranteed. Storage and data access requirements will (as they always have done) continue to grow, in fact the pace appears to be accelerating. Alongside the change in data volumes, new options have become available which provide enterprise-level solutions at affordable prices.

A range of such options is available from RDX cartridges to cloud services, plus a variety of combinations in between. Such solutions allow organisations to effectively back-up and store data, so it doesn't cause serious performance problems on the network; they allow the data to be quickly available for both the running of the business and for compliance purposes; and they offer a disaster recovery option.

Bio of author
Ian Kilpatrick is chairman of international value added distributor Wick Hill Group plc, specialists in market development for secure IP infrastructure solutions and convergence. Kilpatrick has been involved with the Group for more than 35 years. Wick Hill supplies organisations from enterprises to SMEs, through an extensive value-added network of accredited VARs.

Kilpatrick has an in-depth experience of IT and unified communications (UC) with a strong vision of the future. He looks at these areas from a business point-of-view and his approach reflects his philosophy that business benefits, ease-of-use and cost of ownership are key factors, rather than just technology. He has authored numerous articles and publications, as well as being a regular speaker at conferences, exhibitions and seminars. For more information about Wick Hill, please visit http://www.wickhill.com or www.twitter.com/wickhill.

ENDS

For further press information, please contact Annabelle Brown on 01326 318212, email [email protected]. For reader queries, contact Wick Hill on 01483 227600. Web http://www.wickhill.com. For pic of Ian Kilpatrick, please contact Annabelle Brown or download from www.wickhill.com/company/press/pictures

Read the original blog entry...

More Stories By RealWire News Distribution

RealWire is a global news release distribution service specialising in the online media. The RealWire approach focuses on delivering relevant content to the receivers of our client's news releases. As we know that it is only through delivering relevance, that influence can ever be achieved.

@ThingsExpo Stories
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
The Internet of Things will put IT to its ultimate test by creating infinite new opportunities to digitize products and services, generate and analyze new data to improve customer satisfaction, and discover new ways to gain a competitive advantage across nearly every industry. In order to help corporate business units to capitalize on the rapidly evolving IoT opportunities, IT must stand up to a new set of challenges. In his session at @ThingsExpo, Jeff Kaplan, Managing Director of THINKstrategies, will examine why IT must finally fulfill its role in support of its SBUs or face a new round of...
One of the biggest challenges when developing connected devices is identifying user value and delivering it through successful user experiences. In his session at Internet of @ThingsExpo, Mike Kuniavsky, Principal Scientist, Innovation Services at PARC, described an IoT-specific approach to user experience design that combines approaches from interaction design, industrial design and service design to create experiences that go beyond simple connected gadgets to create lasting, multi-device experiences grounded in people's real needs and desires.
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With "smart" appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user's habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps, abiding by privacy concerns and making the concept a reality. These challenges can't be addressed w...
The Domain Name Service (DNS) is one of the most important components in networking infrastructure, enabling users and services to access applications by translating URLs (names) into IP addresses (numbers). Because every icon and URL and all embedded content on a website requires a DNS lookup loading complex sites necessitates hundreds of DNS queries. In addition, as more internet-enabled ‘Things' get connected, people will rely on DNS to name and find their fridges, toasters and toilets. According to a recent IDG Research Services Survey this rate of traffic will only grow. What's driving t...
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
Connected devices and the Internet of Things are getting significant momentum in 2014. In his session at Internet of @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, examined three key elements that together will drive mass adoption of the IoT before the end of 2015. The first element is the recent advent of robust open source protocols (like AllJoyn and WebRTC) that facilitate M2M communication. The second is broad availability of flexible, cost-effective storage designed to handle the massive surge in back-end data in a world where timely analytics is e...
We are reaching the end of the beginning with WebRTC, and real systems using this technology have begun to appear. One challenge that faces every WebRTC deployment (in some form or another) is identity management. For example, if you have an existing service – possibly built on a variety of different PaaS/SaaS offerings – and you want to add real-time communications you are faced with a challenge relating to user management, authentication, authorization, and validation. Service providers will want to use their existing identities, but these will have credentials already that are (hopefully) i...
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
How do APIs and IoT relate? The answer is not as simple as merely adding an API on top of a dumb device, but rather about understanding the architectural patterns for implementing an IoT fabric. There are typically two or three trends: Exposing the device to a management framework Exposing that management framework to a business centric logic Exposing that business layer and data to end users. This last trend is the IoT stack, which involves a new shift in the separation of what stuff happens, where data lives and where the interface lies. For instance, it's a mix of architectural styles ...
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water, are pursuing SmartGrid initiatives that represent one of the more mature examples of SAE. We have s...
P2P RTC will impact the landscape of communications, shifting from traditional telephony style communications models to OTT (Over-The-Top) cloud assisted & PaaS (Platform as a Service) communication services. The P2P shift will impact many areas of our lives, from mobile communication, human interactive web services, RTC and telephony infrastructure, user federation, security and privacy implications, business costs, and scalability. In his session at @ThingsExpo, Robin Raymond, Chief Architect at Hookflash, will walk through the shifting landscape of traditional telephone and voice services ...
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example t...
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at Internet of @ThingsExpo, James Kirkland, Chief Architect for the Internet of Things and Intelligent Systems at Red Hat, described how to revolutioniz...
Bit6 today issued a challenge to the technology community implementing Web Real Time Communication (WebRTC). To leap beyond WebRTC’s significant limitations and fully leverage its underlying value to accelerate innovation, application developers need to consider the entire communications ecosystem.
The definition of IoT is not new, in fact it’s been around for over a decade. What has changed is the public's awareness that the technology we use on a daily basis has caught up on the vision of an always on, always connected world. If you look into the details of what comprises the IoT, you’ll see that it includes everything from cloud computing, Big Data analytics, “Things,” Web communication, applications, network, storage, etc. It is essentially including everything connected online from hardware to software, or as we like to say, it’s an Internet of many different things. The difference ...
Cloud Expo 2014 TV commercials will feature @ThingsExpo, which was launched in June, 2014 at New York City's Javits Center as the largest 'Internet of Things' event in the world.
SYS-CON Events announced today that Windstream, a leading provider of advanced network and cloud communications, has been named “Silver Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. Windstream (Nasdaq: WIN), a FORTUNE 500 and S&P 500 company, is a leading provider of advanced network communications, including cloud computing and managed services, to businesses nationwide. The company also offers broadband, phone and digital TV services to consumers primarily in rural areas.
"There is a natural synchronization between the business models, the IoT is there to support ,” explained Brendan O'Brien, Co-founder and Chief Architect of Aria Systems, in this SYS-CON.tv interview at the 15th International Cloud Expo®, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo (http://www.CloudComputingExpo.com), moderated by Ashar Baig, Research Director, Cloud, at Gigaom Research, Nate Gordon, Director of T...