Welcome!

Web 2.0 Authors: Elizabeth White, Tim Hinds, Liz McMillan, Yeshim Deniz, Pat Romanski

News Feed Item

PHD Virtual Provides Easier to Manage and More Efficient Backup Solutions for Enterprise Environments

PHD Virtual Technologies, a pioneer in virtual machine backup and recovery, and innovator of virtualization monitoring solutions, announced today their solutions are well-suited for large enterprise environments as well as small businesses with growing data storage, backup and recovery needs.

Get PHDVB v6.0 for 15 Days Click Here

Tweet This: @PHDVirtual Provides Enterprises with Virtual Machine Backup and Recovery Solutions Fit to Scale http://bit.ly/HyPclv

PHD Virtual Backup 6.0 gives customers of all sizes the scalability and flexibility they need, but for large enterprises, these products make backup processes easier to manage and more efficient, while also providing fully recoverable data at a moment’s notice. They also consume the least amount of overhead and capital when compared to other products on the market and provide encrypted data protection that is required at larger enterprises.

"Customers continue to embrace server virtualization and are increasingly deploying multiple hypervisors," said Robert Amatruda, Research Director for Data Protection and Recovery at IDC. "PHD Virtual's Backup 6.0 solution provides customers a cost-effective and easy-to-deploy solution that supports multiple hypervisors and will scale with their virtual environment.”

“In addition to the growth we have experienced within the SMB market, we’ve seen a major uptick in our penetration in the larger enterprise environments as well,” commented Jim Legg, CEO, PHD Virtual. “The simplicity and cost-savings are definitely not exclusive to the smaller organizations – the ease of use and data movement offsite provides a powerful combination for any size corporation.”

PHD Virtual benefits for large enterprise environments include the following:

  • Complete or partial restorations - provides administrators with the ability to restore a complete server from scratch, by simply selecting a restore point and target. No agents, no operating system install, just restore a complete duplicate of the VM.
  • TrueDedupe technology - a true source-side deduplication of data, meaning the deduplication and compression of the source data is performed before sending the information across the WAN/LAN and before the data is written to disk. This type of efficient deduplication is critical for enterprises that require disk space to house the backups for massive amounts of data while also being more scalable and pairing down the overall backup window time. PHD performs deduplication by comparing REAL data existing on the backup target while ensuring to eliminate duplicate copies of data across ALL VMs stored on the target making it more robust and preventing unnecessary job management to achieve storage efficiencies.
  • Parallel processing model – this provides the ability to use multiple data streams for backing up, restoring, and replicating with the result that multiple jobs can run concurrently, In addition, parallel processing allows the user to throttle, increasing or decreasing the resources used for processing to balance the work load and timing of the backup window in the data center.
  • Fault tolerant scaling – provides a 100% virtualized footprint for your backups running on a Linux-based application that can scale up and out by simply deploying more Virtual Backup Appliances (VBAs) giving the ability to create fault tolerance and load balancing, while providing performance that is required without the extra cost of more physical infrastructure or extra licensing.
  • Replication – by backing up VMs once and storing them to disk, PHD Virtual Backup eliminates the need for unnecessary snapshots on your production VMs while maintaining the extra layer of protection of having the replicated VMs located offsite.
  • Disaster recovery planning – PHD Virtual provides:
    • TrueRestore: a verification and self-healing process in which the blocks being backed up are inspected both during the backup and restore functions. By doing this, PHD ensures that the data being backed up is indeed the same data that is going to be restored.
    • Test-Mode: provides the ability to run replicated VMs in a test mode located in a standby environment. This gives peace of mind that the standby VM has been verified, is completely operational and can be properly failed over.
  • Data recovery – PHD Virtual’s Instant Restore provides more savvy recovery methods by allowing administrators to immediately power on the backup VMs and begin a restore process simultaneously. By doing so, PHD provides immediate access to servers and applications and also leverages concurrent data streams, allowing them to implement a technology called “mass restore,” which creates and configures a single restore job that will process multiple VMs at the same time, again reducing complexity and reducing the company’s RTO. Granular restore is more common than a complete data center restore, so PHD has provided support within their products to restore a file, virtual disk within a VM, or single application object, such as an email, mailbox, datastore, database, table, etc. This feature provides the functionality to restore only what you need, when you need it, without setting up any virtual labs or sandboxes to speed up recovery time and prevent unnecessary data loss.
  • Backing up a constantly evolving data center – PHD Virtual allows administrators to plug a backup appliance or VBA into virtually any environment, including the cloud or software defined data center (SDDC) or a remote or branch office, ensuring data is safe and recoverable.

“PHD Virtual does VMware backup better than anyone else, especially for enterprises like us,” said Barry Quiel, SunGard Public Sector, California. “For our large environment, we needed someone that specialized in moving lots of virtual machine data, while storing as little as possible of it. We also need plenty of options to handle how we move data off-site and to tape. PHD Virtual gives us all of the options we needed to make sure our VMware environment meets our enterprise data protection requirements.”

Supporting Resources

PHD Virtual Technologies: http://phdvirtual.com/

More PHD Virtual News: http://phdvirtual.com/newsandevents

Twitter: https://twitter.com/PHDVirtual

Facebook: http://www.facebook.com/PHDVirtualTechnologies

LinkedIn: http://www.linkedin.com/groups?gid=1992663&mostPopular=&trk=tyah

RSS Feeds: PHD Virtual news releases: http://www.phdvirtual.com/rss/news-and-events.xml

About PHD Virtual Technologies

PHD Virtual provides the absolute best value in virtual backup and monitoring for VMware and Citrix platforms. More than 4,500 customers worldwide rely on our products because they are effective, easier to use and far more affordable than competitive alternatives. Delivering the highest performance and most scalable cross platform backup and monitoring solutions on the market and pioneer of Virtual Backup Appliances (VBAs), PHD Virtual Technologies has been transforming data protection for virtual IT environments since 2006. Its PHD Virtual Monitor provides a complete, end-to-end solution for monitoring virtual, physical and application infrastructures in VMware and Citrix environments. For more information, please visit: http://www.phdvirtual.com/

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

@ThingsExpo Stories
"For over 25 years we have been working with a lot of enterprise customers and we have seen how companies create applications. And now that we have moved to cloud computing, mobile, social and the Internet of Things, we see that the market needs a new way of creating applications," stated Jesse Shiah, CEO, President and Co-Founder of AgilePoint Inc., in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, data security and privacy.
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example t...
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focused on understanding how industrial data can create intelligence for industrial operations. Imagine ...
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water, are pursuing SmartGrid initiatives that represent one of the more mature examples of SAE. We have s...
There is no doubt that Big Data is here and getting bigger every day. Building a Big Data infrastructure today is no easy task. There are an enormous number of choices for database engines and technologies. To make things even more challenging, requirements are getting more sophisticated, and the standard paradigm of supporting historical analytics queries is often just one facet of what is needed. As Big Data growth continues, organizations are demanding real-time access to data, allowing immediate and actionable interpretation of events as they happen. Another aspect concerns how to deliver ...
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
Cloud Expo 2014 TV commercials will feature @ThingsExpo, which was launched in June, 2014 at New York City's Javits Center as the largest 'Internet of Things' event in the world.
"People are a lot more knowledgeable about APIs now. There are two types of people who work with APIs - IT people who want to use APIs for something internal and the product managers who want to do something outside APIs for people to connect to them," explained Roberto Medrano, Executive Vice President at SOA Software, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Media announced that Splunk, a provider of the leading software platform for real-time Operational Intelligence, has launched an ad campaign on Big Data Journal. Splunk software and cloud services enable organizations to search, monitor, analyze and visualize machine-generated big data coming from websites, applications, servers, networks, sensors and mobile devices. The ads focus on delivering ROI - how improved uptime delivered $6M in annual ROI, improving customer operations by mining large volumes of unstructured data, and how data tracking delivers uptime when it matters most.
DevOps Summit 2015 New York, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that it is now accepting Keynote Proposals. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential.
Wearable devices have come of age. The primary applications of wearables so far have been "the Quantified Self" or the tracking of one's fitness and health status. We propose the evolution of wearables into social and emotional communication devices. Our BE(tm) sensor uses light to visualize the skin conductance response. Our sensors are very inexpensive and can be massively distributed to audiences or groups of any size, in order to gauge reactions to performances, video, or any kind of presentation. In her session at @ThingsExpo, Jocelyn Scheirer, CEO & Founder of Bionolux, will discuss ho...
We’re no longer looking to the future for the IoT wave. It’s no longer a distant dream but a reality that has arrived. It’s now time to make sure the industry is in alignment to meet the IoT growing pains – cooperate and collaborate as well as innovate. In his session at @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, will examine the key ingredients to IoT success and identify solutions to challenges the industry is facing. The deep industry expertise behind this presentation will provide attendees with a leading edge view of rapidly emerging IoT oppor...
“With easy-to-use SDKs for Atmel’s platforms, IoT developers can now reap the benefits of realtime communication, and bypass the security pitfalls and configuration complexities that put IoT deployments at risk,” said Todd Greene, founder & CEO of PubNub. PubNub will team with Atmel at CES 2015 to launch full SDK support for Atmel’s MCU, MPU, and Wireless SoC platforms. Atmel developers now have access to PubNub’s secure Publish/Subscribe messaging with guaranteed ¼ second latencies across PubNub’s 14 global points-of-presence. PubNub delivers secure communication through firewalls, proxy ser...
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
Connected devices and the Internet of Things are getting significant momentum in 2014. In his session at Internet of @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, examined three key elements that together will drive mass adoption of the IoT before the end of 2015. The first element is the recent advent of robust open source protocols (like AllJoyn and WebRTC) that facilitate M2M communication. The second is broad availability of flexible, cost-effective storage designed to handle the massive surge in back-end data in a world where timely analytics is e...
"There is a natural synchronization between the business models, the IoT is there to support ,” explained Brendan O'Brien, Co-founder and Chief Architect of Aria Systems, in this SYS-CON.tv interview at the 15th International Cloud Expo®, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
The Internet of Things will put IT to its ultimate test by creating infinite new opportunities to digitize products and services, generate and analyze new data to improve customer satisfaction, and discover new ways to gain a competitive advantage across nearly every industry. In order to help corporate business units to capitalize on the rapidly evolving IoT opportunities, IT must stand up to a new set of challenges. In his session at @ThingsExpo, Jeff Kaplan, Managing Director of THINKstrategies, will examine why IT must finally fulfill its role in support of its SBUs or face a new round of...