Click here to close now.


Agile Computing Authors: Greg O'Connor, Dana Gardner, Ken Simpson, Philippe Abdoulaye, Kevin Benedict

News Feed Item

PHD Virtual Provides Easier to Manage and More Efficient Backup Solutions for Enterprise Environments

PHD Virtual Technologies, a pioneer in virtual machine backup and recovery, and innovator of virtualization monitoring solutions, announced today their solutions are well-suited for large enterprise environments as well as small businesses with growing data storage, backup and recovery needs.

Get PHDVB v6.0 for 15 Days Click Here

Tweet This: @PHDVirtual Provides Enterprises with Virtual Machine Backup and Recovery Solutions Fit to Scale

PHD Virtual Backup 6.0 gives customers of all sizes the scalability and flexibility they need, but for large enterprises, these products make backup processes easier to manage and more efficient, while also providing fully recoverable data at a moment’s notice. They also consume the least amount of overhead and capital when compared to other products on the market and provide encrypted data protection that is required at larger enterprises.

"Customers continue to embrace server virtualization and are increasingly deploying multiple hypervisors," said Robert Amatruda, Research Director for Data Protection and Recovery at IDC. "PHD Virtual's Backup 6.0 solution provides customers a cost-effective and easy-to-deploy solution that supports multiple hypervisors and will scale with their virtual environment.”

“In addition to the growth we have experienced within the SMB market, we’ve seen a major uptick in our penetration in the larger enterprise environments as well,” commented Jim Legg, CEO, PHD Virtual. “The simplicity and cost-savings are definitely not exclusive to the smaller organizations – the ease of use and data movement offsite provides a powerful combination for any size corporation.”

PHD Virtual benefits for large enterprise environments include the following:

  • Complete or partial restorations - provides administrators with the ability to restore a complete server from scratch, by simply selecting a restore point and target. No agents, no operating system install, just restore a complete duplicate of the VM.
  • TrueDedupe technology - a true source-side deduplication of data, meaning the deduplication and compression of the source data is performed before sending the information across the WAN/LAN and before the data is written to disk. This type of efficient deduplication is critical for enterprises that require disk space to house the backups for massive amounts of data while also being more scalable and pairing down the overall backup window time. PHD performs deduplication by comparing REAL data existing on the backup target while ensuring to eliminate duplicate copies of data across ALL VMs stored on the target making it more robust and preventing unnecessary job management to achieve storage efficiencies.
  • Parallel processing model – this provides the ability to use multiple data streams for backing up, restoring, and replicating with the result that multiple jobs can run concurrently, In addition, parallel processing allows the user to throttle, increasing or decreasing the resources used for processing to balance the work load and timing of the backup window in the data center.
  • Fault tolerant scaling – provides a 100% virtualized footprint for your backups running on a Linux-based application that can scale up and out by simply deploying more Virtual Backup Appliances (VBAs) giving the ability to create fault tolerance and load balancing, while providing performance that is required without the extra cost of more physical infrastructure or extra licensing.
  • Replication – by backing up VMs once and storing them to disk, PHD Virtual Backup eliminates the need for unnecessary snapshots on your production VMs while maintaining the extra layer of protection of having the replicated VMs located offsite.
  • Disaster recovery planning – PHD Virtual provides:
    • TrueRestore: a verification and self-healing process in which the blocks being backed up are inspected both during the backup and restore functions. By doing this, PHD ensures that the data being backed up is indeed the same data that is going to be restored.
    • Test-Mode: provides the ability to run replicated VMs in a test mode located in a standby environment. This gives peace of mind that the standby VM has been verified, is completely operational and can be properly failed over.
  • Data recovery – PHD Virtual’s Instant Restore provides more savvy recovery methods by allowing administrators to immediately power on the backup VMs and begin a restore process simultaneously. By doing so, PHD provides immediate access to servers and applications and also leverages concurrent data streams, allowing them to implement a technology called “mass restore,” which creates and configures a single restore job that will process multiple VMs at the same time, again reducing complexity and reducing the company’s RTO. Granular restore is more common than a complete data center restore, so PHD has provided support within their products to restore a file, virtual disk within a VM, or single application object, such as an email, mailbox, datastore, database, table, etc. This feature provides the functionality to restore only what you need, when you need it, without setting up any virtual labs or sandboxes to speed up recovery time and prevent unnecessary data loss.
  • Backing up a constantly evolving data center – PHD Virtual allows administrators to plug a backup appliance or VBA into virtually any environment, including the cloud or software defined data center (SDDC) or a remote or branch office, ensuring data is safe and recoverable.

“PHD Virtual does VMware backup better than anyone else, especially for enterprises like us,” said Barry Quiel, SunGard Public Sector, California. “For our large environment, we needed someone that specialized in moving lots of virtual machine data, while storing as little as possible of it. We also need plenty of options to handle how we move data off-site and to tape. PHD Virtual gives us all of the options we needed to make sure our VMware environment meets our enterprise data protection requirements.”

Supporting Resources

PHD Virtual Technologies:

More PHD Virtual News:




RSS Feeds: PHD Virtual news releases:

About PHD Virtual Technologies

PHD Virtual provides the absolute best value in virtual backup and monitoring for VMware and Citrix platforms. More than 4,500 customers worldwide rely on our products because they are effective, easier to use and far more affordable than competitive alternatives. Delivering the highest performance and most scalable cross platform backup and monitoring solutions on the market and pioneer of Virtual Backup Appliances (VBAs), PHD Virtual Technologies has been transforming data protection for virtual IT environments since 2006. Its PHD Virtual Monitor provides a complete, end-to-end solution for monitoring virtual, physical and application infrastructures in VMware and Citrix environments. For more information, please visit:

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

@ThingsExpo Stories
Cloud computing delivers on-demand resources that provide businesses with flexibility and cost-savings. The challenge in moving workloads to the cloud has been the cost and complexity of ensuring the initial and ongoing security and regulatory (PCI, HIPAA, FFIEC) compliance across private and public clouds. Manual security compliance is slow, prone to human error, and represents over 50% of the cost of managing cloud applications. Determining how to automate cloud security compliance is critical to maintaining positive ROI. Raxak Protect is an automated security compliance SaaS platform and ma...
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York and Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound cha...
Just over a week ago I received a long and loud sustained applause for a presentation I delivered at this year’s Cloud Expo in Santa Clara. I was extremely pleased with the turnout and had some very good conversations with many of the attendees. Over the next few days I had many more meaningful conversations and was not only happy with the results but also learned a few new things. Here is everything I learned in those three days distilled into three short points.
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, rich desktop and tuned mobile experiences can now be created with a single codebase – without compromising functionality, performance or usability. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, demonstrated examples of com...
As organizations realize the scope of the Internet of Things, gaining key insights from Big Data, through the use of advanced analytics, becomes crucial. However, IoT also creates the need for petabyte scale storage of data from millions of devices. A new type of Storage is required which seamlessly integrates robust data analytics with massive scale. These storage systems will act as “smart systems” provide in-place analytics that speed discovery and enable businesses to quickly derive meaningful and actionable insights. In his session at @ThingsExpo, Paul Turner, Chief Marketing Officer at...
In his keynote at @ThingsExpo, Chris Matthieu, Director of IoT Engineering at Citrix and co-founder and CTO of Octoblu, focused on building an IoT platform and company. He provided a behind-the-scenes look at Octoblu’s platform, business, and pivots along the way (including the Citrix acquisition of Octoblu).
In his General Session at 17th Cloud Expo, Bruce Swann, Senior Product Marketing Manager for Adobe Campaign, explored the key ingredients of cross-channel marketing in a digital world. Learn how the Adobe Marketing Cloud can help marketers embrace opportunities for personalized, relevant and real-time customer engagement across offline (direct mail, point of sale, call center) and digital (email, website, SMS, mobile apps, social networks, connected objects).
We all know that data growth is exploding and storage budgets are shrinking. Instead of showing you charts on about how much data there is, in his General Session at 17th Cloud Expo, Scott Cleland, Senior Director of Product Marketing at HGST, showed how to capture all of your data in one place. After you have your data under control, you can then analyze it in one place, saving time and resources.
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessions, I wanted to share some of my observations on emerging trends. As cyber security serves as a fou...
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, exploreed the current state of IoT connectivity and review key trends and technology requirements that will drive the Internet of Things from hype to reality.
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now all corporate assets – people, objects, and spaces – can share information about themselves and thei...
Continuous processes around the development and deployment of applications are both impacted by -- and a benefit to -- the Internet of Things trend. To help better understand the relationship between DevOps and a plethora of new end-devices and data please welcome Gary Gruver, consultant, author and a former IT executive who has led many large-scale IT transformation projects, and John Jeremiah, Technology Evangelist at Hewlett Packard Enterprise (HPE), on Twitter at @j_jeremiah. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true change and transformation possible.
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound effect on the world, and what should we expect to see over the next couple of years.
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem" in this scenario: microservice A (releases daily) depends on a couple of additions to backend B (re...
The cloud. Like a comic book superhero, there seems to be no problem it can’t fix or cost it can’t slash. Yet making the transition is not always easy and production environments are still largely on premise. Taking some practical and sensible steps to reduce risk can also help provide a basis for a successful cloud transition. A plethora of surveys from the likes of IDG and Gartner show that more than 70 percent of enterprises have deployed at least one or more cloud application or workload. Yet a closer inspection at the data reveals less than half of these cloud projects involve production...
Growth hacking is common for startups to make unheard-of progress in building their business. Career Hacks can help Geek Girls and those who support them (yes, that's you too, Dad!) to excel in this typically male-dominated world. Get ready to learn the facts: Is there a bias against women in the tech / developer communities? Why are women 50% of the workforce, but hold only 24% of the STEM or IT positions? Some beginnings of what to do about it! In her Day 2 Keynote at 17th Cloud Expo, Sandy Carter, IBM General Manager Cloud Ecosystem and Developers, and a Social Business Evangelist, wil...
PubNub has announced the release of BLOCKS, a set of customizable microservices that give developers a simple way to add code and deploy features for realtime apps.PubNub BLOCKS executes business logic directly on the data streaming through PubNub’s network without splitting it off to an intermediary server controlled by the customer. This revolutionary approach streamlines app development, reduces endpoint-to-endpoint latency, and allows apps to better leverage the enormous scalability of PubNub’s Data Stream Network.