|By Jeremy Geelan||
|January 23, 2009 06:00 AM EST||
"There is a shift in focus and it is from technologies that enable virtualization to technologies that manage virtualization," says Bala Murugan, Chief Architect at eG Innovations, in this Exclusive Q&A with SYS-CON's Virtualization Journal. Overall, Murugan maintains, virtualization is "a promising and justifiable investment, particularly in the current economic downturn."
Virtualization Journal: Do you agree with the view that Virtualization is one of the most promising technology investments in the current economic downturn?
Bala Murugan: Virtualization, when done right, has been proven to provide significant reductions in direct cost. It also helps your indirect cost by improving your IT’s performance, reliability and capacity management. So yes, I would say that it is a promising and justifiable investment, particularly in the current economic downturn.
Virtualization Journal: How about your concept of “Virtualization 2.0” – doesn’t it implicitly suggest that Virtualization 1.0 has been deficient?
Murugan: On the contrary, it is more in reference to the evolution of the Virtualization industry. Virtualization 1.0 was a revelation; it introduced virtualization to the world, proved its power and showed everyone how much they can benefit from it. Virtualization 2.0 – which is already here - is about accepting Virtualization as reality and moving on to how to do it right. How to get the most out of it. Essentially, there is a shift in focus and it is from technologies that enable virtualization to technologies that manage virtualization.
To be successful in Virtualization 2.0, organizations have to focus around technology that helps them manage their virtualization deployments better. Being a monitoring technology provider, we understand the complexities of monitoring in Virtualization 2.0 and are well positioned to help these companies realize the full potential of their virtualized infrastructures.
Virtualization Journal: Are you concerned at all that the “2.0” label might detract from the overall value proposition, given that it seems to be going down with the USS Economy. ;-)
Murugan: We view Virtualization 2.0 as an evolution (next phase) – not as a radical revamp of current virtualization deployments. In Virtualization 2.0, the focus is on how to make virtualization deployments more cost-effective and how to gain maximum benefits. So this will actually make virtualization a mandatory technology for most organizations that are dealing with tight budgets in the economic slow-down.
Virtualization Journal: How about interoperability, how important is that for the industry do you think? What barriers persist?
Murugan: We live in an age of diverse infrastructures. Even before virtualization, the success of n-tier architectures and open systems made it impossible for one to have a homogenous environment. Data centers today comprise diverse technologies that have to co-exist and to in deliver IT services. Virtualization has taken this another step on the evolutionary road, now we are talking about adding a couple of more tiers to the n-tier apps by separating the hardware from the OS. At this juncture, we believe that interoperability is not a “nice to have”. It is a “must have.”
In terms of barriers, the ones that still exist are mostly technological, that people are working to overcome. In principle, I believe everyone agrees interoperability is a must have. Not only do they have to deal with a mix of virtual and non-virtual infrastructures, but also different types of virtualization from different vendors. They key we found is to be able to provide a unified consistent view across this diverse landscape, which makes management that much easier for the end-user.
Virtualization Journal: Do you think VMware needs fear Microsoft’s belated entry into the virtualization marketplace?
Murugan: History has shown that Microsoft can be a significant threat in any endeavor it puts its mind to. They will have good technology and resort to their favorite ploy; their licensing model, and make Virtualization more of a commodity than it already is.
VMware itself has recognized that the hypervisor is no longer going to be the differentiator and that technologies that enable the effective use of virtualization (e.g., manageability), new application deployment models (like virtual desktops), etc. will be a key to retaining their leadership position.
Competition in this space can only be good – innovation will be faster and certainly there is room for multiple vendors in this fast growing market.
Virtualization Journal: How about eG Innovations, what’s the background story to the company’s formation and growth to date?
A: eG Innovations was founded by Srinivas Ramanathan, who also is our president and CEO. Prior to eG, he was a research scientist at HP and the chief architect of Firehunter, an ISP performance monitoring solution. His years at HP gave him a ringside seat to real pain points that customers have with monitoring their environments and monitoring tools themselves. In 2000, he left HP to build the proverbial “better mousetrap,” and assembled a strong team, including myself, to take this concept from the ground up. That was the genesis of eG Innovations.
Our focus was on monitoring n-tier architectures by looking at them as business services as opposed to a collection of servers, networks and applications. Our key benefit to the customer was our ability to proactively identify to the right problem, the true root cause, of poor performance in their IT infrastructures. As a result, customers spent less time firefighting and finger pointing, and more time improving their overall service levels. It took a couple of years to roll-out the finished product, and we got VC funding from Singapore. Then we opened up the US market in 2002 and found a receptive audience for the technology. We quickly became the premier Citrix monitoring solution, which had all the classic n-tier architecture issues. We won many awards and saw the company grow across the globe.
We saw the opportunity in the virtualization space quite early and started working with early virtualization adopters to better understand their needs and to strengthen our technology. Our mastery in thin-client computing and shared access technologies (Citrix, Microsoft Terminal Services, etc.) helped because a Virtualization ecosystem (one box – multiple OSs) is similar to a Citrix ecosystem (one OS – multiple users). More awards later, we are now recognized as one of the industry leaders in the Virtualization monitoring space, with support for different virtualization platforms including VMware, Citrix Xen, Solaris Containers/LDOMs and more.
Virtualization Journal: What are the main pain points that bring customers to you in search of a monitoring solution?
Murugan: The biggest single pain point is probably problem isolation. When there is a problem in your n-tier IT infrastructure, it is usually pretty hard to distinguish between the true root cause and the effects. With systems being interdependent, a single problem generally causes a ripple effect that flows through the entire environment, leading you to chase effects as opposed to pinpointing the root cause. In simple terms, this means you are wasting valuable IT resources in fire-fighting mode fixing effects, which leads to finger pointing inside the organization. Meanwhile, your customers are still facing the problem. Virtualization only increases the complexity of your n-tier IT delivery, which makes problem isolation even more difficult.
Another key pain point that we see customers face is lack of visibility into their IT infrastructures. Even though it sounds simple enough, more often than not customers today don’t have total visibility into what is going on within their virtualized infrastructures. When you are managing a virtualized environment you definitely need answers to questions like; “How many guests are they running?” “How many guests are just consuming resources without being used?” “Where are the bottlenecks in the environment?” “Where do you stand on capacity?” “How do applications running inside VMs compare to ones running on physical servers?” “Is VMotion happening? If yes, why?” and so on. When it comes to virtual environments, what you don’t know can hurt you badly.
Another common problem is the classic disconnect between business services and the IT infrastructure. For example, business users say they can’t process orders or things are too slow. The IT side says servers are running fine on CPU. Both of them are right in their own perspective, but they are not on the same page, not even on the same book. This comes from the traditional IT view of looking at boxes and servers as opposed to the actual quality of services being delivered.
Virtualization Journal: What are two of your favorite customer success stories?
Murugan: There are many, but a classic one was when we got called in by a customer who was deploying a new project with Citrix technologies in a heterogeneous infrastructure with physical and virtual servers. Their new service was not taking off. Users were complaining about severe slowdowns and they had already spent weeks on this problem with no results. Before they came to us, they had changed the server hardware, the application software, the client terminals and software, all to no effect. Within a couple of days of getting involved, we were able to pin-point the source of the problem – network packet retransmissions between servers -- due to some issues with the way network teaming had been set up. We had been working with the application and server teams, and these teams had no visibility into the network. All they had to go by was what the network team was telling them. Hence, they assumed when a problem happened that it was a server or application issue, and spent weeks chasing this. Without any kind of instrumentation on the network, our eG Enterprise solution was able to determine that the root cause of the problem was in the network, not in the VMs, Citrix or other applications. This was a classic case of having to work with limited visibility into some domains, working with different silos of the infrastructure, and yet being able to effectively troubleshoot problems. In the end, it took us just a minutes to review the collected metrics to identify the root cause. Even after hundreds of customer installations, this remains a great example of a customer success.
Another very good example was a large financial institution where our technologies have delivered immense value. Before we got involved, they were very silo-based in their day to day firefighting and operations. We helped them streamline their operations, providing the helpdesk with end-to-end visibility into key business services. s a result, when a problem occurs, the helpdesk knows exactly which expert to call to resolve a problem. This produced significant improvement in service uptime, and more effective use of their operations staff.
Virtualization Journal: What does the future hold do you think for VDI?
Murugan: VDI and its various technology cousins are definitely here to say. The idea of a centralized desktop with the power of a localized desktop is extremely attractive. Some of the largest implementations have been VDI related. Currently we are seeing Fortune 100 companies leading the way on this and I believe it will be common place soon even in mid-size companies. As a technology, it has not yet fully matured, but once it does we see it as becoming a much bigger market than server-based virtualization initiatives. It may become the de-facto desktop platform in near future.
Virtualization Journal: Do you agree that we are entering a new age of infrastructure – one in which it is back on the agenda of C-level execs (and not only the CTO)?
Murugan: I believe infrastructure has always been on the agenda of C-level execs, but with the success of virtualization there are definitely more conversations at the C-level about how to do this right.
Virtualization Journal: You were responsible for the design and development of one of the earliest J2EE portals in the late 90s; what role does Java play today in the enterprise technology landscape?
Murugan: The platform independence provided by Java was one of the key drivers that enabled a slew of web-facing service-oriented applications in the last decade. Java and its sister technologies remain one of the backbone technologies of the web-based applications.
|rcjay2 01/23/09 01:38:00 PM EST|
This is a great article and gives you insight to one of the leaders in Enterprise Monitoring Solutions. I am a user who has the pleasure of working with Bala and the folks at EG for some time now. I can honestly say that the product is amazing. It works in all environments across all OS’s and the monitoring/ reporting capabilities are extensive and endless. Out of the box it monitors everything you can throw at it and if you need to implement a custom monitoring solution for something not covered it is easy to include custom scripts EG can run and report on. Currently, I have the EG suite monitoring 2 complete virtual environments with XenServer 5 and ESX Infrastructure 3. Within each virtual environment I have multiple hosts with a range of operating systems. Everything from Solaris, Fedora Core, and all versions of Windows (2003/2008) are running and fully monitored. Not to mention all the network devices (Cisco, Dell, and Linksys) and printers can all be monitored via SNMP.
Furthermore one of the key points is with the newest version EG is now able to monitor the Solaris Sunray environment. All things surrounding the DTU connectivity is readily available. I have found that it is easy to install, configure and in the case of a disaster it is easy to get a backup up and going. One final note, support from the people at EG is second to none. I have spoke with them on numerous occasions and have never run into anything but a genuine offering of help and wiliness to understand and pinpoint the issue until a resolution is discovered.
There is an ever-growing explosion of new devices that are connected to the Internet using “cloud” solutions. This rapid growth is creating a massive new demand for efficient access to data. And it’s not just about connecting to that data anymore. This new demand is bringing new issues and challenges and it is important for companies to scale for the coming growth. And with that scaling comes the need for greater security, gathering and data analysis, storage, connectivity and, of course, the...
May. 6, 2016 01:45 PM EDT Reads: 1,366
The IETF draft standard for M2M certificates is a security solution specifically designed for the demanding needs of IoT/M2M applications. In his session at @ThingsExpo, Brian Romansky, VP of Strategic Technology at TrustPoint Innovation, will explain how M2M certificates can efficiently enable confidentiality, integrity, and authenticity on highly constrained devices.
May. 6, 2016 01:00 PM EDT Reads: 1,405
trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vice president of product management, IoT solutions at GlobalSign, will teach IoT developers how t...
May. 6, 2016 12:00 PM EDT Reads: 834
When it comes to IoT in the enterprise, namely the commercial building and hospitality markets, a benefit not getting the attention it deserves is energy efficiency, and IoT's direct impact on a cleaner, greener environment when installed in smart buildings. Until now clean technology was offered piecemeal and led with point solutions that require significant systems integration to orchestrate and deploy. There didn't exist a 'top down' approach that can manage and monitor the way a Smart Buildi...
May. 6, 2016 10:53 AM EDT Reads: 192
So, you bought into the current machine learning craze and went on to collect millions/billions of records from this promising new data source. Now, what do you do with them? Too often, the abundance of data quickly turns into an abundance of problems. How do you extract that "magic essence" from your data without falling into the common pitfalls? In her session at @ThingsExpo, Natalia Ponomareva, Software Engineer at Google, will provide tips on how to be successful in large scale machine lear...
May. 6, 2016 10:30 AM EDT Reads: 1,597
SYS-CON Events announced today that Peak 10, Inc., a national IT infrastructure and cloud services provider, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Peak 10 provides reliable, tailored data center and network services, cloud and managed services. Its solutions are designed to scale and adapt to customers’ changing business needs, enabling them to lower costs, improve performance and focus inter...
May. 6, 2016 10:00 AM EDT Reads: 1,526
Digital payments using wearable devices such as smart watches, fitness trackers, and payment wristbands are an increasing area of focus for industry participants, and consumer acceptance from early trials and deployments has encouraged some of the biggest names in technology and banking to continue their push to drive growth in this nascent market. Wearable payment systems may utilize near field communication (NFC), radio frequency identification (RFID), or quick response (QR) codes and barcodes...
May. 6, 2016 10:00 AM EDT Reads: 1,106
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
May. 6, 2016 10:00 AM EDT Reads: 1,318
SYS-CON Events announced today that Ericsson has been named “Gold Sponsor” of SYS-CON's @ThingsExpo, which will take place on June 7-9, 2016, at the Javits Center in New York, New York. Ericsson is a world leader in the rapidly changing environment of communications technology – providing equipment, software and services to enable transformation through mobility. Some 40 percent of global mobile traffic runs through networks we have supplied. More than 1 billion subscribers around the world re...
May. 6, 2016 09:15 AM EDT Reads: 1,445
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
May. 6, 2016 09:00 AM EDT Reads: 828
The demand for organizations to expand their infrastructure to multiple IT environments like the cloud, on-premise, mobile, bring your own device (BYOD) and the Internet of Things (IoT) continues to grow. As this hybrid infrastructure increases, the challenge to monitor the security of these systems increases in volume and complexity. In his session at 18th Cloud Expo, Stephen Coty, Chief Security Evangelist at Alert Logic, will show how properly configured and managed security architecture can...
May. 6, 2016 08:45 AM EDT Reads: 682
The IoTs will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today's coding paradigm and share the must-have mindsets for removing complexity from the development proc...
May. 6, 2016 07:00 AM EDT Reads: 1,090
Artificial Intelligence has the potential to massively disrupt IoT. In his session at 18th Cloud Expo, AJ Abdallat, CEO of Beyond AI, will discuss what the five main drivers are in Artificial Intelligence that could shape the future of the Internet of Things. AJ Abdallat is CEO of Beyond AI. He has over 20 years of management experience in the fields of artificial intelligence, sensors, instruments, devices and software for telecommunications, life sciences, environmental monitoring, process...
May. 6, 2016 06:00 AM EDT Reads: 1,538
In his session at @ThingsExpo, Chris Klein, CEO and Co-founder of Rachio, will discuss next generation communities that are using IoT to create more sustainable, intelligent communities. One example is Sterling Ranch, a 10,000 home development that – with the help of Siemens – will integrate IoT technology into the community to provide residents with energy and water savings as well as intelligent security. Everything from stop lights to sprinkler systems to building infrastructures will run ef...
May. 6, 2016 04:00 AM EDT Reads: 1,357
We’ve worked with dozens of early adopters across numerous industries and will debunk common misperceptions, which starts with understanding that many of the connected products we’ll use over the next 5 years are already products, they’re just not yet connected. With an IoT product, time-in-market provides much more essential feedback than ever before. Innovation comes from what you do with the data that the connected product provides in order to enhance the customer experience and optimize busi...
May. 6, 2016 02:00 AM EDT Reads: 1,450
Manufacturers are embracing the Industrial Internet the same way consumers are leveraging Fitbits – to improve overall health and wellness. Both can provide consistent measurement, visibility, and suggest performance improvements customized to help reach goals. Fitbit users can view real-time data and make adjustments to increase their activity. In his session at @ThingsExpo, Mark Bernardo Professional Services Leader, Americas, at GE Digital, will discuss how leveraging the Industrial Interne...
May. 6, 2016 01:45 AM EDT Reads: 1,456
The increasing popularity of the Internet of Things necessitates that our physical and cognitive relationship with wearable technology will change rapidly in the near future. This advent means logging has become a thing of the past. Before, it was on us to track our own data, but now that data is automatically available. What does this mean for mHealth and the "connected" body? In her session at @ThingsExpo, Lisa Calkins, CEO and co-founder of Amadeus Consulting, will discuss the impact of wea...
May. 6, 2016 01:00 AM EDT Reads: 1,289
Increasing IoT connectivity is forcing enterprises to find elegant solutions to organize and visualize all incoming data from these connected devices with re-configurable dashboard widgets to effectively allow rapid decision-making for everything from immediate actions in tactical situations to strategic analysis and reporting. In his session at 18th Cloud Expo, Shikhir Singh, Senior Developer Relations Manager at Sencha, will discuss how to create HTML5 dashboards that interact with IoT devic...
May. 6, 2016 12:00 AM EDT Reads: 1,501
Whether your IoT service is connecting cars, homes, appliances, wearable, cameras or other devices, one question hangs in the balance – how do you actually make money from this service? The ability to turn your IoT service into profit requires the ability to create a monetization strategy that is flexible, scalable and working for you in real-time. It must be a transparent, smoothly implemented strategy that all stakeholders – from customers to the board – will be able to understand and comprehe...
May. 5, 2016 11:30 PM EDT Reads: 1,397
A critical component of any IoT project is the back-end systems that capture data from remote IoT devices and structure it in a way to answer useful questions. Traditional data warehouse and analytical systems are mature technologies that can be used to handle large data sets, but they are not well suited to many IoT-scale products and the need for real-time insights. At Fuze, we have developed a backend platform as part of our mobility-oriented cloud service that uses Big Data-based approache...
May. 5, 2016 04:00 PM EDT Reads: 817