Click here to close now.



Welcome!

Agile Computing Authors: Jayaram Krishnaswamy, Sanjay Uppal, Automic Blog, Charlotte Spencer-Smith, Liz McMillan

Related Topics: Containers Expo Blog

Containers Expo Blog: Article

Balancing the Virtualization Equation

Get the most from your virtualized environment

Enterprises committed to a virtualization strategy need to ensure that management and automation of mission-critical IT systems and applications are included in their planning. Enterprises also need to establish procedures that allow them to maximize the benefits of consolidating to a virtualized platform and mitigate potential business risk across a landscape that has become abstract. Failure to do so will impact the success of projects and dilute the value of a virtualization strategy.

Spiraling energy costs, squeezing extra IT power out of fixed data center real estate footprints and environmental concerns, have shifted virtualization from a commodity tool to a center-stage role in the IT strategy of many organizations.

The history of virtualization can be tracked back to the 1970s when mainframe computers could be virtually partitioned to host multiple guest machines. It proved an ideal environment in which to install and configure new operating platforms, upgrade existing systems, and give software developers a sandbox for isolation testing. In its 21st century incarnation, history has repeated itself with virtualization usually starting life deep within the data center of most enterprises. IT operations and application development teams rapidly recognized the extra flexibility they could get from not needing to procure extra hardware to service ad hoc processing demands or for software testing.

With the shift from commodity to a center-stage role for virtualization, there is a corresponding shift in planning required to ensure that all IT layers in an enterprise are fully aligned to perform in a new virtualized landscape. In addition to ensuring that the underlying IT infrastructure components are in place each time a new virtual machine is provisioned, it's imperative that the business applications as well as the operational processes and procedures are fully established to provide the comprehensive set of services that end users rely on to do their jobs.

Factor
From an end-user or functional user perspective, whether an environment is virtualized or not is largely irrelevant. Such users simply expect their applications and programs to work - virtualization for them is a back-office, and therefore mostly unseen, technology. Planning for virtualization should strive to minimize apparent adverse impact on users' day-to-day activities.

Virtualization transforms a data center into a dynamic IT environment that can provide the flexibility and scalability capable of responding to the varying demands driven by a dynamic 24x7 global marketplace. However, while the ability to add and subtract processing capacity without needing to power up extra hardware offers enterprises greater agility, there are accompanying challenges that require addressing.

Factor
An organization's current system monitoring tools are probably very good at monitoring server statistics (like CPU utilization, I/O, etc.) and raising alarms if certain thresholds are exceeded. In a virtualized environment, such alarms should be expected to initiate action that can start, stop, or move virtual machines within the environment to help alleviate the detected resource exception. Planning should consider how system monitors can take actions that modify the virtual environment.

As each new virtual machine is spawned, the IT Operations team is left with the challenge of recognizing that there is an extra machine available that requires managing and monitoring. This same team also assumes responsibility for manually routing workload to this additional resource, continually checking systems performance and being ready to respond to messages and resolve problems as and when they occur.

Factor
A long-running, complex business process is known to contain a large processing "spike" at a certain point. In a virtualized environment, additional virtual machines can be started just prior to the spike (and stopped just after) to provide additional processing horsepower. The orchestrator (personnel or product) of the business process should be expected to be sufficiently aware of the virtualized environment to note the additional virtual machine(s) and take advantage of them. Without that awareness, even with the flexibility to dynamically add horsepower, an important potential benefit of the virtualized environment is lost. Planning should look at how business process orchestrators can take actions that affect the virtual environment.

This increase in workload combined with the perennial lack of qualified, skilled personnel puts tremendous pressure on IT operations. Instead of continually trying to find, train, and retain staff, organizations need to incorporate the tribal operations management knowledge that has accumulated over many years into the fabric of their virtualized environments. Adopting an automated approach would not only reduce operational pressures; it would also mitigate business risk by reducing the exposure of critical systems and applications to unaccountable manual intervention.

Factor
Drilling down into the previous example - if personnel are responsible for orchestrating the business process, one can envision a very detailed and carefully written manual process document for them to follow to manage the spike, taking advantage of the established virtualized environment. The burden (what higher-value activity could a person be doing?) and risk (what if a person makes a mistake?) of such a manual procedure could be eliminated by using an automated orchestrator - but only so far as the orchestrator is aware of and can interact with and control the virtualized environment. Again, without the awareness, an important potential benefit of the virtualized environment is lost. Planning should work to convert or translate manual processes (to the greatest extent possible) into automated processes.

Ensuring that extra virtual machines are brought online to cater for peak processing demands, optimizing the distribution of batch jobs to complete ahead of critical deadlines through to automatically responding and taking corrective actions against errors are just a few examples of workload management challenges arising in a virtualized world that can be simplified using automation. Beyond the infrastructure layer there's an equivalent set of tasks and procedures that have to be done to drive application processing that have traditionally relied on manual interaction, either by data center or end-user personnel. The virtualization of applications generates a similar set of challenges and requires equal attention if enterprises are going to realize benefits throughout their IT landscape.

In virtualized environments, the fixed relationships between hardware, systems, and applications no longer exist. Hardwired, proscribed associations, ranging from a command sequence in an operations handbook to fixed parameters embedded in a piece of application code, can result in different interpretations when presented in a virtualized world. Virtualization introduces an extra layer of abstraction between physical hardware devices and the software systems that an enterprise runs to support its business.

Factor
It's easy for a developer to write a program that runs well on a single server. However, without due consideration of the virtualized environment, it's all too likely that that same program won't run successfully across a landscape of virtual machines or hypervisors. Support for virtualized environments must be built into custom-developed code.

At the IT infrastructure management layer, there are IT housekeeping and administrative tasks that need to be executed: backups, snapshots, database clean-ups, file-transfer handling, and starting and stopping VMs. At the business application layer, there are functional processes and procedures that need to be undertaken: sales data uploads, order processing, invoicing, logistics, production, analytics and forecasting, finance and accounting, HR and customer care. Bringing together the execution of these activities ensures that everything around business and IT processes are properly managed and maintained. The scope of activities required will usually go well beyond the capability of an individual business application or systems management solution. Enterprises need to manage the suite of all interfaces around their virtual environments. They also need to be able to integrate the real and virtual environments in such a way that they can fully leverage the breadth and the depth of functionality that can be derived from their core applications and operating platforms.

Factor
IT housekeeping and administrative applications certainly must be "virtualization-aware" - indeed, some of the IT housekeeping tasks listed above are included in various hypervisors (e.g., snapshots). Business applications such as ERP, CRM, BI and DW must also be aware - it would make no sense to bring another virtual machine online for a particular application if the application itself had no awareness of its virtualized environment. There's some opportunity for application consolidation in terms of the applications used for managing IT housekeeping, administration, and business applications. The distinctions have blurred between certain classes of applications (e.g., job schedulers, system managers, business process managers) to such a degree that one new application may be able to replace the functionality of two or more older applications (see the references to an "orchestrator" in other parts of this article). Planning must include the business applications and each one's unique requirements.

Forming logical associations and utilizing logical views when managing virtualized systems and applications will allow IT departments to achieve greater flexibility and agility. When seeking to automate IT housekeeping procedures through to business processes, such as financial period-end close, creating a centralized single set of policy definitions that have embedded parameter variables not only ensures consistency and transparency across all virtualized machines and hypervisors - it will also reduce maintenance and administration overheads.

Factor
Establishing a single metadata repository for such items as policy definitions, processing rules, and business processes is a positive step in any virtualized environment. If such a repository also holds data about the current state of play of the policies in force, which rules are in control, and processing status then such data can be used in a predictive manner to proactively determine what virtual resources might be needed near-term AND take action to make those resources available. Effort should be spent planning how metadata can be used to allow proactive management of the virtual environment.

Establishing the availability of virtual resources, determining current systems performance, and analysis of other metrics can be used at runtime to optimize the routing and dispatching of workloads. Process definitions can be dynamically configured using parameter overrides to run on the hypervisor server best suited to ensure end-user SLAs are satisfied.

Factor
In the absence of an orchestrator to automate processing, system monitors can detect system events and raise alarms in a reactive fashion. Proactive and reactive attempts to modify the virtual environment are certainly valid. However, doing neither wastes some of the potential advantages of virtualization. Both proactive and reactive adjustments of the virtual environment should be planned for.

Securing and administering all process definitions in a centralized repository will support change control management. There's no need to manually check that script updates, necessary because a new version of a backup utility is being rolled out, have been propagated to all virtual machines. Critical activities that need to be run on virtual machines are protected against unauthorized updates and illegal use. Being able to maintain a record and report on all changes made to process definitions, as well as details of who executed what, where, when, and the outcome, supports enterprises in ensuring that their use of virtualization doesn't introduce additional operational risk and is compliant with IT governance strategy.

Factor
As highlighted earlier, automation provides a highly effective alternative to manual processes. If changes to the virtualized environment are automated (e.g., though predictive use of state data, automated response to alarms, and planned changes in a business process) then one expectation should be the existence of a good solid audit trail of actions taken by the automation orchestrator. Planning for compliance is a must.

Conclusion
Instead of dusting down an old IT operations run book and updating it to support a virtualization strategy, enterprises need to realize that embedding knowledge and experience into automated procedures not only simplifies management and control of a virtualized world; it can also ensure smart decisions are taken at the right time in the right context. An automated approach translates into improved throughput, greater accuracy, fewer errors, and less risk. Putting technology to work by allowing it to analyze resource utilization and respond instantaneously, provisioning extra resource in a virtualized environment enhances productivity and throughput.

More Stories By Alex Givens

Alex Givens is a Senior Solutions Architect for UC4 Software, Inc., makers of UC4 Workload Automation Suite. For 13 years, Alex has helped organizations improve the efficiency and effectiveness of their business processing. Alex has spoken on business process automation at many international, national and regional conferences.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
SYS-CON Events announced today that Commvault, a global leader in enterprise data protection and information management, has been named “Bronze Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY, and the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Commvault is a leading provider of data protection and information management...
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, will provide an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data profes...
SYS-CON Events announced today that VAI, a leading ERP software provider, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. VAI (Vormittag Associates, Inc.) is a leading independent mid-market ERP software developer renowned for its flexible solutions and ability to automate critical business functions for the distribution, manufacturing, specialty retail and service sectors. An IBM Premier Business Part...
SYS-CON Events announced today that Alert Logic, Inc., the leading provider of Security-as-a-Service solutions for the cloud, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Alert Logic, Inc., provides Security-as-a-Service for on-premises, cloud, and hybrid infrastructures, delivering deep security insight and continuous protection for customers at a lower cost than traditional security solutions. Ful...
Fortunately, meaningful and tangible business cases for IoT are plentiful in a broad array of industries and vertical markets. These range from simple warranty cost reduction for capital intensive assets, to minimizing downtime for vital business tools, to creating feedback loops improving product design, to improving and enhancing enterprise customer experiences. All of these business cases, which will be briefly explored in this session, hinge on cost effectively extracting relevant data from ...
With the Apple Watch making its way onto wrists all over the world, it’s only a matter of time before it becomes a staple in the workplace. In fact, Forrester reported that 68 percent of technology and business decision-makers characterize wearables as a top priority for 2015. Recognizing their business value early on, FinancialForce.com was the first to bring ERP to wearables, helping streamline communication across front and back office functions. In his session at @ThingsExpo, Kevin Roberts...
SYS-CON Events announced today that Interoute, owner-operator of one of Europe's largest networks and a global cloud services platform, has been named “Bronze Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2015 at the Javits Center in New York, New York. Interoute is the owner-operator of one of Europe's largest networks and a global cloud services platform which encompasses 12 data centers, 14 virtual data centers and 31 colocation centers, with connections to 195 ad...
With an estimated 50 billion devices connected to the Internet by 2020, several industries will begin to expand their capabilities for retaining end point data at the edge to better utilize the range of data types and sheer volume of M2M data generated by the Internet of Things. In his session at @ThingsExpo, Don DeLoach, CEO and President of Infobright, will discuss the infrastructures businesses will need to implement to handle this explosion of data by providing specific use cases for filte...
As enterprises work to take advantage of Big Data technologies, they frequently become distracted by product-level decisions. In most new Big Data builds this approach is completely counter-productive: it presupposes tools that may not be a fit for development teams, forces IT to take on the burden of evaluating and maintaining unfamiliar technology, and represents a major up-front expense. In his session at @BigDataExpo at @ThingsExpo, Andrew Warfield, CTO and Co-Founder of Coho Data, will dis...
SYS-CON Events announced today that Fusion, a leading provider of cloud services, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Fusion, a leading provider of integrated cloud solutions to small, medium and large businesses, is the industry's single source for the cloud. Fusion's advanced, proprietary cloud service platform enables the integration of leading edge solutions in the cloud, including clou...
Most people haven’t heard the word, “gamification,” even though they probably, and perhaps unwittingly, participate in it every day. Gamification is “the process of adding games or game-like elements to something (as a task) so as to encourage participation.” Further, gamification is about bringing game mechanics – rules, constructs, processes, and methods – into the real world in an effort to engage people. In his session at @ThingsExpo, Robert Endo, owner and engagement manager of Intrepid D...
Eighty percent of a data scientist’s time is spent gathering and cleaning up data, and 80% of all data is unstructured and almost never analyzed. Cognitive computing, in combination with Big Data, is changing the equation by creating data reservoirs and using natural language processing to enable analysis of unstructured data sources. This is impacting every aspect of the analytics profession from how data is mined (and by whom) to how it is delivered. This is not some futuristic vision: it's ha...
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Learn how IoT, cloud, social networks and last but not least, humans, can be integrated into a seamless integration of cooperative organisms both cybernetic and biological. This has been enabled by recent advances in IoT device capabilities, messaging frameworks, presence and collaboration services, where devices can share information and make independent and human assisted decisions based upon social status from other entities. In his session at @ThingsExpo, Michael Heydt, founder of Seamless...
The IoT's basic concept of collecting data from as many sources possible to drive better decision making, create process innovation and realize additional revenue has been in use at large enterprises with deep pockets for decades. So what has changed? In his session at @ThingsExpo, Prasanna Sivaramakrishnan, Solutions Architect at Red Hat, discussed the impact commodity hardware, ubiquitous connectivity, and innovations in open source software are having on the connected universe of people, thi...
WebRTC: together these advances have created a perfect storm of technologies that are disrupting and transforming classic communications models and ecosystems. In his session at WebRTC Summit, Cary Bran, VP of Innovation and New Ventures at Plantronics and PLT Labs, provided an overview of this technological shift, including associated business and consumer communications impacts, and opportunities it may enable, complement or entirely transform.
For manufacturers, the Internet of Things (IoT) represents a jumping-off point for innovation, jobs, and revenue creation. But to adequately seize the opportunity, manufacturers must design devices that are interconnected, can continually sense their environment and process huge amounts of data. As a first step, manufacturers must embrace a new product development ecosystem in order to support these products.
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, showed how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants received the download information, scripts, and complete end-t...
Manufacturing connected IoT versions of traditional products requires more than multiple deep technology skills. It also requires a shift in mindset, to realize that connected, sensor-enabled “things” act more like services than what we usually think of as products. In his session at @ThingsExpo, David Friedman, CEO and co-founder of Ayla Networks, discussed how when sensors start generating detailed real-world data about products and how they’re being used, smart manufacturers can use the dat...
When it comes to IoT in the enterprise, namely the commercial building and hospitality markets, a benefit not getting the attention it deserves is energy efficiency, and IoT’s direct impact on a cleaner, greener environment when installed in smart buildings. Until now clean technology was offered piecemeal and led with point solutions that require significant systems integration to orchestrate and deploy. There didn't exist a 'top down' approach that can manage and monitor the way a Smart Buildi...