Welcome!

Agile Computing Authors: Liz McMillan, Pat Romanski, Elizabeth White, Dana Gardner, Andy Thurai

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Agile Computing

@CloudExpo: Article

The Coming Network Evolution: Cisco Gets It, Do You?

As Microsoft, Google, Amazon build up steam in the cloud they're creating demands for even more powerful & intelligent networks

Greg Ness's Blog

I think it is only a matter of time before ALL of the leading networking players start talking about the (strategic importance of the) network as a way to succeed in an uncertain economic climate. Last week, in "Cloud Computing, Virtualization and IT Diseconomies" I talked about the increasingly intense pressures already building on static network infrastructure, and the underlying need for more intelligence and automation.

I think the new survival mantra for the coming economic weakness will be "He (or she) who automates wins." As the industrial age emerged from the agricultural age, and as it blends with the computer age, innovation has been driven by the ability of visionaries to boost productivity through automation and connectivity.

I just watched Cisco's John Chambers "Can IT Strengthen the Economy?" interview at the recent Gartner conference just released. Chambers clearly sees innovation as the way out. The network is strategic to business productivity. Flexibility, speed and scale are becoming even more important. That means dynamic connectivity and intelligence will become even more strategic to the network.

I think Chambers gets it and is reminding his customers that strategic innovation will trump mere cost-cutting in a period of economic uncertainty. Those who emerge will emerge even more powerful because they will have avoided the temptation to make the network tactical with the long term vision of shifting it to the cloud ala Nicholas Carr's vision of utility computing.

I think it is only a matter of time before ALL of the leading networking players start talking about the (strategic importance of the) network as a way to succeed in an uncertain economic climate. Last week, in Cloud Computing, Virtualization and IT Diseconomies I talked about the increasingly intense pressures already building on static network infrastructure, and the underlying need for more intelligence and automation.

These intense pressures are setting the stage for the next technology boom, by creating gaps between what networks can do today and what they'll need to do tomorrow. I was amazed at how quickly the concept of Infratsructure2.0 spread, including an interesting discussion at F5 Network's pace-setting DevCentral blog.

These pressures are coming from increasing rates of change, especially in larger networks supporting more devices and branches and processes, as well as with the introduction of consolidation, virtualization and cloud computing initiatives. These new initiatives are introducing even higher rates of change and making it clear that a static network will no longer be a strategic network.

As Nicholas Carr debates with Tim O'Reilly about the form cloud will take a few nuggets emerge:

"But the cloud platform, like the software platform before it, has new rules for competitive advantage. And chief among those advantages are those that we've identified as "Web 2.0", the design of systems that harness network effects to get better the more people use them."

- Tim O'Reilly "Web2.0 and Cloud Computing, October 2008

As Nicholas correctly challenges the role of "network effects" he then engages a fallacy that I think is the core of his misperception of the role of network infrastructure within IT. That is, his electric utility as IT metaphor leads him down a path that is well-trodden from a hype perspective, but not yet enterprise-grade. He talks about economies of scale in IT that can contribute to which cloud players win or lose:

1. Capital intensity. Building a large utility computing system requires lots of capital, which itself presents a big barrier to entry.

2. Scale advantages. As O'Reilly himself notes, big players reap important scale economies in equipment, labor, real estate, electricity, and other inputs.

3. Diversity factor. One of the big advantages that accrue to utilities is their ability to make demand flatter and more predictable (by serving a diverse group of customers with varying demand patterns), which in turn allows them to use their capital more efficiently. As your customer base expands, so does your diversity factor and hence your efficiency advantage and your ability to undercut your less-efficient competitors' prices.

- Nicholas Carr, "What Time O'Reilly gets wrong about the cloud", October 2008

In Cloud Computing, Virtualization and IT Diseconomies I talked about the prevalence of manual labor in critical IT processes, from IP address management to servers that lead to substantial scale and complexity challenges. Exactly where are the advantages if the costs of simple tasks per IP address go up (on a per IP address basis) as networks get larger? Here's what I wrote:

"As much as cloud computing has rallied behind the prospect of electricity and real estate savings, the business case still feels like a dotcom hangover in some cases. Virtualization is still a bit hamstrung in the enterprise by the disconnect between static infrastructure and moving, state-changing VMs; and labor is the largest cost component of server TCO (IDC findings) and a significant component of network TCO (as suggested by the Computerworld findings). So just how much will real estate and electricity savings offset other diseconomies and barriers in the cloud game? I think cloud computing will also have to innovate in areas like automation and connectivity intelligence."

I think that rising complexity and scale challenges driven by various initiatives (including cloud computing) will force static networks to evolve into dynamic networks. That is the only way that scale and complexity can be addressed, and I think that is the core of Carr's challenge to enterprise IT. Dynamic networks would create a new level of automation potential and reduce the sheer amount of resources dedicated to connectivity and change, which will only go up as endpoints and systems become more mobile and more dynamic.

[Thanks to Rick Kagan and Stu Bailey at Infoblox for the above image]

Across several recent articles at Archimedius I've talked about the increasingly costly demands of manual labor on IT, including IP address management, DNS, DHCP and a host of other core network services. I've talked about the importance of reachability and connectivity intelligence within the network so that solutions can learn and adapt to these new fluid systems and more powerful endpoints.

Recent Computerworld and IDC research was also cited in , my lengthy tome predicting the shrinking role of manual labor in IT. I noted larger enterprises paying more for mundane, boring tasks like managing IP addresses by spreadsheet, even on a cost per IP address basis.Cloud Computing, Virtualization and IT Diseconomies

I'll also go so far as to suggest who the leaders are in each required category, from endpoint intelligence (Microsoft), to network intelligence (Cisco) to application intelligence (F5 Networks). I inserted Infoblox as the leader in connectivity intelligence, which I see as this emerging dynamic feedback loop between systems, endpoints and networks now overly dependent upon manual labor to address rising flexibility and scale demands. (Disclaimer: I work for Infoblox).

That's one of the reasons I was so encouraged by the recent discussion at F5's DevCentral community. Here is the post if you're interested in more.


Managing a heterogeneous infrastructure is difficult enough, but managing a dynamic, ever changing heterogeneous infrastructure that must be stable enough to deliver dynamic applications makes the former look like a walk in the park. Part of the problem is certainly the inability to manage heterogeneous network infrastructure devices from a single management system.

- Lori MacVittie, F5 DevCentral

Who knows if standards could ever emerge between the likes of Cisco, Juniper, Brocade, Riverbed and F5 Networks. Lorie is quick to point out that they have worked in the past, as with WS-I (which included Microsoft and Oracle, among others). A very interesting standard I mentioned previously is IF-MAP from the Trusted Computing Group, which includes ArcSight, Aruba, Infoblox and Juniper, among others.


As the Mind requires a Nervous System; Network Intelligence requires Connectivity Intelligence

Yet I think standards will only be part of the solution, even if they are adopted. I think the critical requirement for Infrastructure2.0 will be connectivity intelligence. TCP/IP has now outgrown its static shell and is about to be tasked with connecting even more powerful and dynamic systems. Whether it's the rise of RFID in supply chain, mobility ala Google's Android, or even the adoption of parking meters with their own IP addresses, it is clear that TCP/IP is spreading with or without a strong economy and the most productive enterprises will be the most likely to survive.

The manual labor that has driven IP address management costs higher as networks grow larger is similarly impacting other core network services (like DNS and DHCP) that were not created to support such complex arrays of devices, branches and systems. This is the broader opportunity for Juniper, Brocade and others as well, not only to reduce network infrastructure TCO but to address the new level of flexibility enabled by virtualization and other initiatives driving new scale and flexibility requirements.

Enterprises are now on the battlefield between two competing forces, the rapid proliferation of TCP/IP and the increasingly dynamic and powerful systems and endpoints attaching to the network in order to boost productivity. Those who succeed will have invested in automation based on dynamic feedback between devices and systems and the rise in network intelligence.

Gone will be manual spreadsheets tracking IP addresses across large and ever-changing extended enterprise networks. Gone will be endless hours of overtime tied up in mundane and resource-consuming tasks. Gone will be manual pings to determine whether a network is available or secure or not.

This is the next technology boom, the era of Infratsructure2.0. Cisco is already on message. F5 is getting there and I think it is only a matter of time before the marketers at the world's leading technology companies realize that the war is on, and all of the old alliances that enabled exclusivity and lock-in and layers of manual labor are off the table.

Out of this coming weakness will emerge new strength, possibilities and profits. As Microsoft, Google, Amazon build up steam in the cloud they are creating demands for even more powerful and intelligent networks. Enterprises who see the network as tactical will take the brunt of the pain from a weak economy; those who embrace automation will be the fastest to return to normal and ultimately establish and or maintain operational leadership.

More Stories By Greg Ness

Gregory Ness is the VP of Marketing of Vidder and has over 30 years of experience in marketing technology, B2B and consumer products and services. Prior to Vidder, he was VP of Marketing at cloud migration pioneer CloudVelox. Before CloudVelox he held marketing leadership positions at Vantage Data Centers, Infoblox (BLOX), BlueLane Technologies (VMW), Redline Networks (JNPR), IntruVert (INTC) and ShoreTel (SHOR). He has a BA from Reed College and an MA from The University of Texas at Austin. He has spoken on virtualization, networking, security and cloud computing topics at numerous conferences including CiscoLive, Interop and Future in Review.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...