Welcome!

Agile Computing Authors: Pat Romanski, PagerDuty Blog, ManageEngine IT Matters, Elizabeth White, David H Deans

Related Topics: Agile Computing

Agile Computing: Article

How Can Metcalfe's Law Be Updated for Web 2.0?

Metcalfe's Law is More Misunderstood Than Wrong

"Metcalfe's Law is Wrong," contended Bob Briscoe, Andrew Odlyzko, and Benjamin Tilly recently in a much-discussed IEEE Spectrum article, in which they wrote: "Of all the popular ideas of the Internet boom, one of the most dangerously influential was Metcalfe's Law." Sim Simeonov disagrees.

The industry is at it again – trying to figure out what to make of Metcalfe’s Law. This time it’s IEEE Spectrum with a controversially titled “Metcalfe’s Law is Wrong”. The main thrust of the argument is that the value of a network grows O(nlogn) as opposed to O(n2). Unfortunately, the authors’ O(nlogn) suggestion is no more accurate or insightful than the original proposal.

There are three issues to consider:

  • The difference between what Bob Metcalfe claimed and what ended up becoming Metcalfe’s Law
  • The units of measurement
  • What happens with large networks

The typical statement of the law is “the value of a network increases proportionately with the square of the number of its users.” That’s what you’ll find at the Wikipedia link above. It happens to not be what Bob Metcalfe claimed in the first place. These days I work with Bob at Polaris Venture Partners. I have seen a copy of the original (circa 1980) transparency that Bob created to communicate his idea. IEEE Spectrum has a good reproduction, shown here.

The original Metcalfe's Law graph

The unit of measurement along the X-axis is “compatibly communicating devices”, not users. The credit for the “users” formulation goes to George Gilder who wrote about Metcalfe’s Law in Forbes ASAP on September 13, 1993. However, Gilder’s article talks about machines and not users. Anyway, both the “users” and “machines” formulations miss the subtlety imposed by the “compatibly communicating” qualifier, which is the key to understanding the concept.

Bob, who invented Ethernet, was addressing small LANs where machines are visible to one another and share services such as discovery, email, etc. He recalls that his goal was to have companies install networks with at least three nodes. Now, that’s a far cry from the Internet, which is huge, where most machines cannot see one another and/or have nothing to communicate about… So, if you’re talking about a smallish network where indeed nodes are “compatibly communicating”, I’d argue that the original suggestion holds pretty well.

The authors of the IEEE article take the “users” formulation and suggest that the value of a network should grow on the order of O(nlogn) as opposed to O(n2). Are they correct? It depends. Is their proposal a meaningful improvement on the original idea? No.

To justify the logn factor, the authors apply Zipf’s Law to large networks. Again, the issue I have is with the unit of measurement. Zipf’s Law applies to homogeneous populations (the original research was on natural language). You can apply it to books, movies and songs. It’s meaningless to apply it to the population of books, movies and songs put together or, for that matter, to the Internet, which is perhaps the most heterogeneous collection of nodes, people, communities, interests, etc. one can point to. For the same reason, you cannot apply it to MySpace, which is a group of sub-communities hosted on the same online community infrastructure (OCI), or to the Cingular / AT&T Wireless merger.

The main point of Metcalfe’s Law is that the value of networks exhibits super-linear growth. If you measure the size of networks in users, the value definitely does not grow O(n2) but I’m not sure O(nlogn) is a significantly better approximation, especially for large networks. A better approximation of value would be something along the lines of O(SumC(O(mclogmc))), where C is the set of homogeneous sub-networks/communities and mc is the size of the particular sub-community/network. Since the same user can be a member of multiple social networks, and since |C| is a function of N (there are more communities in larger networks), it’s not clear what the total value will end up being. That’s a Long Tail argument if you want one…

Very large networks pose a further problem. Size introduces friction and complicates connectivity, discovery, identity management, trust provisioning, etc. Does this mean that at some point the value of a network starts going down (as another good illustration from the IEEE article shows)? It depends on infrastructure. Clients and servers play different roles in networks. (For more on this in the context of Metcalfe’s Law, see Integration is the Killer App, an article I wrote for XML Journal in 2003, having spent less time thinking about the problem ;-) ). P2P sharing, search engines and portals, anti-spam tools and federated identity management schemes are just but a few examples of the myriad of technologies that have all come about to address scaling problems on the Internet. MySpace and LinkedIn have very different rules of engagement and policing schemes. These communities will grow and increase in value very differently. That’s another argument for the value of a network aggregating across a myriad of sub-networks.

Bottom line, the article attacks Metcalfe’s Law but fails to propose a meaningful alternative.

More Stories By Simeon Simeonov

Simeon Simeonov is CEO of FastIgnite, where he invests in and advises startups. He was chief architect or CTO at companies such as Allaire, Macromedia, Better Advertising and Thing Labs. He blogs at blog.simeonov.com, tweets as @simeons and lives in the Greater Boston area with his wife, son and an adopted dog named Tye.

Comments (6) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
wsanders 08/13/06 08:41:43 AM EDT

I sort of see the key insight of Briscoe, Odlyzko, and Tilly that, if you are going to pull a function out of your ass, it makes more sense if the differential of the function flattens out rather than slopes linearly upwards forever, because there is ultimately a decreasing value of each connection as the number of connections increases.

So they were correct to pull a log function of of their ass, but they could have just as easily pulled out n*ln(n) or some other base. They made no attempt to "calibrate" the model.

A good insight is this quotation:

"If Metcalfe's Law were true, it would create overwhelming incentives for all networks relying on the same technology to merge, or at least to interconnect. These incentives would make isolated networks hard to explain. Consider two networks, each with n members. By Metcalfe's Law, each one's value is on the order of n 2, so the total value of both of these separate networks is roughly 2n 2. But suppose these two networks merge. Then we will effectively have a single network with 2n members, which, by Metcalfe's Law, will be worth (2n)2 or 4n 2--twice as much as the combined value of the two separate networks.

Surely it would require a singularly obtuse management, to say nothing of stunningly inefficient financial markets, to fail to seize this obvious opportunity to double total network value by simply combining the two."

Inflating these "synergies" was exactly what led to the Bombing Off of the Bubble.

MountainLogic 08/13/06 08:38:03 AM EDT

This all came about because Metcalfe was trying to make a case for networking (e.g., ethernet).

Back then the ethernet cards he was selling were expensive. The decision maker would go, "gee, if it cost $x to network two people why can't Bob just walk down the hall to Jan's office?" If X is greater then the cost of Bob "walking down the hall" (or snail mailing or flying...) then there is no busines case for installing a network. More to the point:
If the node cost, x, is $100 and there are 100 users, n, then the cost for the network is $10,000.

If the single user business value, v, of the network is $10 for one user then the ROI for different valuation methods is:

Linear: vn = $1,000 -- no business case, don't even think about it

Metcalf's Law: (n(n-1)=2)v = 49,500 -- winner

Metcalf's Law as misused by dot-bombers: N^2 * V = 100,000 -- "Proves" selling frozen mud on the net is a winner

As restated by the authors: n long (n) * v = 2000 -- no business case, but better than a flat linear

There really are two problems here. The scaling formula and setting the business value. If you set the business value for a single connection greater than the cost of the network then it is a no brainer, but back when Metcalfe was pushing networking that was a hard case to make.

Scott Allen 08/13/06 08:20:00 AM EDT

Metcalfe's Law has always been understood to mean "the theoretical potential value", since a very large number of the links between nodes will never be made.

This same idea can be applied to Reed's Law, which states that the potential value of social networks, i.e., those with self-organizing groups, grows as 2^N (actually, 2^N - N - 1). In reality, of course, this never even comes close, because real groups don't usually organize like that ("let's make a sub-group within our group that excludes one person", etc.).

Also, virtual groups, in particular, don't tend to break off into ad hoc sub-groups if they don't have the tools to do so, e.g., the participants in one thread don't typically say "Let's go create a new group to discuss this." It's theoretically possible, but only occasionally occurs. Of course, the participants in a particular thread can be considered a sort of ad hoc sub-group, in which case you might achieve something closer to the potential, but still...

Interesting stuff - thanks for the link.

Jon Rubin 08/13/06 08:05:56 AM EDT

Metcalfe's Law said that the value of a network, x, as it increases in network members, n, is described by the equation x=n^2.

In other words, it said the value of a network was proportional to the square of the network's users.

Instead, the IEEE article declares it should be x=n"log(n). My suspicion - and this has no basis in anything and I haven't even graphed it and my total knowledge of information theory is that it was started by a guy named Claude - is that Euler has to be involved somewhere and maybe x=n"ln(n) would be more correct.

Peter O'Kelly 08/13/06 08:03:17 AM EDT

insightful analysis

Johannes Ernst 08/13/06 07:58:19 AM EDT

Of course Metcalfe's Law is overstated! If I have a fax machine and somebody in central China, who I have never heard of and will never interact with, buys another fax machine, the value of the network will not grow proportionally to N (say, hundreds of millions: the number of fax machines in existence today) but by some much smaller number such as a couple of hundred at the maximum (the number of fax machines that person will ever send a fax to or receive a fax from)...

@ThingsExpo Stories
Pulzze Systems was happy to participate in such a premier event and thankful to be receiving the winning investment and global network support from G-Startup Worldwide. It is an exciting time for Pulzze to showcase the effectiveness of innovative technologies and enable them to make the world smarter and better. The reputable contest is held to identify promising startups around the globe that are assured to change the world through their innovative products and disruptive technologies. There w...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
There is growing need for data-driven applications and the need for digital platforms to build these apps. In his session at 19th Cloud Expo, Muddu Sudhakar, VP and GM of Security & IoT at Splunk, will cover different PaaS solutions and Big Data platforms that are available to build applications. In addition, AI and machine learning are creating new requirements that developers need in the building of next-gen apps. The next-generation digital platforms have some of the past platform needs a...
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
Personalization has long been the holy grail of marketing. Simply stated, communicate the most relevant offer to the right person and you will increase sales. To achieve this, you must understand the individual. Consequently, digital marketers developed many ways to gather and leverage customer information to deliver targeted experiences. In his session at @ThingsExpo, Lou Casal, Founder and Principal Consultant at Practicala, discussed how the Internet of Things (IoT) has accelerated our abil...
With so much going on in this space you could be forgiven for thinking you were always working with yesterday’s technologies. So much change, so quickly. What do you do if you have to build a solution from the ground up that is expected to live in the field for at least 5-10 years? This is the challenge we faced when we looked to refresh our existing 10-year-old custom hardware stack to measure the fullness of trash cans and compactors.
The emerging Internet of Everything creates tremendous new opportunities for customer engagement and business model innovation. However, enterprises must overcome a number of critical challenges to bring these new solutions to market. In his session at @ThingsExpo, Michael Martin, CTO/CIO at nfrastructure, outlined these key challenges and recommended approaches for overcoming them to achieve speed and agility in the design, development and implementation of Internet of Everything solutions wi...
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is expected in the amount of information being processed, managed, analyzed, and acted upon by enterprise IT. This amazing is not part of some distant future - it is happening today. One report shows a 650% increase in enterprise data by 2020. Other estimates are even higher....
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
Identity is in everything and customers are looking to their providers to ensure the security of their identities, transactions and data. With the increased reliance on cloud-based services, service providers must build security and trust into their offerings, adding value to customers and improving the user experience. Making identity, security and privacy easy for customers provides a unique advantage over the competition.
Is the ongoing quest for agility in the data center forcing you to evaluate how to be a part of infrastructure automation efforts? As organizations evolve toward bimodal IT operations, they are embracing new service delivery models and leveraging virtualization to increase infrastructure agility. Therefore, the network must evolve in parallel to become equally agile. Read this essential piece of Gartner research for recommendations on achieving greater agility.
Smart Cities are here to stay, but for their promise to be delivered, the data they produce must not be put in new siloes. In his session at @ThingsExpo, Mathias Herberts, Co-founder and CTO of Cityzen Data, will deep dive into best practices that will ensure a successful smart city journey.
SYS-CON Events announced today that Venafi, the Immune System for the Internet™ and the leading provider of Next Generation Trust Protection, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Venafi is the Immune System for the Internet™ that protects the foundation of all cybersecurity – cryptographic keys and digital certificates – so they can’t be misused by bad guys in attacks...
SYS-CON Events announced today Telecom Reseller has been named “Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Akana has announced the availability of version 8 of its API Management solution. The Akana Platform provides an end-to-end API Management solution for designing, implementing, securing, managing, monitoring, and publishing APIs. It is available as a SaaS platform, on-premises, and as a hybrid deployment. Version 8 introduces a lot of new functionality, all aimed at offering customers the richest API Management capabilities in a way that is easier than ever for API and app developers to use.