Welcome!

Agile Computing Authors: Yeshim Deniz, Elizabeth White, Zakia Bouachraoui, Liz McMillan, Pat Romanski

Related Topics: Agile Computing

Agile Computing: Article

How Can Metcalfe's Law Be Updated for Web 2.0?

Metcalfe's Law is More Misunderstood Than Wrong

"Metcalfe's Law is Wrong," contended Bob Briscoe, Andrew Odlyzko, and Benjamin Tilly recently in a much-discussed IEEE Spectrum article, in which they wrote: "Of all the popular ideas of the Internet boom, one of the most dangerously influential was Metcalfe's Law." Sim Simeonov disagrees.

The industry is at it again – trying to figure out what to make of Metcalfe’s Law. This time it’s IEEE Spectrum with a controversially titled “Metcalfe’s Law is Wrong”. The main thrust of the argument is that the value of a network grows O(nlogn) as opposed to O(n2). Unfortunately, the authors’ O(nlogn) suggestion is no more accurate or insightful than the original proposal.

There are three issues to consider:

  • The difference between what Bob Metcalfe claimed and what ended up becoming Metcalfe’s Law
  • The units of measurement
  • What happens with large networks

The typical statement of the law is “the value of a network increases proportionately with the square of the number of its users.” That’s what you’ll find at the Wikipedia link above. It happens to not be what Bob Metcalfe claimed in the first place. These days I work with Bob at Polaris Venture Partners. I have seen a copy of the original (circa 1980) transparency that Bob created to communicate his idea. IEEE Spectrum has a good reproduction, shown here.

The original Metcalfe's Law graph

The unit of measurement along the X-axis is “compatibly communicating devices”, not users. The credit for the “users” formulation goes to George Gilder who wrote about Metcalfe’s Law in Forbes ASAP on September 13, 1993. However, Gilder’s article talks about machines and not users. Anyway, both the “users” and “machines” formulations miss the subtlety imposed by the “compatibly communicating” qualifier, which is the key to understanding the concept.

Bob, who invented Ethernet, was addressing small LANs where machines are visible to one another and share services such as discovery, email, etc. He recalls that his goal was to have companies install networks with at least three nodes. Now, that’s a far cry from the Internet, which is huge, where most machines cannot see one another and/or have nothing to communicate about… So, if you’re talking about a smallish network where indeed nodes are “compatibly communicating”, I’d argue that the original suggestion holds pretty well.

The authors of the IEEE article take the “users” formulation and suggest that the value of a network should grow on the order of O(nlogn) as opposed to O(n2). Are they correct? It depends. Is their proposal a meaningful improvement on the original idea? No.

To justify the logn factor, the authors apply Zipf’s Law to large networks. Again, the issue I have is with the unit of measurement. Zipf’s Law applies to homogeneous populations (the original research was on natural language). You can apply it to books, movies and songs. It’s meaningless to apply it to the population of books, movies and songs put together or, for that matter, to the Internet, which is perhaps the most heterogeneous collection of nodes, people, communities, interests, etc. one can point to. For the same reason, you cannot apply it to MySpace, which is a group of sub-communities hosted on the same online community infrastructure (OCI), or to the Cingular / AT&T Wireless merger.

The main point of Metcalfe’s Law is that the value of networks exhibits super-linear growth. If you measure the size of networks in users, the value definitely does not grow O(n2) but I’m not sure O(nlogn) is a significantly better approximation, especially for large networks. A better approximation of value would be something along the lines of O(SumC(O(mclogmc))), where C is the set of homogeneous sub-networks/communities and mc is the size of the particular sub-community/network. Since the same user can be a member of multiple social networks, and since |C| is a function of N (there are more communities in larger networks), it’s not clear what the total value will end up being. That’s a Long Tail argument if you want one…

Very large networks pose a further problem. Size introduces friction and complicates connectivity, discovery, identity management, trust provisioning, etc. Does this mean that at some point the value of a network starts going down (as another good illustration from the IEEE article shows)? It depends on infrastructure. Clients and servers play different roles in networks. (For more on this in the context of Metcalfe’s Law, see Integration is the Killer App, an article I wrote for XML Journal in 2003, having spent less time thinking about the problem ;-) ). P2P sharing, search engines and portals, anti-spam tools and federated identity management schemes are just but a few examples of the myriad of technologies that have all come about to address scaling problems on the Internet. MySpace and LinkedIn have very different rules of engagement and policing schemes. These communities will grow and increase in value very differently. That’s another argument for the value of a network aggregating across a myriad of sub-networks.

Bottom line, the article attacks Metcalfe’s Law but fails to propose a meaningful alternative.

More Stories By Simeon Simeonov

Simeon Simeonov is CEO of FastIgnite, where he invests in and advises startups. He was chief architect or CTO at companies such as Allaire, Macromedia, Better Advertising and Thing Labs. He blogs at blog.simeonov.com, tweets as @simeons and lives in the Greater Boston area with his wife, son and an adopted dog named Tye.

Comments (6)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...