|By Dennis D. McDonald||
|April 15, 2006 03:00 AM EDT||
That doesn’t mean there’s agreement yet on what the term means. This is one of the reasons we’re hearing about “enterprise resistance” to Web 2.0 applications (more on this below).
Web 2.0, after all, means different things to different people:
- To the programmer, it’s a set of tools and techniques that have the potential for fundamentally altering how network based applications and data are developed, managed, and delivered.
- For start-ups and venture capitalists, it’s an opportunity to get in on the ground floor of another “bubble.”
- For the corporate CIO or IT manager, it’s another set of technologies and architectures to be adopted and supported in an era of continued I.T. department budget strains.
- For newer or smaller companies, it’s an opportunity to acquire technical and business process infrastructure at a fraction of the cost of the investments made by older and legacy companies.
- For the marketing manager it’s an opportunity to “end-run” a traditionally unresponsive I.T. department.
- For the customer it’s an opportunity to establish and maintain relationships that are both personally fulfilling and empowering in the face of the traditional power of larger institutions.
- For the CEO of an established legacy industry company, it’s a threat of loss of control over customer relations.
With so many perspectives, it's no wonder that it's difficult to get a clear picture. We’re dealing not only with shifting technical architectures but also with shifts in how individuals and organization use the Internet. We know that different industries adopt technology at different rates. In the case of Web 2.0, we're talking not just about changes in technology and associated business processes, but also about changes in the relationships that are built around how systems are developed and used.
Reality: Things Don't Always Work
My personal interest in this subject is anything but academic. As a management consultant I help companies plan and manage changes to technology and processes. With Web 2.0 the opportunity for change is massive.
This change is not going to happen overnight. There’s always a risk when a new technical architecture is introduced. Part of the risk is getting components of the architecture into the hands of the users. Another is making sure those components work reliably.
I was reminded of this recently when my wife called about an email she had received that contained a link to something she couldn't read: "I'm trying to show an attachment to an e-mail I got from a client but I can't. I called her to ask what to do and she told me to make sure that 'flash was turned on'. What does she mean by "flash?"
I explained what "Flash" from Macromedia is and fixed her problem, but not until she had experienced a significant delay in communicating with a prospective client -- all because a popular web browser add-in wasn't properly configured.
This got me to thinking about the current evangelizing that is swirling around Web 2.0, AJAX, SOA, and tools like Ruby on Rails, a web application development framework. I've been impressed with the mass of available applications that offer sophisticated functionality without requiring a "heavy client footprint." Just check out Christian Mayaud's list of Web 2.0 applications if you want to be amazed (and amused).
I thought back to my wife's question. "Flash" is one of those "helper" applications that an entire industry and developer community has grown up around. It's now firmly a part of the Internet infrastructure. That wasn't always the case. As I saw with my wife's question, there are still "pockets" of users where an unknown – but simple -- configuration setting can cause the final step in a complex communication channel to fail.
Parallels with Web 2.0
This got me to thinking about what has to happen each time an AJAX based application is used. Some current "mashups" might be combining widely available public data. With "enterprise" types of applications we might be talking about the over-the-web handling of valuable -- or sensitive -- personal or financial data. Reliability and stability of all parts of the server, net, and client will be critical. All will have to work together to ensure reliable two-way interaction.
Is uncertainty about this reliability one of the reasons why some corporate IT managers are taking a wait-and-see attitude about "web 2.0"? My wife's temporary problem with a supposedly mature piece of the web infrastructure is probably not too unusual. There are many users out there who work day in and day out without paying special attention to extensions, helper applications, thin clients, RSS feeds, and the like. There are probably a lot of them for whom managing the vicissitudes of a commonly available component such as Flash is at best, an annoyance.
In theory, the population of users like my wife might be a prime target of the Web 2.0 delivered rich functionality for remotely served applications such as are sometimes referred to as "Web Office" or Office 2.0. Technologies such as AJAX (and Flash, ActiveX, and Java, as well) help deliver near-desktop-quality functionality (some may argue with the adjective "near") without requiring the permanent installation of massive amounts of (expensively-licensed) software on the client machine. I’m sure that many corporate I.T. folks view potential simplification of the client’s configuration as A Good Thing, right?
But lots of dice have to roll the right way for all parts of the channel to work every time. Sometimes problems occur, as my wife's experience attests -- and she was dealing with one of the oldest and most venerable components for delivering rich media content.
(I think of issues related to this every time I switch over from Yahoo's old-but-serviceable Web Mail interface to the Yahoo! Mail Beta. I'm using the Yahoo! Mail Beta via my office DSL connection, and I always have to wait for data to load. And while I love the "drag and drop" functionality the Yahoo! Mail Beta interface provides, the hesitation of the interface grates after a while, especially when handling as much daily e-mail as I do.)
Are Internet users willing to accept such performance glitches "outside the firewall" in order to gain access to an attractive interface and functionality that looks like it's running from a local client?
I suspect my wife won't. She may not be a “power user,” but her standards for performance are high. (I know - I hear about it whenever our home network slows down!)
Web 2.0 Enterprise Hurdles
If Web 2.0 applications built around AJAX and related technologies are to succeed in the "enterprise" several dice have to roll the right way, such as:
Tools, development, and testing processes must continue to mature (this is happening). Tools, development, and testing processes must be accepted into the enterprise -- in addition to, or in replacement of, the architectures that are already there (e.g., how many development platforms is an IT department willing to support?) Data security and stability issues must be solved -- especially when it comes to handling sensitive customer and financial data.
The new architecture must deliver -- and have documented -- (a) reduced costs, (b) added benefits, or (c) both (a) and (b).
Company executives must be willing to accept a new network architecture paradigm along with its frequent association with "social networking" functionality that many people are still not comfortable with.
Except for the last bullet point, the issues here are similar to issues associated with the introduction of any new programming language or development framework into the enterprise. The costs of changes to process and technologies have to be outweighed by the promised benefits. In that sense, AJAX is no different from other evangelized technologies that have come before, except that the Web now provides (potentially) the data, delivery platform, and the medium for promotion (and hype) – outside the firewall.
More Than Technical Architecture Challenges
I’m going to make a leap of faith here and predict that issues of security, reliability, maintainability, privacy, and functionality that are associated with web based Web 2.0 applications are going to be successfully addressed and resolved. Call me optimistic, but there’s a lot of creativity out there and I believe that current technical challenges will be overcome, and quickly.
But let's return to that last bullet point in the above section. One of the most astute descriptions of Web 2.0 adoption is the recent blog article by Thad Scheer of Sphere of Influence called Monetizing Value of Social Computing in Traditional Industries. (Author disclosure: Thad is a friend of mine.)
Thad is the CEO of a DC-area consulting firm that serves both government and corporate customers. He bases his statements on what he sees as resistance from traditional "brick and mortar" industries to adopting a new customer relationship paradigm. According to Thad, the executives of these industries see Web 2.0 as reducing their control over their data and their customer relationships, especially when "social networking" functionality is included in the mix. (Some of my own Web 2.0 Management Survey interviews have borne this view out.)
While Thad goes on to assess the more willing acceptance of Web 2.0 by newer firms and more consumer product oriented firms, the issues he points out are not issues of technical architecture, they are issues of functionality and the expectations executives have of how they relate to their customers.
It’s entirely reasonable that certain population segments and certain types of business transactions will be thought by corporate executives as being outside the realm of either existing or Web 2.0 infrastructures. Not every business transaction can -- or should -- be handled electronically, nor does it need to be handled in a fashion that increases a sense of intimacy.
Change Comes Fast
Changes to social interactions and to network infrastructure usually can't happen overnight. But pressure to adopt more collaborative and interactive techniques based on "social software," even with populations that might have traditionally been thought of as being resistant to such approaches, might develop faster than some people think. In particular, knowledge workers inside and outside even the traditional industries will expect more conversational and interactive communications both within their companies and with the companies -- and customers -- they deal with. Management will need to adapt to the fact that employees are now able to engage with customers more frequently and on a more personal level than ever before. This engagement can lead to loyalty.
Companies will respond differently depending on their structures, management styles, and regulatory constraints. Some will have a small number of "CEO-style" blogs with commenting "turned off." Some will organize groups of staff members to interact in a structured fashion with different groups of users with management structures potentially modeled after high end professional call centers. Still others will engage all staff to interact as a normal part of their job (just like answering the phone). Some may adopt Web 2.0 technologies but only within the corporate organization itself.
In many cases, the I.T. department will be called upon to evaluate and if necessary provide support for technology platforms that may interface -- reliably and safely -- with other corporate systems.
Competition Spurs Change
Will the "brick and mortar" businesses that Thad writes about just sit by while smaller and more agile competitors, who have much less invested in legacy infrastructure, nibble away at their businesses?
I don't think so. It's from such competitors that the pressure on traditional companies to adopt "Enterprise Web 2.0" most likely will come.
I am not prepared to rule out the ability of even large “legacy” companies to adopt the faster, more open and agile ways of smaller competitors. In fact, the ability to effectively innovate and manage using collaborative technologies that incorporate social networking methods may turn out to be the next “big thing” in management, comparable to the quality improvement movements spawned in the 20th century by foreign competition to U.S. manufacturing. This might even rival – and compete with – the current resources being put into process and system outsourcing.
The wave of change will not be limited to manufacturing but will impact all sizes and types of industries that need to communicate with their customers. That’s everyone.
How to Move Forward
OK, so much for analysis, let’s get practical. The following is my advice to managers, executives, and employees who want to take advantage of Web 2.0 technologies:
- Start small. Do a prototype, not an enterprise rollout. Focus on early results based on well defined, achievable goals.
- Involve both business and IT. Even if you can negotiate deals with externally software vendors without the involvement of your IT department, don’t do it. Your IT department probably has more experience with knowing what might go wrong than you do!
- Minimize integration complexity. Don’t start with a project that requires manipulation of the company’s “crown jewel” financial or customer data. Keep it simple.
- Focus on business benefits. Don’t do something because it’s free, easy, or nice to know. Focus on a system that supports the financial or strategic goals of your organization. It’s always easier to justify costs associated with money making activities than costs associated with overhead and administration.
- Know your costs. Even if you’re using existing staff, existing networks, existing hardware, and free software, don’t pretend that time and infrastructure are free. Keep track of staff, user, and support staff time. If you’re successful with your initial project you’ll have to translate these into dollars.
- Use the technology. Practice what you preach. If you’re experimenting with web based tools for customer communication, use web based tools to support project management and staff communication.
- Face issues headlong. Have a process to track and resolve all business, personal, and technical issues that arise during the course of the project.
- Don’t demonize the opposition. Opposition to Web 2.0 may be justified. Listen and find out what the core issues really are, surface them, and address them.
- Remember it’s a business. Even if you’re working with collaborative software that supports personalization, relationships, and intimacy, remember that there needs to be a reasonable line drawn between business and personal matters.
- Manage. Don’t just toss the technology into the user community. Be prepared to guide, give feedback and, where necessary, lead. Remember: no matter what project management philosophy you follow, projects don’t manage themselves, especially projects where you are breaking new technological or cultural ground.
|Mahendrap 11/25/09 03:09:00 AM EST|
"2.0" denotes enabling "interactivity" to web based applications. And, many things enable this interactivity, whether Ajax, or something else.
Once can cut the noise about debating what 2.0 means and focus on how enabling "interactivity" to existing contexts like content, users and commerce can throw up interesting solutions.
It is possible to roll out sub 15K Proof of concepts in web 2.0 in any context before the investor worries about ROI. A lot of Social Media Consultants and sometimes Web 2.0 Platform vendors are all doing it. Keep looking.
|jgo 05/04/06 02:10:37 PM EDT|
LAMP is good. AJAX is evil.
|Dennis D. McDonald 04/18/06 12:39:03 PM EDT|
AUTHOR NOTE: My "acknowledgement" pararaph was inadvertantly omitted from the above article: "Rod Boothby, Chris Law, Jeremiah Owyang, Luis Suarez, and Ken Yarmosh were all kind enough to share their insights with me as I refined the article."
There will be new vendors providing applications, middleware, and connected devices to support the thriving IoT ecosystem. This essentially means that electronic device manufacturers will also be in the software business. Many will be new to building embedded software or robust software. This creates an increased importance on software quality, particularly within the Industrial Internet of Things where business-critical applications are becoming dependent on products controlled by software. Qua...
Jul. 24, 2016 05:30 AM EDT Reads: 1,285
"There's a growing demand from users for things to be faster. When you think about all the transactions or interactions users will have with your product and everything that is between those transactions and interactions - what drives us at Catchpoint Systems is the idea to measure that and to analyze it," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York Ci...
Jul. 24, 2016 04:45 AM EDT Reads: 1,864
SYS-CON Events has announced today that Roger Strukhoff has been named conference chair of Cloud Expo and @ThingsExpo 2016 Silicon Valley. The 19th Cloud Expo and 6th @ThingsExpo will take place on November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. "The Internet of Things brings trillions of dollars of opportunity to developers and enterprise IT, no matter how you measure it," stated Roger Strukhoff. "More importantly, it leverages the power of devices and the Interne...
Jul. 24, 2016 04:45 AM EDT Reads: 1,968
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and shared the must-have mindsets for removing complexity from the develo...
Jul. 24, 2016 01:30 AM EDT Reads: 1,128
SYS-CON Events announced today that MangoApps will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. MangoApps provides modern company intranets and team collaboration software, allowing workers to stay connected and productive from anywhere in the world and from any device.
Jul. 24, 2016 01:15 AM EDT Reads: 1,218
The IETF draft standard for M2M certificates is a security solution specifically designed for the demanding needs of IoT/M2M applications. In his session at @ThingsExpo, Brian Romansky, VP of Strategic Technology at TrustPoint Innovation, explained how M2M certificates can efficiently enable confidentiality, integrity, and authenticity on highly constrained devices.
Jul. 24, 2016 01:15 AM EDT Reads: 878
"We've discovered that after shows 80% if leads that people get, 80% of the conversations end up on the show floor, meaning people forget about it, people forget who they talk to, people forget that there are actual business opportunities to be had here so we try to help out and keep the conversations going," explained Jeff Mesnik, Founder and President of ContentMX, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 23, 2016 11:45 PM EDT Reads: 1,219
Internet of @ThingsExpo has announced today that Chris Matthieu has been named tech chair of Internet of @ThingsExpo 2016 Silicon Valley. The 6thInternet of @ThingsExpo will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Jul. 23, 2016 11:15 PM EDT Reads: 1,861
When people aren’t talking about VMs and containers, they’re talking about serverless architecture. Serverless is about no maintenance. It means you are not worried about low-level infrastructural and operational details. An event-driven serverless platform is a great use case for IoT. In his session at @ThingsExpo, Animesh Singh, an STSM and Lead for IBM Cloud Platform and Infrastructure, will detail how to build a distributed serverless, polyglot, microservices framework using open source tec...
Jul. 23, 2016 11:00 PM EDT Reads: 2,236
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
Jul. 23, 2016 10:15 PM EDT Reads: 2,443
From wearable activity trackers to fantasy e-sports, data and technology are transforming the way athletes train for the game and fans engage with their teams. In his session at @ThingsExpo, will present key data findings from leading sports organizations San Francisco 49ers, Orlando Magic NBA team. By utilizing data analytics these sports orgs have recognized new revenue streams, doubled its fan base and streamlined costs at its stadiums. John Paul is the CEO and Founder of VenueNext. Prior ...
Jul. 23, 2016 09:30 PM EDT Reads: 1,951
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Jul. 23, 2016 08:45 PM EDT Reads: 1,665
CenturyLink has announced that application server solutions from GENBAND are now available as part of CenturyLink’s Networx contracts. The General Services Administration (GSA)’s Networx program includes the largest telecommunications contract vehicles ever awarded by the federal government. CenturyLink recently secured an extension through spring 2020 of its offerings available to federal government agencies via GSA’s Networx Universal and Enterprise contracts. GENBAND’s EXPERiUS™ Application...
Jul. 23, 2016 08:30 PM EDT Reads: 1,779
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 23, 2016 08:30 PM EDT Reads: 2,033
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Jul. 23, 2016 08:00 PM EDT Reads: 2,424
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
Jul. 23, 2016 08:00 PM EDT Reads: 1,805
The IoT is changing the way enterprises conduct business. In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discussed how businesses can gain an edge over competitors by empowering consumers to take control through IoT. He cited examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He also highlighted how IoT can revitalize and restore outdated business models, making them profitable ...
Jul. 23, 2016 07:15 PM EDT Reads: 1,878
We all know the latest numbers: Gartner, Inc. forecasts that 6.4 billion connected things will be in use worldwide in 2016, up 30 percent from last year, and will reach 20.8 billion by 2020. We're rapidly approaching a data production of 40 zettabytes a day – more than we can every physically store, and exabytes and yottabytes are just around the corner. For many that’s a good sign, as data has been proven to equal money – IF it’s ingested, integrated, and analyzed fast enough. Without real-ti...
Jul. 23, 2016 07:00 PM EDT Reads: 892
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
Jul. 23, 2016 06:15 PM EDT Reads: 829
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
Jul. 23, 2016 06:00 PM EDT Reads: 1,902