Welcome!

Agile Computing Authors: Pat Romanski, Zakia Bouachraoui, Elizabeth White, William Schmarzo, Liz McMillan

RSS Feed Item

The next era in IT

The history of IT is usually characterized in terms of technological change.  The mainframe era was succeeded by distributed systems and client/server computing.  User interaction has moved from terminals to thick clients to thin browser-based interfaces, which, with the emergence of Web 2.0 technologies, combine the benefits of ubiquity and a rich end-user experience.  Networking has seen a shift from proprietary protocols and wire-level standards to the near-universal adoption of TCP/IP and Ethernet.  On the software front, SOA represents another major technological shift beyond earlier distributed architectures, the ramifications of which will likely take years to unfold.

At a broader level, however, I see IT evolving through three major eras, with the industry currently transitioning from the second era into the third.

The first era, from IT’s inception and through the 70’s, revolved around data processing (a name that was in fact given to some IT departments).  During this period, the primary focus was turning data – about customers, taxpayers, materials, and so on, along with their associated information – from a paper-based form into an electronic form, and performing batch operations on the data, like calculating interest for savings account holders.

Electronic record-keeping was a huge advance, although getting information into electronic form in the first place was a non-trivial challenge given that almost all data originated in an analog format and had to be manually keyed in or entered through other mechanical processes.  Representing, storing, and manipulating data efficiently was especially important—particularly given the limitations of the hardware of the time—giving rise to the rivalry between hierarchical, relational, and other database technologies.  In fact, while many presume relational technology won out, IBM’s Information Management System (IMS), a hierarchical data processing system introduced in the 60’s, is still going strong and continues to manage a large chunk of the world’s business data, in part because of its superior performance over relational databases including IBM’s own DB2.

By the early 80’s, most large companies and government organizations had a firm grasp on the task of “electronicizing” data and its processing and – while the need to address new data processing requirements and changes continues on – IT’s focus shifted to the implementation of real-time applications to automate and manage various aspects of a company’s operations.  This era of enterprise application deployments, in which SAP, Peoplesoft, Siebel, Oracle, and others established themselves in the market, was characterized by huge investments in packaged applications.  While companies continued to do custom development, the tide shifted from IT departments building the organization’s core applications – as they mostly did in the data processing era – towards buying off-the-shelf application functionality.

Fast forwarding to this point in the new millennium, the wave of investment in data processing and core business applications has largely run its course, at least within the big global corporations.  They have their databases containing customer, product, and other information, and they have their systems to manage their customer relationships, sales, finances, inventory, production, logistics, human resources, and every other major aspect of the business.  For the most part, large companies have filled the big areas of whitespace within their IT portfolios, a fact evident in SAP and Oracle’s push into small- and medium-enterprise, which are less IT-saturated, and into non-transactional application areas such as business intelligence and reporting.

So companies have spent the last few decades filling in pieces of the IT puzzle.  Almost by definition, the next era will focus on ensuring that the pieces work together effectively and to reshape them to meet future business needs.  Of course, organizations will continue to deploy new applications, but these will mainly be to fill gaps or niche needs not addressed by the big enterprise packages, and the initiatives will be generally be smaller in scope than, say, an R/3.  For most companies, the application portfolio in 2015 will not be very different from the applications already in place today.

The suggestion that “asset optimization” – for lack of a better phrase – will become IT’s defining mission isn’t a pitch for SOA, although SOA will clearly play a role.  Rather, it is a recognition of IT’s evolving circumstances.  Increasingly, IT value won’t be as simple as dropping in an ERP package or creating a customer datamart, because most businesses already have these capabilities (as do their competitors).  Instead, the next era of IT advantage will come not from the IT assets that companies own, but from how effectively they are able to exploit these assets.

I use the word “optimize” because it implies achieving the best result within a set of constraints, which is a fitting description of IT today.  IT’s constraints are many.  Budget increases are modest, maintenance consumes an ever-increasing portion of the spend, the labor picture is tightening despite offshoring and globalization, the technology landscape is becoming more diverse and complex, and almost all initiatives have to be accomplished within the context of the existing installed base of applications.  At the same time, there is an ever-increasing business appetite for IT capability,  requirements are coming from new sources (for example, regulatory compliance), and, as businesses seek to differentiate how they do things versus simply what they do, there is a greater need for IT to empower the line worker.  IT's challenge, in a nutshell, is finding ways to delivering on these requirements in the face of some not insignificant restrictions.

Thus, just as data processing and enterprise application deployments required different skill sets, organizational structures, and technology capabilities, the new era of “doing more with what you have” entails some significant adaptations on IT’s part, and in my next post, I’ll get into some of these changes.

On a final note, my characterization of IT’s three eras is obviously a generalization.  Different companies, industries and even countries will be at various stages along this timeline and, furthermore, the generations coincide to some extent (think of three bell curves overlapping each other).  Nevertheless, I think the progression is pretty clear and even companies who aren’t as far along this path will at some point want to think about how they equip themselves for the road ahead.

Anyway, more thoughts on this to come.

Read the original blog entry...

IoT & Smart Cities Stories
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
The standardization of container runtimes and images has sparked the creation of an almost overwhelming number of new open source projects that build on and otherwise work with these specifications. Of course, there's Kubernetes, which orchestrates and manages collections of containers. It was one of the first and best-known examples of projects that make containers truly useful for production use. However, more recently, the container ecosystem has truly exploded. A service mesh like Istio addr...
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
To Really Work for Enterprises, MultiCloud Adoption Requires Far Better and Inclusive Cloud Monitoring and Cost Management … But How? Overwhelmingly, even as enterprises have adopted cloud computing and are expanding to multi-cloud computing, IT leaders remain concerned about how to monitor, manage and control costs across hybrid and multi-cloud deployments. It’s clear that traditional IT monitoring and management approaches, designed after all for on-premises data centers, are falling short in ...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...