Welcome!

Agile Computing Authors: Elizabeth White, Andy Thurai, Ed Featherston, Pat Romanski, Liz McMillan

Related Topics: @ThingsExpo, Java IoT, Linux Containers, Agile Computing, @CloudExpo, @DXWorldExpo

@ThingsExpo: Article

Adventures in Digital Real-Time Wonderland

The only way we’ll truly achieve real-time behavior is by understanding the connections the wonderland of real-time requires

In last week’s Forbes article I discussed various senses of the term real-time: low latency user interfaces, up-to-date information, live human interactions, and high-performance data processing – to name but a few. Today, for the Cortex audience (as well as the Wired Innovations and SYS-CON audiences), it’s time to channel Lewis Carroll and have a wondrous adventure to shed light on the true significance and challenges of real-time.

drinkmeAs we venture down the rabbit hole of our technology-infused world, it’s easy to see that everything is getting faster and bigger and, well, just more. Moore’s Law has just taken a big swig from a bottle that says drink me, as we have more memory, more storage, faster networks, more network-connected doodads, more and faster processors than ever before. And of course, we’re also getting better at everything: better apps. Better operating systems. Better ways of abstracting every element of our environment to provide even greater performance and flexibility. We are truly living in a time of plenty, if not excess. It’s no wonder that we want real-time in everything we do.

Today’s challenge isn’t only making stuff go faster. It’s figuring out how all the acceleration of all the bits and pieces fit together. Furthermore, this push to achieve a holistic perspective on all this gear drives our quest for real-time, as it only takes one bottleneck to slow everything else down. The only way we’ll truly achieve real-time behavior is by understanding the connections this wonderland of real-time requires. So let’s get started, or my ears and whiskers, we’ll be late!

rabbitReal-Time Starting Point: Reactive Programming

Google “real-time.” Right after Bill Maher’s HBO show and a general Wikipedia entry comes Wikipedia’s “real-time computing” page – presumably the real-time we’re talking about here. Load that page and you’ll notice two curiouser and curiouser facts right off the bat: first, the Wikipedia article itself has serious issues – as though no one who cares about real-time computing actually wants anybody else to understand it. Second, “real-time computing” is apparently synonymous with “reactive computing,” a much less familiar term. The rest of the article focuses on the sort of real-time we want from our antilock brakes – useful to be sure, but not the enterprise context we were looking for.

caterpillarMaybe reactive computing is closer to the mark? Well, there’s inexplicably no Wikipedia page for that. The closest we can come is the reactive computing mock turtle: reactive programming. The basic idea with reactive programming is that the behavior of pieces of software can be declaratively defined, and thus evaluated in real-time – just as spreadsheet cells update automatically when a value they refer to changes.

There’s more to the reactive story than software that updates automatically, however, as a visit to the reactive manifesto illustrates. This wise caterpillar of a manifesto calls out four key reactive traits: event-driven, scalable, responsive, and resilient – essentially calling for Cloud-friendly, event-driven architectures that have the declarative behavior definition we know and love from the spreadsheet – only now across a hybrid enterprise context. Mushroom, anyone?

catIt’s no coincidence that Bloomberg Agile Architecture™ (BAA) also calls out responsiveness and resilience, although the BAA contexts for these terms are aspects of business agility rather than software – but suffice it to say, if your software doesn’t have these traits, it’s unlikely your organization is agile. Alas, we thought we saw the Cheshire Cat of agility, but all we saw was its smile. The people behind the reactive manifesto, however, have a far more technical context for these terms – Play Framework, an open source web application framework for Java and Scala that bills itself as lightweight, stateless, and Web-friendly.

walrusAt this point this Cortex might have gone down the Scala rabbit hole – but I’ll save that for a future issue (Through the Looking Glass, perhaps?). Just for fun, however, let’s follow the stateless thread of this adventure to the beach where the Walrus and the Carpenter entertain their oysters. I’ve discussed statelessness over the years in many contexts, from the challenge of maintaining business process instance state with stateless Services, to the role hypermedia play in transferring state to the client if you actually follow REST properly (which almost no one does), to the challenges state management presents to Cloud-based applications. Understanding the relationship between statelessness and real-time behavior, however, ties all these concepts together in a nice package. The oysters, however, aren’t nearly so satisfied.

tartsState, in fact, is the Queen’s tarts of real-time computing. Sure, sometimes your software behavior can be completely reactive: event happens, do some stuff, give some kind of result, and never keep track of anything or wait around for somebody else to finish something. Such processing can be blisteringly fast, of course. It’s when you have to keep track of something that problems arise: where do you do the tracking? Do you have to keep track of multiple things at once when they might interact somehow? How permanent does the tracking have to be? And most importantly: won’t all this tracking slow everything down?

Time to hide the tarts: we could simply keep track of everything in the database. We get unlimited persistence, but databases are relatively slow and scaling them can be difficult. So let’s call upon the knave of hearts to spirit away those tarts to some piece of infrastructure in our middle tier, like an application server or an ESB. The database breathes a sigh of relief, but now we have a Cloud-unfriendly centralized state management approach. So instead, let’s pass the tarts to the client – after all, that’s where REST got its name (Representational State Transfer, natch). We now have scalability and Cloud friendliness, but this approach doesn’t deal well with shared state (as we would need for any type of collaborative application), and nobody likes HATEOAS, even when they understand it.

croquetEnter caching. The idea of a cache is to temporarily store those pesky state tarts, thus lightening the load on the persistence tier. And calloo callay! We can now cache in memory, making it wicked fast. But we still have the Cloud-friendliness problem, so enter from the Queen’s croquet pitch the distributed in-memory cache. Cloud-friendly, check. Wicked fast, check. Responsive? Well, it depends on the color of the roses. The problem here, of course, is the problem caches always have: if all your data are always changing or every interaction always requires different data, caches are worse than useless, since caching something only makes sense if somebody is going to use it a few times before you need to refresh it.

At this point there’s only one more place for the tarts to go: back to the database. We need faster databases that are both Cloud-friendly and deal well with the continually exploding nature of Big Data. It’s no wonder, therefore, that the database marketplace is undergoing a dramatic period of innovation. Yes, another rabbit hole for yet another day – but let’s tease out a single architectural tea party that relates directly to real-time: immutability.

hatterYour mad hatter of a database is immutable if it only supports writes that append data but no updates or deletes. Instead, to handle these pesky changes, additional records are added that indicate a previous record has changed. Immutability is essential for solving some knotty problems with concurrency – a mischievous dormouse for distributed computing since the client/server days and still a hassle in today’s Cloud-enabled world. As anyone who has used GitHub or a similar immutable data store can attest, immutability is the key to scaling a database that supports a large number of users who can add information, since all changes are handled as new data, and furthermore, the data store maintains a complete audit trail of everything that has ever happened, regardless of whether we all move down one seat at the table in search of clean dishes.

dormouseGitHub additionally works well with caching because it must assemble the current version of each stored file by adding together all the changes, or diffs, to that file. Temporary storage of each current version thus lightens the load on the underlying data store. But in other situations where the underlying data are always in flux, immutability still helps to address the real-time need. Reads can be extraordinarily fast compared to traditional databases, because the database can look to the index to identify the latest version of a record.

And so our adventure through real-time computing brings us to indexing in all its glory – not just for finding the desired record, but also for all the metadata necessary to assemble the various diffs in order to deliver the current version of a record in real-time. The metadata story for real-time, however, doesn’t stop at indices, as the army of metadata playing cards are central to the notion of declarative programming.

cardsWe have thus come full circle to the notion of reactive programming, which includes declaratively defining the behavior of pieces of software as simply as entering formulas into cells in a spreadsheet. And while a single worksheet may have tens of thousands of cells, extending the role metadata play to a distributed enterprise context ups the ante on the relationship between reactive programming and metadata: being able to resolve the desired behavior of any software given the combination of all metadata in the relevant environment – for every interaction, in real-time.

We call such resolution dynamic constraint satisfaction, where the metadata describe the relevant constraints, even though they may be fully dynamic. Calculating the result, therefore, must take place in real-time. Envision one massive spreadsheet, only instead of formulas in the cells, you have any reactive software you might find anywhere in your IT environment. The cell with your final answer is always correct, and always up to date – in real-time. Off with their heads!

The Intellyx Take

Our adventure down the real-time rabbit hole in this enterprise IT wonderland took us many different places. And while each of the critters we met had its own real-time story, our adventure tied all the individual stories together. Such is the nature of real-time: we have many moving parts and they must all be working at top form together in order to deliver a true real-time experience to each user.

Real-time behavior, therefore, is an important challenge for any digital professional, as there is more to digital transformation than meets the eye. Your customers, partners, and broader audience expect such behavior from your digital efforts, and to keep them happy you need the right technology and most importantly, the right architecture to glue everything together in real-time.

More Stories By Jason Bloomberg

Jason Bloomberg is a leading IT industry analyst, Forbes contributor, keynote speaker, and globally recognized expert on multiple disruptive trends in enterprise technology and digital transformation. He is ranked #5 on Onalytica’s list of top Digital Transformation influencers for 2018 and #15 on Jax’s list of top DevOps influencers for 2017, the only person to appear on both lists.

As founder and president of Agile Digital Transformation analyst firm Intellyx, he advises, writes, and speaks on a diverse set of topics, including digital transformation, artificial intelligence, cloud computing, devops, big data/analytics, cybersecurity, blockchain/bitcoin/cryptocurrency, no-code/low-code platforms and tools, organizational transformation, internet of things, enterprise architecture, SD-WAN/SDX, mainframes, hybrid IT, and legacy transformation, among other topics.

Mr. Bloomberg’s articles in Forbes are often viewed by more than 100,000 readers. During his career, he has published over 1,200 articles (over 200 for Forbes alone), spoken at over 400 conferences and webinars, and he has been quoted in the press and blogosphere over 2,000 times.

Mr. Bloomberg is the author or coauthor of four books: The Agile Architecture Revolution (Wiley, 2013), Service Orient or Be Doomed! How Service Orientation Will Change Your Business (Wiley, 2006), XML and Web Services Unleashed (SAMS Publishing, 2002), and Web Page Scripting Techniques (Hayden Books, 1996). His next book, Agile Digital Transformation, is due within the next year.

At SOA-focused industry analyst firm ZapThink from 2001 to 2013, Mr. Bloomberg created and delivered the Licensed ZapThink Architect (LZA) Service-Oriented Architecture (SOA) course and associated credential, certifying over 1,700 professionals worldwide. He is one of the original Managing Partners of ZapThink LLC, which was acquired by Dovel Technologies in 2011.

Prior to ZapThink, Mr. Bloomberg built a diverse background in eBusiness technology management and industry analysis, including serving as a senior analyst in IDC’s eBusiness Advisory group, as well as holding eBusiness management positions at USWeb/CKS (later marchFIRST) and WaveBend Solutions (now Hitachi Consulting), and several software and web development positions.

@ThingsExpo Stories
As ridesharing competitors and enhanced services increase, notable changes are occurring in the transportation model. Despite the cost-effective means and flexibility of ridesharing, both drivers and users will need to be aware of the connected environment and how it will impact the ridesharing experience. In his session at @ThingsExpo, Timothy Evavold, Executive Director Automotive at Covisint, discussed key challenges and solutions to powering a ride sharing and/or multimodal model in the age ...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
In his session at @ThingsExpo, Dr. Robert Cohen, an economist and senior fellow at the Economic Strategy Institute, presented the findings of a series of six detailed case studies of how large corporations are implementing IoT. The session explored how IoT has improved their economic performance, had major impacts on business models and resulted in impressive ROIs. The companies covered span manufacturing and services firms. He also explored servicification, how manufacturing firms shift from se...
IoT solutions exploit operational data generated by Internet-connected smart “things” for the purpose of gaining operational insight and producing “better outcomes” (for example, create new business models, eliminate unscheduled maintenance, etc.). The explosive proliferation of IoT solutions will result in an exponential growth in the volume of IoT data, precipitating significant Information Governance issues: who owns the IoT data, what are the rights/duties of IoT solutions adopters towards t...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, examined the regulations and provided insight on how it affects technology, challenges the established rules and will usher in new levels of diligence arou...
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
IoT is at the core or many Digital Transformation initiatives with the goal of re-inventing a company's business model. We all agree that collecting relevant IoT data will result in massive amounts of data needing to be stored. However, with the rapid development of IoT devices and ongoing business model transformation, we are not able to predict the volume and growth of IoT data. And with the lack of IoT history, traditional methods of IT and infrastructure planning based on the past do not app...
"Akvelon is a software development company and we also provide consultancy services to folks who are looking to scale or accelerate their engineering roadmaps," explained Jeremiah Mothersell, Marketing Manager at Akvelon, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
DXWorldEXPO LLC, the producer of the world's most influential technology conferences and trade shows has announced the 22nd International CloudEXPO | DXWorldEXPO "Early Bird Registration" is now open. Register for Full Conference "Gold Pass" ▸ Here (Expo Hall ▸ Here)
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smart...
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
Here are the Top 20 Twitter Influencers of the month as determined by the Kcore algorithm, in a range of current topics of interest from #IoT to #DeepLearning. To run a real-time search of a given term in our website and see the current top influencers, click on the topic name. Among the top 20 IoT influencers, ThingsEXPO ranked #14 and CloudEXPO ranked #17.
Join IBM November 1 at 21st Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Cognitive analysis impacts today’s systems with unparalleled ability that were previously available only to manned, back-end operations. Thanks to cloud processing, IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Imagine a robot vacuum that becomes your personal assistant tha...
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...