Welcome!

Agile Computing Authors: Pat Romanski, Liz McMillan, Yeshim Deniz, Elizabeth White, Andy Thurai

Related Topics: @ThingsExpo, Mobile IoT, Agile Computing, @CloudExpo, @DXWorldExpo

@ThingsExpo: Article

IoT Software Releases | @ThingsExpo #IoT #M2M #BigData #InternetOfThings

Developing software for the Internet of Things (IoT) comes with its own set of challenges

Developing software for the Internet of Things (IoT) comes with its own set of challenges.  Security, privacy, and unified standards are a few key issues.  In addition, each IoT product is comprised of at least three separate application components: the software embedded in the device, the backend big-data service, and the mobile application for the end user's controls. Each component is developed by a different team, using different technologies and practices, and deployed to a different stack/target - this makes the integration of these separate pipelines and the coordination of software updates for IoT more problematic.  How do you coordinate the diverse moving parts that must come together when your IoT product is updated?

Getting IoT to Flow
Electric Cloud helps our customers solve software delivery problems at a very large scale. Our integrated, end-to-end, DevOps platform - ElectricFlow - has proven to be a natural fit for orchestrating the complex pipelines that are common with IoT.

Pipelines
ElectricFlow 6.0 introduces Pipelines to allow you to automate end-to-end software development and delivery processes, enabling Application Release Automation (ARA) and Continuous Delivery (CD).  You can easily manage the software delivery of one or more applications using pipelines.

electricflow pipeline1

A pipeline is a series of high-level re-usable steps that run automated processes. Pipelines consist of one or more stages. Each stage has a stage plan with one or moretasks, an entry gate, and an exit gate.

Tasks are used to run automated build, test, and deployment processes.  The tasks within a stage plan are run in sequential order.

An entry gate and an exit gate exist for each stage. A gate may consist of one or more automatic or manual approvals.  When the tasks for a stage are completed and an exit gate requires approval before going to the next stage, the approvers are notified by email.

Pipelines provide several benefits:

  • Orchestration - determine who (or what) is working on the release process, what part they are doing, what is the status of that part, and what is the next step.
  • Visibility - determine how the software is performing, whether it is ready for beta or production, where there are performance issues, and what versions are available within the release process.
  • Control - determine whether the software is ready for integration, beta, preproduction, or production and whether the software passes the quality criteria at stage gates and has been approved by the appropriate users or groups.

Pipelines in Action
Let's walk through a real use case to see how ElectricFlow is used to coordinate an IoT release.

The Pieces
In our example, there are three teams that work on the major components of a car-based IoT product:

  • The embedded team develops the software deployed to electric cars. This software collects vehicle-specific information (battery charge, MPGe, etc.) and uploads it to the data center.
  • The backend data team develops the software deployed to the data center. This software collects, analyzes, and provides visualization for the data uploaded by the vehicles.
  • The mobile team develops the app deployed to the app store. This mobile app shows car owners real-time information about their vehicle, such as battery utilization.

For the demo, we assembled the following pieces running "v1" of the product:

iot-dem-vehicle-sw

The vehicle - the embedded software in the vehicle is simulated by a Raspberry Pi with a Blink(1) light indicating the battery charge and pushing information to the DB

iot-demo-backend-sw

The data backend- simulated by a database running in AWS EC2, and a Dashing dashboard to visualize the data

iot-demo-mobile-app-sw-1

The mobile app - an iOS application running on an iPhone and showing real-time data from the DB in EC2

Working in Isolation
Each team uses the appropriate automation mechanism to build, test, and deploy their software in isolation from the other teams.

The vehicle and backend data teams:

These teams use ElectricFlow to:

  1. Model their applications' tiers and components
  2. Define their processes used to build, test, and deploy their applications
  3. Define and run pipelines to execute these processes across the various stages of development.

application-model-iot

The backend data application model

deployment-process-iot

The vehicle deployment process

backend-app-pipeline

The backend data pipeline

The mobile application team:
Using Ship.IO, the mobile team builds, tests, and deploys their solution to various test devices as well as to the App Stores.

mobile-app-builds-ship1

The mobile app jobs - build triggered automatically with each commit to Git.

mobile-app-ship2

The mobile app build/test/deploy process

Bringing it all together
While lower environment development and testing is done using isolated pipelines, all three applications must converge, and be tested, before the final push to production. ElectricFlow manages this Release Pipeline.

release-pipeline

The Release Pipeline stages and gates in ElectricFlow

For the purpose of the demonstration, we update each one of the three components comprising the IoT service to form a "v2" of our product. Once all three updates have passed their pipelines, we are ready to stage our coordinated release.  To do that, we run the Release Pipeline.

run-pipeline

A run-time instance of the Release Pipeline

The applications use snapshots to deploy the exact same bits and processes in each stage of both the individual team pipelines and the coordinated Release Pipeline.  The exact same applications are deployed and tested from Development through to Production.  This ensures repeatability and consistency, and greatly reduces the risk of failures when deploying to Production.  When the applications converge, a successful deployment to the Staging environment is followed by a successful deployment to Production.

pipeline-success

A successful Release pipeline

The ElectricFlow environment inventory shows the version of each component that is currently deployed.  This visibility is powerful both when debugging failures and providing data for audit reports.

release-inventory

The inventory for a Staging environment

Automate everything
This example represents the moving parts that are typical of an IoT service.  Coordinating the software delivery for these moving parts poses a challenge, particularly when cross-teams integration is required, and especially at scale.  While manual coordination is typically used to address this, manual tasks are slow and extremely error-prone.  Automation is the key to high-quality software being delivered at a rapid pace.

automate-the-internet-of-things

In the above demonstration, we see how ElectricFlow models an entire end-to-end IoT delivery lifecycle.  Pipelines orchestrate the development and subsequent convergence of multiple applications - from commit through test and deployment to production.  The simple and intuitive UI, along with the scalable automation of CI and deployment processes, makes ElectricFlow a natural solution for IoT software delivery challenges.

More Stories By Anders Wallgren

Anders Wallgren is Chief Technology Officer of Electric Cloud. Anders brings with him over 25 years of in-depth experience designing and building commercial software. Prior to joining Electric Cloud, Anders held executive positions at Aceva, Archistra, and Impresse. Anders also held management positions at Macromedia (MACR), Common Ground Software and Verity (VRTY), where he played critical technical leadership roles in delivering award winning technologies such as Macromedia’s Director 7 and various Shockwave products.

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...