Agile Computing Authors: Kevin Benedict, Liz McMillan, Carmen Gonzalez, Jim Kaskade, Lori MacVittie

Related Topics: Microservices Expo, Java IoT, Linux Containers, Agile Computing, @CloudExpo, @BigDataExpo

Microservices Expo: Article

Applying Advanced Agile Methodologies

Our big challenge now is no longer the speed of code propagation. It’s how we can manage effective communication among streams

In the five years since I co-founded Bonitasoft with Miguel Valdes Faura and Rodrigue Le Gall, our organization has come a long way.

We started with seven developers. We now have 17 dedicated full time to Bonita BPM - along with a systems architect, a QA team, a documentation team, and a "human factors" engineer. We've logged 2.75 million downloads, booked 875 customers and built a community of 60,000 contributors.

How do you triple the size of your development team in less than five years and keep consistent control over your processes? Well, even for a company that's in the business of helping others improve processes, it's been a challenge, a learning experience - and a great opportunity to apply some interesting "advanced" agile methodologies.

How We Started with Agile
Our initial small team focused on development of the Bonita Execution Engine, the Bonita User Experience (web), and the Bonita Studio, with each of these groups having a specific skill set and a technical leader. From the very start we applied agile development practices - with everyone in the entire team working together in the same two-week sprint, participating in the daily scrum meetings, and so on.

With a small team, we were able to make very efficient progress all working on the same code - we got the first release of Bonita Open Solution out in six months.

But as we grew our development team and as we dealt with the inevitable errors that crept in, we found ourselves being held up. If the build chain broke, everyone's progress was affected.

With the growing team, to avoid these compilation issues, we broke up R&D into three individual teams (still focused on the Engine, the Web, and the Studio components of the Bonita BPM suite) and gave each team an independent release process for each component. This greatly helped us to isolate bug errors, but for fixes, the Studio team was always last in line - they needed a stable build from the Web team, who needed a stable build from the Engine team. It might take as long as two weeks before a bug-discovered-and-fixed on the same day by the Engine team actually propagated to the Studio team.

The Business Pressure to Change Our Development Approach
The growth of our team was only one aspect of the pressures we faced in engineering. As we moved through our Bonita Open Source version 5 product releases and began to prepare for the release of our new product, Bonita BPM version 6, we began to work more and more closely with the Product Committee. Together we started looking at ways to allow R&D to work on multiple features simultaneously, end-to-end, without pulling resources from one team to another. We wanted to reduce the time to fully develop new features of better quality, and to fix bugs. Bonitasoft's use of Value Streams at the strategic level offered a logical possibility: link R&D to corporate strategic goals for innovation and improvement.

The New R&D Organization: Agile Streams
Our development team is now organized into four streams: Innovation, Core Product, Integration, and Fast-Track. Strategically speaking, Innovation development keeps us at the leading edge of BPM suite capability, Core product development keeps us competitive in the current market, Integration remains one of our key differentiators, and Fast-Track helps ensure that users' needs are given appropriate priority.

The product committee's guidance heavily influences the priorities of the first three streams. The Fast-Track development priorities come from Support, Customer Success, Pre-Sales, and Delivery, the customer-facing groups inside Bonitasoft. In this way we continue to improve our product through both radical innovation and incremental improvements (new and improved features).

Each stream is comprised of Engine, Web, and Studio developers, plus a Product Manager and members of the documentation and Quality Assurance teams. Our systems architect and human factors engineer work across all four streams.

When a feature or improvement is developed in a stream, it is fully developed and tested on the stream's dedicated continuous integration server. A feature is "done" when the language translation is done, the documentation is done and the tests are done. When the entire code stream is stable, then and only then it is pushed to the shared continuous integration server where it can be accessed and used by the other streams.

When it is time for a major release, the code is pushed to another dedicated server where the final QA is done.

The advantages of this development approach are already being realized: the isolation of each stream and the involvement of QA inside each one means that the code is only shared when ready - and no other stream is dependent on work outside of it in order to advance.

It's also much cleaner to always have one stream dedicated to maintenance. We use a round robin approach so each stream has a turn, and only one stream is working on maintenance fixes at a time.

There's Always a Challenge
Our big challenge now is no longer the speed of code propagation. It's how we can manage effective communication among streams. Development may be appropriately isolated, but clear and timely communication on big changes is critical. We're addressing this challenge by sharing information frequently through informal presentations, and each team has a team leader whose responsibility includes sharing information across teams. Their entire mornings are pretty much dedicated to coordination tasks while their afternoons are dedicated to development tasks.

Looking Ahead
We are already seeing excellent results from our agile stream approach. Our maintenance releases are coming regularly each month, and the implementation of development roadmap is better balanced among the four strategic Value Streams. Bonita BPM has had two versions released in 2013, with two more on the way for 2014. With the Fast-track stream, we have been able to quickly respond to customers' and users' innovative suggestions and business needs - with a flexibility that underscores and confirms the very concept of agile.

More Stories By Charles Souillard

Charles Souillard co-founded Bonitasoft in 2009 with Miguel Valdes Faura and Rodrigue Le Gall. As VP of Engineering and CTO, Charles leads the Bonitasoft product development organization. He was previously head of the Bonita core development team within Bull Information Systems. He has significant experience developing mission-critical applications with BPM and SOA technologies. He serves on a number of European Community projects. He holds a Master’s degree in Computer Science from Polytech de Grenoble.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@ThingsExpo Stories
As ridesharing competitors and enhanced services increase, notable changes are occurring in the transportation model. Despite the cost-effective means and flexibility of ridesharing, both drivers and users will need to be aware of the connected environment and how it will impact the ridesharing experience. In his session at @ThingsExpo, Timothy Evavold, Executive Director Automotive at Covisint, will discuss key challenges and solutions to powering a ride sharing and/or multimodal model in the a...
SYS-CON Events announced today that Coalfire will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Coalfire is the trusted leader in cybersecurity risk management and compliance services. Coalfire integrates advisory and technical assessments and recommendations to the corporate directors, executives, boards, and IT organizations for global brands and organizations in the technology, cloud, health...
A completely new computing platform is on the horizon. They’re called Microservers by some, ARM Servers by others, and sometimes even ARM-based Servers. No matter what you call them, Microservers will have a huge impact on the data center and on server computing in general. Although few people are familiar with Microservers today, their impact will be felt very soon. This is a new category of computing platform that is available today and is predicted to have triple-digit growth rates for some ...
SYS-CON Events announced today that Transparent Cloud Computing (T-Cloud) Consortium will exhibit at the 19th International Cloud Expo®, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The Transparent Cloud Computing Consortium (T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data proces...
Cognitive Computing is becoming the foundation for a new generation of solutions that have the potential to transform business. Unlike traditional approaches to building solutions, a cognitive computing approach allows the data to help determine the way applications are designed. This contrasts with conventional software development that begins with defining logic based on the current way a business operates. In her session at 18th Cloud Expo, Judith S. Hurwitz, President and CEO of Hurwitz & ...
Major trends and emerging technologies – from virtual reality and IoT, to Big Data and algorithms – are helping organizations innovate in the digital era. However, to create real business value, IT must think beyond the ‘what’ of digital transformation to the ‘how’ to harness emerging trends, innovation and disruption. Architecture is the key that underpins and ties all these efforts together. In the digital age, it’s important to invest in architecture, extend the enterprise footprint to the cl...
SYS-CON Events announced today that MathFreeOn will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. MathFreeOn is Software as a Service (SaaS) used in Engineering and Math education. Write scripts and solve math problems online. MathFreeOn provides online courses for beginners or amateurs who have difficulties in writing scripts. In accordance with various mathematical topics, there are more tha...
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
@ThingsExpo has been named the Top 5 Most Influential Internet of Things Brand by Onalytica in the ‘The Internet of Things Landscape 2015: Top 100 Individuals and Brands.' Onalytica analyzed Twitter conversations around the #IoT debate to uncover the most influential brands and individuals driving the conversation. Onalytica captured data from 56,224 users. The PageRank based methodology they use to extract influencers on a particular topic (tweets mentioning #InternetofThings or #IoT in this ...
@ThingsExpo has been named the Top 5 Most Influential M2M Brand by Onalytica in the ‘Machine to Machine: Top 100 Influencers and Brands.' Onalytica analyzed the online debate on M2M by looking at over 85,000 tweets to provide the most influential individuals and brands that drive the discussion. According to Onalytica the "analysis showed a very engaged community with a lot of interactive tweets. The M2M discussion seems to be more fragmented and driven by some of the major brands present in the...
In the next forty months – just over three years – businesses will undergo extraordinary changes. The exponential growth of digitization and machine learning will see a step function change in how businesses create value, satisfy customers, and outperform their competition. In the next forty months companies will take the actions that will see them get to the next level of the game called Capitalism. Or they won’t – game over. The winners of today and tomorrow think differently, follow different...
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and ...
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
Virgil consists of an open-source encryption library, which implements Cryptographic Message Syntax (CMS) and Elliptic Curve Integrated Encryption Scheme (ECIES) (including RSA schema), a Key Management API, and a cloud-based Key Management Service (Virgil Keys). The Virgil Keys Service consists of a public key service and a private key escrow service. 

The Internet of Things (IoT), in all its myriad manifestations, has great potential. Much of that potential comes from the evolving data management and analytic (DMA) technologies and processes that allow us to gain insight from all of the IoT data that can be generated and gathered. This potential may never be met as those data sets are tied to specific industry verticals and single markets, with no clear way to use IoT data and sensor analytics to fulfill the hype being given the IoT today.
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
What happens when the different parts of a vehicle become smarter than the vehicle itself? As we move toward the era of smart everything, hundreds of entities in a vehicle that communicate with each other, the vehicle and external systems create a need for identity orchestration so that all entities work as a conglomerate. Much like an orchestra without a conductor, without the ability to secure, control, and connect the link between a vehicle’s head unit, devices, and systems and to manage the ...
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to impr...
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Amazon has gradually rolled out parts of its IoT offerings, but these are just the tip of the iceberg. In addition to optimizing their backend AWS offerings, Amazon is laying the ground work to be a major force in IoT - especially in the connected home and office. In his session at @ThingsExpo, Chris Kocher, founder and managing director of Grey Heron, explained how Amazon is extending its reach to become a major force in IoT by building on its dominant cloud IoT platform, its Dash Button strat...