Click here to close now.

Welcome!

Web 2.0 Authors: Yakov Fain, Liz McMillan, Carmen Gonzalez, Pat Romanski, Lori MacVittie

Related Topics: Java, Eclipse

Java: Article

i-Technology Viewpoint: The Future of Software Tools

i-Technology Viewpoint: The Future of Software Tools

In a recent press interview I was asked what I thought were some of the important trends for the future of software tools.

It's an interesting question, with many facets, so I was not sure how to respond. After some thought, here are the five areas I chose to highlight from my context of design and construction tool strategy. These are areas that have been occupying much of my thinking and discussions lately with customers, IBM Research, and the Rational teams. These are changing the kinds of software tools we are delivering, and the features the tools support.

1. Connecting business with IT: Business-driven development. The importance of understanding the business context for IT investment has never been more obvious than it is today. More organizations are recognizing the role of IT as a determining factor in the efficiency of their operations, and a bottleneck in their ability to innovate.

I am spending a lot of time with customers who want to explore business alternatives, drive IT projects more directly from business needs with well established business goals and ROI, choreograph services to realize their business processes, and monitor those services in execution to relate operations to the needs of the business. Support for that flow (in its myriad variations) is essential. As we use the current generation of tools in this context we are seeing the emergence of new roles, usage scenarios, and support needs. The lessons from this work are leading to a complete refactoring of tooling capabilities.


2. Greater transparency in the software development process:
Auditing, traceability, and accountability. Software plays a pivotal role in all our lives. It runs our financial institutions, controls the power and utility infrastructure, is embedded in almost every useful device we use, and so on. With this important role comes a certain responsibility.

Government regulators, lawyers, and auditors are beginning to pay increasing attention to the software industry to verify that the software we all rely on has been developed according to some provable quality standards. Sarbanes-Oxley and BASEL2 are just the tip of a very large iceburg. For example, in discussions with those in the auto industry I was overwhelmed by the role software plays in the design, manufacture, control, and management of automobiles, and the kinds of requirements they need fulfiled by the software tools they are using.

Suppose there is a major design flaw in the software managing the anti-lock brakes on a popular model of car that results in injury of a number of people. How does the manufacturer of the braking system prove that it was not negilgent in the design and implementation of that software? Were the engineers developing the software certified against some recognized standards? Were the processes used to develop the software audited for quality? How were software designs analyzed and validated before they were put into production? And so on.

This kind of rigour and auditability will become the norm. Tools must permit this level of access and control. I refer to this as transparency...of prcoess, design, realization, etc. New tooling will emerge that supports and enforces these design principles. Traceability and reporting at all levels will become essential.


3. RAD using new programming models: As Grady Booch likes to say, software drives the world's economies and in some regards we can consider software development to be the most important job in the world!. Yet we all know that the skills and qualities of the best software engineers are in short supply.

It must be possible for a larger community of people to develop sophisticated enterprise solutions and deploy them to heterogeneous runtime environments. We have a long way to go to make this happen. The gap between the way domain-focused users view the problem space and the way in which they must describe systems in the solution space is far too great. In the past, various ways of addressing this gap with CASE tools and 4GLs solved part of the problem, but created their own challenges in return (e.g., proprietary runtime layers, non-standard artifacts, lack of openness to integrate with other systems and services, inflexible high-ceremony design approaches, and so on).

Over the last few years we have seen ways to overcome these limitations with the emergence of robust patterns and frameworks for application development in many technology and business domains. We can raise the abstraction of programming model to be closer to the end-users' mental model of their problem space and use the patterns and frameworks to transform that to the solutions space for today's technologies. Techniques such as generative programming and MDA are a realization of this. We are seeing a lot of innovation in the software tools here.


4. Collaboration among individuals and teams: Much of the inefficiency in software development is a result of the friction between individuals and teams as they work together to share a common understanding of some element of concern, investigate issues from multiple perspectives to make a balanced decision, solve multi-dimensional problems, and so on.

There are many great advances in collaborative tools for interaction and sharing. It's great to be able to start a chat session in a new window while understanding a new piece of code, to view a remote desktop to see the problem a customer is experiencing in their environment, or to create a teleconference as needed to resolve a design issues among colleagues. But there is much more to be done to make those kinds of capabilities part of the software tools workbench of the teams. We'll see those ideas become much better aligned with software development tools so that software engineers can more easily work together on all aspects of the design process, and we'll see design practices evolve to take better advantage of their capabilities.


5. "Pay-per-use" software tools:
New licensing and subscription offerings. There are many pressures on software tool vendors to change the way in which tools are packaged and delivered. Initiatives such as open source software and hosted services via ASPs challenge conventional thinking on software tools.

We've seen some reaction in the marketplace (e.g., open source development tools workbenches such as Eclipse, and on-line testing services from different vendors). Customers are demanding more -- greater flexibility in how software tools are delivered, less overhead in upgrading software tools, more creative pricing based on how the tool is used, when it is used, and how much of it is used.

We are working on different kinds of software tool offerings in response to this by re-factoring the products we offer, increasing the ease with which different tool capabilities can be interchanged, and allowing access to software tool capabilities in a variety of access modes (various flavors of fat client and thin client access). Safe-to-say that lots of people building software in the future will not be buying, installing, and using tools in the way they do today.

Alan W. Brown blogged these comments originally at developerWorks.com. Reproduced here with the kind permission of the author.

More Stories By Alan W. Brown

Alan W. Brown is a Distinguished Engineer at IBM Rational software responsible for future product strategy of IBM Rational's Design and Construction products. He defines technical strategy and evangelizes product direction with customers looking to improve software development efficiency through visual modeling, generating code from abstract models, and systematic reuse.

Comments (2) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
Nice piece 11/22/04 07:57:23 AM EST

Refreshing to come at this from a tools perspective instead on la-di-da generalizations. The future of technology is better viewed through the lens of tools than the rose-tinted perspectives most CEOs trot out.

Toolsman 11/22/04 07:47:53 AM EST

Much of the inefficiency in software development is a result of the friction between individuals and teams as they work together to share a common understanding of some element of concern, investigate issues from multiple perspectives to make a balanced decision, solve multi-dimensional problems, and so on

How true. The human element is the Great Imponderable that not enough people seem to think, let alone write, about. great article!

@ThingsExpo Stories
From telemedicine to smart cars, digital homes and industrial monitoring, the explosive growth of IoT has created exciting new business opportunities for real time calls and messaging. In his session at @ThingsExpo, Ivelin Ivanov, CEO and Co-Founder of Telestax, shared some of the new revenue sources that IoT created for Restcomm – the open source telephony platform from Telestax. Ivelin Ivanov is a technology entrepreneur who founded Mobicents, an Open Source VoIP Platform, to help create, deploy, and manage applications integrating voice, video and data. He is the co-founder of TeleStax, a...
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focused on understanding how industrial data can create intelligence for industrial operations. Imagine ...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing and analyzing streaming data is the Lambda Architecture, representing a model of how to analyze rea...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes for use cases across the industrial, enterprise, and consumer segments.
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data they generate about customer usage and product performance to deliver extremely compelling and reliabl...
SYS-CON Events announced today that CodeFutures, a leading supplier of database performance tools, has been named a “Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. CodeFutures is an independent software vendor focused on providing tools that deliver database performance tools that increase productivity during database development and increase database performance and scalability during production.
The IoT market is projected to be $1.9 trillion tidal wave that’s bigger than the combined market for smartphones, tablets and PCs. While IoT is widely discussed, what not being talked about are the monetization opportunities that are created from ubiquitous connectivity and the ensuing avalanche of data. While we cannot foresee every service that the IoT will enable, we should future-proof operations by preparing to monetize them with extremely agile systems.
There’s Big Data, then there’s really Big Data from the Internet of Things. IoT is evolving to include many data possibilities like new types of event, log and network data. The volumes are enormous, generating tens of billions of logs per day, which raise data challenges. Early IoT deployments are relying heavily on both the cloud and managed service providers to navigate these challenges. Learn about IoT, Big Data and deployments processing massive data volumes from wearables, utilities and other machines.
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
“In the past year we've seen a lot of stabilization of WebRTC. You can now use it in production with a far greater degree of certainty. A lot of the real developments in the past year have been in things like the data channel, which will enable a whole new type of application," explained Peter Dunkley, Technical Director at Acision, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that Intelligent Systems Services will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Established in 1994, Intelligent Systems Services Inc. is located near Washington, DC, with representatives and partners nationwide. ISS’s well-established track record is based on the continuous pursuit of excellence in designing, implementing and supporting nationwide clients’ mission-critical systems. ISS has completed many successful projects in Healthcare, Commercial, Manufacturing, ...
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add scalable realtime functionality with minimal effort and cost.”
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo (http://www.CloudComputingExpo.com), moderated by Ashar Baig, Research Director, Cloud, at Gigaom Research, Nate Gordon, Director of T...
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, including what it is today, what it might ultimately be, the role of wearable tech, and technology gaps stil...
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to make sense of it all.
The Internet of Everything (IoE) brings together people, process, data and things to make networked connections more relevant and valuable than ever before – transforming information into knowledge and knowledge into wisdom. IoE creates new capabilities, richer experiences, and unprecedented opportunities to improve business and government operations, decision making and mission support capabilities. In his session at @ThingsExpo, Gary Hall, Chief Technology Officer, Federal Defense at Cisco Systems, will break down the core capabilities of IoT in multiple settings and expand upon IoE for bo...
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics archive, in his session at @ThingsExpo, Jim Kaskade, Vice President and General Manager, Big Data & Ana...