Agile Computing Authors: Elizabeth White, Liz McMillan, William Schmarzo, Carmen Gonzalez, Jim Kaskade

Related Topics: Java IoT, SYS-CON MEDIA

Java IoT: Article

i-Technology Viewpoint: Is Model Driven Architecture Coming Into Its Own?

MDA has been with us for years; will the arrival of meta-data bring it closer to the mainstream?

JDJ's Bill Dudney (pictured) writes: With the popularity of Object Relational Mapping tools like Hibernate and Cayenne, developers are more often than other giving control of some of their code to models. Will this help raise MDA into the mainstream? Will MDA take its hoped-for place as the next level of abstraction for developers?

What about MDA

Model Driven Architecture, also known as MDA, started in late 2000 with a white paper. Basically the idea is that we define the software we want to build in sophisticated models that capture the detail of the application. Then from these abstract models a series of transformations is applied to turn that abstract model into a running application. The highest-level model is referred to as a Platform Independent Model (PIM). There is an even more abstract model called a Computational Independent Model but we won't discuss that model. The PIM is, as the name suggests, independent of the deployment platform (i.e. .NET Java EE 5 etc.) In this model the business is specified, classes that make up the domain are fleshed out and specified. This model is then transformed into one or more Platform Specific Models (PSM) that can be elaborated with more detail specific to the platform. From the PSM a running application can be generated.

Now of course there is a need to put in your own business logic. Most tools today provide a way for you to edit the 'business logic' apart from the fully generated code. For example AndroMDA (an open source MDA tool) will generate some files only once (where your business logic is written). Other tools like OptimalJ take a different approach giving you code you can edit and 'protected blocks' that are part of the generated code. I'm sure there are other approaches as well that are taken by other tools.

The Promise

  • Productivity - the ability to raise the level of abstraction so that developers can become more productive is one of the greatest promises of MDA. Just as Java raised the level of abstraction from C/C++ so MDA raises the level abstraction for Java EE / .NET.
  • Portability - greater ease of migration between various technologies (such as .NET to Java EE) or between different versions of the same technology (i.e. EJB 2.1 to EJB 3.0). Once developed your PIM is the repository of knowledge about your application, so moving to another underlying technology is 'easy'.
  • Consistency - greater consistency of application architectural principals. This feature is hard to ignore. Many enterprise level projects have divergent architectures on the same project. Maintenance is very difficult in these projects to be sure.

The Problems

At its core MDA is about using meta-data to drive program creation. The idea is that if we can develop a sophisticated enough model to express software then we can fully generate the actual running program from the model (or even create a virtual machine that could execute the model). The problem with taking this idea too far of course is that we end up with just another platform. Probably even worse though is that it's 'programming with pictures' which was already tried at least once in the late 80s and failed miserably. Few are willing to try programming with pictures again.

Many proponents of the MDA approach like to say that given a PIM with sufficient detail one would be free to move between .NET, Java EE or to something like Hibernate & Spring assuming that you had the proper set of transformations for these other PSMs. While this is a great marketing pitch for MDA and the PIM it is just not that straightforward. A PSM has too much platform-specific knowledge buried in it to simply move between different technologies. After all the PSM is where the business logic that makes the application unique actually resides. In order to make this move all that logic must be changed to fit into the new target technology.

Finally, and probably most significantly, MDA must overcome the grass roots resistance to the idea. Developers like to develop. They don't like to have control taken from them. There is a fundamental distrust of code generation and a resistance to this type of abstraction.


There are a couple of moves afoot that make me think we might be on the verge of a change in perception about MDA. The first is a more pragmatic approach being taken by vendors. Instead of expecting developers to program in pictures, many vendors are taking a more pragmatic approach. Developers are expected to build more familiar UML class diagrams and annotate them. Few are expecting a full blown executable model.

Second and more significantly is the emergence of meta-data as a normal part of every day life for Java developers. Several years ago the XDoclet project started bringing meta-data into the mainstream. The 'killer-app' for XDoclet was that EJBs only needed the bean class, the rest of the required files (remote/local interfaces, deployment descriptor entries, value objects etc.) were updated/generated automatically by XDoclet. Many developers embraced this approach because of the reduced tedious work that had to be done. With XDoclet, developers no longer have to mess with keeping the remote and local interfaces in-sync with the implementation methods. Instead XDoclet automatically generates the remote and local interfaces.

At this point many MDA vendors and proponents should be saying to themselves, hey that is exactly what we have been doing for years! And it is, the difference is no visual model. The developers write the meta-data that would normally be in stereotypes and/or tagged values right into their code. For many this bridges a semantic gap that is missing in the visual modeling paradigm.

Back to the meta-data being more 'normal'. The other major change that has recently happened that brings meta-data front and center is the addition of Annotations to the Java SE 5 platform and especially the use of this meta-data in Java EE 5. Developers will be using meta-data on a daily basis.

Another thing that developers typically don't like about the MDA approach is the feeling of lack of control over what is generated. I have often heard the assertion that the developer could do better than the code generator and other such comments. While it is probably true that a hand-crafted piece of code would be 'better' in many respects, it is also true that the generated code can be done in a fraction of the time and is 'good enough'.

An area where developers have been resistant to adopt a meta-data driven approach in the past has been persistence. The same argument of 'I could do better' was used quite often. More recently though it seems that Object Relational Mapping frameworks like Hibernate and/or Cayenne have been gaining momentum. On the surface you might be thinking that I'm nuts to draw a comparison between R/O Mapping and MDA but the comparison is not that far off. Most MDA tools rely on code generation instead of a framework but basically it's the same kind of thing. Hibernate has a framework but could just as well generate code at run time (or imagine aspects being attached to your POJOs). Either way (framework or code generation) meta-data is driving the way objects are mapped to rows in tables.

With the current set of R/O mapping tools a lot of control is removed from the developer; what exact SQL is executed is no longer in the developers direct control. However many are willing to give up this control for the increased productivity gain allowed by using something like Hibernate. Who really wants to write all that JDBC code anyway?

So where does this leave us? Will MDA take its hoped -for place as the next level of abstraction for developers? Will MDA become the next best thing that is relegated to the dustbin of history? Hard to say for sure, one thing is for certain though, meta-data is becoming more a part of developers everyday lives.

More Stories By Bill Dudney

Bill Dudney is Editor-in-Chief of Eclipse Developer's Journal and serves too as JDJ's Eclipse editor. He is a Practice Leader with Virtuas Solutions and has been doing Java development since late 1996 after he downloaded his first copy of the JDK. Prior to Virtuas, Bill worked for InLine Software on the UML bridge that tied UML Models in Rational Rose and later XMI to the InLine suite of tools. Prior to getting hooked on Java he built software on NeXTStep (precursor to Apple's OSX). He has roughly 15 years of distributed software development experience starting at NASA building software to manage the mass properties of the Space Shuttle.

Comments (1) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

Most Recent Comments
Joe Gaber 10/15/05 07:27:38 PM EDT

Here are comments related to snippets of your article:

1. "Developers like to develop" - this is true; however, developers have developed using a wide variety of programming languages and OCL (instrumental to the fulfillment of MDA) is another programming language (in fact its similar to the much aclaimed language smalltalk), and in the same way that Java didn't become a big deal until Java 2, OCL and UML (now at 2.0) are likely to now gain the same acceptance. Also, see my blog for a way for developers and architects to work in pairs in a version of Agile Modeling that I am professing is the missing link between modeling and programming collaboration.

2. "There is a fundamental distrust of code generation and a resistance to this type of abstraction" - isn't any 3rd GL (i.e., Java) an abstraction, or two, from 0s and 1s?? And, doesn't the entire J2EE array of APIs provide even more abstraction from code writting?? And, doesn't every IDE on the market today provide all types of functionality that helps write and refactor code?? Conclusion: If it wasn't for increasing abstractions from the 0s and 1s a CPU uses, we would be producing the same amount of 0s and 1s today as 30 years ago. The fact that we (developers) produce vastly greater amounts of machine code through the "abstract" languages and tools of today then we did before is what produces "productivity". At the point productivity ceases to increase, programming (wirtting code) becomes a commodity given to the lowest bidder.

3. "The developers write the meta-data that would normally be in stereotypes and/or tagged values right into their code. For many this bridges a semantic gap that is missing in the visual modeling paradigm" - whether you write meta-data in the code or in a model it only matters from a business standpoint not an engineering standpoint. What I mean by this is that business models far outlive the applications that fulfill the business's objective. Modeling the business domain, its processes, and transforming that into code provides the business with a much longer lived artifact than code. You can use XDoclet, Java 5 annotation, or an external transformation/mapping file (as with MDA), it doesn't matter, the fact is you are still using the meta-data to generate specific code. Also, as far as the idea of a "visual modeling paradigm", once you add the rich semantics into the visual elements of the model, you know have much more than pictures. You have a programmically rich artifact with much more power than just written code.

4. "Another thing that developers typically don't like about the MDA approach is the feeling of lack of control over what is generated" - in the case of a MDA tool like the open-source Andromda, what is generated is exactly what you specify in the templates, metafades, and specific cartridge descripter files used. All of which is under the complete control of whoever the busines decides, including developers. I am currently writting a cartridge for Andromda to generate specific applications using the Sprng Framework, including Spring's MVC implementation as well as to include DWR(AJAX) integrated with Spring. There is no restrictions to the way I decide for the code to be generated as I am the one creating the templates that produce the code.

The arguements about how MDA, CASE tools, CORBA, Meta-data, etc have not been embraced in the past is an indication of the future of MDA is, IMO, absurd. In order for MDA to succeed, there has had to be, and continues to be, numerous different technologies and methodologies to converge. MDA is in its infancy in terms of being a practical approach to software development. The fact is that MDA is highly dependent upon tools, and quite frankly, no tool of the past has had the power, functionality, vision, etc to truly meet the needs of MDA. That is changing as we speak, with virtually every major tool vendor, and the two leading OS IDEs (Eclipse and Netbeans), introducing new products, or projects, which include UML modeling, OCL, BPM, and MDA functionality all wrapped up into a compete high productivity tool.

Best regards,

@ThingsExpo Stories
According to Forrester Research, every business will become either a digital predator or digital prey by 2020. To avoid demise, organizations must rapidly create new sources of value in their end-to-end customer experiences. True digital predators also must break down information and process silos and extend digital transformation initiatives to empower employees with the digital resources needed to win, serve, and retain customers.
SYS-CON Events announced today that SoftNet Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. SoftNet Solutions specializes in Enterprise Solutions for Hadoop and Big Data. It offers customers the most open, robust, and value-conscious portfolio of solutions, services, and tools for the shortest route to success with Big Data. The unique differentiator is the ability to architect and ...
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
A completely new computing platform is on the horizon. They’re called Microservers by some, ARM Servers by others, and sometimes even ARM-based Servers. No matter what you call them, Microservers will have a huge impact on the data center and on server computing in general. Although few people are familiar with Microservers today, their impact will be felt very soon. This is a new category of computing platform that is available today and is predicted to have triple-digit growth rates for some ...
In past @ThingsExpo presentations, Joseph di Paolantonio has explored how various Internet of Things (IoT) and data management and analytics (DMA) solution spaces will come together as sensor analytics ecosystems. This year, in his session at @ThingsExpo, Joseph di Paolantonio from DataArchon, will be adding the numerous Transportation areas, from autonomous vehicles to “Uber for containers.” While IoT data in any one area of Transportation will have a huge impact in that area, combining sensor...
SYS-CON Events announced today that Transparent Cloud Computing (T-Cloud) Consortium will exhibit at the 19th International Cloud Expo®, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The Transparent Cloud Computing Consortium (T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data proces...
SYS-CON Events announced today that Niagara Networks will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Niagara Networks offers the highest port-density systems, and the most complete Next-Generation Network Visibility systems including Network Packet Brokers, Bypass Switches, and Network TAPs.
In today's uber-connected, consumer-centric, cloud-enabled, insights-driven, multi-device, global world, the focus of solutions has shifted from the product that is sold to the person who is buying the product or service. Enterprises have rebranded their business around the consumers of their products. The buyer is the person and the focus is not on the offering. The person is connected through multiple devices, wearables, at home, on the road, and in multiple locations, sometimes simultaneously...
Everyone knows that truly innovative companies learn as they go along, pushing boundaries in response to market changes and demands. What's more of a mystery is how to balance innovation on a fresh platform built from scratch with the legacy tech stack, product suite and customers that continue to serve as the business' foundation. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, will discuss why and how ReadyTalk diverted from healthy revenue an...
SYS-CON Media announced today that @WebRTCSummit Blog, the largest WebRTC resource in the world, has been launched. @WebRTCSummit Blog offers top articles, news stories, and blog posts from the world's well-known experts and guarantees better exposure for its authors than any other publication. @WebRTCSummit Blog can be bookmarked ▸ Here @WebRTCSummit conference site can be bookmarked ▸ Here
Intelligent machines are here. Robots, self-driving cars, drones, bots and many IoT devices are becoming smarter with Machine Learning. In her session at @ThingsExpo, Sudha Jamthe, CEO of IoTDisruptions.com, will discuss the next wave of business disruption at the junction of IoT and AI, impacting many industries and set to change our lives, work and world as we know it.
Established in 1998, Calsoft is a leading software product engineering Services Company specializing in Storage, Networking, Virtualization and Cloud business verticals. Calsoft provides End-to-End Product Development, Quality Assurance Sustenance, Solution Engineering and Professional Services expertise to assist customers in achieving their product development and business goals. The company's deep domain knowledge of Storage, Virtualization, Networking and Cloud verticals helps in delivering ...
The Internet of Things (IoT), in all its myriad manifestations, has great potential. Much of that potential comes from the evolving data management and analytic (DMA) technologies and processes that allow us to gain insight from all of the IoT data that can be generated and gathered. This potential may never be met as those data sets are tied to specific industry verticals and single markets, with no clear way to use IoT data and sensor analytics to fulfill the hype being given the IoT today.
SYS-CON Events announced today that Hitrons Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Hitrons Solutions Inc. is distributor in the North American market for unique products and services of small and medium-size businesses, including cloud services and solutions, SEO marketing platforms, and mobile applications.
OnProcess Technology has announced it will be a featured speaker at @ThingsExpo, taking place November 1 - 3, 2016, in Santa Clara, California. Dan Gettens, OnProcess’ Chief Analytics Officer, will discuss how Internet of Things (IoT) data can be leveraged to predict product failures, improve uptime and slash costly inventory stock. @ThingsExpo is an annual gathering of IoT and cloud developers, practitioners and thought-leaders who exchange ideas and insights on topics ranging from Big Data in...
SYS-CON Events announced today that Enzu will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Enzu’s mission is to be the leading provider of enterprise cloud solutions worldwide. Enzu enables online businesses to use its IT infrastructure to their competitive advantage. By offering a suite of proven hosting and management services, Enzu wants companies to focus on the core of their online busine...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
The Open Connectivity Foundation (OCF), sponsor of the IoTivity open source project, and AllSeen Alliance, which provides the AllJoyn® open source IoT framework, today announced that the two organizations’ boards have approved a merger under the OCF name and bylaws. This merger will advance interoperability between connected devices from both groups, enabling the full operating potential of IoT and representing a significant step towards a connected ecosystem.
November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Penta Security is a leading vendor for data security solutions, including its encryption solution, D’Amo. By using FPE technology, D’Amo allows for the implementation of encryption technology to sensitive data fields without modification to schema in the database environment. With businesses having their data become increasingly more complicated in their mission-critical applications (such as ERP, CRM, HRM), continued ...
SYS-CON Events announced today that Embotics, the cloud automation company, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Embotics is the cloud automation company for IT organizations and service providers that need to improve provisioning or enable self-service capabilities. With a relentless focus on delivering a premier user experience and unmatched customer support, Embotics is the fas...