Welcome!

Web 2.0 Authors: Pat Romanski, Elizabeth White, Plutora Blog, Liz McMillan, Jnan Dash

Related Topics: Java, SYS-CON MEDIA

Java: Article

i-Technology Viewpoint: Is Model Driven Architecture Coming Into Its Own?

MDA has been with us for years; will the arrival of meta-data bring it closer to the mainstream?

JDJ's Bill Dudney (pictured) writes: With the popularity of Object Relational Mapping tools like Hibernate and Cayenne, developers are more often than other giving control of some of their code to models. Will this help raise MDA into the mainstream? Will MDA take its hoped-for place as the next level of abstraction for developers?

What about MDA

Model Driven Architecture, also known as MDA, started in late 2000 with a white paper. Basically the idea is that we define the software we want to build in sophisticated models that capture the detail of the application. Then from these abstract models a series of transformations is applied to turn that abstract model into a running application. The highest-level model is referred to as a Platform Independent Model (PIM). There is an even more abstract model called a Computational Independent Model but we won't discuss that model. The PIM is, as the name suggests, independent of the deployment platform (i.e. .NET Java EE 5 etc.) In this model the business is specified, classes that make up the domain are fleshed out and specified. This model is then transformed into one or more Platform Specific Models (PSM) that can be elaborated with more detail specific to the platform. From the PSM a running application can be generated.

Now of course there is a need to put in your own business logic. Most tools today provide a way for you to edit the 'business logic' apart from the fully generated code. For example AndroMDA (an open source MDA tool) will generate some files only once (where your business logic is written). Other tools like OptimalJ take a different approach giving you code you can edit and 'protected blocks' that are part of the generated code. I'm sure there are other approaches as well that are taken by other tools.

The Promise

  • Productivity - the ability to raise the level of abstraction so that developers can become more productive is one of the greatest promises of MDA. Just as Java raised the level of abstraction from C/C++ so MDA raises the level abstraction for Java EE / .NET.
  • Portability - greater ease of migration between various technologies (such as .NET to Java EE) or between different versions of the same technology (i.e. EJB 2.1 to EJB 3.0). Once developed your PIM is the repository of knowledge about your application, so moving to another underlying technology is 'easy'.
  • Consistency - greater consistency of application architectural principals. This feature is hard to ignore. Many enterprise level projects have divergent architectures on the same project. Maintenance is very difficult in these projects to be sure.

The Problems

At its core MDA is about using meta-data to drive program creation. The idea is that if we can develop a sophisticated enough model to express software then we can fully generate the actual running program from the model (or even create a virtual machine that could execute the model). The problem with taking this idea too far of course is that we end up with just another platform. Probably even worse though is that it's 'programming with pictures' which was already tried at least once in the late 80s and failed miserably. Few are willing to try programming with pictures again.

Many proponents of the MDA approach like to say that given a PIM with sufficient detail one would be free to move between .NET, Java EE or to something like Hibernate & Spring assuming that you had the proper set of transformations for these other PSMs. While this is a great marketing pitch for MDA and the PIM it is just not that straightforward. A PSM has too much platform-specific knowledge buried in it to simply move between different technologies. After all the PSM is where the business logic that makes the application unique actually resides. In order to make this move all that logic must be changed to fit into the new target technology.

Finally, and probably most significantly, MDA must overcome the grass roots resistance to the idea. Developers like to develop. They don't like to have control taken from them. There is a fundamental distrust of code generation and a resistance to this type of abstraction.

Sea-Change?

There are a couple of moves afoot that make me think we might be on the verge of a change in perception about MDA. The first is a more pragmatic approach being taken by vendors. Instead of expecting developers to program in pictures, many vendors are taking a more pragmatic approach. Developers are expected to build more familiar UML class diagrams and annotate them. Few are expecting a full blown executable model.

Second and more significantly is the emergence of meta-data as a normal part of every day life for Java developers. Several years ago the XDoclet project started bringing meta-data into the mainstream. The 'killer-app' for XDoclet was that EJBs only needed the bean class, the rest of the required files (remote/local interfaces, deployment descriptor entries, value objects etc.) were updated/generated automatically by XDoclet. Many developers embraced this approach because of the reduced tedious work that had to be done. With XDoclet, developers no longer have to mess with keeping the remote and local interfaces in-sync with the implementation methods. Instead XDoclet automatically generates the remote and local interfaces.

At this point many MDA vendors and proponents should be saying to themselves, hey that is exactly what we have been doing for years! And it is, the difference is no visual model. The developers write the meta-data that would normally be in stereotypes and/or tagged values right into their code. For many this bridges a semantic gap that is missing in the visual modeling paradigm.

Back to the meta-data being more 'normal'. The other major change that has recently happened that brings meta-data front and center is the addition of Annotations to the Java SE 5 platform and especially the use of this meta-data in Java EE 5. Developers will be using meta-data on a daily basis.

Another thing that developers typically don't like about the MDA approach is the feeling of lack of control over what is generated. I have often heard the assertion that the developer could do better than the code generator and other such comments. While it is probably true that a hand-crafted piece of code would be 'better' in many respects, it is also true that the generated code can be done in a fraction of the time and is 'good enough'.

An area where developers have been resistant to adopt a meta-data driven approach in the past has been persistence. The same argument of 'I could do better' was used quite often. More recently though it seems that Object Relational Mapping frameworks like Hibernate and/or Cayenne have been gaining momentum. On the surface you might be thinking that I'm nuts to draw a comparison between R/O Mapping and MDA but the comparison is not that far off. Most MDA tools rely on code generation instead of a framework but basically it's the same kind of thing. Hibernate has a framework but could just as well generate code at run time (or imagine aspects being attached to your POJOs). Either way (framework or code generation) meta-data is driving the way objects are mapped to rows in tables.

With the current set of R/O mapping tools a lot of control is removed from the developer; what exact SQL is executed is no longer in the developers direct control. However many are willing to give up this control for the increased productivity gain allowed by using something like Hibernate. Who really wants to write all that JDBC code anyway?

So where does this leave us? Will MDA take its hoped -for place as the next level of abstraction for developers? Will MDA become the next best thing that is relegated to the dustbin of history? Hard to say for sure, one thing is for certain though, meta-data is becoming more a part of developers everyday lives.

More Stories By Bill Dudney

Bill Dudney is Editor-in-Chief of Eclipse Developer's Journal and serves too as JDJ's Eclipse editor. He is a Practice Leader with Virtuas Solutions and has been doing Java development since late 1996 after he downloaded his first copy of the JDK. Prior to Virtuas, Bill worked for InLine Software on the UML bridge that tied UML Models in Rational Rose and later XMI to the InLine suite of tools. Prior to getting hooked on Java he built software on NeXTStep (precursor to Apple's OSX). He has roughly 15 years of distributed software development experience starting at NASA building software to manage the mass properties of the Space Shuttle.

Comments (1) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
Joe Gaber 10/15/05 07:27:38 PM EDT

Here are comments related to snippets of your article:

1. "Developers like to develop" - this is true; however, developers have developed using a wide variety of programming languages and OCL (instrumental to the fulfillment of MDA) is another programming language (in fact its similar to the much aclaimed language smalltalk), and in the same way that Java didn't become a big deal until Java 2, OCL and UML (now at 2.0) are likely to now gain the same acceptance. Also, see my blog for a way for developers and architects to work in pairs in a version of Agile Modeling that I am professing is the missing link between modeling and programming collaboration.

2. "There is a fundamental distrust of code generation and a resistance to this type of abstraction" - isn't any 3rd GL (i.e., Java) an abstraction, or two, from 0s and 1s?? And, doesn't the entire J2EE array of APIs provide even more abstraction from code writting?? And, doesn't every IDE on the market today provide all types of functionality that helps write and refactor code?? Conclusion: If it wasn't for increasing abstractions from the 0s and 1s a CPU uses, we would be producing the same amount of 0s and 1s today as 30 years ago. The fact that we (developers) produce vastly greater amounts of machine code through the "abstract" languages and tools of today then we did before is what produces "productivity". At the point productivity ceases to increase, programming (wirtting code) becomes a commodity given to the lowest bidder.

3. "The developers write the meta-data that would normally be in stereotypes and/or tagged values right into their code. For many this bridges a semantic gap that is missing in the visual modeling paradigm" - whether you write meta-data in the code or in a model it only matters from a business standpoint not an engineering standpoint. What I mean by this is that business models far outlive the applications that fulfill the business's objective. Modeling the business domain, its processes, and transforming that into code provides the business with a much longer lived artifact than code. You can use XDoclet, Java 5 annotation, or an external transformation/mapping file (as with MDA), it doesn't matter, the fact is you are still using the meta-data to generate specific code. Also, as far as the idea of a "visual modeling paradigm", once you add the rich semantics into the visual elements of the model, you know have much more than pictures. You have a programmically rich artifact with much more power than just written code.

4. "Another thing that developers typically don't like about the MDA approach is the feeling of lack of control over what is generated" - in the case of a MDA tool like the open-source Andromda, what is generated is exactly what you specify in the templates, metafades, and specific cartridge descripter files used. All of which is under the complete control of whoever the busines decides, including developers. I am currently writting a cartridge for Andromda to generate specific applications using the Sprng Framework, including Spring's MVC implementation as well as to include DWR(AJAX) integrated with Spring. There is no restrictions to the way I decide for the code to be generated as I am the one creating the templates that produce the code.

The arguements about how MDA, CASE tools, CORBA, Meta-data, etc have not been embraced in the past is an indication of the future of MDA is, IMO, absurd. In order for MDA to succeed, there has had to be, and continues to be, numerous different technologies and methodologies to converge. MDA is in its infancy in terms of being a practical approach to software development. The fact is that MDA is highly dependent upon tools, and quite frankly, no tool of the past has had the power, functionality, vision, etc to truly meet the needs of MDA. That is changing as we speak, with virtually every major tool vendor, and the two leading OS IDEs (Eclipse and Netbeans), introducing new products, or projects, which include UML modeling, OCL, BPM, and MDA functionality all wrapped up into a compete high productivity tool.

Best regards,

@ThingsExpo Stories
Today’s enterprise is being driven by disruptive competitive and human capital requirements to provide enterprise application access through not only desktops, but also mobile devices. To retrofit existing programs across all these devices using traditional programming methods is very costly and time consuming – often prohibitively so. In his session at @ThingsExpo, Jesse Shiah, CEO, President, and Co-Founder of AgilePoint Inc., discussed how you can create applications that run on all mobile devices as well as laptops and desktops using a visual drag-and-drop application – and eForms-buildi...
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
Code Halos - aka "digital fingerprints" - are the key organizing principle to understand a) how dumb things become smart and b) how to monetize this dynamic. In his session at @ThingsExpo, Robert Brown, AVP, Center for the Future of Work at Cognizant Technology Solutions, outlined research, analysis and recommendations from his recently published book on this phenomena on the way leading edge organizations like GE and Disney are unlocking the Internet of Things opportunity and what steps your organization should be taking to position itself for the next platform of digital competition.
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial Cloud.
SYS-CON Media announced that Splunk, a provider of the leading software platform for real-time Operational Intelligence, has launched an ad campaign on Big Data Journal. Splunk software and cloud services enable organizations to search, monitor, analyze and visualize machine-generated big data coming from websites, applications, servers, networks, sensors and mobile devices. The ads focus on delivering ROI - how improved uptime delivered $6M in annual ROI, improving customer operations by mining large volumes of unstructured data, and how data tracking delivers uptime when it matters most.
Things are being built upon cloud foundations to transform organizations. This CEO Power Panel at 15th Cloud Expo, moderated by Roger Strukhoff, Cloud Expo and @ThingsExpo conference chair, addressed the big issues involving these technologies and, more important, the results they will achieve. Rodney Rogers, chairman and CEO of Virtustream; Brendan O'Brien, co-founder of Aria Systems, Bart Copeland, president and CEO of ActiveState Software; Jim Cowie, chief scientist at Dyn; Dave Wagstaff, VP and chief architect at BSQUARE Corporation; Seth Proctor, CTO of NuoDB, Inc.; and Andris Gailitis, C...
SYS-CON Media announced that Cisco, a worldwide leader in IT that helps companies seize the opportunities of tomorrow, has launched a new ad campaign in Cloud Computing Journal. The ad campaign, a webcast titled 'Is Your Data Center Ready for the Application Economy?', focuses on the latest data center networking technologies, including SDN or ACI, and how customers are using SDN and ACI in their organizations to achieve business agility. The Cisco webcast is available on-demand.
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, data security and privacy.
IoT is still a vague buzzword for many people. In his session at @ThingsExpo, Mike Kavis, Vice President & Principal Cloud Architect at Cloud Technology Partners, discussed the business value of IoT that goes far beyond the general public's perception that IoT is all about wearables and home consumer services. He also discussed how IoT is perceived by investors and how venture capitalist access this space. Other topics discussed were barriers to success, what is new, what is old, and what the future may hold. Mike Kavis is Vice President & Principal Cloud Architect at Cloud Technology Pa...
The Internet of Things (IoT) is rapidly in the process of breaking from its heretofore relatively obscure enterprise applications (such as plant floor control and supply chain management) and going mainstream into the consumer space. More and more creative folks are interconnecting everyday products such as household items, mobile devices, appliances and cars, and unleashing new and imaginative scenarios. We are seeing a lot of excitement around applications in home automation, personal fitness, and in-car entertainment and this excitement will bleed into other areas. On the commercial side, m...
SYS-CON Events announced today that CodeFutures, a leading supplier of database performance tools, has been named a “Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. CodeFutures is an independent software vendor focused on providing tools that deliver database performance tools that increase productivity during database development and increase database performance and scalability during production.
Dale Kim is the Director of Industry Solutions at MapR. His background includes a variety of technical and management roles at information technology companies. While his experience includes work with relational databases, much of his career pertains to non-relational data in the areas of search, content management, and NoSQL, and includes senior roles in technical marketing, sales engineering, and support engineering. Dale holds an MBA from Santa Clara University, and a BA in Computer Science from the University of California, Berkeley.
The Internet of Things (IoT) promises to evolve the way the world does business; however, understanding how to apply it to your company can be a mystery. Most people struggle with understanding the potential business uses or tend to get caught up in the technology, resulting in solutions that fail to meet even minimum business goals. In his session at @ThingsExpo, Jesse Shiah, CEO / President / Co-Founder of AgilePoint Inc., showed what is needed to leverage the IoT to transform your business. He discussed opportunities and challenges ahead for the IoT from a market and technical point of vie...
"People are a lot more knowledgeable about APIs now. There are two types of people who work with APIs - IT people who want to use APIs for something internal and the product managers who want to do something outside APIs for people to connect to them," explained Roberto Medrano, Executive Vice President at SOA Software, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Performance is the intersection of power, agility, control, and choice. If you value performance, and more specifically consistent performance, you need to look beyond simple virtualized compute. Many factors need to be considered to create a truly performant environment. In his General Session at 15th Cloud Expo, Harold Hannon, Sr. Software Architect at SoftLayer, discussed how to take advantage of a multitude of compute options and platform features to make cloud the cornerstone of your online presence.
In this Women in Technology Power Panel at 15th Cloud Expo, moderated by Anne Plese, Senior Consultant, Cloud Product Marketing at Verizon Enterprise, Esmeralda Swartz, CMO at MetraTech; Evelyn de Souza, Data Privacy and Compliance Strategy Leader at Cisco Systems; Seema Jethani, Director of Product Management at Basho Technologies; Victoria Livschitz, CEO of Qubell Inc.; Anne Hungate, Senior Director of Software Quality at DIRECTV, discussed what path they took to find their spot within the technology industry and how do they see opportunities for other women in their area of expertise.
DevOps Summit 2015 New York, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that it is now accepting Keynote Proposals. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential.
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity.
"BSQUARE is in the business of selling software solutions for smart connected devices. It's obvious that IoT has moved from being a technology to being a fundamental part of business, and in the last 18 months people have said let's figure out how to do it and let's put some focus on it, " explained Dave Wagstaff, VP & Chief Architect, at BSQUARE Corporation, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Advanced Persistent Threats (APTs) are increasing at an unprecedented rate. The threat landscape of today is drastically different than just a few years ago. Attacks are much more organized and sophisticated. They are harder to detect and even harder to anticipate. In the foreseeable future it's going to get a whole lot harder. Everything you know today will change. Keeping up with this changing landscape is already a daunting task. Your organization needs to use the latest tools, methods and expertise to guard against those threats. But will that be enough? In the foreseeable future attacks w...