Welcome!

Web 2.0 Authors: Liz McMillan, Elizabeth White, Pat Romanski, Natalie Lerner, Dana Gardner

Blog Feed Post

MaaS applied to Healthcare – Use Case Practice

MaaS (Model as a Service) might allow building and controlling shared healthcare Cloud-ready data, affording agile data design, economies of scale and maintaining a trusted environment and scaling security. With MaaS, models map infrastructure and allow controlling persistent storage and deployment audit in order to certify th at data are coherent and remain linked to specific storage. As a consequence, models allow to check where data is deployed and stored. MaaS can play a crucial role in supplying services in healthcare: the model containing infrastructure properties includes information to classify the on-premise data Cloud service in terms of data security, coherence, outage, availability, geo-location and to secure an assisted service deployment and virtualization.

Introduction
Municipalities are opening new exchange information with healthcare institutes. The objective is sharing medical research, hospital acceptance by pathology, assistance and hospitalization with doctors, hospitals, clinics and, of course, patients. This open data [6] should improve patient care, prevention, prophylaxis and appropriate medical booking and scheduling by making information sharing more timely and efficient. From the data management point of view it means the service should assure data elasticity, multi-tenancy, scalability, security together with physical and logical architectures that represent the guidelines to design healthcare services.

Accordingly, healthcare services in the Cloud must primarily secure the following data properties [2]:
-      data location;
-      data persistence;
-      data discovery and navigation;
-      data inference;
-      confidentiality;
-      availability;
-      on-demand data secure deleting/shredding [4] [5] [11] [12].

These properties should be defined during the service design and data models play the “on-premise” integral role in defining, managing and protecting healthcare data in the Cloud. When creating healthcare data models, the service is created as well and properties for confidentiality, availability, authenticity, authorization, authentication and integrity [12] have to be defined inside: here is how MaaS provides preconfigured service properties.

Applying MaaS to Healthcare – Getting Practice
Applying MaaS to design and deploy healthcare services means explaining how apply the DaaS (Database as a Service, see [2] and [4]) lifecycle to realize faster and positive impacts on the go-live preparation with Cloud services. The Use Case introduces the practices how could be defined the healthcare service and then to translate them into the appropriate guidelines. Therefore, the DaaS lifecycle service practices we are applying are [4]:

Take into account, healthcare is a dynamic complex environment with many actors: patients, physicians, IT professionals, chemists, lab technicians, researchers, health operators…. The Use Case we are introducing tries to consider the whole system. It provides the main tasks along the DaaS lifecycle and so how the medical information might be managed and securely exchanged [12] among stakeholders for multiple entities such as hospital, clinics, pharmacy, labs and insurance companies.

The Use Case
Here is how MaaS might cover the Use Case and DaaS lifecycle best practices integrate the above properties and directions:

Objective To facilitate services to healthcare users and to improve exchange information experience among stakeholders. The Use Case aims to reduce costs of services by rapid data designing, updating, deployment and to provide data audit and control. To improve user experience with healthcare knowledge.
Description Current costs of data design, update and deployment are expensive and healthcare information (clinical, pharmaceutical, prevention, prophylaxis…) is not delivered fast enough based upon user experience;
Costs for hospitalization and treatments information should be predictable based upon user experience and interaction.
Actors Clinical and Research Centres;
Laboratories;
Healthcare Institute/Public Body  (Access Administrators);
Healthcare Institute/Public Body (Credentials, Roles Providers);
Patients;
IT Operations (Cloud Providers, Storage Providers, Clinical Application Providers).
Requirements Reducing costs and rapidly delivering relevant data to users, stakeholders and healthcare institutes;
Enabling decision making information to actors who regularly need access [11] [12] to healthcare services but lack the scale to exchange (and require) more dedicated services and support;
Fast supporting and updating healthcare data to users due to large reference base with many locations and disparate applications;
Ensuring compliance and governance directions are currently applied, revised and supervised;
Data security, confidentiality, availability, authenticity, authorization, authentication and integrity to be defined “on-premise”.
Pre-processing and post-processing Implementing and sharing data models;
Designing data model properties according to private, public and/or hybrid Cloud requirements;
Designing “on-premise” of the data storage model;
Modeling data to calculate “a priori” physical resources allocation;
Modeling data to predict usage “early” and to optimize database handling;
Outage is covered by versions and changes archived based on model partitioning;
Content discovery assists in identifying and auditing data to restore the service to previous versions and to irrecoverably destroying the data, if necessary, is asked by the regulations.
Included and extended use case Deployment is guided from model properties and architecture definition;
Mapping of data is defined and updated, checking whether the infrastructure provider has persistence and finding out whether outages are related to on-line tasks;
Deploying and sharing are guided from model properties and architecture definition.


Following, we apply MaaS’ properties (a subset) to the above healthcare Use Case. Per contra, Data Model properties (a subset) are applied along the DaaS lifecycle states:


MaaS Properties

DaaS Lifecycle States

Healthcare Data Model Properties
Data Location Create Data Model
Model Archive and Change
Deploy and Share
Data models contain partitioning properties and can include data location constraints. User tagging of data (a common Web 2.0 practice, through the use of clinic user-defined properties) should be managed. Support to compliant storage for preventative care data records should be provided
Data persistence Create Data Model
Model Archive & Change
Secure delete
For any partition, sub-model, or version of models, data model has to label and trace data location. Model defines a map specifying where data is stored (ambulatory care, clinical files have different storages). Providers persistence can be registered. Data discovery can update partition properties to identify where data is located
Data inference Create Data Model Data model has to support inference and special data aggregation: ambulatory might inference patient’s insurance file. All inferences and aggregations are defined, updated and tested into the model
Confidentiality Create Data Model
Populate, Use and Test
Data model guides rights assignment, access controls, rights management, and application data security starting from data model. As different tenants (hospitals, clinics, insurance companies and pharmacies) access the data, users and tenants should be defined inside the model. Logical and physical controls have to be set
High availability Deploy and Share
Model Archive and Change
Data model and partitioning configuration together with model changes and versions permits mastering of a recovery scheme and restoration when needed. Data inventory (classified by Surgery, Radiology, Cardiology, for example) vs discovery have to be traced and set.
Fast updates at low cost Create Data Model
Generate Schema/Update Data Model
Data reverse and forward engineering permits change management and version optimization in real-time directly on data deployed properties
Multi-database partitioning Create Data Model
Deploy and Share
Bi-directional partitioning in terms of deployment, storage, and evolution through model versioning has to be set. Multi-DBMS version management helps in sharing multi-partitioning deployments: for example, Insurance and Surgery by Patient, normally are partitioned and belong to different tenants vs different databases
Near-zero configuration and administration Create Data Model
Generate Schema/Update Data Model
Data models cover and contain all data properties including scripts, stored procedures, queries, partitions, changes and all configuration and administration properties. This means administrative actions decrease to leave more time for data design and update (and deployment). Regulation compliance can be a frequent administration task: models ensure that healthcare compliance and governance is currently aligned



The Outcome
MaaS defines service properties through which the DaaS process can be implemented and maintained. As a consequence, applying the Use Case through the introduced directions, the following results should be outlined.

Qualitative Outcomes:
1)    Healthcare actors share information on the basis of defined “on-premise” data models: models can be implemented and deployed using a model-driven paradigm;
2)    Data Models are standardized in terms of naming convention and conceptual templates (Pharma, Insurance, Municipality… and so on): in fact, models can be modified and updated with respect the knowledge they were initially designed;
3)    Storage and partitioning in the Cloud can be defined “a priori” and periodic audits can be set to certify that data are coherent and remain linked to specific sites;
4)    The users consult the information and perform 2 tasks:
4.1) try the (best) search and navigate the knowledge for personal and work activities;
4.2) give back information about user experience and practice/procedures that should be updated, rearranged, downsized or extended depending upon community needs, types of interaction, events or public specific situations.
5)    Models are “on-premise” policy-driven tools. Regulation compliance rules can be included in the data model. Changes on current compliance constraints means changes on the data model before it is deployed with the new version.

Quantitative Outcomes:
1)    Measurable and traceable costs reduction (to be calculated as a function of annual Cloud Fee, Resources tuning and TCO);
2)    Time reduction in terms of knowledge fast design, update, deployment, portability, reuse (to be calculated as a function of SLA, data and application management effort and ROI);
3)    Risk reduction accordingly to “on-premise” Cloud service design and control (to be calculated as a function of recovery time, chargeback on cost of applied countermeasures compared with periodical audit based upon model information).

Conclusion
MaaS might provide the real opportunity to offer a unique utility-style model life cycle to accelerate cloud data optimization and performance in the healthcare network. MaaS applied to healthcare services might be the right way to transform the medical service delivery in the Cloud. MaaS defines “on-premise” data security, coherence, outage, availability, geo-location and an assisted service deployment. Models are adaptable to various departmental needs and organizational sizes, simplify and align healthcare domain-specific knowledge combining the data model approach and the on-demand nature of cloud computing. MaaS agility is the key requirements of data services design, incremental data deployment and progressive data structure provisioning. Finally, the model approach allows the validation of service evolution. The models’ versions and configurations are a catalogue to manage both data regulation compliance [12] and data contract’s clauses in the Cloud among IT, Providers and Healthcare actors [9].

References
[1] N. Piscopo - ERwin® in the Cloud: How Data Modeling Supports Database as a Service (DaaS) Implementations
[2] N. Piscopo - CA ERwin® Data Modeler’s Role in the Relational Cloud
[3] D. Burbank, S. Hoberman - Data Modeling Made Simple with CA ERwin® Data Modeler r8
[4] N. Piscopo – Best Practices for Moving to the Cloud using Data Models in the DaaS Life Cycle
[5] N. Piscopo – Using CA ERwin® Data Modeler and Microsoft SQL Azure to Move Data to the Cloud within the DaaS Life Cycle
[6] N. Piscopo – MaaS (Model as a Service) is the emerging solution to design, map, integrate and publish Open Data http://cloudbestpractices.net/2012/10/21/maas/
[7] N. Piscopo - MaaS Workshop, Awareness, Courses Syllabus
[8] N. Piscopo - DaaS Workshop, Awareness, Courses Syllabus
[9] N. Piscopo – Applying MaaS to DaaS (Database as a Service ) Contracts. An intorduction to the Practice http://cloudbestpractices.net/2012/11/04/applying-maas-to-daas/
[10] N. M. Josuttis – SOA in Practice
[11] H. A. J. Narayanan, M. H. GüneşEnsuring Access Control in Cloud Provisioned Healthcare Systems
[12] Kantara Initiatives -http://kantarainitiative.org/confluence/display/uma/UMA+Scenarios+and+Use+Cases

Disclamer
This document is provided AS-IS for your informational purposes only. In no event the contains of “How MaaS might be applied to Healthcare – A Use Case” will be liable to any party for direct, indirect, special, incidental, economical (including lost business profits, business interruption, loss or damage of data, and the like) or consequential damages, without limitations, arising out of the use or inability to use this documentation or the products, regardless of the form of action, whether in contract, tort (including negligence), breach of warranty, or otherwise, even if an advise of the possibility of such damages there exists. Specifically, it is disclaimed any warranties, including, but not limited to, the express or implied warranties of merchantability, fitness for a particular purpose and non-infringement, regarding this document or the products’ use or performance. All trademarks, trade names, service marks and logos referenced herein belong to their respective companies/offices.


Read the original blog entry...

More Stories By Cloud Ventures

The Cloud Ventures Network is an expert community of leading Cloud pioneers. Follow our best practice blogs at http://CloudBestPractices.net

@ThingsExpo Stories
How do APIs and IoT relate? The answer is not as simple as merely adding an API on top of a dumb device, but rather about understanding the architectural patterns for implementing an IoT fabric. There are typically two or three trends: Exposing the device to a management framework Exposing that management framework to a business centric logic Exposing that business layer and data to end users. This last trend is the IoT stack, which involves a new shift in the separation of what stuff happens, where data lives and where the interface lies. For instance, it's a mix of architectural styles ...
We are reaching the end of the beginning with WebRTC, and real systems using this technology have begun to appear. One challenge that faces every WebRTC deployment (in some form or another) is identity management. For example, if you have an existing service – possibly built on a variety of different PaaS/SaaS offerings – and you want to add real-time communications you are faced with a challenge relating to user management, authentication, authorization, and validation. Service providers will want to use their existing identities, but these will have credentials already that are (hopefully) i...
The Internet of Things will put IT to its ultimate test by creating infinite new opportunities to digitize products and services, generate and analyze new data to improve customer satisfaction, and discover new ways to gain a competitive advantage across nearly every industry. In order to help corporate business units to capitalize on the rapidly evolving IoT opportunities, IT must stand up to a new set of challenges. In his session at @ThingsExpo, Jeff Kaplan, Managing Director of THINKstrategies, will examine why IT must finally fulfill its role in support of its SBUs or face a new round of...
Connected devices and the Internet of Things are getting significant momentum in 2014. In his session at Internet of @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, examined three key elements that together will drive mass adoption of the IoT before the end of 2015. The first element is the recent advent of robust open source protocols (like AllJoyn and WebRTC) that facilitate M2M communication. The second is broad availability of flexible, cost-effective storage designed to handle the massive surge in back-end data in a world where timely analytics is e...
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water, are pursuing SmartGrid initiatives that represent one of the more mature examples of SAE. We have s...
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
One of the biggest challenges when developing connected devices is identifying user value and delivering it through successful user experiences. In his session at Internet of @ThingsExpo, Mike Kuniavsky, Principal Scientist, Innovation Services at PARC, described an IoT-specific approach to user experience design that combines approaches from interaction design, industrial design and service design to create experiences that go beyond simple connected gadgets to create lasting, multi-device experiences grounded in people's real needs and desires.
P2P RTC will impact the landscape of communications, shifting from traditional telephony style communications models to OTT (Over-The-Top) cloud assisted & PaaS (Platform as a Service) communication services. The P2P shift will impact many areas of our lives, from mobile communication, human interactive web services, RTC and telephony infrastructure, user federation, security and privacy implications, business costs, and scalability. In his session at @ThingsExpo, Robin Raymond, Chief Architect at Hookflash, will walk through the shifting landscape of traditional telephone and voice services ...
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example t...
The Domain Name Service (DNS) is one of the most important components in networking infrastructure, enabling users and services to access applications by translating URLs (names) into IP addresses (numbers). Because every icon and URL and all embedded content on a website requires a DNS lookup loading complex sites necessitates hundreds of DNS queries. In addition, as more internet-enabled ‘Things' get connected, people will rely on DNS to name and find their fridges, toasters and toilets. According to a recent IDG Research Services Survey this rate of traffic will only grow. What's driving t...
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With "smart" appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user's habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps, abiding by privacy concerns and making the concept a reality. These challenges can't be addressed w...
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at Internet of @ThingsExpo, James Kirkland, Chief Architect for the Internet of Things and Intelligent Systems at Red Hat, described how to revolutioniz...
Bit6 today issued a challenge to the technology community implementing Web Real Time Communication (WebRTC). To leap beyond WebRTC’s significant limitations and fully leverage its underlying value to accelerate innovation, application developers need to consider the entire communications ecosystem.
The definition of IoT is not new, in fact it’s been around for over a decade. What has changed is the public's awareness that the technology we use on a daily basis has caught up on the vision of an always on, always connected world. If you look into the details of what comprises the IoT, you’ll see that it includes everything from cloud computing, Big Data analytics, “Things,” Web communication, applications, network, storage, etc. It is essentially including everything connected online from hardware to software, or as we like to say, it’s an Internet of many different things. The difference ...
Cloud Expo 2014 TV commercials will feature @ThingsExpo, which was launched in June, 2014 at New York City's Javits Center as the largest 'Internet of Things' event in the world.
SYS-CON Events announced today that Windstream, a leading provider of advanced network and cloud communications, has been named “Silver Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. Windstream (Nasdaq: WIN), a FORTUNE 500 and S&P 500 company, is a leading provider of advanced network communications, including cloud computing and managed services, to businesses nationwide. The company also offers broadband, phone and digital TV services to consumers primarily in rural areas.
"There is a natural synchronization between the business models, the IoT is there to support ,” explained Brendan O'Brien, Co-founder and Chief Architect of Aria Systems, in this SYS-CON.tv interview at the 15th International Cloud Expo®, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo (http://www.CloudComputingExpo.com), moderated by Ashar Baig, Research Director, Cloud, at Gigaom Research, Nate Gordon, Director of T...