Welcome!

Agile Computing Authors: Pat Romanski, Liz McMillan, Corey Roth, Elizabeth White, Yeshim Deniz

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Containers Expo Blog, Agile Computing, @DXWorldExpo

@CloudExpo: Article

Cloud Platforms: The Crucial Foundation | @CloudExpo #IoT #Microservices

How a cloud environment can offer the flexible resources, elastic scalability and system reliability

Cloud Platforms: The Crucial Foundation to Safeguard the Pharmaceutical Supply Chain

In today's pharmaceutical supply chain, counterfeit activity is thriving. As pharma companies have expanded target markets and outsourced production over the last decade, the supply chain has become increasingly global, virtual, and vulnerable. Illicit activity has thrived, and patients have suffered, with hundreds of thousands dying each year from counterfeit and contaminated drugs.

More than 40 countries have responded with new laws that regulate prescription medications as they travel through the supply chain. While this is a quantum leap forward for patient health, the implications for supply chain stakeholders, from pharmaceutical companies and their contract manufacturing partners to the pharmacies that serve patients are daunting: they must master each country's disparate track and trace requirements; create a system architecture capable of generating, managing, and storing what will be unprecedented volumes of regulated data; and figure out how to efficiently exchange that data with hundreds to tens of thousands of direct and indirect supply chain partners.

In the process, traditional technology will be put to the test. Traditional, on-premises data center and enterprise software infrastructure will prove incapable of meeting the enormous data management and transactional processing demands that compliance will introduce. Cloud computing networks offer the only practical solution for this challenge.

This article will discuss the how a cloud environment can offer the flexible resources, elastic scalability and system reliability required to meet compliance regulations and patient safety protections necessitating the tracking and tracing of medications through every step of the supply chain.

From Local to Global: Increasingly Virtual and Vulnerable
Emerging market demand for medicines has skyrocketed in the past decade and a better understanding of patient health has driven a parallel growth of increasingly specialized medicines. In response, the pharmaceutical supply chain has shifted from predominantly a local presence, with local pharmaceutical companies producing medicines for the local market, to a truly global supply network linking diverse members across multiple continents to serving global patient populations. On this global network now flows an increasingly broad range of medicines, driving pharmaceutical companies to virtualize many operations and establish complex new supply relationships in order to reach and serve a global, increasingly diverse market.

Many have built relationships with local contract manufacturers, pharmaceutical companies and third-party logistical organizations for production and distribution efficiencies. These partnerships have allowed them to decrease fixed costs, improve agility and enhance access to the global market by cost-effectively producing and distributing product around the world. This growth creates vulnerability as a longer supply chain with more changes of possession and ownership from drug manufacturing to patient dispensation creates more opportunity for counterfeiting and diversion.

The true scope of counterfeiting - which encompasses everything from inactive or incorrect ingredients to drugs with high levels of contaminants or fake packaging - is unknown, but the World Health Organization (WHO) estimates that between 1% and 10% of drugs sold around the world are impacted, and the overall worldwide sale of counterfeit drugs is valued at more than $75 billion a year. The cost to patient safety: immense.

The Response: Track and Trace Regulation
In response to increasing drug integrity concerns, more than 40 countries from the U.S., China, India, South Korea, Brazil and the members states of the European Union have introduced track and trace laws to identify, track, and verify product as it passes through the supply chain. By 2019, more than 75% of the world's prescription medications will be protected by legislation. From a patient health perspective, this is unequivocally good news.

For the industry, however, these diverse regulations introduce a web of challenges. While most regulations involve some combination of generating and managing serial numbers for each unit of drug product, tracking product through the supply chain, verifying data and transactions at various points in the network, and reporting associated events to government authorities, no two countries in the world have passed the same requirements or mandated the same information formats. This is particularly problematic when you consider that globally tens of billion units of drug product are produced and distributed annually across a global supply network made up of well over one million members.

Global Track and Trace Breaks Traditional Enterprise Approaches
Track and trace compliance demands test - and surpass - the limits of what traditional technology and enterprise infrastructures can manage in two key areas: big data management, and trading partner integrations. Let's look at what a mid-sized pharmaceutical company serving the United States market may face under the Drug Supply Chain Security Act (DSCSA). Our model company may produce 50 million units of drug product annually for the U.S. market. All 50 million units need unique serial numbers generated along with the 1 million cases that they are packed in. Each serialized unit will generate an average of 5 serialized events needing to be managed, resulting in 255 million serialization transaction events and more than 255 gigabytes of compliance data at 1 kilobyte per event annually. Across a 10 year record retention period, the company would be faced with securely managing over 2.5 billion compliance events and over 2.5 terabytes of compliance data.

The network challenges are just as daunting. A mid-sized company may use a dozen or more contract partners to augment internal sites to produce their serialized products for different markets. All of these sites and their different serialization line management systems need to be integrated with across multiple transactions. Even more difficult are the downstream network connection challenges. Dozens to hundreds of direct trading partners, including wholesale distributors, repackagers and dispensers need to be connected for lot-level traceability and serialized product data exchange as products are sold, distributed, returned and verified during investigations. Add in the requirement to receive and respond to serialized product verification requests from thousands of indirect trading partners such as independent pharmacies and the pharmaceutical manufacturer faces an unprecedented challenge. This intricate, highly connected serialized supply network will now need high performance electronic connections over which millions of transactions and gigabytes of compliance data will flow.

The U. S. DSCSA example is just one of the many scenarios facing multi-national pharmaceutical companies. All in all, an in-house infrastructure to support this network ecosystem would be prohibitively expensive and complicated from a capital, personnel, and expertise perspective. Because of the dynamic nature of global track and trace regulations, companies making large capital investments in their physical IT infrastructure also take on unacceptable risks as regulatory environments and their related data management requirements may change on a yearly or even quarterly basis.  Given the volume and complexity of required partner connections, an individual, point-to-point approach built upon traditional enterprise infrastructures using standard SQL database is just not feasible. This is why Life Sciences must look to the cloud for a solution.

Network-based Cloud Platforms Are the Only Way to Meet Emerging Needs
In order to succeed in the new global, virtual, and regulated landscape, the Life Sciences supply chain needs a new data exchange model and supporting platform that can scale to manage trillions of objects and transactions, and handle seamless exchange of this data amongst global supply chain partners that number over a million worldwide. Plus, as personalized medicine, patient regimen compliance and other patient engagement programs are developed across the industry, this platform will need to connect to the billions of patients globally receiving medicines. The industry needs a platform that can facilitate communications, generate efficiencies, guarantee compliance and protect information - and do all of that in a cost-effective manner.

Cloud computing infrastructures such as Amazon's AWS Cloud, utilizing global, on-demand data and transaction processing resources, ticks all of the requisite boxes. Yet, cloud environments come in several different flavors, so it's important to understand which design and configurations will meet emerging challenges facing the industry.

Public vs. Private
A private cloud - dedicated to only your company - offers limited elasticity within your private data center, while a public cloud is a true utility computing model. For the reasons of scalability, along with solid disaster recovery and security, a public cloud model makes the most sense to support the pharmaceutical supply chain.

Single, Multi-, and Network-Tenant Environments
In the new pharmaceutical regulatory environment, understanding the attributes of single-, multi-, and network-tenant clouds is critical to a company's ability to exchange required data with their full complement of internal sites, supply and trading partners. The goal is to find the right infrastructure that supports the necessary data, transactional and connectivity requirements.

A single-tenant architecture is a legacy approach that creates a separate software instance and supporting infrastructure for each customer and their data. This type of architecture will not support sharing compliance data with many partners or evolving regulations which necessitate regular updates.

In a multi-tenant setup, a single instance of software and the infrastructure serves multiple parties. Multi-tenancy is a step forward from single tenancy, but it does not explicitly facilitate connectivity and communication between companies.

Network-Tenancy for Mission-Critical Life Sciences
Network tenancy is a whole new dimension that builds on multi-tenancy, but was expressly created to meet the challenges of today's Life Sciences companies.

The fundamental purpose of a network-tenant environment is to enable interactions and share business processes across multiple companies on the network, establishing a new dimension of collaboration with existing partners and allowing for discoverability of new partners within a directory structure.

In a network-tenant solution, interactions with other companies are a paved road and a singular process: you integrate into the network once, and can then interact with everyone. Network tenancy also provides data translation and transportation, enabling seamless information exchange in a regulatory world with incomplete standards and diverse corporate data preferences. Think of it like the Interstate highway or the electrical grid ensuring free flow across the network while enabling diverse entities to join.

Service providers distribute software updates to all network residents simultaneously, insuring that all companies remain in compliance as regulations continue to evolve. Network tenancy builds on the security protocols of multi-tenancy, protecting data and the private network interactions while enhancing interoperability and business growth by adding new forms of public discovery and inquiry across the network similar to those seen in consumer networks like LinkedIn. Well-defined network tenant platforms add additional layers of security by invoking a rigorous process of vetting new members prior to adding them to the network.

Specifically for Life Sciences, network tenant platforms must go one step further. The systems deployed must be designed to meet regulatory requirements in areas like validation while ensuring overall system integrity and reliability as critical medicines are produced and distributed. The platform and the processes used to build solutions on the platform must minimize and isolate change that impact validated systems while providing detailed IQ and OQ artifacts to support validation testing on new platform releases. In addition, inherent fail-over and redundancy must be designed in from the start for both data management and transaction processing.

Done right, network tenancy enables residents to share compliance-related business processes with diverse global supply and trading partners; seamlessly exchange data in a world without format standards; accommodate the massive scale demands of global serialization; and grow their businesses by supporting global supply relationships and discovery of new prospective partners. It also lays the foundation for new business value as any one network member now has global connection and reach to leverage for new forms of supply planning and patient engagement. It is the only solution that can accommodate the evolving needs of the fast growing Life Sciences supply chain.

Choosing the Right Solution for Success
By 2017, the global prescription drug market is projected to grow to $1.2 trillion, up from $900 billion in 2014. Counterfeit activities are also poised to accelerate. The stakes are high - and there's no question that network tenant cloud platforms not only change the game but are a necessary unpinning for this highly regulated and fast evolving environment.

As outlined here, a public, network-tenant cloud environment that offers flexible interoperability and global elasticity will support businesses as they roll-out serialization and track and trace solution. It will make connecting with global trade partners and exchanging data as simple as integrating to the platform once, and then interoperating with everyone. It will create a global network for collaboration and enable the smallest emerging biopharma or street corner pharmacy to enjoy the same global computing firepower and capabilities as the largest global pharmaceutical company.

The right platform will also level the playing field in terms of technical resources will support tighter financial accountability. Most importantly, though, the right networked-cloud environment is the only way to comprehensively track medications through every step of the supply chain, and truly protect patient safety.

More Stories By Brian Daleiden

Brian Daleiden is Vice President of Industry Marketing at TraceLink, a company focused on helping pharmaceutical manufacturers, distributors and dispensers deliver safe prescription medicines to patients everywhere with the largest track and trace network in the Life Sciences industry. In this capacity, he leads the company’s thought leadership, global regulatory analysis and market education programs that help industry stakeholders understand and respond to emerging regulatory, business and technical requirements.

Brian holds an MBA from Vanderbilt University and a BS from the University of Wisconsin. Brian can be reached at [email protected]

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
DXWorldEXPO LLC, the producer of the world's most influential technology conferences and trade shows has announced the 22nd International CloudEXPO | DXWorldEXPO "Early Bird Registration" is now open. Register for Full Conference "Gold Pass" ▸ Here (Expo Hall ▸ Here)
Join IBM November 1 at 21st Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Cognitive analysis impacts today’s systems with unparalleled ability that were previously available only to manned, back-end operations. Thanks to cloud processing, IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Imagine a robot vacuum that becomes your personal assistant tha...
"MobiDev is a software development company and we do complex, custom software development for everybody from entrepreneurs to large enterprises," explained Alan Winters, U.S. Head of Business Development at MobiDev, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
I think DevOps is now a rambunctious teenager - it's starting to get a mind of its own, wanting to get its own things but it still needs some adult supervision," explained Thomas Hooker, VP of marketing at CollabNet, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
It is of utmost importance for the future success of WebRTC to ensure that interoperability is operational between web browsers and any WebRTC-compliant client. To be guaranteed as operational and effective, interoperability must be tested extensively by establishing WebRTC data and media connections between different web browsers running on different devices and operating systems. In his session at WebRTC Summit at @ThingsExpo, Dr. Alex Gouaillard, CEO and Founder of CoSMo Software, presented ...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, introduced two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a multip...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...
In his session at @ThingsExpo, Dr. Robert Cohen, an economist and senior fellow at the Economic Strategy Institute, presented the findings of a series of six detailed case studies of how large corporations are implementing IoT. The session explored how IoT has improved their economic performance, had major impacts on business models and resulted in impressive ROIs. The companies covered span manufacturing and services firms. He also explored servicification, how manufacturing firms shift from se...
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of bus...
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
IoT solutions exploit operational data generated by Internet-connected smart “things” for the purpose of gaining operational insight and producing “better outcomes” (for example, create new business models, eliminate unscheduled maintenance, etc.). The explosive proliferation of IoT solutions will result in an exponential growth in the volume of IoT data, precipitating significant Information Governance issues: who owns the IoT data, what are the rights/duties of IoT solutions adopters towards t...
Amazon started as an online bookseller 20 years ago. Since then, it has evolved into a technology juggernaut that has disrupted multiple markets and industries and touches many aspects of our lives. It is a relentless technology and business model innovator driving disruption throughout numerous ecosystems. Amazon’s AWS revenues alone are approaching $16B a year making it one of the largest IT companies in the world. With dominant offerings in Cloud, IoT, eCommerce, Big Data, AI, Digital Assista...