The rapid growth of biopharma, health care, and regulatory systems has resulted in process and information anarchy. Business process management (BPM) has become too complex, geographically dispersed, time dependent, and bureaucratic to remain managed by people exclusively. Although new telecommunications and data standards are slowly emerging, in general, each organization within the pharmaceutical value chain has been left to determine for itself how and when to integrate its information systems platforms -- computers, operating systems, applications, storage, and so on. In traditional enterprise systems models, most processes associated with business management are performed by individuals, not software. However, in industries where software facilitates the management of business processes (for example, financial services and manufacturing), the strategic focus has been on developing reference IT architectures using Internet-based Web services and BPM driven by software, with humans cast in the supporting role. Dell Computer Corp. and Fidelity Management are examples of enterprises using software-driven BPM.
From the bio-IT perspective, this means having an enterprise-scale IT architecture that can manage the integrated complex of organizations within the life science value chain -- partners, regulators, clinical, manufacturing, distribution, and patients. Let's talk about how strategically applying Web services -- a distributed computing architecture that enables access to data and applications using Internet protocols -- can dramatically reduce costs and complexity of software and BPM.
The regulatory and partnership data "chains" within discovery and product development have resulted in extraordinarily complex systems integration requirements with regard to security, access, and control of applications and databases. This complexity creates two major problems: registering the existence of users, required business and software processes, and data in real time; and allowing only those who need access to software processes and data to do so selectively, under auditable circumstances -- termed the Regulated Data Exchange (RDE) problem.
Solving the RDE problem requires:
-- Integrating computers, operating systems, business and software processes, databases, and registration information into Internet-based systems using Web services protocols.
-- Creating enterprise registries for sharing business processes and data that will be used within the enterprise and accessed outside the enterprise.
-- Implementing security, access, and control processes and Web services that facilitate what Eli Lilly and Co. CIO Roy Dunbar has dubbed the "selective transparency" of information systems. Selective transparency allows processes and data to become, temporarily or permanently, "visible" and accessible to an authorized party for an audited period, for a known purpose.
The regulatory and partnership data 'chains' within discovery and product development have resulted in complex systems integration requirements with regard to security, access, and control of applications and databases.
Johnson & Johnson, for example, is developing a system to enable Web-based access to more than 200 operating divisions covering more than 120,000 application end users. This RDE enterprise application is enabled using a Web services-based directory to enable secure, authenticated access to regulated applications and Web services. In this way, J&J is able, with its systems integration partner, Northrop Grumman Mission Systems (formerly TRW Healthcare), to rapidly create XML wrappers around enterprise applications like Oracle, SAP, and internal custom programs.
Security Is Job 1
The foundation of any Web services architecture must be based on security, access, control, and metadata resources that facilitate the integration of enterprise systems. New systems and data are registered in the enterprise, and data access is provided through controls that allow departments, partners, and others selective access to data depending on their validated needs and rights. Data is secured, integrated, and available for reuse at all times.
Selective transparency allows an employee, research partner, regulator, or any authorized entity or person access to databases or the ability to execute a transaction, program, or report, based on role and security, access, and control criteria. Thus, selective transparency drives the need for integration of systems and data.
Web-based application and systems integration is focused on the ability of Internet-enabled applications to exchange messages in real time through a remote procedure call (RPC) using Web software services. These register applications, software services, security, access, and control information as enterprise data in various directories available in a secure manner within the enterprise.
Being conservative, life science organizations are only now beginning to implement Web services. Many are waiting on improvements in development tools, security software, and standards.
Getting started with Web services requires a strategy, a security model, and software and tools, typically from IBM Corp., Sun Microsystems, or Microsoft Corp. Sun is offering Sun One, its platform for building and deploying services on the Web. IBM has the most advanced libraries of software development tools for handling things ranging from security to data management. Microsoft is still developing .NET but can be expected to be a major contributor. Operating systems ready for Web services include IBM AIX, Microsoft Windows 2000, and especially Linux.
When starting a Web services design, focus on architecture and requirements. Web services are easier to code than application programming interfaces (APIs), so the emphasis should be on strategy, business functions, and requirements.
Time, labor, and costs should be 50 percent to 90 percent below standard systems integration expenses. A small application can begin at about $250,000, but regulated life science applications with validation and verification will start at around $1 million.
To begin, pass all users through a security, access, and control (SAC) application, itself implemented as a Web services program that provides for common verification, validation, and access to production applications (see "Web Services Life Sciences Systems Integration"). This gives users and software processes access to the Internet to send XML messages among applications. Since XML is a simple text-based markup language using standard Internet TCP/IP transport protocols, there is no "hard-wired" API to deal with among applications. Nor are there hard-wired telecommunications network links to maintain.
The network becomes decoupled from hard-wired database and API applications. Protocols can be coded for a fraction of the cost of coding APIs. Geographical and functional dispersion is inherently enabled since all messages utilize the Internet, thus providing ubiquitous and global instant access to each Web service.
This approach differs from traditional systems integration in that the focus moves away from coding APIs that must transverse telecommunications systems or send SQL commands to communicate among applications. When dealing with multiple applications, APIs, and dozens of different vendors, it becomes impractical to develop and implement specific APIs for every possible combination. The coding and therefore maintenance costs rise exponentially. Not so for Web services: one message, one XML form, one decoding rule set in Java or .NET.
Organizations implementing Web services can realize several financial and functional benefits:
Internal and external divisions, partnerships, and regulatory agency relationships can be realistically automated for the first time, because security, access, control, and business functionality can be defined by a Web service.
Systems integration costs are dramatically reduced by as much as an order of magnitude, and interfaces are standardized.
Most important, use of BPM software is enabled so that business processes and latency can be monitored.
In the final state of the global enterprise network, applications, files, persistent data objects, and other databases are divided among production, transaction-processing, and business-intelligence applications and, most importantly, enterprise departments, external agents, and partners.
The systems integration architecture is responsible for ensuring that local applications and databases can transfer data to the central enterprise storage network (ESN) for registration and integration. Using enterprise-level security, access, and control solutions, regulated data are integrated within the ESN or local network storage architecture (NSA) for use by others.
This ensures that the data are registered, secured, and validated so all authorized local and global users, either locally in the NSA or globally through the ESN, can access them. It enables effective, regulated, and secure data exchange with dramatic reduction in systems integration costs.
Moving data through the life science R&D value chain and monitoring business processes is difficult, time consuming, and costly. Security, access, and control systems are needed across the entire regulatory, contractual, and organizational command structure to facilitate sharing the right data with the right party under the right circumstances. This is now mandated by the FDA, the Centers for Medicare and Medicaid (the agency handling HIPAA rules and regulations), and EMEA (The European Agency for the Evaluation of Medicinal Products). Clearly, a Web services architecture, when combined with emerging data standards such as the electronic common technical dossier (eCTD), can become a standard for creating, storing, and sharing local and enterprise-scale applications and data within the life science industry.
The current state of many enterprise IT architectures is business process chaos, as shown by escalating costs and inefficiency of biopharmaceutical discovery and clinical development. Such chaos will become unacceptable to senior management, regulators, and shareholders, if it hasn't already.
For more than a thousand years, the construction industry has used architecture and standardization to facilitate building. Given the clinical and economic risks associated with discovery, manufacturing, and distribution, no less should be done among life science enterprises, partners, regulatory agencies, and health-care providers.
The tools exist, but the will to use them must also be there.
Web Services Life Sciences Systems Integration
New data or requests for data arrive from transaction-processing systems through Web service requests. Essentially, the following protocol is used: A software service -- for example, access to a particular database -- is registered as a Web service in a directory on the Internet (or intranet). All user software processes (services) are registered with security, access, and control rights. Application interfaces are coded as messages sent over RPC TCP/IP protocols sending and receiving XML messages. An application need only know the XML message protocol and how to respond.
Bernard P. Wess Jr. is founder and president of Perseid Software Ltd. He can be reached at email@example.com. -- Bio-IT World
Join the CIO New Zealand group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.