eHealth Revolution - The transformation and strategies

Stephen Chu,  Adjunct Professor, Multi-Media University, Malaysia

Since the automation of administration and financial functions in the 1950s, the healthcare industry is experiencing a new wave of digitalisation (Biesdorf S., Niedermann F. 2014) which may be considered by many as eHealth revolution. The winners are those who effectively conquer the disruption of the emerging technologies, recognise the gaps, embrace relevant technologies to create new and differentiated opportunities, innovative services and reach new markets.

The introduction of the Diagnosis Related Groups (DRGs) in 1983 ushered in serious disruptions in healthcare industry and had been attributed by different authors to cause closure of many hospitals, and many consolidations and mergers (Garner C.B. 2011; Jacobsen-Wells J. 1989). The healthcare industry today is impacted probably on a more serious scale by many different disruptive forces such as upward pressures in healthcare costs, shrinking healthcare resources (financial and professional), increasing regulatory and professional compliance requirements, consumer demands for choices, better quality and a bigger role in decision making, and on top of all, eHealth technology- induced changes.

The Choices: Transform or Peril

Many healthcare organisations are caught in the intersect of these disruptive forces and realise that their existing business model and core services are becoming ineffective, inefficient and financially not viable. Consequently, the healthcare industry is being forced to undergo significant structural and functional transformations that may help health organisations recognise gaps in their core businesses, and create new and innovative services. Those that fail successful transformations may find themselves following the path of DRG casualties in 1990s.

Quality Reform

Since the publishing of the Institute of Medicine (IOM) Crossing the Quality Chasm report in 2001, the healthcare industry is still working to realise the elusive goal of improving quality of care.

A more recent analysis by a US researcher on studies published between 2008 and 2011revealed that preventable medical errors are still occurring at the rate of 210,000 to 400,000 per year (James JT, 2013).

For quality improvement to be effective, it needs to be addressed through multiple dimensions:

Improvement of care coordination

Prevalence of chronic diseases, especially in developed nations, is reaching epidemic level. Epidemiologic studies revealed that 85 per cent of patients 65 years and older have more than one chronic condition; 14 per cent of patient 65 years and older have 6+ chronic conditions; and these 14 per cent consume 45 per cent of annual healthcare costs (Anderson G, Horvath J, 2004). Fuelled by lifestyle problems, the upward trend and cost pressure show no signs of receding.

These patients are managed by multiple healthcare providers. Each of these conditions has serious risk impacts on many others (e.g. diabetes has serious adverse impacts on cardiovascular conditions, arthritis, and mental health), hence requiring careful and well-coordinated planning and delivery of effective management strategies.

Healthcare is tribal in nature, especially in the private sector. Most providers operate in silos, resulting in very poor information sharing and care coordination. Financial factors tend to accentuate such practice.

Some countries (e.g. UK and New Zealand) implement policy requiring patients to register with nominated primary care providers. However, care coordination failure is not uncommon under such a model, and compounded by lack of access to relevant patient information during out-of-hours care, serious patient harm including death can only be the natural outcome (Pallister D, 2007).

Shared care plan is considered an effective facilitating tool for disease management planning and care coordination (ONC, LCC). Strategic use of health IT to facilitate authorised sharing of electronic patient information such as health summaries, test results and care plans may help improve quality of care decision.

Clinical process-based continuous improvement

Quality assurance / improvement programmes have produced some promising results (Brown GE, 1998; Canovas JJ, Hermandez PJ, Botella JJ 2009). However, such programmes are retrospective in nature, focusing on collection and analysis of variances or sentinel event data (ECRI Institute, 2009). They may help prevent future adverse events, but bring no comfort to those identified to have already suffered damages.

A far superior mechanism is incorporation of Deming’s continuous quality improvement model-“plan-do-check-act” (PDCA) into continuous clinical processes improvements (Chu S, Thom J, 1994). With the PDCA steps built into clinical information systems or organisation electronic health record systems, data reflecting patient clinical status or outcomes are continuously monitored and compared to benchmark/outcome measures in real time as the care processes occur. It is designed to pre-emptively identify risks and prevent damages. Any real time patient data indicating deviations from the benchmark, after patient specific thresh-hold adjustments, may trigger alerts and prompt clinicians to start micro-management of the patient before he/she falls off the critical path. The outcome will always be far superiorto treating the patient when damages/complications start to emerge

Performance/outcome-based payment

Healthcare payment models fall into several categories (Miller HD, 2009; Silversmith J, 2011). The most common include: pay for service (amount of payment is predetermined for each discrete service), episode of care payment(single payments for a group of services related to a treatment or condition that may involve multiple providers in multiple settings for a single care episode), comprehensive care payment (total cost of care payment model involves providing a single risk-adjusted payment for the full range of healthcare services needed by a specified group of people for a fixed period of time).

It is widely recognised that healthcare systems in most countries perform poorly despite the rapidly escalating costs (Scheffler RM, 2010).

‘Payment for performance’ has been promoted as a better model to fund health services and is gaining traction in a number of developed economies such as USA, Australia, UK, Canada, New Zealand, Germany (Eijkenaar F, et al, 2013; Perrin B, 2013).

Payment for performance (P4P) is attractive to the funder (Government included) due to its high potential to lift provider performance and care quality. International evidences so far either showed marginal or little benefits in quality improvement (Appleby J, et al, 2012;Jha AK, et al, 2012;MehrotraA, et al, 2009; Petersen LA, et al, 2006).

One of the causes of the relative inconclusive evidence on P4P benefits is attributed to the notorious difficulties in determining a set of robust, evidence-based performance measures that can be used effectively in real time outcome measurements.

Strategic use of health IT can support collection, aggregation and analysis of large scale clinical outcome data (at organisation, regional and national levels) to help determinewhich data can best predict outcomes, and which type of healthcare interventions and the timing of their application may produce the best/optimal clinical outcomes for specific types of patient and conditions.

Providers should be able to analyse the pool of ‘big data’ to determine which intervention worked best for certain disease in patients with similar health profiles and to predict outcomes for those with different profiles. For example, it is possible to evaluate the clinical outcomes of their interventions on breast-cancer against those of other specialists nationally or internationally and refine their management strategies quickly. The analysis can also produce reliable, evidence-based outcome parameters for incorporation into clinical software to support real-time PDCA care quality improvement.

Analytics on population-based intervention-outcome relationship may also help settle the debates on whether generic drugs are no less effective than brand-named drugs.

When equipped with accurate organisation analytic data, healthcare institutes can effectively refine their clinical processes, perfect health intervention and investment strategies and significantly improve quality and performances.

Such analytic results will have far reaching health economic implications. The quality and economic benefits can be enormous.

Increase Consumer Choices and Participation

In nations that embrace market economy, the principle of choices is to stimulate competitions and hence cheaper and better quality products and services to consumers. This principle is gaining greater traction among astute and informed healthcare consumers.

In healthcare where tribal and silo practices are common, greater choices often bring about unintended consequences. Choices mean that as the consumer can switch between providers, their health data are likely to be spread further among providers. Poor information sharing and care coordination are likely to increase risk of compromised quality of decision and care (Bellard K, 2011).

Choice and participation in health decision predicates on high level of consumer health literacy. Even among those with high level of health / medical knowledge, making the optimal choice for health problems can be extremely challenging (Rosenbaum L, 2003).

For the less informed, the Internet may provide a wealth of health related information, but the volume and quality of information (or the lack of quality) can cause havoc - one common and well known problem is cyberchondria (White R, Horvitz E, 2009; WebMD 2002).

Effective consumer participation also requires consumers taking responsibility of their own health (Coulter A, et al, 2008). This means that (a) relevant patient specific health information in plain language needs to be available to consumer, (b) resources are readily available to assist consumer understanding of such information (and to avoid unintended consequences such as cyberchondria), (c) appropriate alerts and reminders to engage in healthy lifestyle behaviours and self-care (e.g. correct medication management and compliant regime) are provided to consumers when and where they are needed.

Home-based health status monitoring supported by Internet-based clinician guided electronic health information delivery that has the smartness of responding to real-time patient health specific data are likely to give visionary organisations a competitive edge.

Specialisation

The explosive advances in medical sciences spawn rapidly increasing medical specialisation. Healthcare institutions respond by offering more and more services around the increasing number of clinical specialty and subspecialty.

Systemic reviews appear to indicate that organisations that focus on highly specialised services delivered on high patient volumes produce far better patient outcomes and lower costs (Jovnt KE, et al, 2003; Ley O, 2014).The findings are in sharp contrast to conventional beliefs of tertiary and quaternary referral institutions that collection of all clinical specialties under the same entity produces highest efficiency, care quality and revenue.

Around the world, specialist institutions such as Cardiac/cardiothoracic, Eye, Head & Neck, and Orthopaedic hospitals are some examples of hospitals electing to adopt the highly specialised pathway. This strategy will, however, reduce their scope of services and capability to provide the care for patients with multiple health conditions.

The strategic decision to take the core specialisation path creates a set of challenges including how to determine the best location and the most suitable/niche clinical specialty (i.e. picking the winner), and how to develop the required clinical expertise, how to cater for the needs of patient with comorbidities that fall outside the organisation’s core expertise.

Super specialisation limits hospitals ability to provide care to those with multiple complex healthcare needs, especially the 14 per cent with 6+ chronic conditions.

The use of health IT in linking external experts into hospital to provide specialist care to patient problems outside the organisation’s core competency will be a key strategic decision to ensure the capability to deliver appropriate quality care while keeping focus on the path of differentiation.

The adoption of appropriate telemedicine model and adequate investment in supporting technologies (e.g. home-based patient monitoring) are important strategic decision examples.

New Services, New Market

Worldwide, medical tourism is estimated to grow at a rate of 15-25 per cent per annum (Medical Tourism Statistics 2014) and revenue is estimated to be US $55 billion.

Many Asian-Pacific nations (e.g. Singapore, Taiwan, South Korea, Thailand) are tapping into the rapidly growing medical tourism business. The industry competes for the business on price and/or quality bases.

Some nations/institutions choose to compete on price alone. Other countries or institutions may not be price competitive, but their differentiating factors will be specialisation, quality and strategic deployment of enabling information technologies.

One of the critical quality measures (and success factor) is the pre- and post- treatment support to patients back in their home nations. Early adoption of new and emerging technologies, especially those enabling non-invasive clinical status monitoring can provide the much needed services differentiations, quality care enhancement, and hence competitive edge.

‘Smart lens’ glucose monitoring technology for use in diabetes patient is one such example (Smart Lens Technology, 2014).

The Strategic use of telehealth technologies to provide excellent expert support to local healthcare services in the patient’s home base is considered paramount.

IT Adoption: The challenges and strategies

The history of IT implementation is littered with project failures. In 2005, it was reported that about 5 to 15 per cent IT projects were abandoned before or shortly after delivery (Charette RN, 2005).

As IT systems become more ubiquitous and complex, the failure rate increases. According to a Standish Report of 3,555 projects analysed from 2003 to 2012, only 6.4 per cent were successful; 41.4 per cent failed – either abandoned or restarted anew from scratch (Thibodeau P, 2013).

Health IT projects are very complex, the failure rate is estimated to be much higher.

Key challenges to health IT adoption and strategies are explored in this section.

Vision and Clear Realisation Pathway

The initiation of the British NHS national program for IT (NPfIT) in 2003 (eHealth Insider, 2003) typified a national eHealth program with grand and ambitious vision. Nations include Australia, Canada, the Netherlands and other EU nations, Singapore and USA have all embarked on their national eHealth programmes.

Grand visions need to be supported by clear realisation pathways with realistic objectives and milestones. The pathways need to acknowledge and address difficult eHealth challenges that are borne out of the mismatches between clinical and technical worlds, the unique socio-economic and cultural characteristics and the high complexity of healthcare ecosystems.

eHealth programs without clear, well managed realisation pathway, and/or with poor insights into associated challenges are more likely to produce casualties than to deliver benefits.

The pathways should also be aligned with and capitalise on advancements in new and emerging technologies.

Clinical and Engineering Impedance Mismatch

Healthcare organisations are complex. Clinical processes are complex. The fundamental characteristics of complexity are imprecision and unpredictability. The human body exemplifies complex machinery that modern medicine is still struggling with. The difficulties in modelling knowledge and information on how best to manage multiple comorbidities are akin to modelling weather predictions for the next month. The socio-economic and cultural dynamics between the professional groups, and the imprecise, often convoluted healthcare languages used add more layers of complexities to the ecosystem.

The practice of clinical documentation is characterised by expressiveness with embedded negations, double negations in clinical statements. The use of propositional contents that carry different illocutionary forces and context/user specific perlocutionary intents further increases the difficulty of machine processing of the clinical statements.

Engineering approach to software design requires precision and predictability. Even the more advanced fuzzy logic (Novák V, 1999) is inadequate in dealing with the imprecise, unpredictable and context laden languages of modern medicine.

To optimise power and benefits big data analytics and clinical data reusability, precision is required. Processing fuzzy clinical statements ridden with negations and double negations, multiple illocutionary forces, perlocutionary intents, and changing contexts has presented seemingly unsurmountable challenges for current technologies.

Most health IT systems designers lack reasonable understanding of the clinical information management problems at hand, let alone theinherent complexity of clinical medicine. They view the clinical problems within the boundaries of current technological capabilities. Given the significant clinical and engineering impedance mismatch, there is little surprise that these systems are often declared unfit-for-purpose when they first meet the clinical users.

Clinical user engagement has been acknowledged to be an important strategy and success factor for health IT projects. However, it is often reduced to an ‘empty slogan’ and given lip-service treatment.

Technical experts often adopt a highly arrogant approach such as ‘show me your forms and I will design a system for you in less than a week’, or ‘train your clinicians to limit medication orders to within 50 characters’. Such arrogance has underpinned countless eHealth project disasters.

Compounding the Health IT application design problem is the fact that clinicians are not trained to clearly articular information and system requirements. Clinical speak and technology language are diametrically different. Engagement exercises often end up in escalating frustrations, mistrust, and at times outright hostility between the clinicians and the system analysts.

So what is the solution?

Engagement should be considered as dynamic discovery processes. The discovery should include scope and boundaries setting for the target system, communicating limitations of technologies, how to address the limitations, complexity, fuzzy requirements, and out-of-scope requirements. Roadmaps for realisation of the out-of-scope requirements, and incremental realisation of complex requirements in step with technological advances and in alignment of organisation strategic directions are also critically important, especially to avoid alienating passionate stakeholders. Unintended consequences of different design approaches should be explored and clearly communicated to stakeholders.

Selected clinical domain experts who may also act as champions should receive adequate training in writing high quality clinical storyboards. These storyboards are effective communication tools between clinical users and system analysts developing information and system requirements.

Clinical informaticians should be brought as key conduits between clinical users and technical experts. Their ability to translate the languages used by clinical and technical experts will help minimise/eliminate confusions, misunderstandings and mistrust between these two key groups.

Health IT Standards

It is commonly accepted that standards are critical for interoperability between the thousands of IT systems that must exchange health data. There are three significant challenges in eHealth standards:

There are too many standards. Internationally, there are many eHealth standards development organisations on the playground. A few examples include Health Level 7 (HL7) standards, the International Standards Organisation Technical Committee (ISO/TC215), the Integrating Healthcare Enterprise (IHE), the European Committee for Standardisation - ComitéEuropéen de Normalisation, Technical Committee 251 (CEN/TC251). Each develops a set of often similar and competing standards.

Standards are like toothbrushes. Everyone needs and wants one (or a few), but none is willing to use others. This is the consequence of (a) my requirements are unique and sufficiently different from others; (b) the ‘not invented here’ syndrome at play

Standards are designed to accommodate requirements of a large number of stakeholders; as such they are necessarily broad in nature, and are subject to implementation specific variations. This infamous statement is often repeated in standards meetings: ‘when you see one implementation of a particular HL7 v2.x standard, you have seen only one implementation’ (i.e. another implementation of this same HL7 v2.x standard is almost by guarantee different; the only question is how much different)

What should be the optimal strategy on standards?

Standards are required to ensure organisation capability to interoperably share and reuse health data to support collaborative care, coordinate care amongst different specialist providers, and especially for cross border exchanges of interoperable information to support medical tourism requirements.

There are three strategies to effectively navigate the jungles of competing and often incompatible standards:

(1) Adopt and adapt standards that have the wisest international acceptance and implementation footprints. The Health Level 7 (HL7) Clinical Document Architecture (CDA) is one such example. These standards are likely to be implemented in software by international health IT vendors, thus reducing the costs of implementation;

(2) adopt and adapt the standards that best fit the organisational business and clinical requirements;

(3) influence the development of international standards to optimise the incorporation of national or organisational requirements into standards design.

Creating one’s own standards is a non-option and should be strongly resisted (Figure 1).

Investment in engagement and active contributions to influence international standards development is the most desirable strategy.

There are three components in standards: (a) what are allowed / supported, (b) what are disallowed, (c) the grey area.

Negotiating what are supported, what are not, and the technical implementation is difficult. It often involves ‘horse-trading’ of competing interests and agenda at social and political levels. The persons who champion the organisational or national interests need to be credible, respected, trusted and possess very high level of people skills.

The negotiations will also be on the standard mechanisms required to constrain/profile the international standards such that the organisational/national requirements are met without having to repurpose the international standards.

The organisation will need to establish a set of rules on how to deal with requirements that fall under the ‘grey areas’ (which are often placed in ‘too hard’ bucket and excluded from the standards. Typically, ‘grey area’ contents should be requirements that can be safely ignored by information exchange partners without compromise patient clinical safety. Otherwise, bilateral or multi-lateral agreements with the exchange partners will be required.

Political and Cultural / Social Challenges

eHealth technologies are disruptive in nature. They impact on multiple stakeholders whose responses are very possibly unpredictable. The challenge in winning clinical support is highlighted by Massaro’s description of clinician adjustment to eHeath system as ‘Kubler-Ross phases of mourning’ (Massaro TA 1993; Berg M, 2001).

Technology-induced changes inevitably trigger varying degree of changes in the balance of power among different user groups. For example, the introduction of case manager enabled by IT adoption may change the coordination authority of some senior clinicians.

eHealth is a double-sword edged: it is intended to facilitate the work of clinicians, but also imposes unfamiliar constraints to their work, and facilitates scrutiny of their works from outsiders (Berg M, et al., 2000). Many clinicians react forcefully when required to change their workflow, or when required to be more structured and precise in entering their orders (Massaro TA 1993).

There is no question that disruptive technologies such as eHealth necessitate changes in business and clinical processes.

Business Process Redesign (BPR) requires that stakeholders be willing to radically redesign their work processes so as to optimise business effectiveness and efficiency (Davenport TH, 1993). However, the healthcare core business processes consist of highly knowledge sensitive activities typified by high complexity that defies predictability and standardisation of simple engineering approach. Forcing top down work pattern changes is doomed from the start.

The strategic approach is best to accept the inherent complexity and unpredictability of the healthcare ecosystem; accept that these characteristics need to be nurtured and carefully managed in system development and deployment. Unforeseen spontaneous uses of the system should be investigated carefully with open-mind and unforeseen benefits of such uses be draw out.

eHealth technologies implementation should be viewed in the light of organisational development. Implementing new technologies involves the mutual transformation of work practices and of the technologies (Berg M, 1997; Bijker WE& Law J, 1992).Management and executives need to understand and accept that the system also has to evolve in the light of knowledge and experiences gain during implementation. The costs of IT systemrefactoring driven byimplementation experiences and the organisational transformation should be budgeted into the costs of IT development and deployment from project inception.

Measuring Project Success

IT project success is traditionally measured on two variables: (a) whether it is successful economically, i.e. on time, on budget, (b) the number of users willing to use the system. Both are very poor indicators or measures for eHealth projects.

eHealth project success measurement is much more complex. There are at least two perspectives and several dimensions of measurements

The organisation perspective

Economic measure – The project is delivered on budget, on time, and meeting requirements stated in the functional and technical design specifications

Business effectiveness and efficiency measure - The IT system improve the competitive edge of the organisation’s business (which includes patient/customer satisfaction); supports pan organisation performance analysis and executive strategic decision

Professional user perspective

Clinical outcomes - The IT system facilitates real time continuous quality improvement: e.g. help detect early warning signs of deteriorating patient clinical status (adjusted to patient’s existing morbidity and comorbidities), supports early micro-management of the patient, and contributes to improvement of patient clinical outcomes; supports cross patient and cross provider comparative analysis of clinical effectiveness

Clinical processes / workflow - The IT system reduces clinical documentation burden, enables seamless interoperable exchange and reuse of clinical data across clinical, research, epidemiological and cross organisation secondary use requirements; facilitates easy access to evidence-based knowledge for improved decision making and effective patient management; different system commands are optimised and enable synergy between ordering, documentation and research functions; system navigation should be intuitive and context of use preserved during navigation.

A sensible balance between the need for clinical expressiveness and precision (structured data entry) is important to ensure that burden of data entry does not disrupt clinical workflow unnecessary. Within the boundary of technical constraints, fully structured data entry is best limited to data components that produce maximum return on documentation efforts. Examples of these data components include: clinical findings, diagnosis, allergy/intolerance, adverse reactions, medications, diagnostic test orders and results. Ethnographic methods of analysis to identify clinician documentation preferences, the burden of structured data entry, how and the extent of clinical workflow may be interrupted by the IT system will help determine the optimal balance.

Patient satisfaction - It is an indirect measure of IT system success. Measures include happier patient journey, greater efficiency in care process (e.g. less waiting time for patients), minimisation or elimination of adverse medical events/harms to patients, better clinical outcomes.

Acceptance and sense of ownership of system - The sense of ownership is often enhanced when clinical user requirements and participations are nurtured and acted upon. This may often give users the sense of control and pride (Berg M, 2001).

Determination of project success measures requires willingness of management and project leaders to fully understand and accept professional users’ view of success measures. Project success measures/criteria based on socio-clinical-technological synergy balancing the organisational and professional user perspectives will have much better chance in being accepted by the majority, if not all, of the stakeholders.

Conclusion

eHealth is disruptive, but is also strategic and transformational. The benefits can be significant. But the hurdles are also very challenging.

While health information technologies may offer considerable administrative efficiency gains and cost-savings, empirical evidences available so far indicate that any clinical cost-savings realised from health IT systems adoption are likely to be marginal. Health IT should be used as strategic and quality improvement tools, not for directed at clinical cost cutting.

eHealth programmes design and implementation should be viewed as socio-cultural-technical alignment exercises and not management key-performance indicators or IT-lead deliverable driven projects.

Implementation strategies should be driven by frameworks that coherently incorporate top down visions and also articulating bottom-up views that transcend user needs.

A clear vision of the capabilities and limitations of Health IT and the design of technological solutions that maximise on IT capabilities, embraces user requirements with a clear pathway to incrementally realise overall organisation and user visions in alignment with the pace of technological advances are steps in the right direction.

Quantum computing and multi-dimensional rendition of complex clinical data when matured out of research laboratory environments will provide some highly promising solutions to clinical computing. The incorporation of these emerging technologies and cutting edge technologies such as fighter cockpit situation awareness into health hardware and software applicationsshould help resolve some of the complex computational problems that have plagued clinical computing in the past decades.

It is important to accept that while technologies transform an organisation, theyalso inevitably transformed by the organisation’s culture and stakeholders. Achieving socio-cultural-technological synergy is important not only for project success, but also for organisation success.

References are available at www.asianhhm.com

-- Issue 30 --

Author Bio

Stephen Chu

Stephen Chu is a leading international health informatics expert and Adjunct Professor of Multi-Media University Malaysia. He has extensive expertise in many clinical and informatics domains. He is co-chairs of HL7 Patient Care Workgroup, chair of Standards Australia Pharmacy committee and colead of many international informatics standards projects.

TOP