Technology has touched every aspect of our lives, but nowhere does it make more of a difference than in healthcare, where it literally saves lives. We have seen tremendous improvements to healthcare through technology, and now in an era where data is king, we are seeing the ethical use of data take patient care standards to the next level.
Artificial intelligence (AI) and analytics are providing clinicians and researchers with actionable insights, from early detection to end-of-life-care, and by changing the way research is done and diagnoses are made. However, unlocking the data treasure trove is not a simple exercise for any healthcare organisation.
With Asia-Pacific (APAC) expected to become the global leader in IoT spending according to IDC1, healthcare is unsurprisingly becoming increasingly connected in the region. However, it is this connectivity that adds complexity to the data challenge.
Healthcare data is now growing at a rate of 48 per cent every year. According to a study by Stanford University, 2,314 exabytes of healthcare data will be produced every year by 2020. This exponential growth rate is due to a multitude of sensors and devices, including mobile applications and wearables, to support everything from diagnostics to insurance. In addition to storing this growing amount of data, the data architecture also needs to support seamless accessibility as researchers run increasingly complex analyses with bigger sets of data.
For example, radiologists sometimes use 20,000 to 30,000-slice CT and MR exams2 in academic environments and will require smooth and consistent performance with low latency for quick results. More healthcare professionals are also relying on Picture Archiving and Communication Systems (PACS) for more extensive analyses,which will undoubtedly expand with the introduction of new data-intensive modalities such as digital tomosynthesis.
These modern demands of healthcare infrastructure have begun to highlight how legacy data storage and management solutions are fast becoming ill-equipped to support the operation of AI systems. In many of these solutions, data exists in silos, and cannot be accessed or easily moved around different applications. This separation puts restrictions on AI systems that need to sift through data across applications to draw insights quickly and effectively.
The data architecture challenge is also further complicated as cloud adoption becomes increasingly common in the healthcare sector. What this means is that organisations now find their IT systems affixed in a multicloud or hybrid cloud environment, where data and workloads are scattered across various platforms. These cloud approaches have grown out of a demand to let organisations keep up with end-users’ rising expectations, particularly when it comes to always-on applications and connected systems.
While the advantages of scalability and versatility are clear, the emergence of multiple cloud systems can result in even more silos. This further complicates an already complex storage ecosystem, making it difficult for healthcare to realise the innovation benefits of cloud.
Additionally, poorly managed storage usage and sprawl hampers real-time data access and analytics, impeding accuracy and speed – two factors critical in the treatment of patients. From physicians through to senior management, healthcare professionals need fast, reliable access to vital information, so that they can make better-informed decisions.
To ensure that innovation is not restricted by the data deluge, healthcare organisations need a data-centric architecture that is backed by an enterprise-wide data strategy. A datacentric architecture breaks down silos and enables applications to store data on demand, share it easily, facilitate high computing and offer continuous improvements. It also enables data and applications to move freely between multiple clouds and on-premise storage in order to drive application and data compatibility for providers to benefit from real-time analytics and insights.
This new flexible and agile storage must be able to manage larger workloads without compromising on performance. All-flash storage platforms that consolidate data in a centralised data hub and enable tasks to be performed simultaneously are primed for AI. Building such an environment would unlock innumerable benefits, from the ability to increase sample sizes in clinical studies, to automating decision making and data visualisation.
Getting this right is especially important with the growing interest in AI among healthcare companies in Asia Pacific. An IDC3 report in 2018 has ranked the healthcare industry in this region third in terms of AI spending, after banking and retail. Healthcare companies spent an estimated US$87.6 million in AI in 2018, with most of the investments going into diagnosis and treatment systems.
Despite the burgeoning interest in AI, many organisations remain uncertain about how they can best deploy and leverage this nextgeneration technology. In asurvey4 conducted by MIT Technology Review and commissioned by Pure Storage, 83 percent of senior leaders agree that AI will bring significant enhancements to processes across multiple industries. Within the Asia-Pacific and Japan region, an overwhelming 87 per cent of companies say that data is critical to delivering results for customers. Despite this, 78 per cent of businesses say they experience challenges with digesting, analysing, and interpreting the data they have. This indicates an urgent need for an agile, scalable datacentric architecture to unlock the true value of AI.
For early adopters, the transition to an all-flash data-centric architecture platform is yielding benefits.
The University of California, Berkeley, which is working on genomic analysis, was unable to manage high data volumes with its legacy storage. A single sample of a person’s DNA takes about 300 GB of raw data. Considering that such projects involve 50,000 to 100,000 participants and each participant’s DNA is sampled several times over the course of a study, it could mean processing petabytes of data in just a single study. After moving to a data-centric architecture on top of its real-time analytics engine, Apache Spark, researchers at Berkeley can now process heavy workloads faster and generate fresh insights into personalising patient treatment based on genetic profiling.
Another example is Paige.AI. The company is digitising enormous amounts of pathological data to train machines to detect cancer cells with the aim of helping pathologists analyse samples more accurately. Paige.AI requires an advanced deep learning infrastructure that can quickly turn massive amounts of data into clinicallyvalidated AI applications. They achieved this over an all-flash datacentric architecture.
In Taiwan, the Linkou Chang Gung Memorial Hospital, which specialises in genomics, is leveraging Pure Storage’s AIRI, an AI-ready infrastructure that enables AI-at-scale. The hospital also has 29 speciality centers that treat over 10,000 local and international patients a year, and thus needs to store and access a vast amount of data. Its researchers can now quickly access large datasets, identify disease-carrying genes, and develop new treatments and preventive measures at a much faster rate. This has enabled the hospital to spread key insights from its Genomic research to other hospitals and local medical providers in the country – amplifying the impact of the work that they do.
Even outside of primary care and research, other players in the healthcare ecosystem can also benefit from speedier access to data. For instance, the Australia-based Catholic Church Insurance provides insurance, workers compensations and asset management support for a large number of hospitals, churches and schools. By modernising their legacy storage system, the Catholic Church Insurance now enjoys faster data processing and a high level of data reduction. This, in turn, has given the insurer the ability to deliver faster database transactions, and seamless experience on customer portals and insurance systems.
Storage systems themselves also benefit from the use of AI. Integrated healthcare provider Carilion Clinic serves over one million residents, and its IT services are used by 16,000 people internally and across partner organisations. Carilion Clinic's switch to all-flash storage had already unlocked faster response time, but the introduction of an intuitive, AI-driven solution on its storage system5 has simplified the administration and maintenance process more than ever before. Traditionally, employees within the storage management team are responsible for ensuring uptime, sufficient data capacity, speeding up sluggish performance, backup and disaster recovery. With the new AI solution, Carilion Clinic is now able to forecast storage capacity utilisation, as well as the load on storage systems and their resulting performance. The team is then able to make adjustments accordingly to accommodate growth in data or performance demands.
These cases highlight the potential of AI to solve some of the most pressing challenges, improve critical patient outcomes and the customer experience for millions. This wealth of data available represents a boon, not a bane, for clinicians and researchers who will now be able to study and analyse this data with greater speed and ease, thanks to machine learning and the use of analytics.
In order to unlock the true value of data, healthcare organisations will need to build a data-centric architecture that meets the needs of tomorrow’s computing potential. Only by adopting a long-term approach to infrastructure and innovation can the healthcare sector reap the benefits of the AI revolution and take healthcare to the next level.
3 https://www.idc.com/getdoc.jsp? containerId=prAP43696818