AI can transform healthcare at every touchpoint, but clinicians and researchers need to first address their data infrastructure to meet computing needs in research and treatment.
by Chua Hock Leng
Asia Pacific’s (APAC) healthcare sector is flourishing as the region undergoes a data and innovation boom. Efforts to digitize patient’s health records1 are well underway, even in emerging markets. In a bid to speed up the development of cutting-edge medical products, countries like Singapore are also ramping up partnerships between researchers and the private sector.2
All of these will pave the way forward for greater AI adoption within the sector. A report by International Data Corporation (IDC) ranks APAC’s healthcare industry third in terms of AI spending,3 after banking and retail. In 2018, healthcare organizations spent an estimated US$87.6 million on AI, with most of the investments going into diagnosis and treatment systems. Used effectively, AI can deliver improved outcomes in areas such as disease prevention, integrated care coordination and innovation of care teams.
The Anatomy of AI in Healthcare
Data and AI share a symbiotic relationship – where one thrives, so does the other. AI can process enormous volumes of data, and the system’s algorithms can be fine-tuned to derive more accuracy the more data it sorts through.
The healthcare sector has no lack of data, providing the perfect environment for the use of AI. A study by Stanford University4 finds that by 2020, 2,314 exabytes of healthcare data will be produced every year. APAC is expected to become the biggest market for the Internet of Things, accounting for 48 per cent of global spending by 2023.5 To accelerate 5G adoption, Singapore has also set aside S$40 million6 for research and innovation for 5G use cases. These trends will exponentially drive up the amount of data that is generated and the speed at which they are moved from one point to another.
Already, radiologists use 20,000 to 30,000-slice CT and MR exams7 in academic environments. More healthcare professionals are also relying on Picture Archiving and Communication Systems (PACS) for more extensive analyses, which will expand with the introduction of new data-intensive modalities such as digital tomosynthesis. Any delays or outage to the imaging systems could be disastrous, especially for healthcare professionals assessing emergency cases in trauma centres.
Healthcare organisations are struggling with the data-intensive workloads required for AI. Overwhelmed by the amount of data, legacy data infrastructure experience high latencies which slow down AI systems. Without addressing this, healthcare organisations cannot innovate fast, or get insights to treat time-sensitive cases.
Laying the Data Groundwork
In a survey8 conducted by MIT Technology Review and commissioned by Pure Storage, 83 per cent of senior leaders agree that AI will bring significant enhancements to processes across multiple industries. Despite this, many organisations still struggle with deploying and leveraging AI. Within APJ, 78 per cent of businesses say they experience challenges with digesting, analysing, and interpreting the data they have.
This indicates an urgent need for an agile, scalable data infrastructure to unlock the true value of AI. This infrastructure should break down silos, enabling applications to store and move data on-demand between multiple clouds and on-premise storage, facilitate high computing and offer continuous improvements.
To ensure that innovation is not restricted by the data deluge, the storage layer within this data infrastructure needs to be able to scale linearly and non-disruptively. The system also must work in real-time, be available on-demand and self-driving. This is more vital in critical conditions, where researchers and physicians must devote their full attention to the work at hand in a time-sensitive and high-risk environment.
Early Successes Show Promise
For early adopters, the transition to an all-flash data infrastructure is yielding benefits.
The University of California, Berkeley, which is working on genomic analysis, was unable to manage high data volumes with its legacy storage. A single sample of a person’s DNA takes about 300 GB of raw data, which means a project involving 50,000 to 100,000 participants would require petabytes of data to be processed in just a single study. After moving to a data-centric architecture on top of its real-time analytics engine, Apache Spark, researchers at Berkeley can now process heavy workloads faster and generate fresh insights into personalizing patient treatment based on genetic profiling. Not only does this optimize medications and improve post-op care, it also lowers patient cost.
In Taiwan, the Linkou Chang Gung Memorial Hospital, which specializes in genomics, is leveraging Pure Storage’s AIRI, an AI-ready infrastructure that enables AI-at-scale. The hospital also has 29 specialty centers that treat over 10,000 local and international patients a year, and thus needs to store and access a vast amount of data. Its researchers can now quickly access large datasets, identify disease-carrying genes, and develop new treatments and preventive measures at a much faster rate. This has enabled the hospital to spread key insights from its genomic research to other hospitals and local medical providers in the country – amplifying the impact of the work that they do.
Even outside of primary care and research, other players in the healthcare ecosystem can also benefit from speedier access to data. The Australia-based Catholic Church Insurance provides insurance, workers compensations and asset management support for many hospitals, churches and schools. By modernizing their legacy storage system, the Catholic Church Insurance now enjoys faster data processing and a high level of data reduction. This, in turn, has given the insurer the ability to deliver faster database transactions, and seamless experience on customer portals and insurance systems.
Getting to the Next Medical ‘Eureka!’
From research and diagnosis to end-of-life care, AI has the potential to transform healthcare at every touchpoint. This wealth of data available represents a boon, not a bane, for clinicians and researchers who will now be able to study and analyze this data with greater speed and ease, thanks to machine learning and the use of analytics.
In order to unlock the true value of data, healthcare organizations will need to build a data infrastructure that meets the needs of tomorrow’s computing potential. Only by adopting a long-term approach to infrastructure and innovation can the healthcare sector reap the benefits of the AI revolution and take healthcare to the next level. [APBN]
About the Author
Chua Hock Leng, Managing Director, Singapore, Pure Storage