There are few (if any) industries that can be transformed by artificial intelligence to the degree that healthcare can. As initiatives around the world seek to digitize healthcare data, there are huge opportunities for game-changing tools and platforms. However, with potentially a yottabyte (1024 gigabytes) of healthcare data in the United States alone, we are no closer to this utopian, data-driven world of healthcare. Data in existence is not standardized, highly fragmented and is stored in incompatible legacy platforms. The technology exists, so why is healthcare so far behind other sectors in utilizing existing technology and what needs to be done to catch up?
The problems lie in two very different but equally challenging areas. The first is privacy; patient data is an incredibly sensitive asset in most developed countries. There are a huge range of policies dictating how patient data should be stored, shared and secured. The second problem area is mission-critical systems; similar to how mainframe technology from the 70s still run many of the world’s banking transactions, legacy electronic medical records and hospital management systems still run on applications with no APIs and brittle data frameworks.
Solving these challenges will not be easy, but looking at the opportunity for companies like Google’s Deepmind to greatly lower operational costs of healthcare services and also the opportunity to use huge datasets to find patterns to tackle disease prevention and treatment, we soon realize that these problems are worth solving.
When governments and healthcare providers are thinking about how they can open up the gates for AI to transform their industry, they must first consider how they can shift data from these monolithic systems, to cloud-based, big data systems that allow for interoperability and secure data exchange that maintains patient privacy. These transformations won’t be easy, but they will be worth it.