AI Can’t Fix all of Healthcare’s Problems
By: Dr. Tim Wetherill, Chief Medical Officer, Machinify
AI in healthcare has been heralded as a panacea for many of the industry’s biggest problems. The last decade has seen AI-based solutions designed to help with everything from taking notes, managing data, monitoring patients, and a bevy of other use cases, and it’s true that AI presents exciting possibilities. However, before we declare the technology as a cure-all, it’s important to step back and think about whether AI is really prepared to deal with the complexity of healthcare in 2025.
The reality of AI in healthcare, at least right now, is that it can do a lot of important things well – from aiding in drug development, to processing medical images. But just because AI can help with certain healthcare tasks does not make it the golden healthcare ticket that many in the industry have made it out to be. In fact, there are reasons to believe that AI might struggle to take the next step in solving healthcare’s problems. Let’s take a look at some of them.
Also Read: The Impact of Increased AI Investment on Organizational AI Strategies
Bad Data
The general idea behind how AI learns is relatively straightforward: it ingests data and uses that data to inform its outputs. The trouble in healthcare is not only that there is a dearth of good data, but an abundance of bad data. The average medical record, for instance, is filled with mistakes, contradictions, and complicated language, not to mention that a single hospital stay may result in a medical record that’s thousands of pages long. The information that’s actually needed becomes nearly impossible to find.
“Garbage in, garbage out,” is a phrase often ascribed to AI models, but I’d struggle to find a setting where it’s more applicable than in healthcare – especially the “garbage in” part. Feeding these records, full of bad and irrelevant data, to an AI model will do more harm than good. It may produce an output, but the likelihood that output will be accurate and based on useful context is slim to none.
Chasing Profits
Healthcare is an essential service, but by no means does that make it impervious to the motivations of any big business. Most healthcare organizations have to stay in the black to be operational, meaning profit can take precedence over the patient. You might think that AI, theoretically devoid of human-like ambition, would help neutralize the tendency to squeeze every dollar out of the system, but, in fact, the opposite is often true. AI is trained based on what humans want it to learn, so if humans want AI to become a profit-maximizing machine (which, spoiler alert, many do), that’s what the AI will try to become.
Just like AI can help doctors find the most effective treatment for a patient, it’s just as capable of steering doctors toward more expensive, if unnecessary, diagnoses that come with higher payouts. Even if AI is designed and implemented with the best intent, there will always be bad actors that will try and game the system.
Dehumanizing the Patient
Test results and statistics are an important part of practicing medicine, but ultimately, medicine is about treating people, and people are often more complex than what can be captured in a chart. While AI can be very good at interpreting neatly organized data that follows certain patterns, patients rarely provide this kind of information. They might misremember something, detail symptoms that may have nothing to do with their actual ailment, or outright contradict themselves.
AI doesn’t do nearly as well with these types of messy inputs that can’t be sorted into particular buckets to get a coherent story. That’s a skill that, at least for now, should be left to humans.
Patients also give all sorts of physical cues that AI models of today might not pick up. Subtle cues from body language can give big hints to doctors, which can sometimes be even more valuable than medical imaging. It’s possible that AI will evolve to be able to pick up on these cues, but for now, they miss the mark, and that can mean the difference between life and death.
Also Read: The Evolution of Data Engineering: Making Data AI-Ready
Putting AI in Perspective
AI has its flaws, but it’s not something that should be blindly avoided at all costs either. Rather, it just needs to be approached with the right mindset. AI is a tool that, when deployed in the right circumstances, can pay dividends for the healthcare industry. But a lot goes into making sure the circumstances are right, including respecting messy data, focusing on patients rather than profits, and recognizing that for some situations, AI will never be able to replicate the work of a human clinician.
For now, AI is a worthy co-pilot in the right environment. Healthcare leaders should make the effort to understand where it belongs and where it doesn’t.
[To share your insights with us as part of editorial or sponsored content, please write to psen@itechseries.com]
Comments are closed.