If you’re anything like me, over the past few years you may have developed the masochistic habit of WebMD-ing yourself within an inch of your life. It was an especially easy habit to get into during the pandemic’s height when we were all left with nothing but our own thoughts, a deadly global health crisis, and conflicting COVID information to occupy our time. And if you’re even more like me, before you step foot in the hospital waiting room or call your health insurance provider you likely spend time researching symptoms, medication side effects, and operation procedures.
The Patient Experience Is The Most Important Customer Experience A Person Can Have
Many patients view this act as not just one borne out of stress or curiosity, but rather as a personal security measure. “Knowledge is power,” as the medical adage goes. Those looking to upkeep, improve or make substantial changes to their health want to be as informed as possible. Patients want to become their own best advocate. It allows them to succinctly convey thoughts to a healthcare professional, nurse practitioner or doctor without feeling they’re getting needlessly tested, ignored, or prescribed treatments that might not meet their health concerns.
Despite differences in vocabulary, the customer experience and the patient experience are very much aligned: personalization, transparency, and empathy are essential to receiving quality care. The most important thing to keep in mind is that while a retail customer can get products at a discount, return, refund or exchange purchases, a patient’s life doesn’t have a price point. The invaluable measure of the patient experience is something that doctors and the infirmed ought to agree on, but such an ethos is not so cut-and-dry.
RELATED SPECIAL REPORT: Patient Experience Trends In A Hypercompetitive Healthcare Landscape
Mismatched care in the medical field is a top concern of “customers”: patients who interact with companies in search of life-saving care for themselves or a loved one often have concerns that their agency as people can be cast aside by the business behemoth that is modern medicine. On the flipside, focus on the patient experience is growing rapidly right alongside not just healthcare technologies, but information technology meant to aid healthcare workers in providing quality care. In the age of the digitized customer experience, omnichannel and artificial intelligence, the patient experience is one that has a lot of options to weigh in terms of what the future looks like.
Is Generative AI A Medical Miracle? Not Quite, But It Can Help Put Patients In Control Of Their Health, Spine Surgeon Says
In recent months use of generative text AI such as ChatGPT has increased across industries, and its role in medical technology has in turn: according to the World Economic Forum, when asked about AI-driven robots that can perform parts of surgery, 56% of those familiar with the technology say it’s a major advance for medical care. Dr. Rahul Shah, a board-certified orthopedic spine and neck surgeon, tells CCW Digital that public opinion on medically focused generative AI tools doesn’t speak to the nuances of advanced technology in the operating room–or outside of it. “Everybody wants the same quality of healthcare," he explains. “Everybody wants individualized care, everybody wants individualized attention. But the healthcare system is being more and more squeezed, such that individualized attention is not possible as much as it may have once been.”
He lists a leaner workforce, cuts to Medicare and less frequent insurance reimbursements among the key obstacles the healthcare industry faces today. And for those who want to be able to make sense of things, deliberate and think through the different care options available to them, “it’s become more of a challenge to serve the needs that patients have,” Dr. Shah notes. Those needs include medicinal treatments, surgeries, patient history overviews and more. While the public may view AI as a futuristic solution to staff shortages at the operating table and beyond, Dr. Shah sees advanced technology as not the answer to these issues but something that over time may help streamline the process that leads to palatable healthcare solutions.
REGISTER NOW: CCW Online 2023 | Generative AI and Chatbots for Customer Contact
“It can help people summarize what they need to," he says, "and make sure that they're talking in a relevant manner so that people don't feel as though they're not in control, or that or that they're going off on rabbit trails that they don't necessarily need to be going off on.”
How Oversimplifying Tech’s Role In PX Can Lead To A Lack Of Oversight
Generative AI has the potential to help patients have more informed conversations and make the medical community’s knowledge base more succinct and even conversational for the average patient. Even so, “I think that the risks are when there is no oversight,” says Dr. Shah.
“Judgment is still something that is a uniquely human tendency. And the reason is because not everything is based on a grid or on a binary... judgment is part of what makes it [generative AI] relevant and what makes it unique, what makes it important and valuable.”
When it comes to analyzing CAT scans, MRIs or understanding the gravity of patient symptoms, Dr. Shah says that reviewing such information may yield one of three diagnoses, known as a differential diagnosis. “It is really the best guess given the available data, and so AI can help with making guesses,” he tells CCW Digital. Judgment, he adds, is key to good decision making when using generative AI in healthcare not just now, but going forward.
Patients are often concerned that maybe they're not seeing something that they should have in terms of their health or that a doctor might not be truly understanding their circumstances. While utilizing generative AI as a supportive tool has the potential to better prepare patients for important and at times life-saving conversations, it also has its data security and privacy risks.
“What you disclose to the technology will certainly depend on how it can be linked to the rest of your patient profile,” Dr. Shah discloses. “It will certainly make it a lot easier to profile you or to know more about you… it's a risk for providers to outsource their diagnostic tools to programs like ChatGPT.”
In terms of his own line of work, asking a generative AI tool “I am scared about spine surgery. What sort of questions should I ask?” is a lower-risk use case than relying on it to inform doctors on how to do surgery. “You're not giving anything more than trying to get a better idea,” Dr. Shah explains. “In theory, you can even do it for a family member that's going to see a physician, and you can have that be helpful.”
Generative AI and Spinal Surgery Concerns: A Real-Time Use Case Of Advanced Technology And PX
For the purpose of this report CCW Digital took Dr. Shah’s recommendation to OpenAI's ChatGPT:
He finds the AI tool’s response to be in line with what he would hope that a patient may ask him. “They're exactly what I would want to have a conversation with when I talk to a patient," he says. "This is exactly what the elements of my conversation are when I talk to patients trying to make sure that they understand their diagnosis and know exactly how it's different from normal, as well as what we’re going to try and do to salvage what we can.”
‘Garbage In = Garbage Out’: Measuring The Risk And Reward Of Generative AI
In terms of privacy and metadata concerns, tools like ChatGPT will source information from public information regarding patients that have disclosed online their own experience with health issues. However, if the information generative AI consumes isn’t rooted in medical or scientific fact, the search engine’s output can be a bit off. “I think that where it can become a problem is when it can make assumptions that are just flat out incorrect. Garbage in definitely generates garbage out, and probably at a greater scale,” Dr. Shah suggests. “That initial curation of what is put in, be careful with that.”
RELATED SPECIAL REPORT: Patient Experience
Some tell-tale signs that generative AI isn’t ingesting the full scope of a given topic are ones that many patients have experienced in their daily lives, he's noticed. “You know if you've ever had a conversation with somebody who's totally not listening, they’re just totally distracted which just gives you nonsensical information, right? It’s very difficult to make progress.” If ChatGPT’s response to a medical concern sounds like such a person, “in that scenario, you're actually hurting yourself not helping yourself,” Dr. Shah warns.
For that reason, Dr. Shah sees generative AI as useful for handling and synthesizing administrative tasks and information in healthcare. Nurse practitioners and employees behind computer screens can deploy advanced tech to gather information and profiles. Dr. Shah even considers what it might look like for him to utilize it in the future:
“I can use it to be able to summarize lots and lots of records… It can help me to understand or generate a plan to be able to talk to patients about their condition. It can help me empathize… These are all things that it could potentially work with, and it can potentially take areas of conflicting diagnoses and summarize them in a way that makes it easy for folks to follow.”
When it comes to ChatGPT’s place in the operating room, Dr. Shah says that generative AI's use case is bit more complex and one that he feels the technology isn’t quite ready for. “The stakes are much higher in the operating room... I think we're still in the very early stages there,” he explains. “But until you put them into practice, it's going to be difficult to know."
All things considered when it comes to generative’s AI’s palace in healthcare, Dr. Shah has one thing that he wants everyone–providers and patients alike–to keep in mind:
“Judgment is the most important thing in healthcare and ChatGPT doesn't have judgment. People have judgment.”