
NEW YORK - At HIMSS AI Forum on Friday, during the talk "The future of people-centered care: earning (or losing) trust by leveraging AI and synthetic research," Dr. Adrienne Boissy, chief medical officer at Qualtrics, discussed people-centered care, the importance of trust in healthcare and how AI can help clinicians earn patients' trust.
"We are not living in an era of trust; we are living in an era of mistrust and distrust, and there is evidence to support that. If any of you follow the Edelman Trust Barometer, as well as these studies, which are very concerning, people used to trust us about 70% of the time. Now, it's 40% of the time in healthcare, and they don't trust us to use AI responsibly," Boissy explained.
Boissy discussed how there are many different models of trust.
"I like very specific definitions, and this is one of the most beautiful definitions I have found: Trust is choosing to make something important to you vulnerable to the actions of someone else," she said.
The definition of trust relates to AI because there is a vulnerability when you put something in the hands of someone or something else, according to her. Additionally, trust is about who you are as an organization.
Boissy challenged the idea that AI can be trusted as much as humans.
"You cannot look AI in the eye. You cannot talk to it. AI cannot read your facial expression. AI cannot interpret," she said. "And this will make it incredibly difficult, but I do think there are different decisional factors in trusting AI versus humans."
Boissy also relayed how research is being done around trusting AI.
"Does it align with my values and integrity? Do I buy it? Is the resonance okay? If that is true, then I decide that the process of the AI model is trustworthy. [That] does not mean I made the decision to trust," Boissy stated.
Only when one begins to see outcomes associated with that model, or it does what it said it will do consistently, will it earn trust, she said.
AI is going to advance people-centered care through access and revenue cycle, conversational means, analytics and dynamics, personalization and predictive analysis, Boissy added. "This is through my lens; this is from an experienced enthusiast lens performance improvement and coaching. And then lastly, clinical and marketing transformation."
"When you look at the appetite of clinicians to embrace AI … we are clearly on a slope. We are fine with it for back-office functions; we are not fine with it for intimate patient activities."
Boissy noted AI hits a major pain point for clinicians around documentation as the clinician is constantly typing and someone in the clinic is not, preventing the clinician from engaging with the patient.
When it comes to trust, integrity is essential, she asserted.
"When we mess up, at a bare minimum, we should work to recover it and apologize in an authentic manner and fix the problem in the long term," Boissy said.