Who is liable when things go wrong? - AI or Doctors

A new peer-reviewed brief was published on Friday in JAMA Health Forum. It talks about the legal risks that doctors could be facing in this new era of AI & Healthcare.

Here is an excerpt:
“Emerging research highlights the disproportionate moral responsibility placed on physicians in assistive AI decision-making. A vignette study found that laypeople assign greater moral blame to physicians when advised by AI than by human colleagues, as human operators are seen as having control over the technology. Consequently, physicians are often viewed as the most liable party in adverse outcomes, more so than AI vendors or healthcare organizations. This perceived liability creates a significant burden for physicians in deciding how to incorporate AI inputs.”

Now the question is - Who is liable for what when things go wrong?

Scenario 1
For a case, the AI was right and physician was wrong and didn’t follow the AI’s recommendations.

Scenario 2
The physician was right and the AI was wrong but the physician followed the AI’s recommendation.

This seemingly creates a “no-win” scenario for clinicians when it comes to use of AI in Healthcare.

Without proper legal protections and support structures for healthcare workers might get caught in legal battles.

What are your thoughts on this?

  1. Do you think there is a possibility of legal issues that can arise with scenarios like 1 & 2 in the future?
  2. When the time comes, should AI companies also be held responsible for their product’s error/mistake or does it all fall on the physician involved?
3 Likes

Doctors will face legal challenges whether case scenario one or two.
AI companies should be made liable, but that mindset Will not kick in easily . However, if only AI is involved, no doctors, then the patient is more likely to put the blame on AI too. Example: a human subject consults AI without the intervention of a doctor.

AI is evolving fast and and we do not know how the life would be in next many years. So we got to wait. It is possible that AI will beat human intelligence, and might even take over.

4 Likes

Yes, legal issues can arise in both scenarios. Without clear guidelines, physicians may face liability whether they follow AI recommendations or not.

AI companies should share responsibility for errors, especially if their models influence clinical decisions. A balanced legal framework is essential to protect both healthcare providers and patients.

2 Likes

Actually if I think from patient’s perspective no scenario is acceptable. Both arises legal issues. We learn to practice perfectly after many years of training and exposure. Human body acts differently in same disease. Even for surgeries there is many anatomical variations. All of them are normal. A physician or surgeon acts instantly with Knowledge, experience, efficiency. Their perfection comes with so many years of practice. Although they can make mistake. But the mistake should not be life threatening.

On the otherhand, AI is programmed. It analyses, correlate the data and give answer. So,indivisualisation is difficult for AI. It can’t replace the expert’s position rather it will help to sort out common diagnosis. But It should be kept in mind that the fact should always be checked by human.
So, no patient should be handled by AI alone. AI can reduce workload but can’t replace doctor. Only appropriate handling can keep legal issues away.

3 Likes

No patient should be handled by AI alone- that sums up everything.

2 Likes

According to me, if patient comes up with AI knowledge , it’s duty of doctors to make them understand about the disease. Patients don’t know about exact medical terms and Google or whether AI can’t make them know what a doctor knows in his due course. None is responsible but at the same time both doctor and patient are responsible. Doctors don’t have enough time due to workload while patients want instant results and explanations. It needs to be balanced.

4 Likes

It is always a good idea to explain further when patients come up with the questions or their narrative of the diseases.

2 Likes