Politics

Be careful what you search on Google. False diagnoses and how artificial intelligence has put patients' lives at risk

Answers generated by Google's artificial intelligence on blood tests contained inaccurate and false information, a Guardian investigation has found. The company has decided to remove some AI-generated summaries.

The investigation by British journalists showed that people were being put at risk by false and misleading information generated by user searches.

Google has removed some of its health AI-generated summaries. The company claimed that the summaries, which use generative AI to deliver instant results with key information about a topic or question, are “useful” and “reliable”.

But some of the summaries, which appear at the top of the search results, contained false information.

“Dangerous” and “alarming”

Experts have found a case, which they described as “dangerous” and “alarming”, in which Google provided false information about crucial liver function tests that could lead people with serious liver disease to mistakenly believe they are healthy.

The Guardian found that a search containing the terms “what is the normal range for liver blood tests” produced a lot of numbers, without much context and without taking into account the patients' nationality, gender, ethnicity or age.

Experts say that what Google's AI Overviews consider normal may actually be 180 degrees off from what is considered normal in reality.

The summaries could cause seriously ill patients to mistakenly believe they had normal test results and to miss follow-up appointments.

What Google says

After the investigation, the company removed AI Overviews for the search terms “what is the normal range for liver blood tests” and “what is the normal range for liver function tests.”

“We do not comment on individual search removals. In cases where AI Overviews misses certain contexts, we work to make general improvements and also take action in accordance with our policies where appropriate,” said a Google spokesperson.

Vanessa Hebditch, director of communications and policy at the British Liver Trust, a liver health charity, said: “This is excellent news and we are pleased to see AI Overviews removed from Google in these cases. However, if the question is phrased differently, a potentially misleading AI Overview may still be displayed and we remain concerned that other medical information generated by AI may be inaccurate and confusing.”

The Guardian found that entering slight variations of the original queries into Google, such as “lft reference range (liver function test, nr)” or “lft test reference range”, generated AI Overviews. That was a big concern, Hebditch said.

She says understanding the results and what to do next is complex and involves much more than comparing numbers on report cards.

“Furthermore, AI Overviews does not warn that someone can get normal results on these tests when they have severe liver disease and need further medical care. This false security could be very harmful,” Hebditch pointed out.

Google is still analyzing results

Google, which has a 91 percent share of the global search engine market, said it was looking into the new examples provided by The Guardian.

Millions of adults around the world already struggle to access reliable health information, says Sue Farrington, president of the Patient Information Forum, which promotes evidence-based medical information for patients, the public and healthcare professionals.

AI Overviews continue to appear for other examples that The Guardian originally flagged to Google. They include summaries of information about cancer and mental health which experts have described as “completely wrong” and “very dangerous”.

When asked why these AI Overviews were also not removed, Google replied that they contain links to known and trusted sources and inform people when it is important to seek expert advice.

photo source: Dreamstime.com

Ashley Davis

I’m Ashley Davis as an editor, I’m committed to upholding the highest standards of integrity and accuracy in every piece we publish. My work is driven by curiosity, a passion for truth, and a belief that journalism plays a crucial role in shaping public discourse. I strive to tell stories that not only inform but also inspire action and conversation.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button