Google has disabled AI reviews for a portion of medical queries
Google began disabling AI Overviews for select medical search queries after publishing an investigation in which experts called the system’s responses “dangerous” and “alarming.” The cases in question were instances where the automated reviews produced misleading or outright incorrect medical information.
These are cases in which the automated reviews provided misleading or outright incorrect medical information.
The trigger was an investigation published in early January that cited specific examples of incorrect advice. In one, the system advised people with pancreatic cancer to avoid fatty foods. Experts pointed out that such a recommendation is directly opposite to clinical approaches and can increase the risk of lethal outcome for patients. In another case, AI Overviews provided false information about key liver function tests, which could lead people with serious illnesses to mistakenly believe they were healthy.
Which queries were left out of AI Answers
As of the morning of January 11, AI summaries have been completely disabled for a number of medical queries, including queries like “what is the normal range of liver blood test values”. Instead of automatically generated summaries, users are now shown normal search results.
The company did not comment on specific instances of AI reviews being removed. However, a Google spokesperson said that significant resources are invested in the development and quality control of AI Overviews, especially in the health area. He said an internal team of clinicians analyzed the examples relayed by journalists and in some cases concluded that the information was not inaccurate and relied on reputable sources. In situations where the system lacked context, Google is making general improvements and applying internal moderation rules.
Another scandal surrounding AI Overviews
This isn’t the first time AI Overviews has found itself at the center of criticism. The feature has previously given absurd and potentially dangerous advice, including recommendations to put glue in food or consume inedible substances. In addition, automated reviews have been the subject of several lawsuits.
The medical query situation underscores the difficulty of applying generative AI in sensitive areas where errors can have direct health consequences. The temporary disabling of AI reviews for some queries shows that Google is being forced to reconsider the boundaries of the technology’s use, despite its aggressive push into search.






