‘Garlic rectal insertion for immune support’: Medical chatbots confidently give disastrously misguided advice, experts say



Popular AI chatbots often fail to recognize false health claims when they are delivered in safe, medical-sounding language, leading to dubious advice that could be dangerous to the public, such as a recommendation that people insert garlic cloves into their buttocks, according to a January study in the journal. The Lancet Digital Health. Another study, published in February in the journal Natural medicinefound that chatbots were no better than a regular internet search.

The results add to a growing body of evidence suggesting that such chatbots are not reliable sources of health information, at least for the general public, experts told LiveScience.

Add Comment