Tech
Google restricts AI tool after bot’s ‘nonsensical answers’ to users queries
Google has been forced to restrict its artificial intelligence search function after scores of users reported “odd, inaccurate or unhelpful” responses to their search queries on Thursday. Some of the “nonsensical” answers included recommending people “eat rocks” or put “glue on pizza” and these answers went viral on social media.
This is not the first controversy surrounding Google’s AI functions. Earlier this year, the AI bot produced historically inaccurate search results. AI Overviews, the Google AI tool, was subsequently scaled back, according to Google’s head of search, Liz Reid.
Rolled out to users in the United States two weeks, ago, Overviews will now have additional guards added to it, to prevent any such incidents from happening again. Designed to improve search, the tool uses generative AI to summarize search queries, reports Forbes.
These include curtailing the bot using user-generated content from social media sites and forums. Using these outlets to generate responses can “offer misleading advice,” according to Reid.
While Reid hailed the tool’s value to Google, she did acknowledge some of its high-profile blunders. These include results telling people that Barack Obama was Muslim, to advising people to eat rocks and to put glue on their pizzas.
The tool will also pause offering summaries on certain topics, such as health. Summaries for “nonsensical,” humorous, and satirical queries, which are asked to elicit silly responses, will also be curtailed.
Overall, Reid said that Google has “made more than a dozen technical improvements.” These include the “added triggering restrictions for queries where AI Overviews were not proving to be as helpful.” Even with the high-profile mistakes, Reid is happy that there is a “higher satisfaction” among users and those asking “longer, more complex questions that they know Google can now help with.”
For all the latest news straight to your inbox, sign up for our FREE newsletters here.
The AI rollout pushes the links associated with Google searches further down the page, with AI-generated responses taking prime position. However, these automatic responses have stirred controversy, with ridiculous responses to queries, including telling people with kidney stones to drink liters of urine to help pass a kidney stone or telling people to eat rocks.
However, a number of fake responses were bandied about online and were debunked by fact checks with Reid warning that “anyone encountering these screenshots to do a search themselves to check.” Some odd searches fall into a “data void” or “information gap,” areas where is little verifiable information and, therefore, ridiculous answer can filter through.
For the latest local news and features on Irish America, visit our homepage here.