Google AI search says to glue pizza and eat rocks

Google's new AI search tool is being criticized for giving weird and wrong answers.

When some people searched for how to make cheese stick to pizza better, the tool told them to use "non-toxic glue".

The AI also said that geologists tell people to eat one rock every day.

Google said these were just a few unusual cases.

Some answers seemed to come from Reddit comments or joke articles from a site called The Onion.

A lot of people made fun of these answers on social media.

But Google said the tool is working well most of the time.

They said, "The strange answers are from very uncommon searches and don't happen for most people."

"Most AI overviews give good information and links to learn more," they added.

Google said they fixed problems when they found them and are using them to make the AI better.

This isn't the first time Google has had issues with its AI products.

In February, they had to pause their chatbot Gemini because people didn't like its "woke" responses.

Gemini's earlier version, Bard, also had a really bad start.


According to the article, what kind of answers has Google's new AI search feature been providing?

What did the AI tool suggest when people searched for ways to make cheese stick to pizza better?

Where did some of the AI's answers seem to come from?

How did Google respond to the criticism of its AI search feature?

What happened to Google's chatbot Gemini in February, and why?