TECH

BBC Cries Foul Play Over Apple Intelligence Alert Headline Summaries

iOS 18.1 Notification Dashboard Examples

The UK's BBC has complained about Apple's iOS 18 notification dashboard feature completely falsifying the point of the article. Here's what happened and why.

The Apple Intelligence rollout included summarization features, saving users time by suggesting key points in a document or set of alerts. On Friday, the summarization of alerts became a major issue for one major news organization.

The BBC complained to Apple about how summarization misinterprets news headlines and makes incorrect conclusions when compiling summaries. A spokesperson said Apple had been contacted to “raise this issue and resolve the issue.”

In one example cited in the public complaint, a notice summarizing BBC News said: “Luigi Mangione has shot himself,” referring to the man arrested for the murder of UnitedHealthcare CEO Brian Thompson. Mangione, who is in custody, is alive and well.

“It is extremely important to us that our audiences trust any information or journalism published in our name, including notices,” a spokesperson said.

Incorrect generalizations are not unique to the BBC, as the New York Times has also been a victim. Bluesky’s report on a November 21 summary claimed that “Netanyahu has been arrested,” but in fact it was saying that the International Criminal Court had issued an arrest warrant for the Israeli prime minister.

Apple declined to comment on the BBC.

Hallucinating the News

Instances of incorrect summaries are called “hallucinations.” This refers to cases where an AI model produces answers that are not entirely factual, even when given very clear data sets like a news article.

Hallucinations can be a big problem for AI services, especially in cases where consumers expect a simple, straightforward answer to a query. This is something that companies other than Apple have to deal with, too.

For example, early versions of Google’s Bard AI, now Gemini, somehow combined AppleInsider author Malcolm Owen with the late singer of the same name from the band The Ruts.

Hallucinations can occur in models for a variety of reasons, such as problems with the training data or the training process itself, or learned patterns not being applied correctly to new data. The model may also not have enough context in its data and clues to suggest a completely correct answer, or it may make an incorrect assumption about the source data.

It's not clear what exactly is causing the headline generalization issues in this case. The original article clearly talked about a shooter and said nothing about a person being attacked.

It's an issue that Apple CEO Tim Cook was aware of as a potential problem when he announced Apple Intelligence. In June, he acknowledged that it would be “not 100%” but that it would still be “very high quality.”

In August, it was revealed that Apple Intelligence had instructions specifically designed to combat hallucinations, including the phrases “Don't hallucinate. Don't make up factual information.”

It's also unclear whether Apple will be willing or able to do anything about the hallucinations, since it has decided not to track what users actively see on their devices. Apple Intelligence prioritizes on-device processing where possible, a security measure that also means Apple won't get much feedback on the actual generalization results.

Follow AppleInsider on Google News

Leave a Reply