close
close

ChatGPT search shows 76.5% error rate in attribution study

ChatGPT search shows 76.5% error rate in attribution study

According to search specialist OpenAI’s ChatGPT, it has difficulty accurately quoting news publishers. study Columbia University Center for Digital Journalism.

The report found frequent misquotations and misattributions, raising concerns among publishers about brand visibility and control over their content.

Moreover, the findings challenge OpenAI’s commitment to the responsible development of AI in journalism.

ChatGPT search background

OpenAI launched ChatGPT search last month, saying it was actively collaborating with the news industry and taking into account feedback from publishers.

This contrasts with the initial rollout of ChatGPT in 2022, when publishers discovered that their content was being used to train AI models without prior notice or consent.

OpenAI now allows publishers to specify via a robots.txt file whether they want to be included in ChatGPT search results.

However, the Tow Center’s findings show that publishers face the risk of misattribution and misrepresentation, regardless of their choice to participate.

Accuracy problems

The Tow Center evaluated ChatGPT Search’s ability to identify sources of citations from 20 publications.

Key findings include:

  • Out of 200 queries, 153 answers were incorrect.
  • The AI ​​rarely admitted its mistakes.
  • Phrases like “maybe” were used in only seven responses.

ChatGPT often prioritizes user satisfaction over accuracy, which can mislead readers and harm the publisher’s reputation.

Additionally, the researchers found that ChatGPT search performed inconsistently when the same question was asked multiple times, likely due to the randomness built into its language model.

Citing copied and syndicated content

Researchers found that ChatGPT searches sometimes cite copied or syndicated articles instead of original sources.

This is likely due to publisher restrictions or system limitations.

For example, when asked to quote from a New York Times article (which is currently involved in a lawsuit against OpenAI and the blocking of its scanners), ChatGPT links to an unauthorized version on another site.

Even in the case of MIT Technology Review, which allows the use of OpenAI scanners, the chatbot linked to the syndicated copy and not the original.

Tow Center found that all publishers are at risk of misleading ChatGPT searches:

  • Enabling scanners does not guarantee visibility.
  • Blocking scanners does not prevent content from being displayed.

These issues raise concerns about OpenAI’s content filtering and approach to journalism, which could drive people away from original publishers.

OpenAI’s response

OpenAI responded to the Tow Center’s findings by saying it supports publishers through clear attribution and helps users find content through summaries, citations and links.

An OpenAI spokesperson stated:

“We support publishers and authors by helping ChatGPT’s 250 million weekly users find quality content through summaries, citations, clear links and attribution. We’ve worked with partners to improve real-time citation accuracy and accommodate publisher preferences, including ensuring they appear in search by managing OAI-SearchBot in their robots.txt file. We will continue to improve search results.”

While the company is working to improve citation accuracy, OpenAI says it is struggling to address specific misattribution issues.

OpenAI remains committed to improving its search product.

Looking to the future

If OpenAI wants to collaborate with the news industry, it should ensure that publisher content is accurately represented in ChatGPT search.

Publishers currently have limited powers and are closely monitoring legal cases against OpenAI. The results could impact content rights and give publishers more control.

As generative search products like ChatGPT change the way people interact with news, OpenAI must demonstrate a commitment to responsible journalism to earn the trust of users.


Featured image: Robert Way/Shutterstock