Dan Milmo reports via The Guardian: The Guardian has accused Microsoft of damaging its journalistic reputation by publishing an AI-generated poll speculating on the cause of a woman’s death next to an article by the news publisher. Microsoft’s news aggregation service published the automated poll next to a Guardian story about the death of Lilie James, a 21-year-old water polo coach who was found dead with serious head injuries at a school in Sydney last week.
The poll, created by an AI program, asked: “What do you think is the reason behind the woman’s death?” Readers were then asked to choose from three options: murder, accident or suicide. Readers reacted angrily to the poll, which has subsequently been taken down — although highly critical reader comments on the deleted survey were still online as of Tuesday morning. A reader said one of the Guardian reporters bylined on the adjacent story, who had nothing to do with the poll, should be sacked. Another wrote: “This has to be the most pathetic, disgusting poll I’ve ever seen.”
The chief executive of the Guardian Media Group, Anna Bateson, outlined her concerns about the AI-generated poll in a letter to Microsoft’s president, Brad Smith. She said the incident was potentially distressing for James’s family and had caused “significant reputational damage” to the organization as well as damaging the reputation of the journalists who wrote the story. “This is clearly an inappropriate use of genAI [generative AI] by Microsoft on a potentially distressing public interest story, originally written and published by Guardian journalists,” she wrote. Bateson added that it had demonstrated “the important role that a strong copyright framework plays in enabling publishers to be able to negotiate the terms on which our journalism is used.”
A Microsoft spokesperson said: “We have deactivated Microsoft-generated polls for all news articles and we are investigating the cause of the inappropriate content. A poll should not have appeared alongside an article of this nature, and we are taking steps to help prevent this kind of error from reoccurring in the future.”