NewsGuard, the American agency that monitors the press and publishes periodic reports on their reliability, has debuted the AI-Generated Information Monitoring Centre. The aim is to keep an eye on unreliable AI-generated news sites, of which there are at least 150 to date, and precisely to produce reports, analyses and fact-checking on misinformation and false narratives involving AI.
The number of these sites has more than tripled in the past month. Newsguard is investigating in particular how Russian and Chinese state media have used AI-generated texts to spread false claims, presenting them as accurate: ‘The China Daily,’ reads a NewsGuard note, ‘cited ChatGPT as a source supporting the false report that the United States operates a bio-lab in Kazakhstan that is conducting research into the transmission of viruses from camels to humans to develop a biological weapon to be used against China. In April, another AI-generated site broke the false news of current US President Joe Biden’s death.
AI has increased an existing problem
In addition, the constantly updated number of sites called Uain (the acronym stands for Unreliable AI-Generated News) identified by NewsGuard analysts is available within the monitoring centre: these are unreliable news sites created, in part or in full, by artificial intelligence and operating with little or no human supervision. As mentioned, 150 Uain sites have been identified so far, and analysts will continue to update the Monitoring Centre as new sites and new artificial intelligence-generated false narratives emerge.
In May, when NewsGuard began monitoring Uain sites, there were 49 of them. ‘One of the reasons we created NewsGuard five years ago was the ease and low cost of producing content farms,’ said Steven Brill, co-CEO of NewsGuard. ‘Today’s AI-generated news sites are similar to the Macedonian content farms that spread disinformation a few years ago, with the difference being that production costs are even lower, and these new sources can become even more prolific due to advances in artificial intelligence.
In May, when NewsGuard began monitoring Uain sites, their number was 49. ‘One of the reasons we created NewsGuard five years ago was the ease and low cost of producing content farms,’ said Steven Brill, co-CEO of NewsGuard. ‘Today’s AI-generated news sites are similar to the Macedonian content farms that spread disinformation a few years ago, with the difference being that production costs are even lower and these new sources can become even more prolific thanks to advances in artificial intelligence.
Business models for profit
Gordon Crovitz, the other CEO of NewsGuard, recalled that ‘most of these sites were created based on the business model of generating revenue from programmatic advertising, which seems to work well. Brands have no intention of placing ads on sites that are unsafe for their reputation, yet the lack of transparency in the programmatic advertising management system means that their advertisements, provided by advertising technology companies like Google, appear on these sites anyway’.
From this point of view, estimates predict that generative AI will allow more companies to increase the $2.6 billion in advertising revenue that goes to sites that publish disinformation every year. These sites read the NewsGuard report, usually have generic names, such as Top News Press and World Today News, and often publish a steady stream of articles on various topics, including politics, technology, entertainment, and travel. What is omitted is that the sites operate with little or no human supervision and publish articles mainly written or entirely by bots rather than offering traditionally created and edited journalistic content. The published articles generally feature banal and repetitive language, characteristic of texts produced by generative AI models. According to NewsGuard’s analysis, the content sometimes contains false information on various topics, such as references to events that never happened or fake medical treatments. Just enough to attract easy clicks and money.