The official explanation for this anonymity is hidden in their How It Works subtitle:
Our editorial team identifies timely topics and invites contributors with relevant, first-hand experience to share their perspectives through structured conversations. Those viewpoints are synthesized and edited into stories that show where contributors align, where they differ, and what it all means – providing depth, balance, and clarity beyond the headline.
But when journalist Tyler Johnson ran the site’s content PangramThe AI detection tool, which claims a 99.98% accuracy rating, discovered how widely AI was trusted: “Of the 94 articles, 69% were marked as completely AI-generated, with another 28% marked as partially AI-generated. Only three articles were classified as human-authored.”
Johnson’s skepticism grew further when he saw material that was both in favor of the development of artificial intelligence and dismissive of critics of AI. For example, one piece warns “The rise of anti-AI fundamentalism,” while another scolds the reader: “Will Republicans let blue states set America’s AI rules?”
mashable light speed
The deeper Johnston dug, the clearer the picture became. As a new site with very little social media presence, The Wire’s articles are rarely retweeted, but Johnston found that half of its engagement on X came from Patrick Hines, president of the PR firm Novus Public Affairs. A quick look at their customer list Turns out they work on behalf of Targeted Victory, the consulting firm at the center of OpenAI’s lobbying efforts on behalf of its regulatory interests in Washington.
Generative Artificial Intelligence has already created cracks in our collective perception of reality. With enough computing power, you can make a fake trailer For films that were never made and will never be made, or Steal a politician’s voice for a deep imitationOr even invent an absurd, unbelievable scenario a shark is attacking the planeAnd fool at least a few credible Internet novices.
If Johnston’s reporting is correct and his estimates are accurate, we may have an example of an AI firm deliberately misrepresenting its work as “independent journalism” in order to lobby on its own behalf (something Johnston cites that is a violation of its own). usage policies).
Disclosure: Mashable’s parent company Ziff Davis filed a lawsuit against OpenAI in April, alleging it infringed Ziff Davis copyrights in the training and operation of its AI systems.
Subject
artificial intelligence
<a href