The image of Taylor Swift supporting Donald Trump? Fake. Kamala Harris in communist garb? AI-generated. Misinformation is now easy to weaponize. That’s why the Content Authenticity Initiative—over 3,700 tech and media companies, including Adobe, TikTok, and the Associated Press—and the Coalition for Content Provenance and Authenticity, which includes Google, Microsoft, OpenAI, and the BBC, have created Content Credentials, a system of watermarks and metadata intended to ensure authenticity and flag AI. Under the system, a participating company's digital camera could affix the image with metadata, while Adobe Photoshop could track any AI edits. Andy Parsons, the senior director of the Content Authenticity Initiative at Adobe, calls Content Credentials "a way to provide a ‘nutritional label’ for digital content."
Learn More at Content Credentials
More Must-Reads from TIME
- L.A. Fires Show Reality of 1.5°C of Warming
- Home Losses From L.A. Fires Hasten ‘An Uninsurable Future’
- The Women Refusing to Participate in Trump’s Economy
- Bad Bunny On Heartbreak and New Album
- How to Dress Warmly for Cold Weather
- We’re Lucky to Have Been Alive in the Age of David Lynch
- The Motivational Trick That Makes You Exercise Harder
- Column: No One Won The War in Gaza
Contact us at letters@time.com