The image of Taylor Swift supporting Donald Trump? Fake. Kamala Harris in communist garb? AI-generated. Misinformation is now easy to weaponize. That’s why the Content Authenticity Initiative—over 3,700 tech and media companies, including Adobe, TikTok, and the Associated Press—and the Coalition for Content Provenance and Authenticity, which includes Google, Microsoft, OpenAI, and the BBC, have created Content Credentials, a system of watermarks and metadata intended to ensure authenticity and flag AI. Under the system, a participating company's digital camera could affix the image with metadata, while Adobe Photoshop could track any AI edits. Andy Parsons, the senior director of the Content Authenticity Initiative at Adobe, calls Content Credentials "a way to provide a ‘nutritional label’ for digital content."
Learn More at Content Credentials
More Must-Reads from TIME
- Inside Elon Musk’s War on Washington
- Why Do More Young Adults Have Cancer?
- Colman Domingo Leads With Radical Love
- 11 New Books to Read in February
- How to Get Better at Doing Things Alone
- Cecily Strong on Goober the Clown
- Column: The Rise of America’s Broligarchy
- Introducing the 2025 Closers
Contact us at letters@time.com