The image of Taylor Swift supporting Donald Trump? Fake. Kamala Harris in communist garb? AI-generated. Misinformation is now easy to weaponize. That’s why the Content Authenticity Initiative—over 3,700 tech and media companies, including Adobe, TikTok, and the Associated Press—and the Coalition for Content Provenance and Authenticity, which includes Google, Microsoft, OpenAI, and the BBC, have created Content Credentials, a system of watermarks and metadata intended to ensure authenticity and flag AI. Under the system, a participating company's digital camera could affix the image with metadata, while Adobe Photoshop could track any AI edits. Andy Parsons, the senior director of the Content Authenticity Initiative at Adobe, calls Content Credentials "a way to provide a ‘nutritional label’ for digital content."
Learn More at Content Credentials
More Must-Reads from TIME
- How the Electoral College Actually Works
- Your Vote Is Safe
- Mel Robbins Will Make You Do It
- Why Vinegar Is So Good for You
- The Surprising Health Benefits of Pain
- You Don’t Have to Dread the End of Daylight Saving
- The 20 Best Halloween TV Episodes of All Time
- Meet TIME's Newest Class of Next Generation Leaders
Contact us at letters@time.com