The image of Taylor Swift supporting Donald Trump? Fake. Kamala Harris in communist garb? AI-generated. Misinformation is now easy to weaponize. That’s why the Content Authenticity Initiative—over 3,700 tech and media companies, including Adobe, TikTok, and the Associated Press—and the Coalition for Content Provenance and Authenticity, which includes Google, Microsoft, OpenAI, and the BBC, have created Content Credentials, a system of watermarks and metadata intended to ensure authenticity and flag AI. Under the system, a participating company's digital camera could affix the image with metadata, while Adobe Photoshop could track any AI edits. Andy Parsons, the senior director of the Content Authenticity Initiative at Adobe, calls Content Credentials "a way to provide a ‘nutritional label’ for digital content."
Learn More at Content Credentials
More Must-Reads from TIME
- Why Trump’s Message Worked on Latino Men
- What Trump’s Win Could Mean for Housing
- The 100 Must-Read Books of 2024
- Sleep Doctors Share the 1 Tip That’s Changed Their Lives
- Column: Let’s Bring Back Romance
- What It’s Like to Have Long COVID As a Kid
- FX’s Say Nothing Is the Must-Watch Political Thriller of 2024
- Merle Bombardieri Is Helping People Make the Baby Decision
Contact us at letters@time.com