- Open the file in Photoshop.
- Save the file.
- You’re done!
As an AI safety researcher, I want to like c2pa, but I’ve long been skeptical of its real utility. Why is this being touted as the savior of all things internet when all you need to do to bypass it is resave the file? Don’t believe me? Make an image in Dalle3, download it, test it here, then resave in Photoshop using same image format and test again. I’ll wait.
As the Verge reported a couple days ago:
OpenAI points out that C2PA’s metadata can “easily be removed either accidentally or intentionally,” especially as most social media platforms often remove metadata from uploaded content. Taking a screenshot omits the metadata.
The Verge also in that article I think wrongly calls it a “watermark” which would suggest some kind of encoding in the pixels themselves. I don’t believe that to be the case with C2PA which is just metadata that is easily and often automatically stripped in the very networks where it is intended to have some kind of impact, albeit a murky one still imo. I know it’s still “early days” but I’ve seen all too often in life how temporary solutions end up becoming permanent ones, even long after we’ve outgrown them. In this case, I feel like we’ve already outgrown this one. I’m also not so sure that information traceability is an entirely beneficial social thing all the time either; I can see plenty of ways the whole thing can be not just gamed, but used exactly as designed which result in dystopian outcomes, especially for political dissidents. More work needs to happen here.