Security News > 2021 > July > Deepfakes: Microsoft and others in big tech are working to bring authenticity to videos, photos
If you want people to trust the photos and videos your business puts out, it might be time to start learning how to prove they haven't been tampered with.
Microsoft has a quiz you can take to see if you can spot deepfakes yourself; that's less a training tool and more an attempt to increase awareness and media literacy.
Tools like the Microsoft Video Authenticator look for artefacts where the image has been altered that can give away deepfakes that you might not be able to see yourself, but they won't spot everything.
The Content Authenticity Initiative is a broad group of organizations who are interested in content authenticity, led by Adobe which is creating a tool to let Photoshop and Behance users save location data, details of the creator and even the history of every edit made to an image inside that image, again using metadata, so people looking at the image later can see how it was edited.
To bring all these pieces together, the Coalition for Content Provenance and Authenticity is combining them into a standard for proving where content comes from, authored by Adobe, Arm, the BBC, Intel, Microsoft, TruePic and now Twitter.
You can see the kind of information that will be available by uploading an image to the Verify tool on the Content Authenticity Initiative site.