6 Comments

Good callout re: tradeoff between control and verifiability.

The tough bit here seems to be that in the long term in order to build trust in the security of hardware signing the device would need remote attestation capabilities aka the ability to prove that the signing key is properly secured and that the right application code is being run (i.e. just sign raw sensor data and nothing else). These proofs would be useful in the context of ensuring that a device is always just signing raw images, but I could also see concerns arising about the device information revealed in proofs potentially being used for other filtering/gating. These types of concerns were at the root of the backlash from proponents of consumer device freedom and network endpoint agnosticism (i.e. servers shouldn't discriminate against what device an end user is using) against Google's now defunct Web Environment Integrity proposal (browsers would allow website devs to check what device the consumer is running). They're generally also at the root of concerns about the use of remote attestation capabilities in consumer devices generally.

But, at the moment, it doesn't seem that there is a way around this tradeoff...if the goal is to verify that an image was actually taken by a camera (although there could be workarounds if the goal is softened to something like establishing an image is *probably* taken by a camera based on information from your trusted circle).

Expand full comment

Almost seems like a reasonable use for a… blockchain? At least for high value human made assets.

Expand full comment

Yes! I think so.

Expand full comment

C2PA and other such authentication schemes will need to preserve the location and time a photo is taken or else they will be subverted easily by simply taking a photo of a deepfake image (same with video) with a supported camera to add the authentication data. Even if the location is required, it would require the location and time reported by the camera cannot be spoofed, which could be done on a smartphone, but most standalone cameras like SLRs don’t have GPS or let the user add a location or set the time manually.

Expand full comment

Yup. There is no such thing as pure privacy and verifiably authentic media.

Expand full comment
Comment deleted
Jun 24
Comment deleted
Expand full comment

I agree with you that the whole “social media deepfakes” panic is probably overblown.

However, what does strike me as a more realistic near-term concern on the affect of this technology on evidence admitted in court settings. I think we should stress test any technical standards against that use case, rather than trying to police all content put on the internet, which is a much more dubious pursuit in my view.

Expand full comment