from fediverse
As the fediverse grows, moderators and administrators will need tools to help with moderation. In particular, for child sexual abuse material (CSAM), it is important to be able to detect and remove CSAM from the fediverse.
[PhotoDNA](https://www.microsoft.com/en-us/PhotoDNA/CloudService) is a service developed by Dartmouth College and Microsoft that can detect known CSAM images. It is used by many social networks to detect and remove CSAM. Fediverse server software can use PhotoDNA to detect and report CSAM uploaded to the server before it is distributed via ActivityPub.
Thiel and DiResta in their report on [Child Safety on the Federated Social Web](https://stacks.stanford.edu/file/druid:vb515nd6874/20230724-fediverse-csam-report.pdf) note that if each recipient server need to check an incoming message from a sending server, this will cause thousands or even tens of thousands of extraneous calls to PhotoDNA.
Instead, they propose an architecture in which the origin server makes a single call to PhotoDNA, and then relays an attestation to all recipient servers. The recipient server can then check the attestation and decide whether to accept the message, without requiring additional calls to PhotoDNA.
This FEP expands their proposal into an implementable ActivityPub extension.
It is a WIP; I need to confirm with the team from SIO and/or PhotoDNA what the hash function should be, and how a recipient server can confirm the signature from the API.