A catalog of naturally occurring images whose Apple NeuralHash is identical.
-
Updated
Dec 26, 2022 - JavaScript
A catalog of naturally occurring images whose Apple NeuralHash is identical.
This RocketchatApp validates uploaded images against the Microsoft PhotoDNA cloud service and quarantines those identified as child abuse images (child pornography or CSEM).
Forensic Tool for Detecting Child Pornography Using Deep Neural Networks
Modtools Image is a simple, multi-user image moderation platform for Trust & Safety professionals.
🇸●🇳●🇺●🧒️🚫️💾️ The official source repository for SNU CSAM-K, the official Child Sexual Abuse Material (CSAM) hashing, detection, and reporting tool for the SNU Framework, with the addition of an evidence locker to prevent abuse of abuse reporting.
articule with domains related to ad providers and tutorials to recognize URLs with OSINT operations.
Sem-wise note for the courses offered to CSAM branch in IIITD
Blockchain Proof of Concept for ValliNamChain. Prevention of Child Sexual Abuse Material (CSAM).
🇸●🇳●🇺●🧒️🚫️📖️ The official documentation source repository for SNU CSAM-K, the official Child Sexual Abuse Material (CSAM) hashing, detection, and reporting tool for the SNU Framework.
Exploring the limits of social media transparency data
A modern, promise based Node.js wrapper around the Project Arachnid CSAM API
PDNAScrubber is a script used to delete the erroneous Photo DNA property found in the Project VIC json file export generated by Cellebrite Physical Analyzer
Add a description, image, and links to the csam topic page so that developers can more easily learn about it.
To associate your repository with the csam topic, visit your repo's landing page and select "manage topics."