Govt officials have been quoted saying that the 'safe harbor' status of X could be revoked because of Grok's CSAM content ...
LAION, the German research org that created the data used to train Stable Diffusion, among other generative AI models, has released a new dataset that it claims has been “thoroughly cleaned of known ...
AI has restricted Grok image generation to paid subscribers following CSAM backlash. The UK government calls the move ...
It seems that instead of updating Grok to prevent outputs of sexualized images of minors, X is planning to purge users ...
For years, hashing technology has made it possible for platforms to automatically detect known child sexual abuse materials (CSAM) to stop kids from being retraumatized online. However, rapidly ...
Two major developments reignited regulatory and technological discourse around Child Sexual Abuse Material (CSAM) this year: The first, Visa & MasterCard cracking down on adult sites that contained ...
In August 2021, Apple announced a plan to scan photos that users stored in iCloud for child sexual abuse material (CSAM). The tool was meant to be privacy-preserving and allow the company to flag ...
Thousands of CSAM victims are suing Apple for dropping plans to scan devices for the presence of child sexual abuse materials. In addition to facing more than $1.2B in penalties, the company could be ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results