Why Nostr? What is Njump?
2024-04-16 16:55:08

Ars Technica on Nostr: UK seeks to criminalize creation of sexually explicit AI deepfake images without ...

UK seeks to criminalize creation of sexually explicit AI deepfake images without consent

Under new law, those who create the "horrific images" would face a fine and possible jail time.

https://arstechnica.com/information-technology/2024/04/uk-seeks-to-criminalize-creation-of-sexually-explicit-ai-deepfake-images-without-consent/?utm_brand=arstechnica&utm_social-type=owned&utm_source=mastodon&utm_medium=social

Author Public Key
npub15wh3wyz0j8mlndtx0v28zlgagdynzx27f3hqwkmacy7ca4cmc3hszke9hw