TikTok and Bumble join the initiative to prevent sharing of non-consensual images
- Get link
- X
- Other Apps
Two big social media platforms TikTok and Bumble have joined the initiative to prevent sharing of non-consensual images with the aim of protecting the people.
The two platforms partnered with StopNCII.org (Stop
Non-Consensual Intimate Image Abuse) in order to detect and block any intimate
images that can be used to abuse others.
According to a report by Engadget, StopNCII.org uses
on-device hashing technology through which people being threatened with
intimate image abuse can create unique identifiers for their images.
Participants can convert the images into digital fingerprints. Those hashes are
shared with participating partners. TikTok and Bumble can block any images
included in StopNCII.org’s bank of hashes.
Sometimes, people upload intimate images of their ex to
several different platforms with the intention of “revenge porn.” To protect these people, the
two social media platforms joined up with the project Meta launched.
According to Bloomberg, StopNCII.org is an effective
platform that can crack down on this problem. The platform is run by
independent intimate image abuse experts.
Reportedly, more than 12,000 people have used the platform
in order to prevent intimate videos and images from being shared without
permission. According to the report, users have created more than 40,000 hashes
to date.
The initiative was started in Australia in 2017 in order to
help victims. The initiative was launched in partnership with Meta. Meta asked
users to upload revenge porn images to a Messenger chat with themselves in
order to create the hash. Meta said it would delete the images after hashing
them. However, the approach raised obvious privacy concerns.
- Get link
- X
- Other Apps
Comments
Post a Comment