MASHABLE – In 2019, Bumble launched its AI tool 'Private Detector', which alerts users when they've been sent an obscene photo and automatically blurs the image. Now, Bumble is making a version of the tool available to the wider tech community on GitHub. Rachel Haas, Bumble's VP of member safety, said in a statement, "Open-sourcing this feature is about remaining firm in our conviction that everyone deserves healthy and equitable relationships, respectful interactions, and kind connections online."
by Rachel Thompson
See full article at Mashable
