Bumble open-sources its lewd-spotting AI tool

Dating app Bumble is open-sourcing its lewd-spotting AI tool that was first introduced in 2019.

The tool helps to protect users from certain unsolicited photos – and we’re not just talking about genitalia, but also shirtless selfies and photos of firearms.

When a suspect image is received by a user, it’s blurred to allow the recipient to view it, block it, or report the sender.

Online harassment is difficult to counter entirely but AI is proving a powerful tool in helping to protect users, especially the vulnerable. Shocking research in 2020 found that 75.8 percent of girls between the ages of 12 and 18 have been sent unsolicited nude images.

By open-sourcing its AI tool, Bumble can help to protect more online users.

“It’s our hope that the feature will be adopted by the wider tech community as we work in tandem to make the internet a safer place,” explained Bumble in a blog post.

Human content moderators see the worst of the web day in and day out. Spending your days reviewing abuse, torture, massacres, beheadings, and more is bound to take a serious mental toll. As a result, content moderators often require therapy and is a role associated with one of the highest suicide rates.

Relying solely on AI moderation is problematic. Human moderators, for example, can understand context and tell the difference between content exposing war crimes and that of terrorist propaganda glorifying hate and violence.

AI tools like the one open-sourced by Bumble can help to protect moderators by blurring the content while still harnessing the unique skills of humans.

In its blog post, Bumble explained how it traversed the trade-offs between performance and the ability to serve its user base at scale:

“We implemented (in its latest iteration) an EfficientNetv2-based binary classifier: a convolutional network that has faster training speed and overall better parameters efficiency. It uses a combination of better designed architecture and scaling, with layers like MBConv (that utilizes 1×1 convolutions to wide up the space and depth-wise convolutions for reducing the number of overall parameters) and FusedMBConv (that merges some steps of the vanilla MBConv above for faster execution), to jointly optimize training speed and parameter efficiency.

The model has been trained leveraging our GPU powered data centers in a continuous exercise of dataset, network and hyperparameters (the settings used to speed up or improve the training performance) optimization.”

Bumble says its tool, both offline and online, achieves “world class performance” of over 98 percent accuracy in both upsample and production-like settings.

You can find Bumble’s tool on GitHub here. The tool has been released under the Apache License so anyone can implement it as is for blurring lewd images or can fine-tune it with additional training samples.

(Image Credit: Bumble)

Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London.

Explore other upcoming enterprise technology events and webinars powered by TechForge here.

Tags: ai, artificial intelligence, bumble, content moderation, machine learning, open source, open-source, tool

Source link

Leave a Comment