NSFW Filter Introduction: Building a Safer Internet Using AI

Written by neo_matrix | Published 2020/08/23
Tech Story Tags: artificial-intelligence | computer-vision | machine-learning | tensorflow | tensorflowjs | web-development | javascript | future-of-ai

TLDR NSFW Filter is a web extension which runs on Chrome and Firefox that uses Machine Learning to filter out NSFW images. The extension picks up all the images loaded on a web page and hides the images from the user. These images are then checked by the Machine Learning model which will detect NSFW content. If the images are found not to have NSFW. content, then they are made visible. All the source code for this project is available in our GitHub repo. The project could be improved greatly by contributions from people with diverse ideas. contributions are always welcome!via the TL;DR App

Filtering out NSFW images with a web extension built using TensorFlow JS.
All the source code used in this project is available here.
The Internet is an unfiltered place. There is no guarantee what you would stumble across while you are casually scrolling through your feeds.
You could stumble across inappropriate or “Not-Safe-For-Work” images even in unassuming places on the Interweb.
This led me to think of a solution that could filter out such content from the web. There were a few points to consider:
  • All images should be monitored from the websites the user loads.
  • The images should be processed without having to leave the client machine.
  • It should be fast and should work on all websites.
  • It should be open-source.
The solution? A web extension that would check if the images loaded in a page are NSFW and only display them if they are found to be safe.

Enter NSFW Filter

NSFW Filter is a web extension which currently runs on Chrome and Firefox that uses Machine Learning to filter out NSFW images.
The nsfwjs model is used to detect the content of the loaded images. This model was trained for the sole purpose of detecting NSFW images.
Using this model which has been optimised to be run on a browser, we built the extension that would read the images in a web page loaded in the browser and only make them visible if they are found to be safe.

Open-source development model

NSFW Filter is completely open-source and it always will be.
With an awesome list of contributors from around the world, we will continue to improve it and add more features.
Since such a product does not exist yet, there could be more room for improvements and features that must have been completely missed by the current team.
So, we released it into the wild and started accepting contributions from developers across the world with broad ideas.
Now, we have our first release.

How it works under the hood

When a web page is loaded, the extension picks up all the images loaded on the page and hides the images from the user.
These images are then checked by the Machine Learning model which will detect NSFW content.
If the images are found not to have NSFW content, then they are made visible. The NSFW images remain hidden.
The basic workflow of NSFW FilterUse cases
The uses of the extension are broad. Whether you are using it on your work computer or using it on your kids’ computer.
You don’t want to stumble across NSFW content while you are at your work right?
You wouldn’t want your kids’ to accidentally stumbling across such content while they are doing a report for school, right?
The solution, use NSFW Filter!
All the source code for this project is available in our GitHub repo.
The project could be improved greatly by contributions from people with diverse ideas. 
Any and all kinds of contributions are always welcome!
Happy coding!

Written by neo_matrix | Artificial Intelligence Engineer with a hint of Natural Stupidity
Published by HackerNoon on 2020/08/23