Filter NSFW Images from Your App with AI – NSFW Filter

An AI-powered library to filter out inappropriate images. Dive in to maintain a clean digital environment.

NSFW Filter is an open-source TypeScript library that allows developers to check if an image contains adult content before displaying or processing it further.

Built using Tensorflow.js and nsfwjs, it can be used to help developers easily block NSFW images in their apps.

GitHub Repo

How to use it:

1. Install the NSFW Filter package via NPM:

npm i nsfw-filter

2. Import the NSFWFilter:

import NSFWFilter from 'nsfw-filter';

3. Check if an image contains adult content:

const isSafe = await NSFWFilter.isSafe(image);

4. Here is an advanced React example demonstrating how to verify if a user-uploaded image is safe:

import { useState } from 'react';
import NSFWFilter from 'nsfw-filter';
function ImageUploader() {
  const [imageUrl, setImageUrl] = useState('');
  const handleImageUpload = async (event) => {
    const file = event.target.files[0];
    // Check to see if the image is appropriate
    const isSafe = await NSFWFilter.isSafe(file);
    if (!isSafe) return 'Image is not appropriate';
    // Process the image if it is safe
    if (file) {
      const reader = new FileReader();
      reader.onloadend = () => {
        setImageUrl(reader.result);
      };
      reader.readAsDataURL(file);
    }
  };
  return (
    <div>
      <input type="file" onChange={handleImageUpload} />
      {imageUrl && <img src={imageUrl} alt="Uploaded" />}
    </div>
  );
}
export default ImageUploader;

5. NSFW Filter is currently operational, processing hundreds of thousands of images for a renowned image restoration service, restorePhotos. It plays a pivotal role in preventing the upload of inappropriate pictures, ensuring a clean and respectful user experience.

Leave a Reply

Your email address will not be published. Required fields are marked *