Press enter to see results or esc to cancel.

Moderate NSFW Images and Video with Clarifai API [Tutorial]

Let’s face it: nudity is an inevitable part of the internet. Spend enough time online and you’re bound to come across some raunchy imagery (whether you intend to or not!). If you’re building an app or website, it’s important to filter out any “not safe for work” (NSFW) images or video that may be uploaded. After all, no one likes a surprise!

unsee-gif

Enter the Clarifai API! This tool uses deep-learning technology to recognize categories, objects and tags from images and videos.
clarifai-api-overview

The API is free for up to 5,000 calls a month. The Clarifai API team also built a NSFW model so that the API can recognize nudity. With this functionality, you could….

  • Filter out NSFW image submissions for your Instagram app
  • Protect your comments’ section from naughty spammers
  • Limit your app’s Giphy searches to G-rated material

Read how it works below or head straight to the Clarifai API package page to start making calls for yourself.

While you can connect to the Clarifai API directly, today we’ll use our free tool, RapidAPI. Why? The live coding feature means that you can start making calls to the API right away from within the browser. By the end of this tutorial, you can upload your own images, call the API and see if it actually passes the NSFW test.

Step 1. Get the Clarifai Access Token and Credentials

Clarifai requires an access token before you start making calls to their API. No worries! It’s easy (and more importantly, free) to get one. Here’s how:

  1. Go to Clarifai’s developer page
  2. Sign up for an account
  3. Click the Create Application button (or head to the  Developer Dashboard and click “Create a New Application”)
  4. Copy and save your client_id and client_secret
  5. Press the  Generate Access Token button

Voila! You should now have your  client_id, client_secret and Access Token for the Clarifai API.

Step 2. Call the API from RapidAPI

Next, head over to RapidAPI.com to run the API and start testing images! Here’s an overview of what you’ll need to do.

  1. Visit the Clarifai package page on RapidAPI
  2. Go to the “blocks” category and select the getTags endpoint
  3. Fill the getTags endpoint with relevant data
    • urls: Add the image URL in brackets and quotes ["http://IMAGE.jpg"]
    • model: Type in nsfw-v1.0
    • accessToken: Copy the Access Token that you got from Step 1
  4. Log in and select your backend language
  5. Click “Test” to make call
  6. Check the code to see how probable it is that the image is NSFW

That’s the overview, but now, for some fun. Let’s try out the Clarifai API with an example.

Example: The Questionable Lamps Photo

The Clarifai API is pretty good at distinguishing safe for work content from the raunchier stuff, even if it’s hard to tell on first glance.

Take this light shade photo, for instance.

lampshades

Those shadows really make it look like…well, you know what it looks like. Lets see if Clarifai’s API will be fooled!

First stop, RapidAPI! We’ll go to RapidAPI.com and find the Clarifai package page.


Next, we’ll input the data on the blocks page. On the getTags endpoint, type in the following data.

  • urls: ["http://i.imgur.com/Acgjygu.jpg"]
  • modelnsfw-v1.0
  • accessToken: Your Access Token from Step 1

Finally, log in and hit test and see if it works! The final test should look something like this.


What was the final verdict on the risqué lamps? Clarifai said with 97% certainty that the image is, in fact, safe for work. We couldn’t fool them this time!

Go ahead and try it out. You can either play around with Clarifai’s API on the RapidAPI website or export the code directly into your app. Let us know what you think.

Comments

Leave a Comment

Tell us your thoughts!

Spread the API ❤️