This API is designed to detect NSFW (Not Safe for Work) content in images. It uses AI to analyze and score images based on their likelihood of containing explicit or sensitive content on a scale from 0.0 to 1.0.
This API is designed to detect NSFW (Not Safe for Work) content in images. It uses AI to analyze and score images based on their likelihood of containing explicit or sensitive content on a scale from 0.0 to 1.0.