From PC Mag: Meta is experimenting with a new feature that uses machine learning, an application of AI, to detect images with nudity sent via direct message (DM) on Instagram. The offending images will then be blurred out and marked with the label "Photo may contain nudity," according to a Meta post published Thursday.
Meta users will also see pop-up warning messages when sending images containing nudity with the new feature. Instagram plans to warn users that unsending a photo doesn't mean someone hasn't already seen it, and that any images sent could become screenshots or sent to others.
Instagram users who try to forward or reshare nudity-flagged images within the app will also get a pop-up warning asking them to "be responsible" and "be respectful" before forwarding the sensitive images. It will still allow them to share the images, though.
When Instagram users receive photos believed to contain nudity, the app will blur out the image and tell users via a pop-up message: "Don't feel pressured to respond." The pop-up also reminds users that they can block the person they're DM-ing, and tells users they can stop any conversations that make them "uncomfortable."
View: Full Article