In a few weeks, Instagram will begin testing new tools to combat "sextortion," which is an online form of blackmail involving the sending of private images.
"Nudity protection" is one of the tools that blurs nude photos in direct messages. Under-18s will by default have it enabled.
Trials will also be conducted using pop-ups that point possible victims to resources.
Governments everywhere have issued warnings about the growing harm that sextortion poses to youth.
Usually, the victim is provided a photo of themselves in the nude before being asked to supply one of their own. When the victim complies with the blackmailer's requests, they are then told that the picture will be made public.
Two Nigerians pleaded guilty on Wednesday to molesting young men and boys in the US, including one who took his own life.
Recently, the crime has also been connected to an Australian adolescent's suicide.
Horrifying incidents of pedophiles abusing and coercing youngsters through sextortion have also been reported.
Its nudity prevention technology, which was initially made public in January, detects nude photographs in direct messages and gives users the option to see them or not. It does this by using artificial intelligence (AI) that runs exclusively on the user's smartphone.
It was created "not only to protect people from seeing unwanted nudity in their DMs, but also to protect them from scammers who may send nude images to trick people into sending their own images in return," according to Instagram.
This article was originally published on the BBC.