Facebook testing a tool for non-consensual intimate images; Alert if someone is impersonating you

Facebook is testing a feature that will send you an alert when they suspect an account is impersonating you or your account by using your name and profile photo.

Facebook will reach out and ask you to clarify or out the troll. 

The feature is currently live for 75% of users across the globe.

This tool if successful, may reduce the number of complaints and harassment it receives from users on its platform.

"We heard feedback prior to the roundtables and also at the roundtables that this was a point of concern for women," Davis told Mashable. "And it's a real point of concern for some women in certain regions of the world where it [impersonation] may have certain cultural or social ramifications."

Facebook is also testing two other safety features as a result of the talks: new ways of reporting non-consensual intimate images and a photo checkup feature. Facebook has explicitly banned the sharing of non-consensual intimate images since 2012, but the feature it's currently testing is meant to make the reporting experience more compassionate for victims of abuse, Davis is quoted as saying.

Under the test, when someone reports nudity on Facebook they'll have the additional option of not only reporting the photo as inappropriate, but also identifying themselves as the subject of the photo. Doing so will surface links to outside resources — like support groups for victims of abuse as well as information about possible legal options — in addition to triggering the review process that happens when nudity is reported.

Davis said initial testing of these reporting processes has gone well but they are still looking to gather more feedback and research before rolling them out more broadly.