Facebook took an initiative to prevent suicidal attempts. It rolls out “Proactive Detection” Artificial Intelligence-based tool that will identify if someone shares posts live streams or videos expressing suicidal thoughts. They will even send it to the mental health resources to that user. Vice President of Product Management at Facebook, Guy Rosen, said they want to build a safe community on and off Facebook," He even told that the system has identified few videos which have gone unreported. Facebook has worked on collecting the phrases and hashtags that are related to online challenges related to suicide or self-harm.
Facebook is first testing this concept in the US and will later bring it to other countries. The feature will be available worldwide soon after the successful tests on the US except for European Union. In the initial phase of testing the company detected the texts of Facebook posts and comments contain references to suicide or self-harm. This is done through the group of dedicated experts who have knowledge of self-harm. The AI tool will search for comments like “Are you ok?” and “Can I help” comments as they can be related to suicidal posts.
The company uses AI to prioritize the posts and videos in which they are reviewed, they will figure out the most distress of them all. In this case, the speed matters the most. After detecting the distress or self-harm, the tool alerts the Facebook team and then the company suggests resources to that person or their friends. At times they even call local authorities to intervene depending upon the situation.
No comments:
Post a Comment