suicide

facebook

At times, I can be an extremely harsh critic of Facebook.  The whole idea of the platform can seem negative.  Perhaps that’s just how the internet is these days, but there is something about Facebook that seems to draw in negativity.  But, in this post, I am going to grant them amnesty.  I can’t guarantee that I will extend this to other posts about Facebook, but let’s see how it goes.  Facebook is going to launch a new “proactive detection”, which essentially is artificial intelligence scanning posts for patterns of suicidal thoughts.  If necessary, Facebook will be able to automatically send mental health resources to the user at risk or their friends.  The extremely interesting part that I want to highlight is the fact that Facebook can also contact local first responders.

This is proactive because Facebook is using AI in order to flag worrisome posts to a human moderator.  Instead of waiting for a human to flag the worrisome posts in the first place.  Which means Facebook can decrease how long it will take to send help.  I always say that I love when technology can be used to benefit people in real and meaningful ways.  This is an example of that.  I love technology in general and love how it can enhance our lives, but I think that it also should play a role (where possible) to enhance our lives.

facebook_suicide

Facebook had previously tested AI to detect troubling posts and more prominently they have surfaced suicide reporting options to friends in the United States only.  Now Facebook will scour all types of content around the world – except for the EU where the General Data Protection Regulation prevents this kind of technology from profiling users based on the sensitive information.  They do have a point in the EU, but I guess this is a question of whether or not this kind of invasion of privacy is better than the alternative.  You could also make the argument that the ends don’t justify the means.  Which makes this kind of technology a moral conundrum.

Facebook is going to use AI in order to prioritize particularly risky or urgent user reports so they’re more quickly addressed by moderators.  It is also dedicating more moderators to suicide prevention.  Training them on how to deal with the cases 24/7 and they now have 80 local partners like save.org, National Suicide Prevention Lifeline, and Forefront – where they can provide resources to at-risk users and their networks.

facebook_suicide

The purpose behind the AI is that Facebook believes it will shave off minutes during every step of the process.  In Facebook Live situations, there have been cases where a first responder has arrived while the person is still live streaming their pain and agony.  Which is why I think this is so incredibly amazing.  But there are some people that feel like this is an issue.  I can see their point as we hover incredibly close to a dystopian society, but again, what’s worse?  I mean, we already are basically living in a dystopian society, how will this make it worse?

I think that people with mental health disabilities need help. Attempting harm or suicide can be a cry for help.  So rather than forcing these people to go through such a senseless act, why can’t we allow Facebook into our homes?  If you’re really that concerned about it, perhaps you should disconnect your Facebook account?  Again, I think the good outways the bad and we should be looking at it from that perspective, instead of ragging on Facebook for invading our privacy.  (Cherish that as you may never hear those words from me again)

One thought on “Facebook Adds Feature to Help Prevent Suicide in Users”

Comments are closed.