Can Social Media Detect Suicide Risk?

Project Analyzes Veterans' Posts, Texts for Troubling Signs
Can Social Media Detect Suicide Risk?

Can real-time analysis of social media and mobile device text data help detect suicide risk among U.S. service veterans? That's what a new research collaboration involving Facebook is trying to figure out.

See Also: Take Inventory of Your Medical Device Security Risks

But some privacy experts raise concerns about whether individuals will be accurately informed about how their data will be collected, used, and safeguarded.

The effort, called the Durkheim Project, involves collecting and analyzing social media and mobile text data from veterans who can opt-in or opt-out at any time for participation in the initiative, says Chris Poulin, lead investigator of the project, which is being conducted by his analytics firm Patterns and Predictions, and the non-profit Veterans Education and Research Association of Northern New England.

The data is analyzed in real time for suicidal risk. An intervention effort also under development aims to automatically alert individuals' mental health professionals or other designated caregivers of the patients being at risk for suicide so that help can be offered.

A report released by the Department of Veterans Affairs last year found that while the percentage of all suicides reported among veterans has decreased, the number of suicides nationally has increased. The VA report attributes some of the decline to outreach efforts that have been stepped up in recent years, including the VA's own suicide prevention programs, such as its crisis hotline and online chat services. However, the report states that among vets at risk, "the first four weeks following service require intensive monitoring and case management."

How Project Works

The data collected for the Durkheim Project includes participating veterans' networking profiles and updates on Facebook, Twitter and LinkedIn, as well as texts from their mobile devices, Poulin says.

"It's a search engine technology" that's being used to identify data that's relevant to the analysis, he says. That includes more than just a variety of keywords and a collection of other factors that could together be red flags for suicide, he says. The analysis by an artificial intelligence system looks for "subtle cues" over a length of time, including the use of certain words with increased frequency, he says. Once risk of suicide is detected, the aim is to automatically alert mental health professionals, family or friends of the individual so that intervention can be offered.

Project researchers developed linguistic-driven suicide prediction models to analyze that social media and text data. Those prediction models are based on work by the Defense Advance Research Project Agency, Poulin says.

"Facebook is putting the word out to its veteran user base" that individuals can sign up for the project, Poulin says. The researchers are aiming to sign up at least 100,000 veterans to participate. A mobile app to collect vets' text data will become available later this summer, he says. As part of privacy measures in collecting data from vets' text conversations, "the identity of the other individuals [communicating with the vets] will be stripped out," he says.

The data gathered for the Durkheim Project are collected via application programming interfaces for Facebook, LinkedIn, Twitter and mobile applications, says Poulin. That data is stored on Durkheim Project servers temporarily, but moved for longer-term storage in a database behind a firewall of Geisel School of Medicine at Dartmouth University, he says. That data will safeguarded in compliance with HIPAA privacy and security rules, he says.

Three Stages

The project is in the third of three phases, says Poulin. The first phase involved researchers establishing the text mining methods and predictive models of suicide.

Phase two is the data collection through social media and mobile text "that's on autopilot now", he says.

Phase three is work that's under way to develop an automated intervention system, that could, for instance, notify designated "buddies" of participating veterans, such as family members or friends, or mental health professionals, that an individual is at risk, says Poulin.

Poulin acknowledges that many veterans who are suffering from serious mental health issues or who could be most at risk for suicide or other destructive behavior are unlikely to hear about or sign up for the Durkheim Project. However, he's hopeful that the project can help many other vets, such as those who are suddenly faced with difficult issues that might push them to suicide.

"Maybe we could help that guy who's just back from Iraq and is trying to transition back to normal life from going off the rails when suddenly things fall apart," he says. On the other hand, "some people who are most private or recluse could be most at risk" and unwilling to participate in project like this, let alone social media, he says.

The Durkheim Project's concept of using social media data analysis to aid suicide prevention efforts could be expanded to other populations beyond vets, Poulin says. "Teen suicide prevention is an application area we'd be happy to help out in."

Privacy Worries

While Poulin says the data that is collected will not be sold or used for other purposes other than the project, and veterans can voluntarily opt in or out of the effort, not everyone is convinced that individuals' privacy will be properly protected.

Privacy advocate Deborah Peel, M.D., who is also a practicing psychoanalyst, has doubts that individuals will be "informed" enough about the use and flow of their data before giving consent to have their information collected and analyzed for this effort.

"The best result of this research would be [to collect] more data on why the U.S. needs to build a mental health system," she says, arguing that not enough resources are allocated to mental health programs. "The worst results are very dark - we get another hidden technology that spies on everyone, and secretly decides the state of their mental health, which may not be accurate, and sells that information to various hidden users.

"The designers of this research have good intent, but what they are building can be sold and used for harm," she worries.

Social media data analysis for health related research is part of other efforts in the U.S. That includes a project under way by the Department of Homeland Security and Accenture to scan social media sites to collect and analyze health-related data could help identify infectious disease outbreaks, bioterrorism or other public health and national security risks (see: Using Social Media for Biosurveillance).


About the Author

Marianne Kolbasuk McGee

Marianne Kolbasuk McGee

Executive Editor, HealthcareInfoSecurity, ISMG

McGee is executive editor of Information Security Media Group's HealthcareInfoSecurity.com media site. She has about 30 years of IT journalism experience, with a focus on healthcare information technology issues for more than 15 years. Before joining ISMG in 2012, she was a reporter at InformationWeek magazine and news site and played a lead role in the launch of InformationWeek's healthcare IT media site.




Around the Network

Our website uses cookies. Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing govinfosecurity.com, you agree to our use of cookies.