How the death of a British teenager changed social media | Techy Kings

[ad_1]

Instagram also hides search terms – but only if the term or phrase itself promotes or encourages self-harm, says Tara Hopkins, director of EMEA public policy at Instagram. “For other search terms related to suicide/self-harm that are not inherently offensive, we display a support message before displaying any results.” The company declined to share how many search terms were blocked.

Instagram’s parent company Meta says it is juggling concerns about children’s safety with young people’s freedom of expression. The company admitted that two posts Molly saw and showed to the court would have violated Instagram’s policy at the time. But Elizabeth Lagone, head of health and wellbeing at Meta, told last week’s inquiry it was “important to give people that voice” if they are struggling with suicidal thoughts. When the Russell family’s attorney, Oliver Sanders, asked Lagone if she agreed that the content Molly saw and saw by the court was “not safe,” Lagone replied, “I think it’s safe for people to be able to express themselves.”

These comments embody what researchers say are major differences between the two platforms. “Pinterest is much more concerned with being decisive, being clear, and de-platforming content that doesn’t meet their standards,” said Samuel Woolley, program director of the Propaganda Research Lab at the University of Texas at Austin. “Instagram and Facebook … tend to be much more keen to run afoul of free speech.”

Pinterest hasn’t always worked this way. Hoffman told the inquiry that Pinterest’s guidance used to be “when in doubt, lean toward … lighter content moderation.” But Molly’s death in 2017 coincided with the fallout from the 2016 US presidential election, when Pinterest was implicated in spreading Russian propaganda. Around that time, Pinterest started banning entire topics that didn’t fit with the platform’s mission, like vaccines or conspiracy theories.

It stands in stark contrast to Instagram. “Meta-platforms, including Instagram, are governed by the dictum of wanting to exist as infrastructural information tools [like] the phone or [telecoms company] AT&T, rather than as a social media company,” says Woolley. Facebook shouldn’t be the “arbiter of truth,” Meta founder and CEO Mark Zuckerberg argued in 2020.

The survey also showed differences between how transparent the two platforms were willing to be. “Pinterest provided helpful material about Molly’s activities on Pinterest at once, including not only Pins that Molly had saved, but also Pins that she [clicked] on and rolled over,” says Varney. Meta never provided the court with that level of detail, and much of the information the company shared has been redacted, she adds. For example, the company revealed that in the six months before her death, Molly was recommended 30 accounts with names that referred to sad or depressing themes. Still, the actual names of those accounts were redacted, with the platform citing user privacy.

Varney agrees that both platforms have made improvements since 2017. Pinterest results for self-harm terms don’t contain the same level of graphic content they did five years ago, she says. But Instagram’s changes have been too little, too late, she argues, adding that Meta didn’t ban graphic images of self-harm and suicide until 2019.

[ad_2]

Source link