A leading psychologist who advises Meta on suicide prevention and self-harm has quit her role, accusing the tech giant of “turning a blind eye” to harmful content on Instagram, repeatedly ignoring expert advice and prioritising profit over lives.
Lotte Rubæk, who has been on Meta’s global expert group for more than three years, told the Observer that the tech giant’s ongoing failure to remove images of self-harm from its platforms is “triggering” vulnerable young women and girls to further harm themselves and contributing to rising suicide figures.
Such is her disillusionment with the company and its apparent lack of desire to change, the Danish psychologist has resigned from the group, claiming Meta does not care about its users’ wellbeing and safety. In reality, she said, the company is using harmful content to keep vulnerable young people hooked to their screens in the interest of company profit.
In her resignation letter, she wrote: “I can no longer be part of Meta’s SSI expert panel, as I no longer believe that our voice has a real positive impact on the safety of children and young people on your platforms.”
In an interview with the Observer, Rubæk said: “On the surface it seems like they care, they have these expert groups and so on, but behind the scenes there’s another agenda that is a higher priority for them.”
That agenda, she said, was “how to keep their users’ interaction and earn their money by keeping them in this tight grip on the screen, collecting data from them, selling the data and so on.”
A Meta spokesperson said: “Suicide and self-harm are complex issues and we take them incredibly seriously. We’ve consulted with safety experts, including those in our suicide and self-harm advisory group, for many years and their feedback has helped us continue to make significant progress in this space.
“Most recently we announced we’ll hide content that discusses suicide and self-harm from teens, even if shared by someone they follow, one of many updates we’ve made after thoughtful discussion with our advisers.”
Rubæk’s warning comes as new research by Ofcom published last week found that violent online content is “unavoidable” for children in the UK, many of whom are first exposed when still in primary school. Among the main apps mentioned by those interviewed was Instagram.
Rubæk, who leads the self-injury team in child and adolescent psychiatry in the Capital Region of Denmark, was first approached about joining the select group of experts – which has 24 publicly listed members – in December 2020. The invite came after she publicly criticised Meta, then known as Facebook, over an Instagram network linked to suicides of young women in Norway and Denmark following a documentary by Danish broadcaster DR.
She agreed to join in the hope of helping to change the platform to make it safer for young people. After a couple of years of having her suggestions ignored – the original network she was critical of still exists – she came to the conclusion that the panel was just for show.
Now she believes the invitation could have been an attempt to silence her. “Maybe they wanted me to be a part of them so I wouldn’t be so critical of them in the future.”
In emails seen by the Observer Rubæk raised the difficulties users faced in trying to report potentially triggering images with Meta in October 2021. In correspondence with Martin Ruby, Meta’s head of public policy in the Nordics, she said she had tried to report an image of an emaciated female but received a message from Instagram saying they did not have enough moderators to look at the image, which stayed on the platform.
In response, Ruby said in November 2021: “Our people are looking at it, but it is not that simple.” In the same email, he mentioned the secret Instagram network that Rubæk had originally criticised, saying that Meta was “taking a closer look”.
But despite its well-documented links to suicides, Rubæk says the network remains up and running today.
Rubæk’s patients tell her they have tried to report self-harm images on Instagram but they often remain. One client said that after reporting one image, it vanished, but she later saw it via a friend’s account, suggesting it had only been moved from her view. Meta “does a lot of tricks” to get around removing content, said Rubæk.
“The AI is so clever, finding even the smallest nipple in a photo.” But when it comes to graphic pictures of self-harm that are proven to inspire others to harm themselves, she added, it appears to be a different story.
https://news.google.com/rss/articles/CBMigQFodHRwczovL3d3dy50aGVndWFyZGlhbi5jb20vdGVjaG5vbG9neS8yMDI0L21hci8xNi9pbnN0YWdyYW0tbWV0YS1sb3R0ZS1ydWJhZWstYWR2aXNlci1xdWl0cy1mYWlsdXJlLXRvLXJlbW92ZS1zZWxmLWhhcm0tY29udGVudC3SAYEBaHR0cHM6Ly9hbXAudGhlZ3VhcmRpYW4uY29tL3RlY2hub2xvZ3kvMjAyNC9tYXIvMTYvaW5zdGFncmFtLW1ldGEtbG90dGUtcnViYWVrLWFkdmlzZXItcXVpdHMtZmFpbHVyZS10by1yZW1vdmUtc2VsZi1oYXJtLWNvbnRlbnQt?oc=5
2024-03-16 18:39:00Z
CBMigQFodHRwczovL3d3dy50aGVndWFyZGlhbi5jb20vdGVjaG5vbG9neS8yMDI0L21hci8xNi9pbnN0YWdyYW0tbWV0YS1sb3R0ZS1ydWJhZWstYWR2aXNlci1xdWl0cy1mYWlsdXJlLXRvLXJlbW92ZS1zZWxmLWhhcm0tY29udGVudC3SAYEBaHR0cHM6Ly9hbXAudGhlZ3VhcmRpYW4uY29tL3RlY2hub2xvZ3kvMjAyNC9tYXIvMTYvaW5zdGFncmFtLW1ldGEtbG90dGUtcnViYWVrLWFkdmlzZXItcXVpdHMtZmFpbHVyZS10by1yZW1vdmUtc2VsZi1oYXJtLWNvbnRlbnQt
Tidak ada komentar:
Posting Komentar