Tech firms "need to stop monetising misery", the father of Molly Russell says as her inquest ends.
Our Online Safety Bill is the answer and through it we will use the full force of the law to make social media firms protect young people from horrendous pro-suicide material." A member of his staff had to leave the room while they were viewed. His narrative conclusion continued: "Molly subscribed to a number of online sites. She also reached out to celebrities on Twitter with pleas for support, not realising they were unlikely to notice her messages, let alone reply. The coroner will compile a report outlining his concerns. Other content sought to isolate and discourage discussion with those who may have been able to help. - Secretary of State for Digital, Culture, Media and Sport Michelle Donelan said in a statement: "The inquest has shown the horrific failure of social media platforms to put the welfare of children first". "These binge periods are likely to have had a negative effect on Molly. - Following the conclusion of the inquest, a spokeswoman for Meta said the company was "committed to ensuring that Instagram is a positive experience for everyone, particularly teenagers" and that it would "carefully consider the coroner's full report when he provides it". He is to write to Meta - the owner of Instagram - and Pinterest, as well as the government and Ofcom. He said the images of self-harm and suicide she viewed "shouldn't have been available for a child to see". "She died from an act of self-harm while suffering from depression and the negative effects of online content."
Fourteen-year-old died from self-harm, and web material probably contributed to death 'in more than minimal way', inquest finds.
“For the first time globally, it has been ruled that content a child was allowed and encouraged to see by tech companies contributed to their death,” said Andy Burrows, its head of child safety online policy. And if you’re struggling, please speak to someone you trust or one of the many wonderful support organisations rather than engage with online content that may be harmful. Molly’s father ended a press conference after the hearing with a tribute to his daughter. She said the online safety bill, which will require tech platforms to protect children from harmful content, “is the answer”. He said some of the content “romanticised” acts of self-harm and sought to discourage users from seeking professional help. Sir Peter Wanless, the NSPCC chief executive, said the ruling would “send shock waves through Silicon Valley”. Elizabeth Lagone, the head of health and wellbeing policy at Meta, had described one batch of depression, suicide and self-harm content seen by Molly before her death as “safe” because it adhered to content guidelines at the time. The inquest heard that she compiled a digital pinboard on Pinterest with 469 images related to similar subjects. Asked if he had a message for Zuckerberg, the founder and chief executive of Meta, he said: “Listen to people who use his platform. He said Meta had to change its “toxic corporate culture”. The headteacher at Molly’s secondary school He called for the government to implement its long-delayed online safety bill.
A senior coroner has concluded schoolgirl Molly Russell died after suffering from “negative effects of online content”. Coroner Andrew Walker said online ...
“Molly Rose Russell died from an act of self-harm whist suffering from depression and the negative effects of online content.” The inquest heard Molly accessed material from the “ghetto of the online world” before her death in November 2017, with her family arguing sites such as Pinterest and Instagram recommended accounts or posts that “promoted” suicide and self-harm. Concluding it would not be “safe” to rule Molly’s cause of death was suicide, Mr Walker said the teenager “died from an act of self-harm while suffering depression and the negative effects of online content”.
On Friday the senior coroner at North London coroner's court ruled at the end of a two-week hearing that Molly had died from an act of self-harm while suffering ...
The court also heard Molly had a Twitter account that she used to contact Salice Rose, an influencer who has discussed her experience of depression online, in an attempt to gain help. Others related to anxiety and depression, while it emerged that Pinterest had sent content recommendation emails to Molly with titles such as “10 depression pins you might like”. Some content, such as the video clips, was repeated more than once in court, giving those present an idea of how Ian Russell felt when he said the “relentless” nature of the content “had a profound adverse impact on my mental health”. Raising his voice at one point, he said Instagram was choosing to put content “in the bedrooms of depressed children”, adding: “You have no right to. But Russell said the family had noticed a change in Molly’s behaviour in the last 12 months of her life. She defended the suitability of some of the posts, saying they were “safe” for children to see because they represented an attempt to raise awareness of a user’s mental state and share their feelings. Ian Russell emphasised this part of Molly’s life as he paid an emotional tribute to her at the start of the inquest at North London coroner’s court, talking of a “positive, happy, bright young lady who was indeed destined to do good”. He questioned how posts containing slogans like “I don’t want to do this any more” could be appropriate for a 14-year-old to view. In September 2017 Russell told his daughter the family was concerned about her, but she described her behaviour as “just a phase I’m going through”. Oliver Sanders KC, representing the Russell family, said “this is Instagram literally giving Molly ideas”. Some videos contained scenes drawn from film and TV, including 13 Reasons Why, a US drama about a teenager’s suicide that contained episodes rated 15 or 18 in the UK. Molly, 14, from Harrow, north-west London, had killed herself after falling, unbeknown to her family, into a vortex of despair on social media.
Content on social media sites, including Instagram and Pinterest, is “likely” to have contributed to the death of British teenager Molly Russell, ...
Ian Russell's campaigning after his daughter's death has made case for online safety bill unavoidable, says peer.
The bill’s progress through parliament has been paused but it is expected to resume in late October with the child safety provisions staying intact, if not strengthened. The online safety bill places a duty of care on tech companies to shield children from harmful content and systems. What they did then is now a contravention of the code.” Ofcom, the communications watchdog, will vet those proposals and monitor the companies’ adherence to them. “The Russell family have made an unavoidable case for the online safety bill,” says Beeban Kidron, a crossbench peer who sat on the joint parliamentary committee that scrutinised the bill. Kidron paid tribute to Russell, a 59-year-old TV director who has become an important voice on internet safety.
The coroner said online material viewed by the 14-year-old on sites such as Instagram and Pinterest 'shouldn't have been available for a child to see' ...
“Please do what you can to live long and stay strong. I hope this will be an important step in bringing about change.” The inquest also heard details of emails sent to Molly by Pinterest, with headings such as “10 depression pins you might like” and “new ideas for you in depression”. “Sadly, there are too many others similarly affected right now. “At the time that these sites were viewed by Molly, some of these sites were not safe as they allowed access to adult content that should not have been available for a 14-year-old child to see. The coroner said some of the content Molly viewed was “particularly graphic" and “normalised her condition,” focusing on a “limited” view without any counter-balance.
Concluding it would not be “safe” to rule Molly's cause of death as suicide, Mr Walker said the teenager “died from an act of self-harm while suffering ...
“Molly Rose Russell died from an act of self-harm whist suffering from depression and the negative effects of online content.” “The platform operated in such a way using algorithms as to result, in some circumstances, of binge periods of images, video clips and text – some of which were selected and provided without Molly requesting them. and then do something about it”. At North London Coroner’s Court on Friday, he said: “At the time that these sites were viewed by Molly, some of these sites were not safe as they allowed access to adult content that should not have been available for a 14-year-old child to see. Concluding it would not be “safe” to rule Molly’s cause of death as suicide, Mr Walker said the teenager “died from an act of self-harm while suffering depression and the negative effects of online content”. Thank you.”
Other witnesses included child psychiatrist Dr Navin Venugopal, Molly's headteacher Sue Maguire and deputy headteacher Rebecca Cozens. Molly Russell inquest ...
– What was said about Molly Russell’s activity on Twitter? – What was said about Molly Russell’s activity on Pinterest? – What was said about Molly Russell’s activity on Instagram?
William spoke out after a coroner ruled social media content contributed to the death of the 14-year-old.
and then do something about it”. [He spoke out after a coroner ruled social media contributed to the death of the 14-year-old.](/news/uk/molly-russell-coroner-concludes-material-not-safe-b1029318.html) Thank you.” Online safety for our children and young people needs to be a prerequisite, not an afterthought.” Head of child safety online policy at the children’s charity the NSPCC, Andrew Burrows, said it was the “first time globally it has been ruled that content a child was allowed and encouraged to see by tech companies contributed to their death”. They have been so incredibly brave.
Baroness Beeban Kidron said she will table a change to the legislation in the House of Lords after the coroner's conclusion on Friday.
He said his message to Instagram – and Facebook – boss Mark Zuckerberg would be: “Just to listen. Andy Burrows, head of child safety online policy at the NSPCC, said: “This is social media’s big tobacco moment. For the first time globally, it has been ruled content a child was allowed and even encouraged to see by tech companies contributed to their death.
The Prince of Wales says online safety for young people should be "a prerequisite, not an afterthought".
At the conclusion of the hearing, the coroner said he would compile a report outlining his concerns. A member of his staff had to leave the room while they were viewed. His narrative conclusion continued: "Molly subscribed to a number of online sites. Pinterest executive Judson Hoffman told the court that the platform "should be safe for everyone", and accepted that "there was content that should have been removed that was not removed" when Molly was using it. A spokeswoman for Meta said the company was "committed to ensuring that Instagram is a positive experience for everyone, particularly teenagers" and that it would "carefully consider the coroner's full report when he provides it". And at a later news conference Mr Russell was on the edge of tears as he concluded his remarks by paying tribute to Molly - thanking her for being his daughter. The content she then kept watching more and more of was likely to have had a negative effect on the teenager, and "contributed to her death in a more than minimal way", Mr Walker said. The inquest was only shown a small sample of the thousands of images that algorithms served up to Molly - dark, miserable and depressing, of nooses, pills and razor blades. "She died from an act of self-harm while suffering from depression and the negative effects of online content." Molly took her own life in 2017, and coroner Andrew Walker said the images of self-harm and suicide she viewed online "shouldn't have been available for a child to see". A coroner concluded that the teenager from London died from an act of self-harm while suffering depression and the negative effects of online content. The prince said: "No parent should ever have to endure what Ian Russell and his family have been through."
The Prince of Wales says online safety for young people should be "a prerequisite, not an afterthought".
At the conclusion of the hearing, the coroner said he would compile a report outlining his concerns. A member of his staff had to leave the room while they were viewed. His narrative conclusion continued: "Molly subscribed to a number of online sites. Pinterest executive Judson Hoffman told the court that the platform "should be safe for everyone", and accepted that "there was content that should have been removed that was not removed" when Molly was using it. A spokeswoman for Meta said the company was "committed to ensuring that Instagram is a positive experience for everyone, particularly teenagers" and that it would "carefully consider the coroner's full report when he provides it". And at a later news conference Mr Russell was on the edge of tears as he concluded his remarks by paying tribute to Molly - thanking her for being his daughter. The content she then kept watching more and more of was likely to have had a negative effect on the teenager, and "contributed to her death in a more than minimal way", Mr Walker said. The inquest was only shown a small sample of the thousands of images that algorithms served up to Molly - dark, miserable and depressing, of nooses, pills and razor blades. "She died from an act of self-harm while suffering from depression and the negative effects of online content." Molly took her own life in 2017, and coroner Andrew Walker said the images of self-harm and suicide she viewed online "shouldn't have been available for a child to see". A coroner concluded that the teenager from London died from an act of self-harm while suffering depression and the negative effects of online content. The prince said: "No parent should ever have to endure what Ian Russell and his family have been through."
Prince of Wales says 'no parent should ever have to endure' what her family went through.
Concluding it would not be “safe” to rule Molly’s cause of death as suicide, Mr Walker said the teenager “died from an act of self-harm while suffering depression and the negative effects of online content”. Thank you.” [Prince of Wales](/topic/prince-of-wales) has said online safety for children “needs to be a prerequisite” after a coroner ruled social media contributed to the death of [Molly Russell](/topic/molly-russell).
These companies make decisions that harm children. The government must take action, say the NSPCC's Sir Peter Wanless and 5 Rights' Lady Beeban Kidron.
Lady Beeban Kidron is the founder and chair of 5 Rights It is time to ensure that child safety is at the top of the corporate inbox, ahead of profit or indeed any other consideration. Do you have an opinion on the issues raised in this article? It needs to be a robust piece of legislation that sets out children’s rights and needs in enforceable codes of practice, which ensure child safety is not optional but simply a price of doing business – just like any other sector. The world is watching. These young people are the collateral damage of a “move fast and break things” culture in the tech industry, where tragedy is met with a wilful corporate blindness. Basic product safety is the aim. And it forced the Thanks to the incredible courage, bravery and determination of Molly’s father, Ian, and her family, they finally have answers to what caused the death of the daughter and sister they adored. It’s also been five years that we have been waiting for new online safety laws. Answers that they had to fight for against the inscrutable indifference of tech companies. The senior coroner, Andrew Walker, concluded that 14-year-old Molly Russell “died from an act of self-harm while suffering from depression and the negative effects of online content”.