The supercomputer , called the RSC (Research SuperCluster) of Meta
In a July 2020 document titled Child Safety – State of Play, Meta listed immediate product vulnerabilities that could harm children, including difficulty in reporting disappearing videos, and confirmed that the protections offered on Facebook were not always present on Instagram.
Dissipated during his defamation trial, Trump is brought back to order by the judge
ELSE ON NEWS: Dissipated during his trial for defamation, Trump is brought back to order by the judge
At the time, Meta's reasoning was that she didn't want to prevent parents and older relatives on Facebook from contacting their younger family members, according to the suit. The report's author called the reasoning unconvincing and said Meta sacrificed children's safety for a growth gamble.
However, in March 2021, the Instagram app announced that it was banning people over the age of 19 from messaging minors.
Meanwhile, during an internal conversation in July 2020, an employee asked: What do we do? -us specifically regarding child manipulation [something I just heard about and happens a lot on TikTok]?
Another employee's response was: somewhere between zero and nothing. Child safety is not an explicit goal of this half [which presumably means one semester], according to the lawsuit.
In a statement, Meta said it wants teens to have safe, age-appropriate online experiences. The company says it has spent a decade working on these issues and hiring people who have dedicated their careers to keeping young people safe and supported online.
The complaint misrepresents our work using selective citations and documents that support its position.
A quote from Extract from the Meta press release
Instagram began restricting the ability of adults to message minors in 2021.
Instagram n' ;also did not resolve the problem of inappropriate comments under minors' posts, the complaint states. This is what Arturo Béjar, former director of engineering at Meta, mentioned to a committee. The latter, known for his expertise in combating online harassment, recounted his own daughter's disturbing experiences with Instagram.
I stand before you today as a father with first-hand experience of a child receiving unwanted sexual advances on Instagram, he told a panel of U.S. senators in November. She and her friends began to have horrible experiences, including repeated unwanted sexual advances, harassment.
A March 2021 child safety presentation noted that Meta is not sufficiently invested in addressing the sexualization of minors on [Instagram], including sexualized comments on content posted by minors. Not only is this a terrible experience for creators and internet users, but it is also a vehicle for bad people to identify and connect with each other.
Mark Zuckerberg, CEO of Meta (Archive photo)
Meta, based in Menlo Park, California, has updated its protections and tools for younger users, although critics say it hasn't done enough. Last week, the company announced that it would begin hiding inappropriate content from teen accounts on Instagram and Facebook, including posts about suicide, self-harm and disorder. New Mexico's complaint follows a lawsuit filed in October by 33 states who claim that Meta harms young people and contributes to their mental health problems by knowingly and deliberately designing features on Instagram and Facebook that get children addicted to its platforms.
For years, Meta employees tried to sound the alarm about how decisions made by Meta executives subjected children to solicitations dangerous and sexual exploitation declared Raúl Torrez in a press release.
While the& As the company continues to minimize the illegal and harmful activities children are exposed to on its platforms, internal data and Meta presentations show the problem is serious and pervasive.
A quote from Raúl Torrez, Attorney General of New Mexico
Meta founder Mark Zuckerberg, along with Snap executives, Discord, TikTok and X, are expected to testify before a US Senate committee on child safety in late January.