Online Safety: Desired Takedown Obligations for Certain Content


Online safety: desired takedown obligations for certain content

Experts recommend imposing takedown requirements for harmful content for children.

Experts tasked with advising the federal government on drafting an online safety bill recommend imposing takedown requirements for content harmful to children, but do not agree. #x27;not all agree that such requirements should apply to any form of potentially hateful posting.

Some experts have expressed concern that the requirement to remove all forms of content, with the possible exception of content that explicitly calls for child abuse and sexual exploitation, is- he writes in a summary of the discussions of the panel of 12 experts published by Ottawa on Friday.

Panel members have reported that they are concerned that marginalized groups will be particularly affected by a suppression approach rather than a risk management one.

They explained that instead of being forced to take down content, services would be forced to manage their risk, which they could do by taking down content, but also by using other methods. other tools.

Other experts favored broader takedown obligations. They explained that it was best to err on the side of caution. They expressed a preference for excessive removal of content, rather than insufficient removal, the review notes.

In any case, everyone is of the opinion that the legislative framework on which the Minister of Heritage, Pablo Rodriguez is working, must include rigorous special protection measures for children .

[Experts] explained that children need stronger protections because of their inherent vulnerability, an area where existing laws governing digital spaces fall short. They stressed that online services must be required to assess and mitigate any risk their services pose to children in particular.

They addressed the #x27;idea for user reporting mechanisms and the introduction of timelines for the removal of offensive content specifically harmful to children.

Justin Trudeau's government has been promising action on harmful online content for several years. Shortly before the last election was called, the Liberals introduced Bill C-36, which aimed to give tools to citizens who were victims of online hate. The initiative, which immediately died on the order paper, was to be accompanied by another bill, which was not finally tabled before the election campaign.

However, a legislative and regulatory framework had been presented and submitted for consultation. The latter targeted five categories of content to tackle: hate speech, incitement to violence, terrorism, non-consensual sharing of intimate images as well as the sexual exploitation of children. .

Experts have told the government that they consider it necessary to include disinformation in legislation, but that it is not simple to find the right way to do it.

They expressed […] extreme caution with regard to the definition of disinformation in the legislation for a a number of reasons, including because it would put the government in a position to tell right from wrong, which it simply cannot do, reads the summary of their deliberations.

< p class="e-p">The specialist group completed its work last month. Minister Rodriguez is continuing public consultations throughout the summer.


Please enter your comment!
Please enter your name here