Meta to Expand Child Safety Measures

Meta to Expand Child Safety Measures

Meta to Expand Child Safety Measures

In response to mounting concerns and accusations, Meta has announced plans to bolster its child safety measures across three key areas: Recommendations and Discovery, Restricting Predators, and Strengthening Enforcement.

Under the umbrella of its Child Safety Task Force, Meta is taking proactive steps to fortify its child safety policies. In the first area, labelled 'Recommendations and Discovery,' the company aims to broaden the scope of child safety-related terms and employ machine learning to identify harmful associations between terms. This includes an effort to enhance the detection of potentially harmful content.

The second area, 'Restricting Potential Predators and Removing their Networks,' focuses on identifying and preventing suspicious adults from engaging with teens on Instagram and Facebook. Actions involve the removal of Instagram Reels and Facebook Groups that violate Meta's child safety policies. Law enforcement specialists will monitor online networks for evolving behaviours and coded language, contributing to the continuous improvement of detection technology.

The third and final focus area, 'Strengthening Enforcement,' emphasizes Meta's commitment to refining its systems. The company has implemented technology designed to identify and address child exploitative imagery. Moreover, Meta has joined Lantern, a program by the Tech Coalition, enabling the sharing of signals about accounts violating child safety policies for internal investigations and actions.

These measures are a direct response to the company facing accusations of inadequately preventing the spread of child sexual abuse material (CSAM). A June report in The Wall Street Journal highlighted Instagram's role in facilitating connections between accounts engaged in the illicit trade of CSAM through its recommendations algorithm. Subsequent investigations revealed similar issues in Facebook Groups, with accounts and groups involved in inappropriate activities.

Under pressure from the USA and facing a deadline set by the EU for more information on child protection measures by December 1, 2023, Meta is taking these steps to address the serious concerns raised. CEO Mark Zuckerberg, alongside other key executives from major tech companies, is slated to testify before the Senate in January 2024 regarding online child exploitation.

   


পাঠকের মন্তব্য