Meta’s deeper than brand problem Continues to Raise Researchers’ Concern

Meta’s Facebook Inc. knows and admitted that its platforms were used to Incite Violence in Myanmar.  Facebook further admitted that wrongfully removed and suppressed content by Palestinians and their supporters, including about human rights abuses carried out by Israel against Palestinians during the May 2021 aggression on Gaza.


There is damning evidence that Facebook has a two-tier justice system, that it knew Instagram was worsening body-image issues among girls and that it had a bigger vaccine misinformation problem than it let on, among other issues. The company’s acknowledgment of errors and attempts to correct some of them are insufficient and do not address the scale and scope of reported content restrictions, or adequately explain why they occurred in the first place. are

The company’s acknowledgment of errors and its attempts to correct some of them are insufficient and do not address the extent and scope of the content restrictions reported, nor do they adequately explain why they occurred first. Facebook researchers have identified the platform’s adverse effects, despite congressional hearings, its own pledges, and numerous press briefings, the company has not corrected them. The documents offer perhaps the clearest picture to date of the extent of awareness of Facebook’s issues within the company, right down to the CEO himself.

Facebook Strategic Moves

To mitigate its trouble, Facebook has recently made, among others two strategic moves:

  • Rebranded itself to Meta, a Meta Verse Company, a term typically used to describe the concept of a future iteration of the internet, made up of persistent, shared, 3D virtual spaces linked into a perceived virtual universe.
  • Shut down its facial recognition software. This means that people who’ve opted to the service in will no longer be automatically recognized in photos and videos and we will delete more than a billion people’s individual facial recognition templates. Their promised change will also impact image descriptions captured for blind and visually-impaired people. After this change, descriptions will no longer include the names of people recognized in photos but will function normally otherwise.

Scholars’ Letter to Facebook

A coalition of scholars across the world and a number of influential signatories have joined forces to send an open letter to Meta CEO Mark Zuckerberg. The group calls for Meta to finally do their part in understanding the mental health of children and adolescents and offers three concrete steps Meta must take if it is indeed serious about the mental health of its users.

The letter outlines three key points:

Commit to gold-standard transparency on child and adolescent mental health research

The foundation of modern science is best captured by the Royal Society’s motto: Nullius in verba—Latin for ‘take nobody’s word for it’. This principle applies equally to independent scientists and those who work for Meta. Science only works if independent verification of the methods, analysis pipeline, and data for a given research project are public. The studies we have seen in recent weeks fall well short of this basic standard.

Contribute to independent research on child and adolescent mental health around the globe

Large-scale studies in dozens of countries track cohorts of young people through every phase of life, using genetic, social, psychological, nutritional, educational, and economic data to understand human development. As the lines between online and offline blur, these sources of information increasingly fail to capture the full determinants of mental health. Meta’s platforms capture a wide swath of behaviours that are critical to advancing scientific understanding of child and adolescent mental health in general, and effects linked to Meta platforms in particular.

Establish an independent oversight trust for child and adolescent mental health on Meta platforms

The time is right for a new global trust dedicated to promoting credible, independent, and rigorous oversight on the mental health implications of Meta. Expanding upon the Facebook Oversight Board model, in place of quasi-judicial rulings the trust would conduct independent scientific oversight.

What can Facebook do?

Its possible that there is very little that Facebook can do. However, some advocates are calling for one of three options:

  1. Dismantling of sensitive functions and the deletion of sensitive data including facial recognition.
  2. Government or Self Regulation by internet platform
  3. Strong Enforcement by consumer protection

What do you think?
Let us know by tweeting your response to the article.

Leave a Reply

Your email address will not be published. Required fields are marked *