A series of investigation reports rolled out by Wall Street Newspaper puts the spotlight behind the scenes on Facebook. Everything from rule exceptions for high-profile users to Instagram’s contribution to teenagers’ mental health, Newspaper‘s “Facebook Files” reveal internal Facebook research that seems to show how knowledgeable the company was about the platform’s ‘bad effects’.
The Newspaper’s reports, three so far, are based on a review of internal Facebook documents, including research reports, online employee discussions and draft presentations to senior management. They reveal that the company is alleged to have failed to address many issues that have long been known about and in some cases made them worse.
“Time and time again, the documents show, Facebook researchers have identified the bad effects of the platform. Time and time again, despite congressional negotiations, their own promises and many media exposures, the company did not fix them, he says. Newspaper reports. “The documents provide perhaps the clearest picture to date of how widely Facebook’s problems are known within the company, up to the boss himself.”
Here’s what you need to know about Newspaperongoing revelations about Facebook.
The XCheck program
Due to an internal program called “cross check” or “XCheck”, high-profile Facebook users are exempt from some or all of the platform’s rules, according to debut episode of “Facebook files”.
Published Monday, the NewspaperPreliminary report examines how the program, which was originally intended to quality control measures against “elite” users such as celebrities, politicians and journalists, has actually allowed these users to avoid moderation. According to information, the protected persons are either whitelisted, which means that they are not subject to enforcement measures from Facebook or are allowed to publish breach of content pending further review by Facebook employees. According to an internal document from 2019 reviewed by Newspaper, less than 10% of the posts flagged to XCheck were actually reviewed.
The program reportedly protected at least 5.8 million people as of 2020, including former President Donald Trump, Donald Trump Jr., Senator Elizabeth Warren and Candace Owens, and has allowed false information, harassment, incitement to violence and revenge porn to remain on the platform. .
“In fact, we do not do what we say we do in public,” according to an internal audit from 2019. “Unlike the rest of our society, these people can violate our standards without any consequences.”
Facebook spokesman Andy Stone told Newspaper that criticism of XCheck is justified and that Facebook has worked to address the issues with the program, which is intended to “apply guidelines for content that may require more understanding.”
Instagram’s contribution to teens’ mental health
Facebook, which owns Instagram, has known this for several years The photo-sharing platform is detrimental to mental health of a significant proportion of young users, and especially teenage girls, according to a Newspaper Report published Tuesday.
Referring to internal studies conducted by Facebook over the past three years, the report describes how the company downplayed its own research on the negative effects that Instagram has on the millions of young people who make up almost half of its user base, ranging from eating disorders to depression to suicidal thoughts. An internal Facebook presentation reviewed by Newspaper showed that among teens who reported suicidal thoughts, 13% of UK users and 6% of US users tracked the problem to the photo sharing app.
The report says that Facebook’s own research showed that teenage girls are particularly susceptible to Instagram’s mental toll, where highly filtered and edited photos that promote unrealistic and often unattainable body standards run high. In a presentation from 2019, researchers said that Instagram exacerbates problems with the body image of one in three teenage girls. A presentation from the following March, reviewed by Newspaper, reported that 32% of teenage girls who felt bad about their bodies felt worse when they looked on Instagram.
“Comparisons on Instagram can change how young women see and describe themselves,” reads a picture.
But that does not mean that teenage boys are not also affected by the psychological consequences of using Instagram. Facebook’s research reportedly showed that 40% of teenage boys experience the effects of negative social comparisons when using Instagram, while 14% of boys in the US said that Instagram made them feel worse about themselves.
Despite these findings, Facebook never made its research public or available to academics or lawmakers who requested it Daysal reported. Researchers also found that some of the features that play a key role in Instagram’s success and addictive nature, such as the Explore page, which serves users decorated according to their interests, are among the most harmful to young people. “Aspects of Instagram are exacerbating each other to create a perfect storm,” the research reportedly said.
In response to a request for comment Newspapers report, pointed out an Instagram spokesman TIME to a Blog post on Tuesday which says that while the story “focuses on a limited set of findings and throws them in a negative light”, the company stands by the research when it “shows our commitment to understanding complex and difficult issues that young people can struggle with and informs all the work we do for to help those who are experiencing these problems. “
The Newspaper goes on to say that Instagram’s research reflects external research that “the effects of social media on human well-being are mixed” and that “[s]Social media is not in itself good or bad for people. Many people think that it is good one day and problematic the next. What seems to be most important is how people use social media and their state of mind when using them. ”
The Newspapers report comes in the wake of Facebook confirms its intention to continue the development of an Instagram for children under 13 despite pressure from lawmakers to abandon the plan.
On the issue of the link between children’s declining mental health and social media platforms at a congressional hearing in May, Facebook CEO Mark Zuckerberg said the research was not “crucial”.
An algorithm that rewards divisive content
When Facebook revised its News Feed algorithm to promote “meaningful social interactions” in 2018, it instead resulted in a system that reinforces divisive content on the platform, the Newspaper reported Wednesday.
While Zuckerberg said the goal of the change was to “strengthen ties between users and improve their well-being”, Facebook staff reportedly warned that the algorithm’s heavy weighting of shared material drives polarization and fake content that caused outrage and therefore more comments and reactions, at the forefront of users’ news feeds.
“Incorrect information, toxicity and violent content are unusually common among redistributors,” said researchers in an internal memo reviewed by Newspaper.
Zuckerberg also opposed some suggestions for fixing the problem, such as eliminating an upswing that the algorithm gave to content that is likely to be shared by long chains of users, for concern that it would lead to reduced user engagement.
In response, Lars Backstrom, Facebook’s vice president of technology, told Newspaper that any algorithm can market malicious content and that the company has an “integrity team” that works to mitigate these issues. “Like any optimization, there will be some ways to exploit or take advantage of it,” he said.