Facebook Files Highlight Need for Government Regulation – Massachusetts Daily Collegian
Facebook is deeply flawed
Facebook and other major social media platforms have become essential aspects of our daily lives, while somehow escaping many regulations. The Facebook Files – a series of internal Facebook documents recently obtained by the Wall Street Journal – show in detail just how bad this lack of regulation has been for the mental and civic health of individuals.
Jeff Horowitz, one of the reporters working on the Facebook Files, summed up their findings by saying: “Time and time again, documents show that in the United States and abroad, Facebook’s own researchers have identified the negative effects of the platform, in areas such as health, political discourse and human trafficking. He continues: “Time and time again, despite Congressional hearings, its own commitments, and numerous media briefings, the company has not corrected them. “
These findings confirm what many have been saying for years: Social media cannot remain unregulated when its impact on society is so vast.
The regulations seem to be the opposite of what Facebook wants. Facebook has set up an independent watchdog and a campaign to promote vaccine awareness, both trying to show how its platform can ultimately be social good. The Facebook files, however, reveal that Facebook has to say the least not lived up to its aspirations.
A good example of this is Facebook’s effort to improve social interactions on its platform. In January 2018, Facebook CEO Mark Zuckerberg announced a new plan to push forward Facebook’s goals of “helping people find relevant content to help them engage more with friends and family.” Zuckerberg argued that these changes could reduce user engagement, but noted that “over time [users] spending on Facebook will be more valuable. And if we do the right thing, I think it will be good for our community and our business in the long run as well. But while Facebook’s motives were ostensibly well-intentioned, what Facebook did in practice made matters worse.
Facebook attempted to increase interaction with users by creating a new post ranking algorithm that prioritized “meaningful social interactions”. But in the summer of 2018, it became clear that the new algorithm was having some unexpected effects. Users reported that the quality of their news feed had declined, while news agencies like Buzzfeed and ABC News saw more than 10% drop in online traffic. Facebook researchers also noted that a group of Polish politicians told them that “the proportion of [their party’s] posts [shifted] from 50/50 positive / negative to 80% negative, explicitly according to the change of algorithm. These negative consequences stem from the algorithm’s emphasis on prioritizing user engagement.
While the algorithm has been successful in slowing the decline in comments and improving Facebook’s “daily active” metric, it inadvertently increased the rankings of posts that have fueled outrage and debate. Since these posts tend to generate more engagement, outrage, anger, and controversy have become the best way for publishers to deliver content on Facebook.
One of the most egregious disclosures from the Facebook files concerns Instagram, which is owned by Facebook. According to the documents, Facebook had internal research detailing Instagram’s harmful effects on the mental health of adolescents, especially girls. Research has shown that Instagram worsens the body image problems of one in three teenage girls. Other research of teenagers in the US and UK found that “40% of Instagram users who said they felt ‘unattractive’ said the sentiment started on the app. In addition, psychology professor Jean Twenge notes that these mental health effects can often include “clinical-level depression that requires treatment.” We are talking about self-harm which brings people to the emergency room. Rather than sharing this research with other academics and lawmakers, Facebook has publicly downplayed the detrimental effects its platforms have on teens.
The documents clearly show that Facebook has made real efforts to improve user experiences, but has largely failed to produce positive change. Some of these failures are due to the fact that Facebook, being a private company, mainly focuses on profit. Tristan Harris, a former Google employee who directed the Netflix documentary “The Social Dilemma,” argued that Facebook employees are pressured not to make the necessary reforms to the platform as this will inevitably lead to a reduction in the user engagement – Facebook’s most important metric. for success.
It would be incorrect to say that Facebook is solely driven by profit. Facebook was founded on the idealistic belief that the world would be a better place if more people were connected to each other. Many within the company sincerely believe that Facebook is ultimately a good thing. The problem that the Facebook files point out is not that Facebook is an inherently harmful platform, but that Facebook often lacks the means or incentives that could facilitate reform.
This is why the United States must regulate social media companies like Facebook to ensure that steps are taken to improve these platforms. Facebook’s lack of transparency has cast doubt on its ability to improve on its own. Rather than depending on the caution of social media executives, the United States must play a more active role in mitigating the adverse effects of these platforms.
Benjamin Schnurr can be contacted at [email protected]