But “had” is the keyword, here. According to internal documents inspected by the Wall Street Journal, Facebook shut down the team in 2019.
Worse, the team appears to have been on the cusp of making major suggestions that could have benefited users—at the cost of time spent on the platform, which is crucial to Facebook’s ability to earn revenue. Before they were disbanded, the researchers on the team conducted a survey of 20,000 users and found that one in eight engaged in “problematic use” of Facebook. Problematic use, they said, produced a variety of negative effects on key aspects of users’ lives. Some users reported loss of productivity, while others said their sleep was impacted by late-night scrolling and viewing disturbing content. Users also reported deterioration within their interpersonal relationships; parents even avoided their children in favor of spending more time online. One user missed a family member’s wedding because they were watching a video on Facebook. Another said it was common for them to browse the app until 2 AM, making it difficult to wake up feeling rested the next morning.
Anyone familiar with the scientific method will tell you correlation isn’t causation—as will Facebook itself, whose parent company, Meta, denies WSJ‘s interpretation of its research. But Facebook’s correlation with unhealthy social media use isn’t exactly hopeful. It also isn’t comforting in the face of Facebook’s recent rebrand, during which Zuckerburg has publicly aimed at blurring the lines between the “real” world and the virtual world by building out a metaverse.
Facebook’s internal documents reveal that the company knew its platform was more frequently associated with addictive use than other virtual experiences, including Reddit, YouTube, and World of Warcraft. Whistleblower Frances Haugen spoke just last month about how Facebook is designed to reward controversial (and sometimes, downright hateful) content due to the way its algorithms favor engagement above all else. One of Facebook’s subsidiaries, Instagram, was also found this year to have a uniquely poor impact on its users thanks to its algorithms and user interface.
Despite this, the company has only made half-hearted attempts at improving its platform to address these issues. It added a time-management tool to its mobile app in 2018, as well as a “quiet mode” that muted push notifications in 2020. But the latter feature was hidden among the app’s settings, and Facebook’s algorithms still push unsavory content to the top of users’ news feeds. Facebook recently squashed an outside attempt at helping people curb overuse of the app, so it’s unlikely that we’ll see any real strides toward user well-being in the near future.
Now Read:
- Facebook Planned to Target Six Year Olds to Compensate for Teen Departures
- Facebook Teases New High-End ‘Project Cambria’ VR Headset
- SEC Complaint: Facebook Concealed ‘Shrinking’ User Base to Investors
No comments:
Post a Comment