موقع شبرون للتقنية والأخبار- متابعات تقنية:
Social media algorithms have been a point of contention for lawmakers and regulators for several years over concerns that users were being brainwashed by the flood of misinformation promoted on their feeds. As it turns out, while the algorithms do impact the person’s experience on social media, four studies published on Thursday found it does not directly impact the person’s political beliefs.
Meta agreed to participate in the research in 2020, giving the study an additional level of credibility unlike those in the past which relied solely on data independently gathered from publicly available information and was based on a small number of users.
With Meta’s assistance, one of the four studies looked at data from 208 million Facebook users during the 2020 presidential election to discern if the misinformation displayed on their feeds had influenced their political stance. One study found that of those who read “untrustworthy” news stories, 97% identified as conservative and primarily viewed right-wing content.
The company has received continued criticism from Frances Haugen, the Facebook whistleblower who, last year, revealed internal documents showing Meta’s algorithm raised hateful, divisive, and false posts to the top of users’ feeds. She argued that if the company were to switch the feed to be chronological, it would promote less divisiveness among users.
Haugen claimed Facebook has stood in researchers’ way and prevented them from studying how it functioned, and in some cases, she said the company resorted to taking legal actions against those who spoke out against the company. “They’ve sued researchers who caught them with egg on their face,” she told CBS News in June. “Companies that are opaque can cut corners at the public expense and there’s no consequences.”
One study titled “How do social media feed algorithms affect attitudes and behavior in an election campaign?” did just that and reversed the feeds from those tailored to their interests to chronological feeds showing the most recent posts first for more than 23,000 Facebook users and 21,000 Instagram users. Researchers said they found users spent less time on Facebook when feeds were chronological and the amount of “political and untrustworthy content they saw increased on both platforms.” However, researchers said the alteration didn’t significantly change or affect “polarization.”
Overall, researchers said they found that removing reshared and algorithm-generated posts from individuals’ feeds over three months didn’t affect their political stance. “These are huge experiments,” Stephan Lewandowsky, a University of Bristol psychologist who was not part of the work told SCIENCE. “And these results are quite interesting.”
Meta’s President of Global Affairs, Nick Clegg, told The New York Times that the studies showed “there is little evidence that key features of Meta’s platforms alone cause harmful ‘affective’ polarization or has meaningful effects on these outcomes.” He added that although the results of the studies may not settle the debate about the influence social media has on democracy, he said, “We hope and expect it will advance society’s understanding of these issues.”
Katie Harbath, a former public policy director at Meta told the outlet, “We must be careful about what we assume is happening versus what actually is.” The studies revealed the perhaps unexpected truth that social media may not be solely or even partially responsible for the political opinions of those who use the platforms. Harbath added that political ideals are influenced in many ways, adding, “Social media alone is not to blame for all our woes.”
اكتشاف المزيد من موقع شبرون
اشترك للحصول على أحدث التدوينات المرسلة إلى بريدك الإلكتروني.