For some time, the effectiveness of Facebook and other social media to accurately disseminate information has been a hot topic of debate. Debates about the credibility of the rapidly increasing news sources on Facebook and similar social media platforms are also ongoing. These platforms rely on algorithms to select and display the information they think is suitable for the user based on the user’s personal behavior. A new study once again supports the long-held view of critics that social media algorithms facilitate the spread of misinformation to more reliable sources, thereby misleading the public debate on an important issue.

According to the Washington Post, researchers from New York University and Grenoble Alpes University in France user behavior on Facebook before and after the 2020 U.S. presidential election and found that between August 2020 and January 2021, to publish News publishers known for misinformation received six times more “likes, shares, and interactions” on the platform than reliable news sources such as CNN or the World Organization (WHO).

Researchers also found that compared with fact pages, the misinformation trading pages on the far left and right can attract more Facebook users. The results of the survey confirmed concerns about “fake news,” which was first made public after the 2016 U.S. presidential election, which was held after a split and intense poll. Social media is often accused of amplifying violent appeals, including Trump supporters’ attempt to violently attack the Capitol, where the US government’s legislature is located, on January 6.

Two months later, Facebook CEO Mark Zuckerberg (Mark Zuckerberg) appeared before Congress, seeming to imply that he is not responsible for misinformation campaigns on social media platforms. Others at the hearing included Twitter CEO Jack Dorsey and Google parent company Alphabet CEO Sundar Pichai. Legislators slammed the three platforms’ practices against false content.

See also  2021 Mercedes-AMG E53: Mild Hybrid | News

Rebekah Tromble, director of the Institute for Data, Democracy and Politics at George Washington University, reviewed the results of the study. She said that despite many mitigation measures, there is growing evidence that, The “misinformation has found a comfortable home” on Facebook. .

Facebook said that the study measured the number of people who participated in the content, but it was not a measure of the number of people who actually watched the content. A Facebook spokesperson told the Washington Post: “When you view the most impactful content on Facebook, it’s not at all what this research shows.” Facebook does not make public the number of people (impressions) who view the content on its platform. Provided to researchers.