Social media platforms like Facebook “have played a major role in exacerbating political polarization that can lead to such extremist violence,” based on a from researchers at New York University’s Stern Center for Business and Human Rights.
That could not appear like a stunning conclusion, however Facebook has lengthy tried to downplay its position in fueling divisiveness. The company says that current analysis exhibits that “social media is not a primary driver of harmful polarization.” But in their report, NYU’s researchers write that “research focused more narrowly on the years since 2016 suggests that widespread use of the major platforms has exacerbated partisan hatred.”
To make their case, the authors spotlight quite a few research analyzing the hyperlinks between polarization and social media. They additionally interviewed dozens of researchers, and at the least one Facebook government, Yann Le Cun, Facebook’s prime AI scientist.
While the report is cautious to level out that social media isn’t the “original cause” of polarization, the authors say that Facebook and others have “intensified” it. They additionally be aware that Facebook’s personal makes an attempt to cut back divisiveness, reminiscent of de-emphasizing in News Feed, present the company is properly conscious of its position. “The introspection on polarization probably would be more productive if the company’s top executives were not publicly casting doubt on whether there is any connection between social media and political divisiveness,” the report says.
“Research shows that social media is not a primary driver of harmful polarization, but we want to help find solutions to address it,” a Facebook spokesperson mentioned in an announcement. “That is why we continually and proactively detect and remove content (like hate speech) that violates our Community Standards and we work to stop the spread of misinformation. We reduce the reach of content from Pages and Groups that repeatedly violate our policies, and connect people with trusted, credible sources for information about issues such as elections, the COVID-19 pandemic and climate change.”
The report additionally raises the problem that these issues are tough to handle “because the companies refuse to disclose how their platforms work.” Among the researchers suggestions is that Congress pressure “Facebook and Google/YouTube, to share data on how algorithms rank, recommend, and remove content.” Platforms releasing the info, and unbiased researchers who research it, needs to be legally protected as a part of that work, they write.
Additionally, Congress ought to “empower the Federal Trade Commission to draft and enforce an industry code of conduct,” and “provide research funding” for different business fashions for social media platforms. The researchers additionally increase a number of adjustments that Facebook and different platforms may implement instantly, together with adjusting their inside algorithms to additional de-emphasize polarizing content material, and make these adjustments extra clear to the general public. The platforms must also “double the number of human content moderators” and make all of them full staff, in order to make choices extra constant.
All merchandise beneficial by Engadget are chosen by our editorial crew, unbiased of our dad or mum company. Some of our tales embrace affiliate hyperlinks. If you purchase one thing by means of one in all these hyperlinks, we could earn an affiliate fee.