Posted on: October 22, 2021 Posted by: Kairi Comments: 0
Twitter says its algorithms favor tweets with right-leaning political content

Twitter has published the results of research that analyzes the algorithm to explore the potential for political bias. The company noted that users who have a home timeline sorted algorithmically will see a combination of tweets from the people they follow and recommend tweets based on their Twitter activities. This initial study compares the political content that users see in their timeline when sorted by algorithms rather than chronologically.

Earlier this year, Twitter said he would learn justice algorithms and ways in which they might accidentally contribute to ‘danger’. This new study is part of the business; It focuses on tweets from selected officials in seven countries, as well as suggested content from news companies emerged by the algorithm.

Among other things, the company sees whether the algorithm strengthens certain political groups more than others and, if so, is it a problem that is consistent in various countries. The company also explores whether the algorite supports certain political news outlets more than others. Analysis involved millions of tweets published between April 1 and August 1520.

Twitter has shared several discoveries made during this analysis, including that compared to the chronological timeline, the timeline sorted by algorithmically strengthens tweets about political content regardless of parties. Note, six of the seven countries included in this study were found to strengthen the tweet containing political rights content compared to others.

In addition, Twitter found that the algorithm also strengthened the content from the news publication leaning right. It seems that Twitter is not sure why this happens, with the company that records in blog posts about the study that “further causative root analysis is needed to determine what, if any, changes are needed to reduce the adverse effects by our home timeline algorithm.”

Director of Twitter Software Rumman Chowdhury explained:

In this study, we identified what had happened: certain political content was reinforced on the platform. Setting why this observed pattern occurs is a question that is significantly more difficult to answer because it is a product of the interaction between people and platforms. The mission of the Ethics Team, Transparency and Accountability (Meta), as researchers and practitioners embedded in a social media company, is identifying both, and reducing injustice that might occur.

Leave a Comment