Sunday, January 16

This is how Facebook can manipulate you


Former Facebook Product Manager Frances Haugen testified before the United States Senate on October 5, 2021, that the company’s social media platforms “harm children, fuel division and weaken our democracy. ”

Haugen was the main source of a Wall Street Journal Exposure in the company. He called Facebook’s algorithms dangerous, said Facebook executives were aware of the threat but put profits before people, and asked Congress to regulate the company.

Social media platforms rely heavily on people’s behavior to decide what content you watch. In particular, they look for content that people respond to or “engage with” by liking, commenting, and sharing. Troll farms, organizations that spread provocative content, take advantage of this by copying high-engagement content and publish it as your own, which helps them reach a wide audience.

As a computer scientist who studies the ways in which large numbers of people interact using technology, I understand the logic of using the wisdom of crowds in these algorithms. I also see substantial difficulties in the way social media companies do it in practice.

From lions in the savannah to like on Facebook

The Crowd Wisdom Concept assumes that using cues from the actions, opinions, and preferences of others as a guide will lead to wise decisions. For instance, collective predictions they are usually more accurate than individual ones. Collective intelligence is used to predict sports, financial markets, elections and even disease outbreaks.

Over millions of years of evolution, these principles have been encoded in the human brain in the form of cognitive biases that come with names like familiarity, mere exposure and Drag effect. If everyone starts running, you should start running too; maybe someone saw a lion coming and running could save his life. You may not know why, but it is wiser to ask questions later.

Your brain picks up clues from the environment, including your peers, and uses simple rules to quickly translate those signals into decisions: go with the winner, follow the majority, copy your neighbor. These rules work very well in typical situations because they are based on sound assumptions. For example, they assume that people tend to act rationally, many are unlikely to be wrong, the past predicts the future, and so on.

Technology allows people to access signals from a much larger number of people, most of whom they do not know. Artificial intelligence applications make heavy use of these popularity or “engagement” signals, from selecting search engine results to recommending music and videos, and from suggesting friends to ranking posts in sources. news.

Not everything viral deserves to be

Our research shows that virtually all web technology platforms, such as social media and news recommendation systems, have a strong popularity bias. When apps are driven by signals like engagement rather than explicit search engine queries, popularity bias can lead to harmful unintended consequences.

Social networks like Facebook, Instagram, Twitter, YouTube, and TikTok rely heavily on artificial intelligence algorithms to rank and recommend content. These algorithms take as input what you like, comment and share; in other words, the content you interact with. The goal of algorithms is to maximize engagement by discovering what people like and ranking it at the top of their feeds.

At first glance, this seems reasonable. If people like credible news, expert opinions, and funny videos, these algorithms should identify that high-quality content. But the wisdom of the crowds makes a key assumption here: that recommending what’s popular will help high-quality content “bubble.”

U.S tested this assumption by studying an algorithm that ranks items using a combination of quality and popularity. We found that overall, popularity bias is more likely to reduce the overall quality of content. The reason is that engagement is not a reliable indicator of quality when few people have been exposed to an item. In these cases, the coupling generates a noisy signal and the algorithm is likely to amplify this initial noise. Once the popularity of a low-quality item is great enough, it will continue to expand.

Algorithms are not the only thing affected by participation bias, it can affect people too. Evidence shows that information is conveyed through “complex contagion, ”Which means that the more times people are exposed to an idea online, the more likely they are to adopt and share it. When social media tells people that an article is going viral, their cognitive biases kick in and translate into the irresistible need to pay attention and share it.

Not so wise crowds

We recently conducted an experiment with a news literacy app called Fakey. It is a game developed by our laboratory that simulates a news source such as Facebook and Twitter. Players see a mix of current articles from fake news, junk science, hyperpartisan and conspiracy sources, as well as mainstream sources. They get points for sharing or liking news from trusted sources and for flagging low-credibility articles for fact-checking.

We found out that the players are are more likely to like or share and are less likely to check Articles from low credibility sources when players can see that many other users have engaged with those articles. Exposure to engagement metrics creates a vulnerability.

The wisdom of crowds fails because it is based on the false assumption that the crowd is made up of diverse and independent sources. There may be a number of reasons why this is not the case.

First, due to the tendency of people to associate with similar people, their online neighborhoods are not very diverse. The ease with which social media users can stop being friends with those with whom they disagree pushes people toward homogeneous communities, often referred to as echo chambers.

Second, because many people’s friends are friends with each other, they influence each other. TO famous experiment showed that knowing what music your friends like affects your own stated preferences. Your social desire to conform distorts your independent judgment.

Third, popularity signals can be harnessed. Over the years, search engines have developed sophisticated techniques to counter so-called “link farmsAnd other schemes to manipulate search algorithms. Social media platforms, on the other hand, are just beginning to learn about theirs. vulnerabilities.

People seeking to manipulate the information market have created fake accounts, like trolls and social bots, and organized fake networks. They have flooded the network to create the appearance that a conspiracy theory or a political candidate it’s popular, fooling both the platform’s algorithms and people’s cognitive biases at the same time. They even have altered the structure of social media create illusions about majority views.

Reduce commitment

To do? Tech platforms are currently on the defensive. Are getting more aggressive during the elections in remove fake accounts and harmful misinformation. But these efforts can be similar to a game of smash a mole.

A different preventive approach would be to add friction. In other words, slow down the information dissemination process. High-frequency behaviors, such as automatic liking and sharing, could be inhibited by CAPTCHA tests, which require a human to respond, or fees. Not only would this reduce opportunities for manipulation, but with less information people could pay more attention to what they see. It would leave less room for participation bias to affect people’s decisions.

It would also help if social media companies adjusted their algorithms to rely less on engagement signals and more on quality signals to determine the content they serve you. Perhaps the disclosures of the whistleblowers will provide the necessary impetus.

This is an updated version of a article originally published on September 20, 2021.The conversation

Philip Menczer, Luddy Distinguished Professor of Informatics and Computer Science, Indiana University

This article is republished from The conversation under a Creative Commons license. Read the Original article.


www.moneyweb.co.za

Leave a Reply

Your email address will not be published. Required fields are marked *