A physicist applied the method of studying multi-body physics to social networking platforms, hoping to predict and prevent extreme events by monitoring and modeling hate speech. The study found that the rapid growth of user groups on social platforms is similar to the divergence of correlation length in physical systems undergoing phase transitions. At the microscopic level, the growth process of user groups is similar to the formation process of gel. Written by Qu Lijian Shootings are not uncommon in the United States. Recently, two tragic shootings have occurred in succession. On May 14, a shooting occurred in New York, USA, in which 13 people were shot and 10 died. What is more serious is that the nature of this shooting is a racist massacre. Inspired by white supremacist ideas, the white youth Kindren crossed half of New York State and came to a supermarket in a community dominated by black people. He wanted to take as many black lives as possible. For such extreme events, can we use scientific means to detect clues in advance and prevent or even stop the tragedy from happening? A physicist is working on this, he is Professor Neil Johnson of the University of Washington. Neil Johnson A close relationship with terrorism Johnson was born in England in 1961 and grew up amid the Irish Republican Army's repeated terrorist activities. Johnson received his Ph.D. from Harvard University in 1989. In 1991, he went to the University of the Andes, the top university in Colombia, to be a professor for a year. Here, he witnessed the guerrillas killing innocent people indiscriminately. He then returned to England as a professor, and then to the United States in 2007. In the most civilized and affluent societies, mass shootings are increasingly making headlines. Johnson began to wonder if physics could be used to understand what violence is and to predict when it might occur. Johnson studies many-body physics, the collective behavior of interacting particles. For example, electrons moving in a certain direction in a conductor form an electric current, which is hindered; however, if the electrons can work together and form pairs, they can be immune to the obstacles, which is superconductivity. Johnson applied the thinking method of multi-body physics to think about human group behavior. People are not particles, but their behavior does form significant patterns. Each person's pursuit of beauty converges into popular fashion, each person's thoughts and understandings converge into surging social trends, and individual traders' buying and selling converge into financial trends... Johnson felt that many of the principles he had learned in dealing with multi-body physics could be used to think about social issues, and he achieved many results in this regard. Johnson began to ponder a pressing practical question: Can we detect some clues before an incident of violence makes headlines? Can violent incidents be predicted in advance? In 2014, Johnson began researching extremist violence. He asked a group of students to track the information of pro-ISIS groups on the Russian social media VKontakte (VK, a social networking service similar to Facebook in the United States). Johnson's team found that the number of members of pro-ISIS groups on VK had remained roughly the same, but the number of groups suddenly increased in September 2014. Then there was the ISIS armed siege of the Syrian city of Kobani. This pattern has also occurred in Brazil. Before the Brazilian protest movement reached its peak in June 2013, there was a surge in Brazilian political topic groups on the social media Facebook. The evolution of the number of group members over time is shown in the figure below. Physicists will be more familiar with this kind of graph. When the physical system is near the critical point, the correlation length is the characteristic in the graph. The group growth rate increases dramatically, similar to the divergence of correlation lengths in physical systems undergoing phase transitions. Johnson and his team analyzed the changes of about 200 pro-ISIS groups, involving about 100,000 member accounts, with the accuracy down to the second, and obtained the law of group evolution. The rise and fall of the group is very similar to the aggregation and decomposition of molecular aggregates in physics and chemistry, and the resulting mathematical equation is very consistent with actual observations. The evolution of group membership on social media and its dynamics Johnson's results are instructive for predicting extreme events. Anti-ISIS agencies do not need to monitor millions of pro-ISIS accounts, but only need to monitor a few hundred groups, which greatly reduces the workload. Anti-ISIS agencies can prevent pro-ISIS from spreading widely by paying attention to not letting the groups grow. If the number of group members is found to have soared sharply, it means that a large-scale attack is about to occur, similar to the divergence trend of the correlation length before a phase change occurs in a physical system. When we see an extreme event about to happen, is it possible to identify the extremists in advance to avoid tragedy? Microphysics of Groups What we have obtained before is the "thermodynamics" theory at the macro level, so can we construct a "statistical" theory at the micro level? One obvious difficulty is that people are not molecules. All molecules are the same, but people are different, with thousands of faces and thousands of different people. Physicists studied the collective behavior of molecules and developed the many-body theory; what kind of "many-body" theory should be used to describe the collective behavior of extreme molecules? Johnson and collaborators took inspiration from models of gel formation. Curd is a kind of gel. The uniform milk first condenses into many lumps of different sizes. These clots finally aggregate into a whole block, forming a gel. Johnson and his collaborators developed a theoretical model for the growth of online groups of extremophiles. In analogy with theoretical models of gel formation, Johnson also took into account the specificity of the "human molecules". Johnson treats people as "mesoscopic" people, regardless of the details, and directly assigns people a characteristic parameter, which is between 0 and 1. Theory has found that differences in people's characteristic parameters have a significant impact on the growth and diversity (i.e. distribution) of groups. The number of extremist groups and group size have a power law relationship of -5/2. The theoretical predictions are consistent with the data collected on social networking sites. In January 2015, the number of pro-ISIS groups on the Russian social networking site VK suddenly increased dramatically. The small white circles are individual users, the small yellow circles are blocked individual users, the large red circles are blocked groups, and the large green circles are not blocked groups. This simple theory shatters the desire of anti-terrorist authorities to identify and take down "lone wolves" who carry out terrorist attacks before they occur. According to Johnson's theory, this is almost an impossible task, just like when water is about to boil, it is impossible to identify which molecule will be the first to vaporize. However, Johnson's theory provides a precursor to an extreme event that is about to happen. Just like when the number of curds in milk increases dramatically according to a power law, we know that curds are about to appear. On social networking sites, when the number of extremist groups and the number of people in the group begin to show a power law relationship close to -5/2, it means that extremists may be about to cause trouble. Extreme events are carried out by individual elements, but the cause may be the collective behavior of extremists. Why go to so much trouble? Wouldn’t it be simple, brutal and effective to just eliminate the extremist accounts and their groups? Johnson's research found that it was simple and crude, but not effective. How to take down online hate networks Facebook moderators in Berlin remove hate posts. There are two ways to eliminate hate speech. One is the micro approach, which is to find the bad guys and kill them directly; the other is the macro approach, which is to immediately block any related speech that touches on it, and it is better to kill a thousand by mistake than to let one go. Neither method is feasible. The former is like looking for a needle in a haystack, which is impossible to find. The latter method hinders freedom of speech. Johnson and his collaborators studied the global dynamics of hate speech on social networks and found that hate speech clusters and forms a network structure between clusters. This network of networks spans national borders, continental boundaries, network platforms, language and cultural differences, and is decentralized. These characteristics make hate speech difficult to eliminate. Deleted extremist accounts and groups will reappear under a new name, and will be revived on another platform after being eliminated on one platform. What's worse, after being alerted, extreme speech will become more covert, such as using slang instead of sensitive words, making it more difficult to combat. The above chart shows the spread of hate speech within and across platforms How can we effectively combat hate speech on social networking sites? Based on the theoretical model they developed, Johnson and his colleagues gave four coping methods according to the two dimensions of intervention methods and intervention granularity, as shown in the figure below. Strategies for intervening in online hate speech Strategies 1 and 2 are directly implemented by the platform, while strategies 3 and 4 require the participation of water army. Strategy 1: Direct ban, targeting small, isolated groups. Strategy 2: ban randomly, don’t be greedy, and do it without anyone noticing. Strategy 3: Encourage anti-hate users on social media to become more organized and fight against hate users. Robots can be deployed to act as "nucleation sites" to induce anti-hate users to form groups. Strategy 4: Take advantage of the contradictions between hate speech, stir up confrontations between hateful users and groups, and sow seeds of doubt in the minds of extremists. These strategies may still be a long way from implementation, but they can provide a reference framework for regulatory policies. Practice makes perfect These academic studies are very meaningful for understanding the spread of hate speech. Can they be used to suppress hate speech and prevent extreme incidents? Johnson worked with the FBI and other agencies to apply academic insights to actual combat. However, the implementation methods have not been reported. Extreme shootings in the United States have continued to increase, and even white supremacist racist shootings have occurred. With the development of the Internet, they have begun to live broadcast online. This is enough to show that these academic achievements have not yet had obvious application effects. I hope that the academic community and public safety agencies can find ways to prevent extreme events as soon as possible. References https://www.aps.org/publications/apsnews/200611/backpage.cfm https://physics.aps.org/articles/v11/76 https://www.aps.org/publications/apsnews/201611/backpage.cfm https://physicsworld.com/a/the-dark-side-of-social-media/ Johnson, Neil F., et al. "New online ecology of adversarial aggregates: ISIS and beyond." Science 352.6292 (2016): 1459-1463.DOI: 10.1126/science.aaf0675 Manrique, Pedro D., et al. "Generalized gelation theory describes onset of online extremist support." Physical review letters 121.4 (2018): 048301. DOI: 10.1103/PhysRevLett.121.048301 Johnson, Neil F., et al. “Hidden resilience and adaptive dynamics of the global online hate ecology.” Nature 573.7773 (2019): 261-265. DOI: 10.1038/s41586-019-1494-7 Special Tips 1. Go to the "Featured Column" at the bottom of the menu of the "Fanpu" WeChat public account to read a series of popular science articles on different topics. 2. Fanpu provides a function to search articles by month. Follow the official account and reply with the four-digit year + month, such as "1903", to get the article index for March 2019, and so on. Copyright statement: Personal forwarding is welcome. Any form of media or organization is not allowed to reprint or excerpt without authorization. For reprint authorization, please contact the backstage of the "Fanpu" WeChat public account. |
>>: After reading Compendium of Materia Medica, would you be brave enough to try the fascia knife?
Nowadays, short video software has covered our li...
The 9th session of SEO training has started, and ...
World Diabetes Day is celebrated on November 14th...
How much is the quotation for Hefei fabric develo...
What are UI Tests? UI Tests is a Testing componen...
Friends in operations , advertising, and marketin...
Community operation has been very popular recentl...
Dear friends, Sesame Credit is recruiting talents...
Why do marketers need to understand psychology? B...
[[122276]] In light of recent hacker attacks, the...
The friend invitation mechanism is a standard for...
Introduction to the operating rules of the SMS in...
As the largest social application APP in China, W...
After restoring your phone to factory settings, w...
Hu Shenzhi's Family Education Course: Solve a...