How to stay away from the "calculation" of algorithms?

——Survey on the algorithm literacy of netizens under the new regulations

Editor's note

On March 1, the "Regulations on the Administration of Internet Information Service Algorithm Recommendations" (referred to as the "Regulations") jointly issued by the State Internet Information Office and other four departments were officially implemented.

The "Regulations" comprehensively regulate the algorithm recommendation service, clarify the algorithm governance system and mechanism, and scientifically build an accountability system for the network platform.

The 2022 "Qinglang·Comprehensive Algorithm Governance" special action, which was subsequently launched, requires supervision and rectification of problems such as "information cocoon room", "algorithmic discrimination", and "big data killing" caused by the unreasonable application of algorithms.

The above new regulations and measures are an effective response to algorithmic governance in the era of digital transformation.

In the era of "coexisting with algorithms", as the main body of online behavior, how can netizens improve their algorithmic literacy, stay away from the "calculation" of algorithms, and enjoy online life more safely and independently?

In view of this, Guangming Daily, in cooperation with the research team of Minzu University of China, conducted a questionnaire survey (912 valid questionnaires) and in-depth interviews. Based on the experience of individual users, it sorted out the difficulties, challenges and specific coping performances in algorithmic survival, and discussed Suggestions on how to improve algorithm literacy.

  "I used to be 'arranged' to see the content I 'want to see'. After turning off the personalized recommendation, I felt like I saw a wider world." I can't help but buy, buy, buy; don't open it, because I'm afraid of missing the right product." "Before I closed the recommendation, the short videos I often read always made me want to stop. There are beauty tutorials and exquisite life. Scenario spoofs, earthy love words... I quickly and silently turned on personalized recommendations again."

  …………

  With the release of the "Regulations", Douyin, WeChat, Taobao, Baidu, Jingdong, Weibo and other apps have launched the "personalized recommendation" one-click closing function.

However, some users in the survey expressed ambiguity about whether to turn off "personalized recommendation": some people feel that turning it off "opens the door to a new world", while others feel that turning it off reduces the user experience and makes it difficult to adapt.

  From the itinerary tracking in epidemic prevention and control to the calculation of road conditions in smart travel, from information content that suits tastes to product recommendations that meet individual preferences... In the era of big data, algorithms that are inseparable have been widely penetrated into all levels of society, becoming a human being "Technology Companion".

We need the existence of the algorithm to enjoy the convenience it brings; at the same time, we need to have enough ability to identify and resist the control and other risks it brings.

The survey results show that 56.9% of the respondents believe that algorithm programs have improved the experience of using electronic products, but the "China Great Security Perception Report (2021)" released by the Internet Development Research Center of Peking University also shows that 70% of the respondents are worried about personal Likes and interests are "calculated" by the algorithm, and 50% said they want to escape the Internet and stay away from the mobile phone under the constraints of the algorithm.

  Concerns:

  In the algorithm society, what dilemmas are we facing?

  The "differential treatment" hidden under the algorithm recommendation.

At the beginning of March, the Beijing Consumers Association released a survey on the "killing" problem of Internet consumption big data. After investigating 16 platforms and extracting 32 simulated consumption experience samples, it found that 14 samples had inconsistent prices between new and old accounts. .

The survey found that there is a clear "information gap" between users and the algorithm platform. The platform masters the model formula and presentation rules of the algorithm. Through the personalized design of information screening standards and the "data tracking" of other platforms, the Users are at the bottom of the information pyramid.

The precise recommendation comes from the platform's labeling operation on personal data, which in turn divides the user's spending power into "three, six, nine, and so on", forming algorithmic discrimination.

  Algorithmic discrimination not only damages the interests of users, but also intensifies the original prejudice of society.

The survey found that vulnerable groups such as women and the elderly in news feeds are more likely to attract people's attention. 68.7% of the respondents believe that gender-specific words such as "female driver" and "female college student" are more likely to be searched. Gender issues are more likely to spark discussion.

The platform uses "user portraits" and "labels" to distinguish individuals, flatten users, and reinforce stereotypes in real life, which may trigger public opinion.

  Nowhere to escape information collection and 'privacy threats'.

There is such a plot in the popular black humor short drama "Aunt's World": The square dance aunts are worrying about what color costumes to buy. At this time, Aunt Wang casually said to eat tomatoes and scrambled eggs at night, and asked Aunt Yang to go. Picking up the grandson of the "Panda Class", the mobile shopping app sent her red and yellow, black and white clothes respectively.

The platform steals data by "eavesdropping" to calculate user preferences, and then "raid" to recommend information, which has attracted widespread attention.

Many respondents in the survey said that being "eavesdropped" in a system jointly established by platforms and algorithms has become "common".

Many companies, apps, and third-party organizations try to control more personal privacy when sharing users' personal information.

In the interview, many users said that the browser can always "remember" their search history and synchronize it to other devices, and repeatedly push the same searched content on the home page, which makes people feel a sense of "being leaked".

  The troubles caused by algorithmic recommendation services based on the collection of personal information do not end there.

On the one hand, inappropriate recommendations are manifested in the "quantity". Too much homogenized content forms information redundancy, which can easily lead to psychological fatigue and privacy anxiety for individuals; on the other hand, the "degree" of algorithm recommendations often exceeds The recommended content range, pornographic, vulgar and other content that violates mainstream values ​​are often placed on the startup page to induce users to click.

  "Progressive dependencies" filtered and controlled by algorithms.

"I usually find restaurants and playgrounds that are ranked at the top of the app when I eat and walk my baby," said Sun Ying, a 31-year-old mother with a second child. Although she has "stepped on thunder" by relying on the recommendation of the app platform, such a choice is very difficult for her. It saves time and effort.

In the Internet era, many netizens increasingly hand over the judgment of the pros and cons of things to algorithms.

Algorithms construct "people's perception and reality on the Internet" to a large extent by means of mechanisms such as "priority", "classification", "association" and "filtering".

The survey shows that more than 60% of the respondents will make personal decisions based on platform rankings, ratings and other data, although they do not agree that the top-ranked products or content are necessarily better.

  The survey also found that many platforms that provide algorithmic recommendation services achieve "entrance monopoly" and "self-preferential treatment" by placing relevant content in a prominent position and blocking links to other platforms.

These platforms push specific content to the top of the list in key links such as the hot search list, the first screen, and pop-up windows by setting keywords, targeted push, etc., and secretly control the range of information that people can pay attention to.

72.1% of the respondents believed that the platform would recommend content similar to their own interests.

When people are accustomed to the information provided by the platform, they may fall into the "filter bubble" set by the algorithm, and people commit themselves to become the "attachment" of the data, and gradually lose the opportunity to contact multiple information.

  response:

  The way of user survival from "escape" to "living"

  From the criticism of "big data" to the familiarity of "information cocoon room", ordinary users have to choose to endure the various problems caused by the algorithm in a compromised way in the face of the strong application of algorithm technology.

In the survey, more than half of the respondents said that they were aware of the collection of personal information by platforms or businesses, but if they did not check the "User Informed Consent", they would not be able to use the functions provided by the software.

Giving up privacy in exchange for services has become the "helpless and self-consistent" mentality of ordinary people when they coexist with algorithms.

However, in addition to most cases of "subject to the algorithm", there are still many users who choose to "fight their wits and courage" with the algorithm to avoid the negative impact of the algorithm.

  Make good use of the "digital stealth" of the "anti-tracking" strategy.

College student Gao Yu is a "heavy user" of social platforms. During the day-to-day interaction with algorithm recommendations, she has developed the ability to "dodge" algorithms: "Every time I swipe something I don't want to read on the homepage, I will "With lightning speed," she passed by." In addition, she also registered a "small account" on the same social platform, "The big one is used for chasing hot news, and the small one is used for daily news, so I don't want to see relevant information. When it comes to content, there is a place to 'hide'."

  At present, ordinary individuals living in various default settings protocols have developed a variety of "algorithmic avoidance" strategies.

They either change their mobile phone numbers or create multiple accounts to confuse the algorithm; or they choose not to like, post, or comment to reduce their digital traces on the Internet; or because they don't like the recommendation mechanism of certain software. Turn off or disable until an algorithm-friendly alternative product is found; or find a way to turn off the mobile phone radio function to block possible algorithm monitoring from the hardware device side.

These methods present a common algorithmic survival strategy - "disconnection" and "invisibility".

However, many users in the survey said that although they can stay away from algorithm software, they cannot avoid the chain reaction of algorithmic thinking.

The survey shows that only 54.9% of people think that this kind of evasion has had an effect, and the overlord clause, stealing privacy, forcible positioning and other situations are still difficult to be alleviated by individual strength.

  The "reverse discipline" of the "adapted to local conditions" feeding algorithm.

Formulate usage strategies according to the situation of the algorithm "according to local conditions", actively explore the rules and ways of thinking of algorithm operation, and set independent preferences through actions such as "like", "favorites", and "not interested" with clear intentions, and even leave a message under the post that interests you. "Please remember me with big data, and send me more such (posts)"... The survey found that some users choose to actively "feed" to seek ways to get along with algorithms.

  Among them, the "data fans" who were once active in the "fan circle" are a group of people who rely on the "you come and go" with the algorithm to try to influence the sorting of information.

The "data fans" group usually clicks and publishes information, likes idols' works, forwards them with topics, etc., to increase the popularity of related content, thereby affecting the algorithm, making idol-related content more prominent, and grabbing the public with a high profile attention.

  A "living coexistence" that seeks dividends through algorithmic platforms.

"In the past, I only selected the content I was interested in to create, but once I accidentally edited and produced an introduction to a popular film and television drama, and the traffic was very good. Since then, I began to think about how to make it more popular." Yu Jiahao is an amateur video website. Content creators, led by similar feelings to him, more and more MCN (multi-channel network) and self-media creators are wholeheartedly "living" in the content world dominated by algorithms, catering to code rules, grasping traffic passwords, and realizing Commercial realization constitutes another model of algorithmic survival.

In the survey, more than two-thirds of the respondents who answered "Does it actively cater to the recommendation mechanism to publish content" chose "Yes".

The survey also found that some content producers did not hesitate to use "three vulgar content" and "fighting the edge balls" to meet the algorithm criteria.

  Reflect:

  How difficult is it to get out of the algorithmic dilemma?

  Platform values ​​are embedded in algorithmic programs, making it more difficult for ordinary individuals to detect bias.

During the survey, the algorithm engineer interviewed said that the algorithm is not an absolutely objective and neutral technology, and the scope of data collection and the market intention of the platform will be written into the code, affecting the sample distribution or data set, so as to combine "tendency" with "stance". Embedded in programming.

For ordinary users, because the algorithm itself is presented in the form of numbers and formulas and has the characteristics of "technical unconsciousness", it is more difficult for individuals to detect and trace the source of value bias.

For example, the deformed content evaluation system represented by "traffic first" is the result of the subtle effect of algorithmic logic.

  There is a knowledge gap between algorithm designers and the public, making algorithms difficult to understand and interpret.

According to industry practice, most algorithm designers do not disclose the details of the operation to the public, and there is an invisible "black box" in the algorithm.

For a long time, this opaque mechanism has been regarded as a "naturalized" existence, and few people have questioned its rationality, which has seriously affected the social supervision of algorithms.

There is a lack of open and continuous algorithm explanation links between algorithm engineers and other social subjects, and it is imminent to establish a relevant multi-party algorithm communication mechanism.

  Algorithm users are in a weak position, which exacerbates the problem of individual rights protection.

The research team found that there are not many people who really understand the operation mechanism of the algorithm. Only 43.0% of the people think that they are familiar with the algorithm, and more respondents said that they "know a little" about the algorithm and "do not seem to be familiar with it."

In my country's current media literacy education, the cultivation of "algorithmic literacy" is still absent.

In 2020, the European Commission began calling on member states to foster public awareness and literacy of algorithmic systems functions and their implications.

  The supporting measures and mechanisms to carry out algorithmic governance are not yet in place.

As an everyday technology, algorithms have profoundly changed people's lives, but society has yet to keep up with the pace of algorithm development.

In other words, there is no perfect "infrastructure" to support the development of technology.

At the same time, the ethical awareness of the algorithm industry is still weak, and there is a lack of ethical standards for industry autonomy.

Although the management regulations on algorithm recommendation information services have been promulgated, the system, mechanism and safeguard measures for the governance algorithms have yet to be perfected. The atmosphere awareness and participation channels of the whole society to carry out "co-governance" for algorithms need to continue to be Explore and advance.

  future:

  Comprehensively improve algorithm literacy and complement the "Regulations"

  The promulgation of the "Regulations" provides a legal basis for top-down algorithmic governance, but there is still a big gap in the improvement of bottom-up algorithm literacy, and relevant work needs to be carried out from the following aspects.

  Integrate multiple channels, take a good "algorithm popularization course" for the whole people, form a social consensus and knowledge sharing, and break the "algorithm worship".

Algorithmic thinking is not only applied in the field of algorithms, but also affects practice at a broader level.

To enhance algorithmic thinking, on the one hand, it is necessary to deeply understand algorithmic logic and improve the ability to use computational thinking, data thinking to analyze and solve problems; New thought.

Today, systematic algorithmic thinking education is still absent, and it is urgent for the whole people to take "algorithmic popularization courses" to make up for the gap of algorithmic thinking education.

Since 2018, algorithm-related courses have been included in the "new curriculum standard" for high schools. In the future, it is necessary to add algorithm courses in primary and secondary schools to cultivate algorithmic thinking "starting from dolls"; strengthen the creation of algorithmic popular science books and film and television works, and promote the cultural industry. Algorithmic thinking education; establish an algorithmic thinking evaluation index system to promote the scientific development of algorithmic thinking education.

  The improvement of algorithm literacy will be "effectively connected" and "accurately matched" with the existing consumer rights, privacy rights, and special groups' rights protection system, and an upward and good algorithm dissemination pattern will be constructed.

In line with the requirements of the "Regulations" for transparent and interpretable algorithm rules, based on the open algorithm mechanism, the user's ability to distinguish information filtered by the algorithm is enhanced, and the manual intervention of functions such as hotspot sorting is strengthened; in response to the "privacy crisis" , establish awareness of personal information protection, clarify the boundary between ordinary personal information and sensitive personal information; set privacy disclosure reminders on personal information input pages to prevent blind disclosure of personal information; strengthen media coverage of privacy-related cases and establish public awareness of rights protection; strengthen rational consumption , fair consumption and other knowledge publicity, open differentiated pricing complaint channels, clarify reward and punishment mechanisms, strictly rectify consumer rights violations such as "big data killing", and help the public avoid "consumption traps".

  Build a collaborative governance system of "mainstream values ​​+ industry ethics + public supervision", strengthen algorithm accountability, and create a law-compliant, sentimental, and traceable algorithm development environment.

The main body of algorithm literacy includes both algorithm users and algorithm developers.

To improve the algorithm literacy of algorithm developers, it is even more necessary to cooperate with each other and work together.

The implementation effect of regulations such as the "Regulations on the Administration of Algorithm Recommendations for Internet Information Services" should be tracked, and companies that violate algorithmic technology should be punished and interviewed to achieve rigid constraints; extensive publicity and discussion on algorithm ethics should be carried out in the industry to strengthen the mainstream The promotion of values ​​within the industry, introducing social morality and professional ethics into the industry value system, and realizing flexible constraints; urging the Internet industry to establish operational norms at the practical level, deliberately preventing phenomena such as "algorithmic bias" and "algorithmic discrimination", and pursuing algorithms from the development and design process Fairness, achieving a balance between technological innovation and ethics; promoting openness and transparency of the algorithm development process, introducing public supervision, and creating a legal, prudent, and ethical industry environment.

  Author: Guangming Daily Joint Research Group

  (Writers: Mao Zhanwen, special researcher of Beijing Xi Jinping Thought Research Center for Socialism with Chinese Characteristics for a New Era, Bai Xuelei, reporter of this newspaper; other members of the research team: Zhao Anqi, Nie Yidan, Yu Hao, Ji Yun, Dai Ning)