Over 70% of users said they have been troubled by “algorithm” recommendation services and believe that “algorithms” have hidden risks.

  Don’t let “algorithm” become “calculation” (“integrated” view of China)

  ——The second series of reports on "Algorithms" in perspective

  Reporter Lu Zehua

  Have you ever had this experience: when you read a piece of news, similar news will be pushed to you continuously; when you browse a product, product advertisements will come one after another; even when you accidentally talk about a certain topic, related information will come in one after another, and your phone will seem to Your little words are recorded all the time... These are not coincidences, but the results of "algorithmic" calculations.

  Because of "computing", the cost for users to obtain information is greatly reduced, and network services are more accurate and efficient. However, more and more users are worried about being "calculated" and are increasingly concerned about issues such as "information cocoon rooms", "big data killing" and "inducing addiction". "Don't let 'algorithm' become 'calculation'" has become the common aspiration of netizens.

  Pain points

  "Traffic first" causes chaos

  Recently, Beijing citizen Wang Lan encountered a ridiculous thing: her son Ruirui was less than 10 years old, but his mobile phone frequently received various blind date messages. It turned out that my son mistakenly clicked on a blind date advertisement a few days ago. This worries Wang Lan: “The network has reached a pervasive level of collecting user information.”

  College student Song Fan recently downloaded a reading software. After opening, a novel marked "You may be interested" is pushed to the interface. "Isn't this exactly my favorite author?" He clicked on the link and looked at it for more than an hour. After reading it, the system recommended the "extra" of this novel. Unconsciously, he "brushed" for several hours and even forgot to write the topics assigned by the teacher.

  "This is the role of 'algorithms.'" Xiang Yang, an algorithm engineer at an Internet company in Hangzhou, said, "Mobile phone software can capture user data at any time, analyze people's behavior, habits and preferences through intelligent models, and filter out content that users are not interested in. Recommend information that users are interested in.”

  He revealed that for some commercial platforms, the main purpose of using "algorithms" is to increase traffic and increase revenue. Issues such as whether the pushed information is of high quality or poor quality and whether it will lead to Internet addiction are not considered. "Just like children who like to eat sweets, the only goal of some businesses is to sell as much candy to children as possible. As for whether it will affect their health, they are not concerned."

  This "traffic first" business logic prompts some online platforms to use "algorithms" to frequently push homogeneous content that users are interested in, and even worse, to please users with eye-catching low-quality and vulgar content. Over time, the scope for users to receive information will become narrower and the channels will become single, thus falling into what is often called an "information cocoon".

  In addition to "information cocoon room", "algorithmic discrimination" is also frequently criticized. The so-called "algorithm discrimination" refers to the "unfair results" caused by bias, discriminatory data sets or other factors in the design and application process of "algorithms".

  Ms. Liu, who lives in Shijiazhuang, has a child who attends university in Beijing. Every time she comes to Beijing to visit her child, she always stays at a hotel next to the school. At the beginning of this year, she booked a room through a mobile phone app at a price of nearly 800 yuan per night. But when she checked in, she discovered that a guest had booked the same room type for only 600 yuan. After comparison, the prices they saw on the same mobile phone software were completely different.

  "I am still a platinum card customer of this hotel. Instead of enjoying the discount, I was 'slaughtered'. My child told me that this is called 'big data killing'!"

  "Big data killing" is one of the manifestations of "algorithmic discrimination". It means that the platform uses "algorithms" to "portray" users. If the user is already a regular customer, there is no need to attract them at a low price. At the same time, the platform may detect the user's spending power by collecting consumption data, thereby "targeting" raising prices. Xiang Yang said that "algorithmic discrimination" was once rampant. Buying air tickets, taking online ride-hailing, and even shopping online were all "killed by big data." Later, the relevant departments came under strong supervision, and this phenomenon has converged. However, it is still different in the industry. degree exists.

  difficulty

  "Algorithm" violations are not easy to judge

  Some netizens concluded that the inferior information recommended by "algorithms" has three characteristics: it is difficult to distinguish between true and false, lacks depth, and has confusing value orientation. Many people believe that "confused value orientation" is the biggest difficulty in standardizing "algorithms".

  Wang Jun is a self-media person who recently planned to write an article on the topic of "should young people chew on the old?" When he turned on his phone and searched for this topic, he found that most netizens did not support young people chewing on the old, which was very consistent with his views.

  Unexpectedly, a colleague told him that most young people now believe that "there is nothing wrong with growing old, life is for enjoyment!" This surprised Wang Jun. The colleague brought the phone to Wang Jun: "Look, the screen is full They all support the view of nibbling on old age." When Wang Jun also showed his mobile phone page to his colleagues, the two could only look at each other.

  "It is not difficult to judge the user's value orientation. There are many ways to evaluate. For example, capturing the time users spend watching a certain topic, analyzing whether comments are positive or negative, counting user like records, etc. can all be used to determine user opinions, and then Based on 'content similarity' and 'user similarity', the most recognized content is recommended to the page." Xiang Yang further explained that the "algorithm" plays the role of content "editor" to a certain extent, whether to push or push Who to push and how much to push are all decided by pre-set programs.

  In recent years, in response to the chaos caused by "algorithms", regulatory authorities have taken multiple measures to purify the Internet atmosphere through interviews, punishments, rectifications, removals and other means. However, how to effectively advocate correctly oriented information through "algorithms" is still a proposition that needs to be solved.

  Xiang Yang said that although "algorithm" as a technology is neutral, its model design, data analysis, etc. are inseparable from the designer's personal choices, and his subjective consciousness will be subtly embedded in the "algorithm" system. At the same time, because "algorithm" design has a high technical threshold, its design process is like a black box, and it is difficult for the outside world to understand the principles and logic. Therefore, it is also difficult to determine what kind of orientation the "algorithm" embodies.

  Xiangyang's statement is constantly confirmed by reality. Since most "algorithm" designers will not make the details of the calculations public, even if they are made public, it is difficult to accurately determine. Due to the lack of professional and technical knowledge, when users’ rights and interests are harmed, the cost of protecting their rights according to law is relatively high. Many lawsuits caused by "algorithms" have not resulted in ideal sentencing results.

  "'Algorithm' recommendation is a kind of trade secret and technical secret, and the application process often involves social and public interests. How to balance and grasp the boundaries of legal regulations on 'algorithm' recommendation is a major problem." Peking University Law School Vice President Xue Jun believes that in terms of judicial and administrative concepts, the application of "algorithms" needs to be an extension of the behavior of "algorithm" operators. Regardless of whether the operator is at fault or whether there is any administrative violation, etc., the "algorithm" factor should be included in the consideration of accountability.

  focus

  Cohesion specification "algorithm"

  As the public continues to pay attention to "algorithms", calls for strengthening "algorithm" regulations are getting louder and louder. A survey shows that more than 70% of the respondents said they have been troubled by "algorithmic" recommendation services, and more than 60% of the respondents believe that "big data kills familiarity" is common in life. The vast majority of users believe that "algorithmic" recommendation technology lurks major risks of snooping and leaking users' personal privacy. The abuse of "algorithms" infringes upon consumers' rights to know, choose, and personal information, and also has a negative impact on the communication order in cyberspace.

  In this context, some users hope to turn off the "algorithm" recommendation function and stop sharing personal information for platform use. Under the supervision of relevant departments, major online platforms have launched the function of turning off "algorithm" recommendations, returning the choice of whether to collect user preferences to users.

  However, the platform's personalized recommendations have problems such as difficulty in closing. The Shanghai Consumer Protection Commission conducted an 8-month special evaluation of 10 mobile phone apps commonly used by consumers. The results showed that it takes up to 7 steps to turn off personalized recommendations.

  The reporter logged into several mainstream mobile apps with the highest download rankings. Currently, the way to turn off the "algorithm" recommendation function is mainly through "personal settings". The operation steps are consistent with previous survey results, and most still require 5-7 steps. The reporter randomly interviewed 20 mobile phone users, and most of them said they were not aware that the mobile phone software has the function of turning off "algorithm" recommendations. Only one user knew about this function and knew how to operate it.

  "This phenomenon reflects the ambivalence of the industry. On the one hand, it does not want users to shut down the 'algorithm' on a large scale, but on the other hand, it also realizes that only standardization can last long." Xiang Yang said that standardizing "algorithms" requires joint efforts from all sectors of society. Find the greatest common divisor.

  "Effective governance of 'algorithm' chaos is an inevitable requirement for building a safe and clear network ecosystem." Zheng Yushuang, associate professor at China University of Political Science and Law, said that it is necessary to improve the legal regulation model of "algorithms", solve the legal regulation problems of "algorithms", and achieve good governance of algorithms .

  "As to whether the use of 'algorithms' has adverse consequences, a social review mechanism must be established to conduct continuous follow-up research. If the social review mechanism has sufficient reasons to determine that the use of a specific 'algorithm' has led to adverse consequences, it should be initiated The accountability mechanism requires the corresponding 'algorithm' operator to explain and optimize it." Xue Jun suggested.