With the development of information technology and the widespread application of big data, algorithm recommendations have made information dissemination more personalized, customized, and intelligent, but some chaos has also appeared-


  what "algorithms" do we need?

  Have you ever had a similar experience: fill out a survey on job preferences on the job search website, and the website will automatically push the matching positions; open the shopping software and find that most of the pages are recently searched or browsed products; read a health information through the App After that, I will often receive advertisements of health-care knowledge and health-care products...With the rapid development of information technology and the wide application of big data, algorithm recommendation technology is bringing people into personalized, customized, and intelligent information dissemination. era.

  Because of algorithm recommendations, Internet platforms are increasingly able to capture users' hearts, help people obtain information more conveniently and accurately, and firmly attract users' attention.

According to incomplete statistics, algorithm-based personalized content push currently accounts for about 70% of the entire Internet information content distribution.

While algorithm recommendation has gradually become the "basic operation" of various platforms, chaos such as accurate push of vulgar and inferior information and "big data killing" have also become prominent.

  As an important driving force of the digital economy, how can algorithms achieve higher-quality development?

In the era of mobile internet, what kind of "algorithms" do we need?

  Do Internet platforms increasingly "understand" users?

  "After watching a short video, the platform will automatically recommend a lot of related videos, which is very convenient." Chen Hui, who works in a company in Beijing, is an "enthusiast" of a certain online battle game. He usually likes to watch related short videos on his mobile phone to improve his operation. Level.

He found that as the frequency of refreshing short videos increases, more and more related videos are recommended by the platform. "Game strategy, hero introduction, and battle videos are available. E-commerce platforms will also recommend mice and keyboards."

  The automatic recommendation function of these platforms uses a technique called algorithmic recommendation.

This is a technology that uses artificial intelligence analysis and filtering mechanisms to conduct in-depth analysis of massive data and complete the precise matching of information content with users.

  Since the University of Minnesota research group launched the first automated recommendation system in 1994, algorithm recommendation technology has now penetrated into information, social, e-commerce, short video, search engine and other platforms and Internet applications.

  Internet platforms are becoming more and more "understanding" users. While greatly facilitating users to obtain information, it is also prone to cause some users, especially teenagers, to indulge in varying degrees.

  “It’s a good idea to watch for 5 minutes, but the result is a few hours.” Lin Zhongxin of Guiyang City, Guizhou Province, said that his 12-year-old son recently became obsessed with flashing some spoof short videos, which not only affects academic performance, but also goes outdoors with his peers. There is also less time for activities.

  Since the end of May last year, the anti-addiction system for youth has been promoted on major online short video platforms across the country.

Under the "youth mode", most platforms mainly push educational, knowledge and other useful content.

However, in the absence of guardian care and supervision, the effect of the anti-addiction system is easily compromised.

  Some interviewees reported that the personalized push of some news information, online social networking and other platforms has the "three-to-many" phenomenon of "more pan-entertainment information, more vulgar content, and more unverified content"; some online social platforms are anti-addiction There are fewer methods, which can easily lead to teenagers' addiction and blind imitation.

  Qiu Zeqi, director of the Chinese Society and Development Research Center of Peking University, told reporters that self-preference is part of human cognitive preferences, and "preference" reading may accelerate the formation of "information cocoon" effects and "emotional contagion" effects, and the former is easy to cause vision Limitations, the latter is easy to make personal emotions infected by others.

  Some people who like online shopping may also suffer losses due to the "big data".

Some platforms use algorithmic technology to "portrait" different types of consumer data to determine their preferences, user stickiness, price sensitivity, etc., so that different users see different prices or search results.

Usually the price seen by old users is more expensive than new users, or the search results are less than new users.

  During the "Double 11" period this year, Beijing consumer Ms. Han found that she booked a hotel through an App, "the same time, different mobile phones" booking, the price difference was about 1,000 yuan.

  In mid-September, a poll launched on Weibo showed that 15,000 people believed that they had encountered significant price differences, accounting for nearly 80% of all voters.

  "Algorithm" is neutral, the problem lies with people

  The important significance of algorithm technology is to transform the previous human-based "people looking for information" into "information looking for people" based on computer automated calculations, which not only greatly liberates manpower, but also more efficiently completes the matching of people and information.

  Since 2012, Toutiao, an Internet platform, has applied the algorithmic recommendation system to products in the information field earlier in the industry, realizing the system's automatic learning recommendation.

According to Cao Huanhuan, the algorithm architect of Toutiao, the recommendation system comprehensively considers content characteristics, user characteristics, environmental characteristics and other factors to make decisions.

For example, the environmental characteristics include user interest preference information in different scenarios such as during work, on the way to get off work, and off work.

  In order to help users discover more points of interest, Toutiao continues to introduce high-quality content producers in multiple fields and uses algorithms to recommend them to users; launches the "Spirit Dog" anti-vulgar assistant to eliminate vulgar information.

The recommendation system also adds weight elimination and break-up strategies. The former aims to eliminate the recommendation of articles with similar content, and the latter reduces the frequency of recommendations for articles in the same field or topic.

  However, while the top Internet platforms are strictly self-disciplined, there are still some online social media and news clients with irregular acquisition and editing processes and poor management that are biased in business orientation.

Mainly manifested in:

  -Recommend poor quality information to users.

In order to retain users, some platforms continue to recommend similar content they have followed, which are mixed with information such as vulgar kitsch, pornographic violence, difficult to distinguish between true and false, lack of depth, and confusion in value orientation.

In order to increase click-through rate and traffic, some Internet platforms will also make manual recommendations, actively recommending "bo eyeballs" and "scrolling" information. Many users directly call "spicy eyes".

This reflects the lack of social responsibility of some platforms and neglect of value building.

  -Increase the difficulty of protecting user rights.

The content recommended by some algorithms over-enforces user preferences, affects users’ independent choice of information content, exacerbates the "information cocoon" effect, and easily leads to the isolation of individuals and society, and lacks a deep understanding and judgment of the current national conditions.

Relying on data such as user browsing records, algorithm recommendations may also infringe on users' personal privacy if they are not properly designed.

  —— Carry out "big data to kill familiarity."

According to Shen Hao, a professor at the Big Data Research Center of Communication University of China, for algorithms, the amount of user data and the frequency of data update can easily determine whether it is a "new visitor" or a "regular visitor."

The result is that the platform makes a lot of money, the interests of businesses and consumers are damaged, and it is easy to lead to monopoly.

  As a technical application, algorithm recommendation is neutral, and the problem lies with the designer and operator.

  On the one hand, it pursues a single value orientation of "flow first".

In order to cope with competition, some platforms regard user dwell time as an important assessment indicator, ignoring their role as a "gatekeeper".

The "gatekeeper" must guide the design and application of algorithm recommendations with positive and healthy values ​​that conform to public order and good customs to ensure the correct value orientation of the pushed content.

  On the other hand, excessive pursuit of "interest first", using its information asymmetry advantage to infringe the legitimate rights and interests of consumers.

This is a legal issue that requires sufficient attention.

  "There are problems with the top-level design ideas of some algorithms." Jiang Qiping, director of the Information Research Center of the Chinese Academy of Social Sciences, told reporters that when the famous British mathematician and logician Turing and other scholars proposed the artificial intelligence vision, humans and machines are right. The relationship between the two-way and mutual progress, but now some algorithm designs show that the relationship between human and machine is single.

For example, in terms of big data, we believe in correlation analysis and ignore causal analysis.

In terms of defining algorithm efficiency, only specialization efficiency is defined as efficiency, but in fact, diversification efficiency is also an efficiency.

  Big data and algorithm recommendations should be more "temperature"

  A netizen recently posted on a question-and-answer platform that he mentioned a sweeper when chatting with friends on a social platform, and then an advertisement for a sweeping robot appeared on the platform.

In the post, many netizens questioned: "Is it possible for the platform to use algorithms and other technologies to capture user chat records for accurate advertising?"

  He Yanzhe, an expert on the App Special Governance Working Group, said in September this year that the App Special Governance Working Group of the four ministries and commissions has not found any "eavesdropping" behavior in the App during multiple batches of App inspections.

However, there is a certain possibility in the technical realization of App "eavesdropping". It is necessary for relevant departments to issue regulations to clarify whether companies can use personal voice information when conducting big data "portraits", so that users can be more assured.

  From another perspective, the questions of public opinion actually point to the value-oriented issues of technology applications such as big data and algorithms.

How to regulate the use of big data and algorithm technology to make it more "warm" and more reassuring?

It is necessary to establish an effective monitoring system and evaluation system to ensure that algorithm designers and operators use healthy, correct, and positive values ​​to guide the design and application of algorithm recommendations.

  Relevant legislation and supervision need to be strengthened urgently, especially the legal supervision of algorithm recommendation itself.

  For example, the "Personal Information Protection Law (Draft)", which is publicly soliciting opinions, stipulates that individuals who believe that automated decision-making has a significant impact on their rights and interests have the right to refuse personal information processors to make decisions only through automated decision-making.

  On November 10, the “Guidelines for Anti-Monopoly in the Field of Platform Economy (Draft for Comment)” issued by the State Administration for Market Regulation stipulated that, based on big data and algorithms, it will be implemented based on the payment ability, consumption preferences, and usage habits of the counterparty of the transaction. Differential transaction prices or other transaction conditions; implementation of differential transaction prices or other transaction conditions for new and old counterparties; implementation of differential standards, rules, and algorithms; implementation of differential payment terms and transaction methods, etc., may be identified as " Unfair competition behaviors such as "big data kill familiarity" face stricter supervision.

  Xue Jun, director of the Research Center for Electronic Commerce Law of Peking University, told reporters that algorithms should be supervised at different levels according to the scenarios in which the algorithms are used and their impact on the basic rights of citizens.

In addition to the legal requirements that need to be more clear, a socialized review mechanism can be established to evaluate the consequences of the platform's use of algorithms and require the platform to be optimized based on public values.

  Compact the social responsibility of the platform.

Cao Huanhuan said that Toutiao today does not rely entirely on algorithm recommendations, but a general information platform that integrates "algorithms + hotspots + search + attention" to help users expand their interests.

Users can also choose to turn off the "Personalized Recommendation" button or "Permanently Clear History Behavior" and independently choose the method of obtaining information.

  "The value ethics should be emphasized in algorithm technology, the human side should be considered as the technology itself, and enterprises should be encouraged to fulfill social responsibilities in business activities." Jiang Qiping believes that the supervision of algorithm recommendation technology and platforms should ensure fairness and efficiency , The balance of personal information development and protection, personal information and platform information, etc., in terms of promoting the healthy development of the digital economy service format, policy adjustments can be made in accordance with the principle of balancing the income and payment of personalized information services.

He suggested to ensure that consumers have the right to judge the quality of information collectors' services, so that consumers are always in an active position.

  Some experts believe that regulatory authorities should urge companies to adjust their business logic and incorporate positive value orientation and user high-quality pursuits as key labels into the top-level algorithm design; governments, schools, parents, and platforms should share responsibilities and continuously improve the youth network Literacy.

Peng Xunwen

Peng Xunwen