Today, "algorithm" has long been a familiar term to the public.

With the deepening of the integration of the Internet and life, whether it is advertising push on social media or shopping recommendation on e-commerce platforms, it is inseparable from the support of algorithms.

Internet companies capture user data and provide customized information, entertainment, and consumer services based on user portraits. On the one hand, they provide users with a more "personalized" experience, and on the other hand, they also bring more profits to enterprises. It seems like a "win-win" good thing.

However, in recent years, the public's concerns about companies' capture of user privacy have deepened, and secondary issues such as "big data killing" and "information cocoon room" have also attracted public attention.

  Recently, Douyin, WeChat, Taobao and other apps have launched the algorithm shutdown button, allowing users to turn off the "personalized recommendation" function with one click in the background.

After the recommendation is turned off, the app will no longer generate recommended content or advertisements based on the user's browsing history.

This is one of the major measures taken by Internet companies for self-inspection and self-correction after the official implementation of the Provisions on the Administration of Algorithm Recommendations for Internet Information Services.

The intervention of government and social supervision can not only make the algorithm "black box" invisible and intangible to the public transparent, but also make the algorithm better serve the public interest.

  The rapid development of the Internet has brought an unprecedented amount of information to the public. How individuals can collect what they need from the huge influx of information is a unique problem in the Internet era.

Algorithmic recommendation can accurately provide services according to user preferences and needs, saving users time in collecting and filtering information.

From this point of view, the algorithm is not a beast, but has its own unique value.

  However, the crux of the problem is that the algorithmic recommendation mechanisms of Internet companies are basically "black boxes".

During the application process, users can neither choose whether to use this service, nor know what personal information the algorithm has collected, nor what the logic and rules of the algorithm recommend.

Many people have been unknowingly collected a lot of privacy, and even been "slaughtered" because of it.

In addition, algorithm recommendations are firmly bound to Internet services, so that in many cases, refusing algorithms is equivalent to denial of service, which also deprives consumers of their right to choose, so they can only be forced to accept the algorithm’s peeping on privacy and sit back and watch themselves. the legitimate rights and interests of which are damaged.

  Once the algorithm is excessively abused, it will not only infringe on individual rights, but may also have a negative impact on the public space.

When social media and news platforms screen and recommend information completely based on personal preferences, the formation of an "information cocoon" will also accelerate.

Users are wrapped in the same kind of information all day long, group with people of the same position and reject other information, which will naturally become "isolated islands", and finally the network will no longer be a bridge of communication, but become a space for different groups to criticize each other. .

  The dilemma caused by algorithm recommendation makes us have to think deeply about the connotation of "personalization".

Today's "personalized" service is essentially a passive "personalization".

Users can only passively accept algorithm recommendations, passively allow algorithms to collect their own data, passively allow algorithms to tell themselves what services and information they need and like, and are ultimately trapped in the “comfort zone” and cannot escape.

Under such "personalization", algorithmic recommendation has become a means of "domesticating" users.

The "personalization" that users really need should be active "personalization".

Users should be able to actively choose whether to use the algorithm, customize the services they need on the premise of understanding the algorithm recommendation rules and logic, and know what content the algorithm recommends has screened out, so as to break through the "information cocoon room" and access a different world .

In the final analysis, algorithmic recommendation should be a tool for users, not a shackle on the user's head.

  On September 20, 1987, China successfully sent an email for the first time, and its content was "Crossing the Great Wall and Going to the World".

The advancement of Internet technology should make people more convenient, independent, and free, make the algorithm settings of enterprises more transparent, and give users more full right to choose and know, so that true "personalization" can become the future of the Internet.

  Luo Guangyan Source: China Youth Daily