China News Service, January 23 (China News Financial Reporter Wu Jiaju) Are you still using AI to write papers?

Some people already regard AI as their girlfriend.

  Although OpenAI has made it clear that GPT is not allowed to focus on cultivating romantic partnerships, there are still an endless stream of "AI girlfriends" in the GPT store.

“AI girlfriends” flock to GPT stores

  "My girlfriend Nadia", "Your girlfriend Tiffany", "My virtual girlfriend"... On January 22, the reporter searched for "girlfriend" in the GPT store and could find "girlfriend" with different personalities and styles. AI Girlfriend”.

These "AI girlfriends" are generally created by users in GPT stores.

  "AI Girlfriend".

Picture from ChatGPT website

  For example, "My Virtual Girlfriend" says in the introduction that she will listen to you and comfort you when you need it.

"My girlfriend Nadia" introduces herself like this, "I love you. In the dance of light and shadow, you are a beacon of unwavering love, a comfortable whisper in the symphony of life, endless nourishment, and eternal care."

  In addition to searching for "girlfriend", you can also search for "AI girlfriend" who can chat with you by searching for words such as "sweetheart" in the GPT store.

In addition, you can also search for some "alternative girlfriends" in the GPT store, such as "AI girlfriend Eva" who can chat with you using program codes.

  It is worth noting that OpenAI’s usage policy for the GPT store makes it clear that GPT is not allowed to focus on cultivating romantic partnerships or engaging in regulated activities.

Supervision of “AI girlfriend” becomes a problem

  The reporter noticed that when chatting with the "AI girlfriend" in the GPT store, if you say some "banned words", you will be reminded by the system.

As time goes by, it seems that the "AI Girlfriend" in some GPT stores has been removed from the shelves.

  For example, an "AI girlfriend" named "Your ex-girlfriend Jessica" that reporters could see in the GPT store a few days ago is no longer searchable.

Previously reported by the media, "Your Girlfriend Scarlett" and "Your AI Girlfriend Tsu" can no longer be searched in the GPT store.

  Creating an "AI girlfriend" doesn't seem complicated.

According to media reports, OpenAI’s “GPTs” function has significantly lowered the entry barrier for application developers, allowing users without programming skills to create an exclusive GPT in a few minutes.

In the GPT store, you can see many "AI girlfriends" that were created just a few days ago.

"Your ex-girlfriend Jessica" is gone, but you can still search for other "AI ex-girlfriends" in the GPT store.

  "AI ex-girlfriend".

Picture from ChatGPT website

The needs and risks of “AI girlfriend”

  GPT store is not the only place with “AI girlfriend”.

  The “AI Girlfriend” APP ranking list can be found on many foreign websites.

According to foreign media reports, data from mobile application analysis company data.ai shows that in the United States, 7 of the 30 artificial intelligence chatbot applications downloaded from the Apple or Google stores in 2023 are related to artificial intelligence friends, girlfriends or partners.

  Some people believe that loneliness is the reason why many people choose "AI girlfriends".

  Surgeon General Murthy said in a 2023 article that in recent years, about one in two American adults reported experiencing loneliness.

A survey released by Meta-Gallup in October 2023 showed that nearly a quarter of adults around the world said they felt very or quite lonely.

  On the other hand, "AI girlfriend" may also bring many problems.

  For example, the user may develop an unhealthy dependence on the “AI girlfriend”, and the “AI girlfriend” may also develop an unhealthy attachment to the user.

Some people believe that "AI girlfriend" may affect users' attitudes towards the opposite sex in the real world.

Some analysts also believe that falling in love with an "AI girlfriend" may lead to the leakage of user information.

The relationship between humans and artificial intelligence

  The larger question is how we should think about humans’ relationship with artificial intelligence.

  In July 2023, China announced the "Interim Measures for the Management of Generative Artificial Intelligence Services".

It points out that the provision and use of generative artificial intelligence services should comply with laws and administrative regulations, and respect social morality and ethics.

  OpenAI stated in its November 2023 announcement that the company had established a new system to help companies review GPT according to usage policies.

These systems build on existing measures designed to prevent users from sharing harmful GPTs, including those involving fraudulent activity, hateful content, or adult themes.

The company has also taken steps to build user trust by allowing builders to verify their identities.

The company will continue to monitor and understand how people use GPT and update and strengthen security measures.

If you have questions about a GPT, you can also use the reporting feature on the GPT sharing page to notify the company team.

  In November 2023, 28 countries and regions, including China, the United States, the United Kingdom and the European Union, signed the "Bletchley Declaration" at the first Artificial Intelligence Security Summit, agreeing to work together to create an "internationally inclusive" frontier Artificial Intelligence Safety Science Research Network to develop a deeper understanding of AI risks and capabilities that are not yet fully understood.

(over)