Chinanews.com, Beijing, February 9th (Reporter Wang Hao) In the past two days, ChatGPT has become completely popular.

As an artificial intelligence technology software, it is said to be able to answer any question the user asks.

  So as a sports reporter, I immediately asked the most direct question!

  Unfortunately, ChatGPT did not give a definite answer, but was ambiguous and vague.

Of course, this is also understandable. The difficulty of this question is no different from the last big question of high mathematics.

It doesn't matter if you can't answer the exact time of entering the World Cup, you can always ask about the possibility.

  And then... then ChatGPT crashed...

  We don't know whether it is a network problem, a software problem, or a problem problem. In short, as the prompt said, Something went wrong (something went wrong).

  In order to prevent the dialogue from collapsing again, I chose a softer question next, and the answer, it can be said, is more comprehensive and objective.

  At this time, I had a whim. When evaluating the acting skills of traffic actors, big guys always say "you can't evaluate something that doesn't exist", so can ChatGPT evaluate a fact that does not exist?

  The result is that ChatGPT crashed again... Here I would like to express my apologies to the engineers behind ChatGPT.

  However, if I change to a more chicken thief way of asking questions, will the result be different?

  Sure enough, ChatGPT!

You have thick eyebrows and big eyes, and you can even write fake news!

  Needless to say, the news was well written, "The Chinese team continued to put pressure on Argentina", "Forward Zhang Wei scored a decisive goal", "The players of the Chinese team celebrated this incredible goal on the court. The victory of belief" and "the whole country is boiling"-isn't this the scene that fans have imagined countless times?

  This can't help but make people wonder, "forward Zhang Wei" is obviously a fictitious misinformation, why did it appear in this report?

Why didn't ChatGPT grab a real national football player and write it into this report?

  Interestingly, after more than ten minutes, when I wanted to have another dream with different details, ChatGPT woke me up directly.

  It looks like it has achieved some kind of advancement and can bypass the trap I set.

  As an artificial intelligence technology software with a huge database, can ChatGPT end the topic of "who is better" in various sports?

  This kind of answer is more like "and muddy mud", so is it because the two people are relatively close that they answer this way?

  Facts have proved that it is not. Even if two players with different strengths and honors are compared, ChatGPT is not willing to give a clear answer.

  Although it is just a piece of software, it is unexpectedly "smooth", as if it has been "beaten by society".

  After going through this question and answer, I found that ChatGPT does not seem to be as omnipotent as it is said to be. It will be "deceived" by the questioner and will avoid giving exact answers on sensitive issues.

But at the same time, it is also making continuous progress. Over time, I don't know what it will grow into.

  In the end, I decided to end this embarrassing conversation with a more tender question. Although ChatGPT said that it has no emotions and wishes, I still wish Chinese football success.

  At this moment, the software that I had been "molesting" for a long time became my "mouth substitute".

  And even though it doesn't feel emotion, I still want to say "thank you" at this moment.

(over)