Enlarge image

Biden rival and Democrat: Dean Phillips at a campaign event

Photo: Timothy A. Clary / AFP

Is it perhaps too risky if the Democrats let Joe Biden run against Donald Trump again in the US presidential election in November?

This is the belief of some in the Democratic camp and their supporters.

Congressman Dean Phillips is someone like that.

He is campaigning for the party not to enter the race again in November with 81-year-old Joe Biden - but ideally with him.

Dean's campaign has some supporters.

They call their initiative “We deserve better”.

Behind it are two Silicon Valley entrepreneurs named Matt Krisiloff and Jed Somers.

As the Washington Post writes, stock market billionaire Bill Ackman is also one of them.

According to the newspaper, in order to promote Dean Phillips' campaign, the initiative has, among other things, developed a chat program that is intended to bring Phillips' campaign messages to citizens.

They call it “Dean.Bot”.

The program is based on the AI ​​platform ChatGPT from the OpenAI Group - and that is exactly what apparently caused the experiment's undoing: The company has blocked the initiative's access to ChatGPT.

This is the first action that the Microsoft partner has taken in response to the misuse of its artificial intelligence (AI) tools in a political campaign, the Washington Post reported on Saturday.

The fear of election interference

"We recently removed a developer account that knowingly violated our usage guidelines, which prohibit political campaigning or impersonating a person without consent," OpenAI said in a statement.

The company blocked the account late on Friday (local time), citing that OpenAI's rules prohibit the use of its technology in political campaigns.

According to the newspaper, the political action committee “We Deserve Better” has not yet commented on the matter.

Just last week, OpenAI countered fears about election interference using artificial intelligence (AI).

OpenAI emphasizes that its policies prohibit using its technology in ways that are considered potentially abusive.

This includes creating chatbots that can pretend to be real people, create a mood and thus influence people's voting behavior.

beb/Reuters