The fake account garnered 20,000 subscribers before being shut down, a metric still too rare according to experts.

“For Telegram, accountability has always been a problem,” says Oleksandra Tsekhanovska, of the Kyiv-based Ukraine Crisis Media Center.

"That's why (the app) was so popular even before the full-scale war with far-right extremists and terrorists around the world," she told AFP from a safe location near of the Ukrainian capital.

Telegram has 500 million users, who share information between individuals or in groups with relative security.

But the fact that Telegram is also used as a one-way broadcast channel – which subscribers can join but not respond to – means that content from inauthentic accounts can easily reach large audiences.

False information often circulates thanks to public groups, and the consequences can be serious.

“Someone pretending to be a Ukrainian citizen joins the chat and starts spreading misinformation or collecting data, like the location of shelters,” Ms Tsekhanovska says, noting how fake messages have urged Ukrainians to turning off their phones at a specific time of night in the name of cybersecurity.

However, such instructions could on the contrary endanger the inhabitants, who receive alerts on airstrikes on their smartphones.

"Far west"

The way Telegram is designed also limits the ability to curb the spread of misinformation: the fact that comments are easily disabled in channels, for example, reduces the space for public criticism.

And although some channels have been deleted, moderation is considered opaque and insufficient by analysts.

"Back when content moderation was in its Wild West period, like 2014 or 2015, they might have gotten away with it, but that's in stark contrast to how other companies handle themselves today. 'today," said Emerson Brooking, disinformation expert at the Atlantic Council.

WhatsApp, a rival messaging platform, introduced new measures to combat misinformation as Covid-19 began to spread.

WhatsApp has restricted the number of times a user can forward something, and has developed automated systems that detect and report problematic content.

Unlike Facebook or Twitter, which talk openly about their anti-misinformation programs, "Telegram is notoriously lax or even absent in its content moderation policy," according to Mr. Brooking.

"Very weak"

Telegram founder Pavel Dourov runs his company discreetly from Dubai.

On February 27, he acknowledged on his Russian account that Telegram channels were becoming "more and more a source of unverified information in connection with events in Ukraine".

He said that since his platform does not have the means to monitor all channels, it may restrict some in Russia and Ukraine "for the duration of the conflict";

but then backtracked after many users complained, arguing that Telegram was an important source of news.

Oleksandra Matviichuk, a Kyiv-based lawyer and head of the Center for Civil Liberties, called the position "very weak".

"He needs to start being more proactive and find a real solution to this situation, not just sitting on the sidelines and not intervening. It's a very irresponsible position," she said.

In the United States, where Telegram is less popular, the application has generally managed to escape strict surveillance.

But it was on this platform that some people organized before the attack on the Capitol on January 6, 2021, and last month Senator Mark Warner sent a letter to Mr. Durov asking him to put a brake on Russian operations. in terms of information on Telegram.

"The sheer volume of information shared on channels makes it extremely difficult to verify, so it's important for users to double-check what they read," Telegram spokesperson Remi Vaughn told AFP.

But since communications are often affected in war zones, these checks would be a luxury for many users, replies Ms. Tsekhanovska.

© 2022 AFP