The company has updated its FAQ page to say that private chats are no longer shielded from moderation.
Telegram has quietly removed language from its FAQ page that said private chats were protected from moderation requests. The change comes nearly two weeks after its CEO, Pavel Durov, was arrested in France for allegedly allowing “criminal activity to go on undeterred on the messaging app.”
Earlier today, Durov issued his first public statement since his arrest, promising to moderate content more on the platform, a noticeable change in tone after the company initially said he had “nothing to hide.”
“Telegram’s abrupt increase in user count to 950M caused growing pains that made it easier for criminals to abuse our platform,” he wrote in the statement shared on Thursday. “That’s why I made it my personal goal to ensure we significantly improve things in this regard. We’ve already started that process internally, and I will share more details on our progress with you very soon.”
Translation: Durov is completely compromised and will do whatever NATO tells him to do. Do not trust in the security of Telegram, which frankly was never that good to begin with. And do not trust anything else even remotely connected to the company or Durov personally.
Helps you with local cops for sure. But disappearing messages are also just a false sense of security IMO, there’s nothing technically stopping someone from using a modified client like that, in fact some do exist and generally work despite the hostility, and so do screenshots…
I mean yeah, but I don’t think this is realistic. If you offer people bulletproof un-censor-able security they’re going to take you up on it, even if you don’t like them. But signal isn’t that
Signal like every mainstream service has some amount of control and uses it to crack down on things like spam. They likely will use that control to censor other things too in the long term. To me that’s a bad thing. If it were federated, that power and responsibility would be with the instance/homeserver, not with one centralized organization.
This ties back to my point about metadata. There are plenty of reasons to want to trust the server, and with signal, you can’t.
I do agree though, feds doing targeted surveillance have easier ways. The issue is more one of bulk collection, and principle.
And frankly the whole argument about open source safety goes out the window when the source and distribution is centralized, development is done behind closed doors (not sure to what extent this is true of signal clients but it was true of the server), and updates are automatically pushed out.
There are big advantages to the linux-distro-with-maintainers model in that regard, as those are well-versed people who track development and act as a filter between users and a malicious update.