I don’t want to see PGP rejection based on usability. So, to level the field at user level we take Delta Chat, which uses PGP. If I understand that correctly.
I have no knowledge of telegram security at all.
I don’t want to see PGP rejection based on usability. So, to level the field at user level we take Delta Chat, which uses PGP. If I understand that correctly.
I have no knowledge of telegram security at all.
Beyond the fact that security on Telegram is a joke (E2EE not enabled by default, only available in 1-to-1 chats, groups chats are all unencrypted, homespun encryption algo), they have never had a full, independent audit of their encryption standard.
It looks like there are a handful of papers that looked at parts of the earlier standard Telegram used (MTProto 1), but nothing on the current version (MTProto 2).
https://courses.csail.mit.edu/6.857/2017/project/19.pdf
https://eprint.iacr.org/2015/1177.pdf
https://eprint.iacr.org/2015/1177.pdf
Anyway, long story short, Delta Chat has had independent audits several times. I’d say that says it all, really.
https://delta.chat/en/help#security-audits
(Also, thanks for introducing me to Delta Chat, was unaware of the project up to now. Neat stuff.)
While I don’t disagree with you, I don’t believe that if MTProto 2 was breakable govts would be putting the shit show they’re putting right now.
while true, that doesn’t mean that it isn’t compromised but not hackable yet, or that a weakness won’t be found in the future. I would heed the advice of those in the field of cryptography and stay away from Telegram and MProto
breakable for the NSA doesn’t mean the police have access
also the current issue is with moderation: telegram is refusing to take down CSAM channels etc
And what about signal? If some gov founds a group chat they don’t like, will they take it down? How will they even know if all the contente is encrypted?
CSAM? More like copyright infringement. CSAM is the usual cheap excuse to shut down everything because of the obvious social implications.
if a govt seizes a device and discovers channel IDs to be taken down, i’m sure than signal would do so - there have been no arrest warrants, after all… however, the problem is also significantly smaller for signal because signal can’t have enormous broadcast groups
it’s kinda irrelevant what it is - you have to comply with police orders to moderate your platform… if this were musk and x lemmy would be cheering on the arrest! no matter who you are, you
don’tshouldn’t get to just break the lawand you’re right CSAM is frequently used as an excuse, and no i don’t have evidence - that would require actually looking for said content, which i have no inclination to do. the only information i have is that multiple independent news outlets have referenced telegram for years - not proof, but a more convincing argument than simply denial - because let’s not kid ourselves, unless you’ve gone looking for that content, you’ve got no proof against it either (and even if you didn’t find it, that’s no guarantee either - it’s unlikely easy to find)
Your points are fair however, where does it stop? If the police says “make it all plaintext” then what happens? It is a police request after all.
This thing where chat platforms and others “need” to comply with police / govt orders and remove content is very tricky… should platforms really censor everything the govts ask for? What if it is a group chat about a corrupt political party in power (with proof)? The govt will say it is CSAM, them Signal will shut it down and our democracies are gone.
To make it really clear: I’m not for breaking the law, and I don’t think that content should be on such platforms. The problem is that once you start removing that content the precedent will be abused to remove other actually important stuff because “it is CSAM” and the E2EE doesn’t have ways to check if is is really CSAM nor should it be the judge of the content.
this is the slippery slope fallacy… “where does it stop” is not a valid argument to not start