This is useful if you want to keep the content of your messages secure, but if you need to keep your identity, social graph and the fact that you conversed with certain people obfuscated, I don't think Delta Chat via email is a good solution.
It's also only decentralized as much as public email infrastructure is decentralized.
It's basically GPG with better UX.
That’s great, but I’m not holding my breath. PGP isn’t architecturally well-equipped to provide forward secrecy. In the mean time, I think it’s borderline negligent to put this in the category of secure messaging; the world’s expectations for security baselines have moved on beyond the mid-2000s.
(My reference point here is Keybase, which built a very user-friendly and misuse-resistant encrypted chat on top of PGP in the mid-2010s. They couldn’t get to forward secrecy either with PGP as their substrate.)
> as for metadata, as long as the messages are sent from my personal email server to the destinations email server using a TLS connection, the metadata is accessible only on those two servers.
To the best of my knowledge, MTA-STS adoption rates are still abysmal[1]. It’s a move in the right direction, but this kind of shambolic jigsaw approach to communication security isn’t appropriate in 2025. Sensitive messages should go over protocols designed to carry them.
[1]: https://www.uriports.com/blog/mta-sts-survey-update-2025/
[1]: https://support.delta.chat/t/autocrypt-key-rotation/2936
Which brings up a point I suppose. Delta Chat is not really doing OpenPGP. They are mostly doing Autocrypt. Autocrypt was an attempt to do encrypted email without the bother of identity verification. It has always seemed like a bad idea to me. The Delta Chat project ended up adding identity verification on top of Autocrypt.
i have no insight into the development, but i suppose that swapping out PGP for something entirely different should technically be possible.
they did develop a peer to peer protocol with forward security for real-time messages that sidesteps SMTP entirely. seems a bit wierd given the premise, but the devs are at least not limiting themselves to SMTP and PGP.
That would probably be good, but email is still a terrible substrate for secure messaging. Clear metadata is security poison; you want as little of it revealed to participant servers as possible.
> they did develop a peer to peer protocol with forward security for real-time messages that sidesteps SMTP entirely.
That’s great, but in that case: what’s the value proposition relative to Signal or even Matrix?
I mean this kindly: I wish they would think a little bit more inside the box, and converge onto a proven design.
(It’s worth noting that your “existing infrastructure” argument is exactly why Signal uses phone numbers. Using existing infrastructure is a great idea, so long as it doesn’t compromise the security expectations any reasonable user has. That isn’t currently true for Delta Chat.)
the reason may be the same, but the effect is entirely different. until recently signal did not allow hiding the phone number, failing my privacy expectations. a public phone number is something entirely different than a public email address. signal is also centralized with its own servers. deltachat works completely without dedicated servers. and emails easily allow multiple accounts.
and what are reasonable security expectations? what you and i consider reasonable does not at all match what the general population expects. for most people sending encrypted emails would already be a win. (autocrypt also works with regular email clients, not just deltachat)
the goal here is to raise the general use of encryption in messages. if that is not sufficient then deltachat is not the right tool. but i have friends on telegram and whatsapp. getting them to use deltachat would be an improvement.
There is an inbuilt drive for decentralization, as "anybody" could run a server (I just set one up).
> and what are reasonable security expectations?
End-to-end encryption that the user can’t accidentally downgrade from and that doesn’t spray valuable metadata across the Internet. That’s table stakes; I’m not interested in lowering my standards below that.
> for most people sending encrypted emails would already be a win.
I don’t think this is even remotely true. I think the average person doesn’t know what an encrypted email is. We’re now in at least the third decade of encrypted email techniques, and adoption outside of corporate S/MIME (another can of worms) is marginal.
There’s almost too much to even say here; it’s a disservice to even accept the implicit assumption that users would use encrypted email correctly if they could be made to: the single most common breakage point for all of this stuff is still people replying or forwarding previously encrypted messages in the clear!
> the goal here is to raise the general use of encryption in messages.
No. The goal is security. “General use of encryption” goes back to putting ideology before security. The goal is to actually put users in a position where adversaries struggle to collect the kinds of data and metadata that would allow them to harm people. The US famously kills people based on metadata[1], and we’re the “strict” ones in terms of evidentiary standards.
[1]: https://www.nybooks.com/online/2014/05/10/we-kill-people-bas...
true, i wasn't thinking about security here but reuse of infrastructure. signal doesn't reuse infrastructure because it needs its own servers.
End-to-end encryption that the user can’t accidentally downgrade from
that's a fair point.
that doesn’t spray valuable metadata across the Internet
i find that a gross exaggeration. yes. metadata can be read by every server the mail passes through. but in practice most mails are only touching the sending and the receiving mail server. if both of those servers are in control of the sender and recipient and the connection between them is encrypted then the metadata remains private.
also, where i use deltachat, the alternative is to use email.
I think the average person doesn’t know what an encrypted email is
which is why we need more encryption by default.
adoption outside of corporate S/MIME is marginal.
because it is to hard to use. deltachat makes it easy to use. next possible step: delta mail. a more traditional mail client that makes encryption as easy as deltachat does.
The goal is to actually put users in a position where adversaries struggle to collect the kinds of data and metadata that would allow them to harm people
there is a long road to get to that. more encryption is just one step, but a necessary one. i agree with you, but the goal can't be reached if we don't work on multiple fronts. one of those is helping people to learn about encryption and privacy, which only happens by slowly getting them to use better tools and by improving those tools.
rejecting deltachat is rejecting something that improves the current state for something better that is not obtainable by some. sometimes that makes sense, especially if the solution promises more than it holds. and deltachat would fall into this if it were to promise complete privacy. but i don't think it does that.
i have friends who outright refuse to sign up to a new service. but deltachat is ok because they can use their existing email for it. technically that sounds the same as saying that with signal you can reuse your existing phonenumber, but people already have much higher privacy expectations to sharing their phone number, and also deltachat doesn't share your email address except with recipients so it really isn't the same thing.
Why are we entertaining this hypothetical? It isn’t true in practice; the average user doesn’t control their mail server. The average user is using Gmail or Outlook, where their metadata is a single subpoena away.
And again, it just isn’t true: you need not just control over the server but also strict transport security for this property. This is not widely true of mail servers on the Internet.
> rejecting deltachat is rejecting something that improves the current state for something better that is not obtainable by some.
I don’t agree. I think the average user has multiple high-quality E2EE messaging technologies available to them, and that Delta Chat effectively muddies the water by providing a worse security posture with the trappings of a familiar-but-unsecurable ecosystem (email).
(I also don’t know why people think Signal shares your phone number with people other than recipients. To my knowledge, that has never been the default and presumably never will be, even with their private contact discovery protocol.)
fair point. there are options however. you are not locked into trusting a specific entity. but the critical point is that even signal is able to figure out who is talking to whom: https://sanesecurityguy.com/articles/signal-knows-who-youre-... sure, for SMTP the contact details are directly in the messages, which is worse, but i don't know of any service that works completely without metadata. but signal is at least trying.
also strict transport security for this property. This is not widely true of mail servers on the Internet
since gmail requires TLS i highly doubt that there are many servers out there that don't support it.
the average user has multiple high-quality E2EE messaging technologies available to them
available and willing to switch are different. as i said, my friends are not willing to sign up to yet another messaging service. it's a social media fatigue.
why people think Signal shares your phone number with people other than recipients
that's not the point, at least for me. i am hesitant share my number with signal or any other service, and worse, i do not want to share my number with the people i talk to. i refused to use signal until the later was fixed. i refused whatsapp too, but to many people that i need to reach demand it, so i had no choice.
these are all trade-offs. not everyone agrees on the same, and while i understand and principally agree with your arguments, for me they don't work because i can't convince my friends. i also have other friends who do run their own mail servers. i have contacts who require whatsapp and others who can only use wechat. most often i don't have a choice. i am using whatever i can get people to agree to, and for that deltachat is a good option. signal could have been a better option but unfortunately their requirement to share phone numbers until recently made them a worse option than deltachat or even telegram for anything but 1:1 communication with trusted friends (those who i trusted to have my number). that has changed now, and i started to use it. but it will take time to build up my contacts there. btw, in some countries it is not even possible to sign up to signal. the number gets rejected.
Gmail doesn’t require TLS, unless by that you mean that their webmail interface is TLS only. Like every other mail provider, they do opportunistic TLS on external delivery, and TLS on MUA connections (SMTP and IMAP) is largely at the mercy of user configuration.
The fact that people seem to think that TLS is a mainstay of the email ecosystem is clearly part of the problem here.
As for the rest of this: I’ve hammered on about Signal because it’s the naive right choice, but it’s ultimately up to you to decide whether your phone number is an acceptable public identifier. But even if it isn’t, there is so much out there that’s indisputably better than this mess: Matrix or even iMessage (with an email identifier instead of a phone) would be better.
according to this article it does:
https://www.valimail.com/blog/the-new-requirements-for-email...
and for one i think this is a good thing.
otoh, according to this it doesn't:
https://support.google.com/mail/answer/6330403
but https://transparencyreport.google.com/safer-email/overview shows that by now almost all emails sent and received by google go through TLS which i believe can be used as a proxy to assume that most servers out there now support TLS.
signal fixed their phone number problem, so that is no longer an issue.
matrix is not reliable enough. the encryption can break in the sense that messages can no longer be read. i am basically required to have a second unencrypted backchannel (or use a different app, but then why even bother) to make sure i can reach someone. (the issue i experienced could be due to a misconfiguration of a matrix server, but that's a bug in itself. it should not be possible to change the configuration of a server in such a way that my messages arrive but can not be decrypted anymore.)
what server & client are you using?
the situation is as follows:
there are multiple servers and users involved. let me name the servers A, B, C and matrix.org.
i have accounts on A and B, and my friend has an account on C. others have accounts on matrix.org. all of us are in a group on matrix.org (i am in the group with both of my accounts from A and B).
with both my accounts i can see but not decrypt messages from before i joined. yet the groups chat history setting is "visible for all participants" and not "visible from joining"
on account A i can read messages since joining, except for those from my friend on C. my friend on C also can not read messages from A in the group. nor can we talk to each other directly.
now, A is a very restricted server that blocks many other servers as a spam protection measure. as far as i can tell, it does block server C but it does not block B. B doesn't have any blocks.
that i am unable to open a direct connection to C from server A is expected because of the block. from server B this is not a problem. B can also read all messages in the group (after the join date)
what bothers me is this: even if server A blocks server C, why does it block messages that C sends into a group on matrix.org? groups should either be allowed fully or not allowed at all. it doesn't make sense that groups break for members on blocked servers.
now, A blocking C is not intentional and i could ask the admin to remove the block, but lets assume that it is intentional because maybe there are many spammers on C and my friend is an exception.
what i wonder is why even allow blocking in this form at all?
i am the only member from server A in the group. what benefit does server A have from blocking users from C in the group i joined on matrix.org? i could understand if A doesn't want people from C to join groups on server A, or connect to people on server A. so block directly incoming connections. but why block messages in a group that's not on server A? i joined that group. dealing with C should only be my problem. also, the messages aren't even blocked. they just can't be decrypted. so traffic is not even reduced. this is not encryption randomly breaking. this looks more like a problem with how blocking works to me.
also i think it would make sense that despite blocks, individual members from A should be allowed to initiate connections to users on blocked servers. it's connections from C to A we don't trust, but connections from A to C should be fine, because everyone on A is trusted.
the way i see it, if i am allowed to join a group, i should be able to see all messages in the group, and everyone should be able to see my messages, even from people on blocked servers and no blocking rule should be able to prevent that. if i should not see those messages then i should not even be allowed into the group. once i am in a group, there should be no blocks getting in the way.
users from blocked servers should not be able to access groups or contact people on the blocking server. and maybe users from the blocking server should not be allowed to join groups or talk to people on blocked servers. but that would ideally be a separate permission.
another issue is the key handling. i find it confusing as to what i need to back up so that i can reopen a connection from another device. deltachat has a simple export profile. i save that and i import it on another device and i am done.
I think this is counter-productive, limiting the adoption of meaningful security improvements. The engineering and UX implications of PFS and full metadata encryption (in particular social graphs) are severe. Not even signal has that, and they are above and beyond for a mass consumer product.
From the physical world, it’s like saying that having addresses on the letter is the same as the government opening and scanning the contents of every letter. Of course I don’t like the indiscriminate metadata collection, but there are worse things.
If you’re a spook or dissident, by all means, take extra precautions. You’re gonna need to anyway, in many more disruptive ways than your messaging app. Personally I just want to share shitposts with friends and speak freely without second guessing if I’m gonna be profiled by a data broker, or someone is gonna scan and store the pictures I send forever. Keep in mind that the status quo (Gmail, DM on social media) is incredibly bad.
These kinds of message board discussions invariably pose a dilemma: "send messages in plaintext using normal email, or use whatever secure messaging tool is available regardless of its strength". That's false. People always have a third option: not sending the message electronically. Most of us here have messages they wouldn't send even with their most trusted messaging tools; people who are at serious risk from message interception have much more dangerous messages than that.
Recommending that at-risk people use weak secure messaging as a "better than nothing" step towards real secure messaging isn't just bad advice. It's malpractice.
> Unless your messenger is at pains to make sure people don't use it in life-or-death situations [...] the exact opposite thing is true
Right, this is the false-sense-of-security effect. It exists and it's real. But there are more aspects that weigh in.
> People always have a third option: not sending the message electronically.
I challenge this assumption. In reality the effect is not about what they can do if they listen to the advice of Bruce Schneier, but what they will do. Navel-gazing on security and throwing your hands up if people don't act "the way they should" is what's really irresponsible, imo. I.e. if your contacts are not physically close, they won't (or even can't) schedule a flight to send a message. They'll generally use what's socially convenient, even if they're discussing something like abortion in an oppressive state. If you're lucky non-techies will say "Hey, maybe we should try that app Signal, I heard it's more secure". That's as good of a win as it gets.
The counter-example would be going around saying Signal is worthless because they collect phone numbers, they don't enforce public key validation, and they don't use onion routing to protect your social graph. I don't think we disagree about how ridiculous that would be, even if we disagree on which aspects are most important.
Basically, if set the weight of all security properties to ∞, you will get something that's so wildly inconvenient that nobody would use it. Even PGP that's relatively easy to use was at its peak about as popular as starting a yak farm.
I disagree, people will end up in prison or dead if they let a false sense of security compromise themselves. It should be stressed that certain sensitive activities should not involve computers, phones, etc because of the very real possibility of dire consequences. If someone is desperate enough where they have to resort to using computers to do sensitive activities, they should be given the best advice, caveats emphasized, and not just what someone feels is "good enough".
> Not even signal has that, and they are above and beyond for a mass consumer product
What parts of this do you think are missing from Signal? Signal has had PFS for as long as it’s been called Signal, and has famously minuscule metadata on users.
The social graph isn’t e2ee in any app that works because the server needs to route the message. And the social graph is metadata.
You are welcome to live your privileged life with your privileged friends using any software you feel is good enough. Just don't assume everyone can afford that luxury.
https://pressgazette.co.uk/news/rsf-moves-downgrades-global-... is a decent index to assess in what kind of country you're living in.
That's already a lot more decentralized than most web services we use on a daily basis
Email is an open interoperable standard, owned by nobody.
You can run your own email infrastructure just fine (I do, many do).
So it is fundamentally different from all the proprietary walled gardens which have a single owner that controls everything.
Telling Joe Shmoe that he should run his own email infrastructure instead of using literally anything actually built for E2EE is an ideological argument, not one grounded in Joe’s message security expectations.
See parallel response. Open source is not the same as an open interoperable standard.
> And email is de-facto owned by a small handful of service providers.
No, not really. Yes there are large providers who manage a lot of people, but it is not owned by anyone.
> Telling Joe Shmoe that he should run his own email infrastructure
That's not necessary either. Joe can get his email from any of thousands of providers ranging from large to tiny if he doesn't want to run it. Service can also be delegated in various ways depending on comfort and convenience. For instance, one mixed setup is to manage receiving by one provider (which could be oneself, to guarantee you can't get locked out) and delegating sending to a different provider (self or others).
It's also easy to delegate to a tech-savvy friend or family member. I run email for my own domains but also for most family members and a few consulting businesses in our circle.
This is the power of open standards codified in RFCs. It is what the Internet was meant to be. Walled gardens was never part of the plan.
Sadly, as with many things, Gmail effectively controls it de facto, nowadays ...
Similarly, using SimpleX private message routing via .onion message relays and the fact that the system has no identifiers can also afford you that obfuscation.
According to https://github.com/simplex-chat/simplexmq/blob/stable/protoc...:
> identify that and when a user is using SimpleX.
Does this apply to Cwtch?
Also, is it not possible to obfsucate this traffic? Tor with obfs4?
Related:
#1 - https://security.stackexchange.com/questions/241730/traffic-...
#2 - https://github.com/simplex-chat/simplex-chat/issues/4300
#3 - https://github.com/tst-race/race-docs/blob/main/race-channel...
Heavily sandboxed SimpleX that's firewalled to block any non-Tor traffic. Chose this one because it allows for offline message sending/receiving, despite privacy implications, and because it has clients people will actually use.
Cwtch doesn't let you send messages when the recipient is offline by virtue of how it works, which is more secure, but inconvenient.
When evaluating Cwtch, I think I read somewhere it might send identifying metadata to your recipient, or something similar, but I might just be making that up. I'll have to look up what I was reading.
> > identify that and when a user is using SimpleX.
> Does this apply to Cwtch?
With Cwtch you're running two hidden services, one on either end of the chat, and that happens over Tor with no middleman service, so no. A passive network observer can tell when you're connecting to Tor, but you can attempt to obfuscate that with transports.
Such as obfs4, I presume.
I read about RACE just now, seems interesting:
- https://github.com/tst-race/race-quickstart?tab=readme-ov-fi...
- https://github.com/tst-race/race-destini
Have you heard about it, or have you used it before?
> Cwtch doesn't let you send messages when the recipient is offline by virtue of how it works, which is more secure, but inconvenient.
I agree. How much more secure is that? In the case of Ricochet, this only applies to friend requests. You have to be online to be able to receive friend requests, which I am fine with.
It's much more secure wrt metadata. There is no third party server that's able to amass metadata about the two users conversing. SimpleX doesn't hide your IP-address from the server, and given that there's exactly two parent companies hosting ALL of the official servers, it's not too hard for Akamai or https://runonflux.com/ or anyone who compromises their OOBM systems to perform end-to-end correlation between two users.
https://discuss.privacyguides.net/t/simplex-vs-cwtch-who-is-... has a lot of discussion about Simplex vs Cwtch.
Similarly, built-in routing over Tor can make performing correlation attacks difficult for some adversaries, and if you elect to use your own .onion servers instead of the official ones, it adds another layer of obfuscation.
[1] https://github.com/simplex-chat/simplexmq/blob/stable/protoc...
How do you configure SimpleX on Android to use your own SMP servers BTW?
I would also like to know how I would configure SimpleX on Android to use my own SMP servers.
Edit: I found this: https://simplex.chat/docs/server.html.
And I found:
# `socks_mode` can be 'onion' for SOCKS proxy to be used for .onion destination hosts only (default)
# or 'always' to be used for all destination hosts (can be used if it is an .onion server).
# socks_mode: onion
In any case, I believe what I was looking for is https://simplex.chat/docs/server.html.On Android, however, this is not as easy or straightforward and I cannot think of a way to do this, to be honest. That is why I prefer these programs to have Tor bundled and run the hidden service by themselves with a hardened-enough torrc. Ricochet does this on desktop, which I think is the right way to go about this. SimpleX's server (https://github.com/simplex-chat/simplexmq) should do this.
What I do is run Wireguard on my server with a Tor daemon, connect to the WG network on my phone and then access the SOCKS and DNS proxies the Tor daemon exposes.
That way there is no need for Orbot or running Tor on Android at all.
"To mitigate this problem SimpleX Messaging Protocol servers support 2-hop onion message routing when the SMP server chosen by the sender forwards the messages to the servers chosen by the recipients, thus protecting both the senders IP addresses and sessions, even if connection isolation and Tor are not used."
The thing is, like I said, there are only two main companies running all the servers. Akamai and RunOnFlux. So unless Tor is used, it's a 50-50 chance that both users are connecting on to servers run by Akamai. Doesn't matter if the two servers don't share with each other the information about the IP-adderss of the user's peer. It's enough the parent VPS company has access to all traffic coming into the infrastructure. There's nothing "onion" about that routing. It's much closer to just traffic between two nodes of a server farm. Which is what practically any scalable IM server does.
Yep, but the author of obfs4 says not to use it, there are more modern transports with less flaws.
At the end of the day, the transport lists are public, but sharded, so it's truly just obfuscation no matter what transport protocol you use. Someone observing your connection with the resources to map out transport relays can tell if you're using Tor.
> Have you heard about it, or have you used it before?
I haven't, but it looks interesting. It seems they're doing a similar mixnet approach to SimpleX.
> I agree. How much more secure is that?
If you don't to rely on a third party to queue and relay your messages when your recipient comes online, it's one less party that you're sharing information with.
I also believe it opens you up to Tor correlation attacks, like what happened with Ricochet. Maybe an overlay mixnet can add some further obfuscation, as with SimpleX and RACE, but I assume those overlays are vulnerable to correlation attacks, as well.
Such as?
So… entirely? What am I missing about your point?
In practice, it's quite centralized and you're always at risk of one of the big providers locking your servers out of their network or putting you on a blocklist they all use.
05-mar-2025 https://news.ycombinator.com/item?id=43262510 100 comments
24-jan-2021 https://news.ycombinator.com/item?id=25893626 148 comments
07-jan-2021 https://news.ycombinator.com/item?id=25674894 4 commments
27-feb-2019 https://news.ycombinator.com/item?id=19263357 11 comments
21-feb-2019 https://news.ycombinator.com/item?id=19216827 56 comments
03-feb-2017 https://news.ycombinator.com/item?id=13560279 1 comment
https://delta.chat/en/help#pfs
It's great they're being open about the implications. But given that there's better protocols out there (Signal protocol for example), it makes no sense to use inferior apps.
(The response here might be that you could run your own mail server, but you’ve now excluded >99% of the world’s population from the essentially reasonable expectation of secure messaging. Plus, you’re then dealing with the ongoing misery of securing your own mail host.)
Not true, because an open standard will always be superior to a company-owned (and controlled) app.
I run all my own email infrastructure. Many of my friends do. We can communicate without any corporate overlord deciding who can say what.
Signal is a company, one that demands a phone number to use their proprietary service and can shut you out in a nanosecond. No thanks.
But, and maybe I'm stating the obvious but it is a critical difference, open source is nice but much inferior to open interoperable standards.
Signal-the-company does not allow any clients other than their proprietary compiled client (I believe they sort of tolerate some, but not supported). So while in theory I could use the open source software to run my parallel signal-protocol network, it won't interoperate with the one run by Signal-the-company which is where most people are. So, not actually useful.
Contrast this with email which is an open standard. I can run any SMTP server I like and any MUA I like (or even write my own for one or both), and interoperate with the whole universe of people who use email.
(But also, this isn’t a good argument! Repressive governments love metadata, and email is an amazing source of unbounded metadata even with these kinds of “secure” layers slapped on top. If I was a government looking to snoop on my citizens, I would absolutely push them towards the protocols I can infer the greatest amount of behavior from.)
I'm not sure your second point holds either - for most nations, an active connection to imap.gmail.com leaks little other than how actively the user uses gmail. Correlating senders and receivers from that data sounds technically challenging enough that I wouldn't expect repressive regimes to be capable. But, to be fair, I base that on nothing.
Yes; the point was not that they’re the same, but that regimes that do the former tend to also do the latter. Moreover, we shouldn’t do insecure things because regimes block the secure things; that’s what the regime wants you to do. The answer might not be Signal if Signal is insufficiently decentralized, but it certainly isn’t email.
> for most nations, an active connection to imap.gmail.com leaks little other than how actively the user uses gmail
This alone is a significantly larger amount of metadata than schemes like Signal leak. But it also isn’t true: a country that controls its internet infrastructure can almost certainly pull much more metadata from plaintext IMAP/SMTP than just access times and addresses. And this isn’t hypothetical: STS is not widely adopted in the email ecosystem, so plaintext downgrades are pervasive.
Nations don't have to do any of that, they can just subpoena the email host for the data, or just ask nicely for it, as companies are wont to work with law enforcement and the regimes they do business with.
The point of many of anonymizing and "private" chat services is the lack of data sitting on third-party hosts that can later be shared with adversaries.
It took me two minutes to figure out DeltaChat connects to the server with SNI "nine.testrun.org". Banana dictatorships can trivially write firewall rules to cut those connections. There are other servers, but if those are going to be usable by anyone, they're going to have to be public, and writing block-rules is trivial compared to spinning up new servers.
I'm not saying Signal is much better in this regard, I'm just saying resilience isn't a useful metric to assess messenger security.
sounds like a bug that can be fixed. it should not need to make that connection unless you create an account on that server.
not quite. the default server feature is only a year old. while deltachat itself goes back to at least 2017, so the majority of users will not be on that default server now, and it would be possible to offer a randomized selection to prevent one default server from dominating.
Also, I'm unsure if it's smart the client just picks a server for you at random. AFAIK this uses email as back-end so it's not like you can just swap your email address host like you can swap telco while keeping your phone number. One option would be to have the user first whitelist the email providers they'd trust, but most users usually prefer trusting the app vendor as they're trusting it with the client anyway.
No forward secrecy and will automatically switch to unencrypted messages if you receive an unencrypted message from a contact.
I wonder if it's vulnerable to downgrade attacks from adversaries falsifying the sending address. If an adversary sends an unencrypted email imitating a contact will delta chat reject it or will it silently switch the chat with that contact over to unencrypted email?
https://delta.chat/en/help#how-can-i-ensure-message-end-to-e...
and it's not just pgp with email, it's more akin to an overlaysystem.
JFC. There's a reason Signal dropped SMS support. What an insane design decision.
Private key login, encrypted private chats and contacts, encrypted group chats, and lightning payments. Decentralised, built on Nostr. Available on all platforms.
Also, the direct messages have three types
1) NIP-04 DM: "Most widely used", but also, "not recommended". Reeks of Telegram that also has non-secret chats being the most popular option
2) Gift-Wrapped DM: Uses different encryption algorithm but no forward secrecy? Forward secrecy has been around for 20 years.
3) Secret DM: Can't be recovered on different devices. Why can't the backup be self-contained database like Signal has?
Also "Secret chat requires consent from peer." Like what :D You have to wait for contact's approval to have a private conversation with them. Sounds like it incentivizes all chats to start with less secure protocols.
The nice part about writing your own chat system is the security agility in that you can bump any security property without having to fight with protocol standardization bodies. Having three DM protocols inside the same app is wild.
Using an email address as an identifier for IM is a great idea (I hate that everything uses phone numbers for this, which are not internationally portable and not possible to reasonably “self-custody” the way TLDs are).
But using the actual email protocol as a backing protocol for instant messaging seems like a weird contortion and still makes this effectively a separate protocol, the split being servers that do and don’t support all necessary extensions. The overhead must also be staggering; just look at an email header to see how much is going on for each message these days.
Certainly if no one can implement these two things it is functionally a closed source project. It also is a security failure from the standpoint of control, validation, and also future security and vulnerability patching (there's a graveyard of dead "secure" messaging apps.)
Is DeltaChat perfect from a security standpoint? No, but it's certainly well above the hurdle most people are at now. Most people are using non-encrypted communication that is actively scanned & stored, or e2e on paper stuff where one party controls the client, server, application, and storage (trust me e2e security.)
Telegram, Discord, Facebook Messenger, stop using that shit.
It's less safe compared to Signal, and Signal is the gold standard recommendations for average Joes. "Better than Telegram" is a low bar.
Telegram is a walking time bomb with 900 million users' data waiting to be leaked from the servers.
>and, deltachat.
That must be why I've never heard of anyone using it.
>deltachat is the only one that doesn't require a smartphone and a phone number.
It leaks the IP-address to the server, which by default (defaults matter) is nine.testrun.org. That server can amass metadata about users conversing, and any government entity that comes knocking can look at TelCo records about to which user the IP-addr was assigned at the time.
If you're going to try to address metadata privacy against service provider, you're going to have address it properly, and DeltaChat isn't the one at that point. Neither is Signal. You'll want Cwtch for that.
Russia probably has all the Telegram data, considering they officially intervened in the recent Romanian presidential elections taking the side of the local MAGAs.
https://www.reuters.com/world/europe/telegram-founder-says-h...
What the article doesn't say is they sent a message in romanian to all romanian telegram users with the above claim, signed Durov.
So I don't think their "security" can be trusted.
the question is not what is the best, most secure, most private, option, but what has the right balance between easy onboarding, ease of use, security and privacy. and maybe deltachat is not the best possible, but it is pretty good. remember, when security and privacy are to onerous then you don't have security or privacy because people will refuse to use the tool.
Which doesn't really work in practice. The closer you move to the user, the more the threat of creepy buddy watching over metadata of people they know grows. Medium sized institution like university or a company might run their own, but that's also somewhat risky.
>the question is not what is the best, most secure, most private, option, but what has the right balance between easy onboarding, ease of use, security and privacy.
No. The question is, given an architecture that imposes fundamental limitations on what can be achieved, which tools under that domain have best privacy by design system, where the UX and features are maximized with ingenious design, is the best.
Fundamental architectural limitations:
Does Delta Chat use data diodes? No? Then it can't have key exfiltration security, but it can have message forwarding.
Does Delta Chat use Tor Onion Services? No? Then it can't have proper metadata privacy for users' identity from the server, but it can have offline messages.
These are fundamental trade-offs.
DeltaChat is content-private by design. It might be metadata-private by policy (internal policy that server on nine.testrun.org does not collect metadata), but until that is tested in court like Signal is, we can't know for sure.
Signal is content-private by policy. Cwtch uses Tor Onion Services so it's metadata-private by design.
Now, it's fine to argue which is the best inside one league.
Element/Matrix is E2EE with double ratchet protocol, so it has both forward secrecy and future secrecy, which DeltaChat doesn't have.
It's only once security is more or less exactly on par, that you should be comparing general UX. Really usable but insecure tool might turn into really unusable tool when you sit in prison for your political opinions, or because you revealed your ethnicity and ICE caught on.
>maybe deltachat is not the best possible, but it is pretty good
It's not the worst out there. At least it tries to do things properly. It's just that given that there's insane obstacle of moving people to a safe platform, DeltaChat is just another distraction. Until it does what competition does security wise, and improves on their UX, it doesn't get the top podium.
>when security and privacy are to onerous then you don't have security or privacy
Sure, but when you're in prison for using crap tool, you won't have liberty, security, or privacy.
ideally yes, but that is not what the average user will do, and it is not what i can use as an argument to get people to switch to something more secure. convenience over security is still a user preference.
i get your point, but that falls on deaf ears among family and friends. especially using prison as an argument is really not helping. i mean by the same argument we should not be having this conversation on hackernews, because clearly we are trying to subvert the authorities by suggesting that people should keep their communication secret.
actually i don't follow that argument. it is more likely that my data gets caught up with someone accessing a larger server than my own server. if someone targets my own server they may as well target all my messaging clients and get all the data from there.
The proper way to address this is with p2p messaging, like Cwtch, where each user is running server for their own account. Cwtch also experimentally supports caching ciphertexts on a server that's hosting the group chats that all members will have access to anyway, so there's no peer metadata to eavesdrop on.
in fact this particular threat that you describe is more likely to happen at a university server where a rogue admin may use their privilege to snoop on people they want to stalk for whatever reason, as opposed to the friend that i chose because i trust them, like say the admin of the server of the local linux user group or the hackerspace that i am a member of.
in fact i am more likely to trust anyone that i know in person, simply because even if that person decides to snoop on me we can work that our in person, and the likely hood for it happening is low because it would affect our friendship. and i would guess that this is true for most people.
at some point you have to trust someone, and the closer you are to that person, the easier it will be to resolve problems.
University students don't get to run infrastructure of the facility, and at least in my uni, the old beard IT staff members and faculty don't really hang out with the students aside course environments or support groups, so there's a bigger gap. There's also salaries and careers in the line.
But bickering about who's trustworthy is pointless when there's trustless architectures for those situations already.
i am not saying it can't happen, but that the smaller the group the easier it is to assess the risk and the consequences. and for that reason i prefer smaller groups.
in austria and germany hiring students for part time sysadmin work is very common. i did those jobs and on the other hand stories from staff stalking that cute student they saw one day do exist.
But bickering about who's trustworthy is pointless
agreed. it all comes down to personal experience and preference.
when there's trustless architectures for those situations already
the problem is that the choice is not made in a vacuum. what good is a system if my friends don't want to use it. for almost my contacts i had to follow the choices of the others. very rarely someone followed my choice. and when they do i have to consider their technical capacity and tolerance to difficulties.
Anyone up to the challenge?
still i like the idea. but deltachat also has a nice UI, and for matrix i use fluffychat which is also quite nice.
edit: I didn't downvote you and I don't think someone asking an honest question like this should be downvoted
Notes and Other Stuff Transmitted by Relays.
It’s just signed json messages distributed by [websocket] relays.
> While nostr offers the ability to send encrypted DMs to user pubkeys, the metadata of these messages are broadcast publicly via relays. This is the same as a bitcoin transaction being viewable on the public ledger. The contents of the direct message will be encrypted, but other metadata like the sender and recipient can be viewed by anyone.
It’s worth highlighting as there are many affinity scammers spinning up tokens/blockchains called “Nostr”.
Effectively nostr objects lay around on relays until they reach some server-side expiry policy and sometimes forever if one can't figure out how to delete them. Nostr clients (web and mobile) are the wild west of good luck with which features and NIPs they support (XMPP all over again). My experience here led to dissatisfaction as well. Relays are another wild west - it's choice paralysis and a helping of good luck with that.
The real problem for IM: Nostr does not ensure all relays sync, instead a user chooses their preferred relays (some you pay for in crypto). It is entirely realistic (I experienced it) that you're on relays other people aren't and you can't share content. This is death for an IM app and just trying to use Nostr became frustrating. (not to mention it's flooded with porn and crypto shills)
Edit: a nostr client has to keep open network connections to all these relays, as objects can be stored on multiple relays and it's up to the client - not the relays - to query all relays and then de-dupe the JSON responses. There are combobouncers in use (who will de-dupe) but they're not the default and tend to be read-only (because you can't choose which relays to post to when using a combobouncer).
But - has there been security audit been done?
Why do every system insists on having persistent names as network identifiers? Practice shows that the main threat for the vast majority of users is state censorship. In case of Delta, Matrix, XMPP and others, once you're cut off your home server, your account is basically toast. The only thing you can do, besides circumventing, is a cumbersome and messy account migration[1], where available.
In case of Matrix, I feel very bitter, as I managed to onboard a considerable chunk of my personal network but most of them can't login anymore without using VPNs. I'm not sure if I have enough social capital to convince them to repeatedly register on different servers as they get blocked. P2P[2] feels still too far away.
Why can't we use key pairs as identifiers and simply request a desired username upon first login? In case of federated networks this would allow seamless server switching and allow users to continue their conversations. Servers shouldn't care what server a particular user's messages are coming from as long as they are verifiably theirs.
You can even add username propagation between servers (a new server requests the username from the old one that's supplied with user's login request). I know about Matrix identity servers but I don't see how it helps in this case.
It took me a couple of clicks to find: https://github.com/deltachat
The "Internet Standards" link points to a URL under https://github.com/deltachat/deltachat-core-rust/ but when you go there it actually redirects and you end up at https://github.com/chatmail/core which is also confusing.
Anyone who hasn't tried it really ought to.
To the haters talking about PGP: giving your entire social graph to Meta or even Signal is considerably worse.
(Delta Chat markedly does leak your social graph, because it's email and email has no way to protect sender metadata from each user's email provider. That means full social graph recovery is one low-effort subpoena away in your attacker's municipality of choice.)
Their contact discovery uses SGX, which has a long list of vulnerabilities [1], and is even deprecated by Intel.
With access to the server, my guess is that getting someone's social graph is not entirely impossible.
[1]: https://en.wikipedia.org/wiki/Software_Guard_Extensions#List...
> I’ve tested Delta Chat with my own mail server, which uses Postfix and has everything configured for public e-mail, like DKIM signing, spamd, IP blocklist checks and so on, and each message took about 2 seconds from one device to another. Using a public server it sure feels below 300ms, so there is room for improvement when self-hosting a dedicated chatserver.
https://www.kassner.com.br/en/2025/05/08/delta-chat-encrypte...
In my test, both clients were ~80ms away from the IMAP server, but the server was delivering to itself. I’m also not sure if the port 587 has an idle/keepalive mechanism, or if it has to go around the entire TLS handshake at each message.
I don’t think 2 seconds is bad, most of my contacts will take at least triple that to read and type in an answer, so not a big deal.
The OpenPGP crypto can never be "outdated" because it is constantly being updated.
There's no PGP equivalent of TLSv1.3. The last time people tried that it created a huge drama.
> The OpenPGP crypto can never be "outdated" because it is constantly being updated.
Yet it hasn't been, it's not there in the implementations, it's not there in the defaults.
> giving your entire social graph to Meta or even Signal
1) Signal does not have your social graph2) you are not required to give the app access to your contacts
Stop spreading this misinformation, it is only making it harder to get people onto secure messaging systems. You need two people using secure systems to communicate and the result of all this horseshit is a bunch of armchair experts who haven't bothered to look into the actual security of the app making strong confident statements. Just stop.
Even if it had half the issues people pretend it does let's be honest, my grandma can use signal. That's a fuck ton better than most of the alternatives out there. Frankly, that's what 99% of people need, the app that everyone can use. Not the app that some techie says is trivial...
Side note) Comparing Signal to WhatsApp is wildly disingenuous.
Side side note) there's a 30 yo pgp hack. If you reply to a gpg email with "could not decrypt" you'll get back the email in clear text. (Joke is older than the average HN user)
IMO people freak out about spam way too much. I'd rather have something that works with occasional spam than have to put up with the insanity of modern IM. Having push notifications from 10 proprietary IM apps is worse spam than a couple of emails a day from some retard trying to get me to download a "pdf." I don't block spam at all in my personal email (although I have a couple of tools automatically label it.) I'd rather have everything delivered.
I got spam to postmaster once for some reason. That's a nice way to make admins aware of your spam campaign.
Spam is presumably more of a problem when you're more well-known and you don't have the option to control your own filters.
Other than that it looks like I get like 4 spams per week.
Mind, i don't publish my email anywhere. If you look at my profile on here you'll get a gmail address.
Or at least via a proxy.
So contact invitation can just be handled with use-once codes (or at least trivially burnable ones).
So I would say it's a low priority feature in the backlog.
1. Manually screen who can send you messages like Hey[^1] and Apple[^2]
2. Basic filtering to ensure the promotional stuff gets blocked or put in a separate list [^3]
3. Rate-limit senders who are showing robot like behaviour
---
[^1]: https://www.hey.com/features/spam-corps/
[^2]: https://support.apple.com/en-il/guide/iphone/iph203ab0be4/io...
Edit: Also this wasn’t about collecting phone numbers, but about providing one for your business if you host a publically accessible site
You see, most EU countries decided some time ago that allowing people to own mobile numbers without a background check was simply too dangerous. What if someone used a burner phone to commit fraud, or worse — say something mildly controversial on the internet? To prevent such dystopian chaos, SIM registration laws were born. Now, whenever you purchase a SIM card in France, Germany, Spain, or pretty much anywhere with croissants, you have to offer your passport, soul, and, ideally, a letter of recommendation from your local constable.-
The result? Your phone number in the EU is no longer just a string of digits—it’s basically your name, address, and social security number all rolled into one. It’s like a little snitch in your pocket, ready to identify you at the first sign of online mischief. Online platforms know this. That’s why so many of them, from social networks to AI models, insist on a phone number. They’re not just trying to text you cute security codes — oh no, they’re trying to make sure there’s a warm, squishy, legally-recognizable human on the other end. Preferably one without too many fake Twitter accounts.-
Technically, GDPR is supposed to protect your data. That includes your phone number. But there’s a loophole the size of Luxembourg: if the phone number is used to stop terrorism, fraud, bots, or people being mean in the comments, then suddenly it’s all hands on deck. Platforms benefit from the comforting knowledge that EU phone numbers are like digital dog tags: traceable, trackable, and just annoying enough to prevent the average troll from spinning up 50 accounts to yell into the void.-
Of course, this all raises philosophical questions. Like: should your right to privacy hinge on your desire to play Candy Crush in peace? Is a SIM card a person? Could it run for European Parliament? And should we perhaps explore more civilized alternatives to this “one phone number equals one identity” system, like zero-knowledge proofs or just asking nicely?
In the meantime, welcome to the EU: where the cheese is soft, the bureaucracy is hard, and your SIM card knows more about you than your therapist.-
There are several countries that didn't buy into the madness of registering SIMs, luckily. Most strangely, the UK, the master of CCTV. Apparently they realized that it's a useless measure and will just anger the people.
Briar supports communication over multiple mediums, including wifi & Bluetooth, has forward secrecy, and feels quite 'signal-like' so its not impossible to get people to use it.
Briar looks really nice for people that really need the extra security and privacy, though.
I've been interested in SimpleX lately, but I don't really have anyone to properly test it with.
Delta Chat is an instant messaging scheme. It is still good to use preexisting standards where possible.
Maybe something Proton should build on for its own chat app.
Maybe with AI there could be a sort of decentralized antispam filtering . but maybe not
Have their been done any third-party security audits by reputable companies?
If not, it's not safe to use - who knows what's buried in the source code (even if the source code is open).
Their FAQ answers this:
> Yes, multiple times. The Delta Chat project continuously undergoes independent security audits and analysis
I also built OTR on top of Discord but it requires Nitro because the messages for OTR end up being way too long. :(
i am using element/matrix and i have tried briar. the usability of deltachat and the ease of onboarding beats both of those. briar was especially difficult to get started with and only has a very limited usefulness compared to the others. and matrix is simply very complex and easier to misconfigure.
signal does not use a standardized protocol, and it requires a phone. that's not an alternative. my children have deltachat on their laptop. i can talk to them when i am not at home without needing to give them a phone.
OTR has had forward secrecy for 21 years. The effin headline stated PGP was a faulty model https://dl.acm.org/doi/10.1145/1029179.1029200
Why implement something PGP-like, without forward secrecy, 13 years later, beats my understanding. I mean, 13 years is also the time difference between OTR and PGP. I guess some devs don't read cornerstone papers of the field they supposedly specialize in :)