• 0 Posts
  • 43 Comments
Joined 1 year ago
cake
Cake day: June 13th, 2023

help-circle
  • If you use GnuPG or one of the GUI implementations it does.

    No, because it’s the server that terminates the TLS connection, not the recipient’s client. TLS is purely a security control to protect the transport between you and the server you are talking to. It doesn’t have anything to do with e2ee. It’s still important, of course, but not for e2ee.

    You do realize e2ee merely means that two users share public keys when they communicate in order to decrypt the messages they receive, right?

    And how does TLS between you and your mail server help with this? Does it give you any guarantee that the public key was not tampered when it reached your server? Or instead you use the fingerprint, generally transmitted through another medium to verify that?

    Nothing to stop you from hosting your own on an encrypted drive.

    An encrypted drive is useful only when the server is off against physical attacks. While the server is powered on (which is when it gets breached - not considering physical attacks) the data is still in clear.

    EteSync does E2E already

    And…it requires a specialized client anyway. In fact, they built a DAV bridge (https://github.com/etesync/etesync-dav). Now tell me, if you use this on -say- your phone, can you use other DAV tools without using such bridge? No, because it does something very similar to what Proton does. If proton bridge will get calendar/contacts functionality too (if, because I have no idea how popular of a FR it is), you are in the exact same situation.


  • It doesn’t matter that your private key is stored on their servers encrypted/hased or whatever. If you were simply storing it there, that would not be an issue. The problem is that you’re also logging in and relying on whatever JS is sent to you to only happen client-side.

    I feel like I covered this point? They make the client tool you are using, there is 0 need for them to steal your password to decrypt your key. Of course you are trusting them, you are seeing your unencrypted email in their webpage, where they can run arbitrary code. They do have their clients opensourced, but this doesn’t mean much. You are always exposed to a supply-chain risk for your client software.

    Most users aren’t sending emails from their Proton to other Proton users either.

    So…? The point is, if they do, encryption happen without them having to do anything, hence transparently. That was the point of my argument: my mom can make a proton account and send me an email and benefit from PGP without even knowing what PGP is.

    Furthermore, the users that want encryption seek it out.

    And that’s the whole point of the conversation: these users are techies and a super tiny minority. This way, they made a product that allow mainstream users to have encryption.

    Thunderbird or other mail clients that is open source and their apps are signed or you can reproducibily build from source.

    And this control is worth zilch if they get compromised. This is a control against a MiTM who intercepts your download, it’s not a control if “the maker of Thunderbird” decides to screw you over in the same way that Proton would do by serving malicious JS code. If the threat actor you are considering is a malicious software supplier, you have exactly the same issue. There can be pressures from government agencies, the vendor might decide to go bananas or might get compromised.

    However, once that is built it doesn’t change. With Proton, everytime you visit their site you don’t know for sure that it hasn’t changed unless you’re monitoring the traffic.

    Yes, this is true and it’s the real only difference. I consider it a corner case and something that only affects the time needed to compromise your emails, not the feasibility, but it’s true. I am counting on the other hand on a company who has business interests in not letting that happen and a security team to support that work.

    A government is much more likely to convince Proton to send a single user a custom JS payload, than to modify the source code of Thunderbird in a way that would create an exploit that bypasses firewalls, system sandboxing, etc.

    Maybe…? If government actors are in your threat model, you shouldn’t use email in the first place. Metadata are unencrypted and cannot be encrypted, and there are better tools. That said, government agencies have the resources to target the supply chain for individuals and simply “encourage” software distributors to distribute patched versions of the software. This is also a much better strategy because it’s likely they can just get access to the whole endpoint and maintain easy persistence (while with JS you are in the browser sandbox and potentially system sandbox), potentially allowing to compromise even other tools (say, Signal). So yeah, the likelihood might be higher with JS-based software, but the impact is smaller. Everyone has their own risk appetite and can decide what they are comfortable with, but again, if you are considering the NSA (or equivalent) as your adversaries, don’t use emails.

    You mean their PWA/WebView clients that can still send custom JS at anytime, or their bridge?

    Yes.

    First, explain what you mean by a fat client? GnuPG is not a fat client.

    In computer networking, a rich client (also called heavy, fat or thick client) is a computer (a “client” in client–server network architecture) that typically provides rich functionality independent of the central server.

    What I mean is this: a client that implements quite some functionality besides what the server would require to work. In this case, the client handles key management, encryption, decryption, signature verification etc. all functionalities that the server doesn’t even know they exist. This is normal, because the encryption is done on top of regular email protocols, so they require a lot of logic in the client side.

    Being able to export things is a lot different than being able to use Thunderbird for Calendars, or a different Contacts app on your phone.

    For sure it’s different, I didn’t say it’s the same thing. I am saying that you can migrate away easily if your needs change and you’d rather have interoperability.

    DAV is as secure as the server you run it on and the certificate you use for transport.

    Exactly. Which is why in the very comment you quoted I said:

    There is a security benefit, and the benefit is trusting the client software more than a server, especially if shared.

    Are you trusting your Nextcloud instance (yours of hosted by someone else) not getting pwned/the server being seized/accessed physically/etc. more than you trust Proton not to get pwnd? Then *Dav tools might be for you.


  • Why would anyone be interested in efforts on a platform with a closed-source backend and that is not developer focused?

    Because most people don’t care about those particular things. Almost all the world uses completely proprietary tools (Gmail) that also violate your privacy.

    Not to mention, entirely unnecessary why you should have to use a bridge gateway in the first place with IMAPS & PGP/GPG, CalDav & CardDav. Like I said, Proton is engaged in some questionable practices.

    It’s not unnecessary, it’s the result of a technical choice. A winning technical choice actually. PGP has a negligible user-base, while Proton has already 100 million accounts. I would be surprised if there were 10 million people actually using PGP. They sacrificed the flexibility and composability of tools (which results almost always in complexity) and made an opinionated solution that works well enough for the mainstream population, who has no interest in picking their tools and simply expects a Gmail-like experience.

    And if you really have stringent requirements, they anyway provided the bridge, so that you can have that flexibility if it’s really important for you.

    IMAPS & PGP/GPG, CalDav & CardDav

    • IMAPs is just IMAP on TLS, so it does not have anything to do with e2ee in this context.
    • PGP/GPG is what they use. They just made a tool that is opinionated and just works, rather than one which is more flexible but also more complex. Good choice? Bad choice? It’s a choice.
    • *DAV clients expect cleartext data on the server. If you encrypt the data, you need to build all this logic into the clients, and you are not following the standard anymore, which means you will anyway be bound to your client only (and those which implement compatibility). Proton decided that they want to implement e2ee calendar, and they decided to roll their own thing. It’s up to everyone to decide whether e2ee is a more important feature than interoperability with other tools. I don’t care about interoperability, for example, and I’d take e2ee over that.

  • Proton stores your keys

    Proton stores an encrypted blob.

    All they need now is your decryption password & they can read your messages

    “All they need now is your private key”. It’s literally a secret, they use bcrypt and then encrypt it. Also, “they” are not generally in the threat model. “They” can serve you JS that simply exfiltrates your email, because the emails are displayed in their web-app, they have no need to steal your password to decrypt your key and read your email…

    It isn’t transparent, because most users aren’t running their own frontend locally and tracking all the source code changes.

    Probably we misunderstand what “transparent” means in this context. What I mean is that the average user will not do any PGP operation, in general. Encryption happens transparently for them, which is the whole thing about Proton: make encryption easy and default.

    Now you’re merely trusting them to not send you a custom JS payload to have your decryption password sent to the server.

    Again, as I said before, they control the JS, they can get the decrypted data without getting the password…? You always trust your client tooling. There is always a point where I trust someone, be it the “enigmail” maintainers, Thunderbird maintainers (it has access to messages post-decryption!), the CLI tool of choice etc.

    How many users are actually utilizing their hidden API to ensure that decryption/encryption is only done client-side?

    I mean, their clients are open-source and have also been audited?

    If they have your private key, how many users do you think are using long enough passwords to make cracking their password more challenging?

    I don’t know. But here we are talking about a different risk: someone compromising Proton, getting your encrypted private key, and starting bruteforcing bcrypt-hashed-and-salted passwords. I find that risk acceptable.

    This is just entirely inaccurate and you’ve failed to provide any "proof’ for your generalizations here.

    See other post.

    If you actually understood PGP you’d know you can generate and use local-only keys with IMAPS and have support to use any IMAP client.

    Care to share any practical example/link, and how exactly this means not having a fat client that does the encryption/decryption for you?

    There is no security benefit in their implementation other than to lock you into a walled garden and give you a false sense of security.

    Right, because *DAV protocol are so secure. They all support e2ee, right…? There is a security benefit, and the benefit is trusting the client software more than a server, especially if shared. You can export data and migrate when you want easily, so it’s really a matter of preference.


  • There are certain things that are known facts, there is no need to prove them every time.

    The simple fact that:

    • There is not a standard tool that is common
    • The amount of people who use PGP is ridiculously low, including within tech circles. Just to make one example, even a famous cryptographer such as Filippo Sottile mentions to receive maybe a couple of PGP encrypted emails a year. I work in security and I have never received one, nobody among my colleagues has a public key to use, and I have never seen anybody who was not a tech professional use PGP.

    You can also see:

    We can’t say this any better than Ted Unangst: “There was a PGP usability study conducted a few years ago where a group of technical people were placed in a room with a computer and asked to set up PGP. Two hours later, they were never seen or heard from again.” If you’d like empirical data of your own to back this up, here’s an experiment you can run: find an immigration lawyer and talk them through the process of getting Signal working on their phone. You probably don’t suddenly smell burning toast. Now try doing that with PGP.

    A recent talk, I will quote the preamble:

    Although OpenPGP is widely considered hard to use, overcomplicated, and the stuff of nerds, our prior experience working on another OpenPGP implementation suggested that the OpenPGP standard is actually pretty good, but the tooling needs improvement.

    And you can find as many opinion pieces as you want, by just searching (for example: https://nullprogram.com/blog/2017/03/12/).

    However, if you really believe I am wrong, and you disagree that PGP tooling is widely considered bad, complex and almost a meme in the security community, you are welcome to show where I am wrong. Show me a simple PGP setup that non-technical people use.

    P.s.

    I also found https://arxiv.org/pdf/1510.08555.pdf, an interesting paper which is a followup of another paper 10 years older about usability of PGP tools.




  • It’s actually fairly simple: if the server never has access to the keys or the plaintext of messages (or calendar events, etc.), then you need a client tool to handle decryption and encryption operations.

    They use PGP, and they have implemented this feature in a way that it’s completely transparent to the user to make it mainstream. So they chose building dedicated tools (bridge, web client), rather than letting users use their own tools, because the PGP tooling sucks hard and it’s extremely inaccessible for the general population.

    This means that you need a fat client, whatever you do, or otherwise the server will have access to the data and there is no e2ee. Instead of using enigmail or other PGP plugins/tools, they built the bridge.






  • Hey, that’s actually a very nice project, and to be honest, I can kinda imagine that the saving is minimal if there at all, in terms of time. Partially, I think this is also due to the fact that we are talking about super small amounts of time anyway! Moving files around I think it’s totally fast with a mouse, and in general I still do it like that. For me speed is really a secondary thing, it’s about ergonomics and limiting my movements. Chances are, I am already writing on the keyboard when I want to do something, so it might not be faster to switch to browser with mod+2 and back to terminal with mod+1, but it’s less movement to find the mouse, rotate the shoulder (my split kb is open at shoulder width) etc. Also I think I would argue that requires less focus because it’s inherently more mechanic as an action compared to find a button and click, or dragging and dropping something. Either way, it’s for sure something interesting to look at!


  • Oh no, I get it, I do have a work-issued macbook pro which I am currently not using in favour of a Linux machine. The main reason for me is ergonomics. My laptopt is closed in a vertical stand, and I cannot imagine myself moving the hands so much do to stuff. I do basically everything what the trackpad does with i3 keybindings, which I find not only faster, but also allow me to reduce movement of my arms and ultimately limiting wrist/arms stress.

    Obviously I completely agree that if one has or prefers to work with trackpads, apple ones are honestly great.






  • On the other hand, you can approach the dramatic cut of emissions from both angles, as in “you are not legally able to do what you want as long as you can pay for it, and you have the responsibility in minimizing emissions”.

    Internet does generate a lot of emissions. Streaming quality, website size. Whatever we do to reduce the energy demand is a good idea, as long as we don’t think of it as " The Solution", but as part of a wide range of actions aimed at slashing energy consumption.


  • I want to add a small bit of info that might be useful in the future. Your script doesn’t need really to be run with root privileges. Your backup script likely needs access to parts of the filesystem which are only readable from root but that’s all it needs. The root privileges are essentially a combination of capabilities (see man capabilities) attached to processes. In your case, what you want is the CAP_DAC_READ_SEARCH, which allows read access to every file. You can for example add this capability to rsync (or more likely, to Borg,restic or rustic - which are backup tools I recommend you look at! They do encryption, deduplication etc.) and then you can use that binary as a low-privileged user, but having that slice of root privileges. Obviously, there is a risk in this too, but can be compensated in other ways as well (for example running the backup job in a sandbox etc. - probably out of scope for now).

    While in this particular case it might not be super relevant (backups are executed often as root or as a backup user which has read access), it might be useful in the future to know that very rarely full root privileges are needed, and you can run tools only with the specific capability needed to perform that privileged action. You can check setcap and getcap commands.