130 pointsby dogtype2 days ago19 comments
  • Sat_P18 hours ago
    I was using Boxcryptor with OneDrive for over 5 years and once they shut it down, I moved everything back to my local SSD. This had a number of advantages, the biggest one being that I could now use MacOS search to find files at lighting speed. I’ll never go back to cloud storage for files again due to latency. As a precaution, I now back up all of my data to an external HDD daily, then to a separate one on 1st of each month. Critical financial data is archived to a BluRay on the first day of each quarter.
  • jszymborskia day ago
    I wonder how Cryptomator [0], EncFS [1] or gocryptfs [2] stacks up.

    [0] https://cryptomator.org/

    [1] https://vgough.github.io/encfs/

    [2] https://github.com/rfjakob/gocryptfs

    • tptaceka day ago
      If they aren't multitenant systems it doesn't make sense to compare them to the targets of this paper.
      • jszymborskia day ago
        I suppose I meant for the specific use case where you store and sync the encrypted file systems with cloud providers like e.g. Dropbox or pCloud.

        But perhaps I've misunderstood you.

        • traceroute6618 hours ago
          > I suppose I meant for the specific use case where

          You're missing the point.

          Anybody can download $insert_name_of_your_favourite_software and use that to encrypt data before uploading to cloud storage.

          The point here is the discussion about multi-tenant cloud-based solutions. You know, the sort of thing you use in a work environment when you share documents with your colleagues.

          In that context, the DIY $my_favourite_tool pre-encrypt route is simply not feasable or scalable and would be hell on earth to manage and maintain.

  • triyambakama day ago
    Hmm, I wish the author had reviewed Proton. I think it's kind of seen as a meme here? But I heavily rely on it and generally the Proton ecosystem is getting better and better from a UX perspective
    • canadiantima day ago
      I think Proton is more viewed as a honeypot
      • 3np19 hours ago
        Which would make it all the more interesting to look at.
      • ziddoapa day ago
        I have not seen this take before, do you have any pointers to someone making this claim?
        • Founders with US affiliation/physicist creating crypto products [1], faulty claims how the relevant Swiss law (BÜPF) applies to them [2], doing crypto in JavaScript on the client side, etc. To me, this smells like Crypto AG [3][4].

          [1] https://proton.me/about/team

          [2] https://steigerlegal.ch/2019/07/27/protonmail-transparenzber...

          [3] https://en.wikipedia.org/wiki/Crypto_AG

          [4] https://en.wikipedia.org/wiki/Operation_Rubicon

          • nla day ago
            Doing crypto on the client side in JS is absolutely the correct way to do this if you want E2EE with a web client. You need to be careful about supply chain attacks etc.

            > To me, this smells like Crypto AG

            It's easy to throw around unsubstantiated, impossible to disprove theories.

          • devman0a day ago
            How else would you do client side crypto for a website if not with JavaScript, isn't that kind of the point of how Proton does E2EE?
            • stavrosa day ago
              Crypto for websites is completely broken (because the server can serve you whatever it wants), so doing crypto for websites at all is suspicious.
              • iknowstuffa day ago
                I guess they have this for local email decryption: https://proton.me/mail/bridge

                idk if they have anything like that for their other products like calendar or file storage

                Presumably if you stick to mobile apps you won't be using JavaScript served by their server? Unless they're just html wrappers

              • ranger_danger15 hours ago
                It's not "broken", please don't spread FUD. It's a whole lot more transparent than doing it on the server side. Client code can be inspected and publicly audited, and many times you can save/cache it so that it doesn't change. Also opens up the possibility for third party standalone apps that don't change often.
            • ranger_danger15 hours ago
              WASM? I have seen it used a lot for this.
          • Is the suggestion that founders who have US affiliation are automatically in bed with three letter agencies?
            • justinclifta day ago
              If they're physically located in the US, they have no way to stop (legal) coercion by the TLAs yeah?
        • thephybera day ago
          From my reading, the “ProtonMail is a honey trap” meme seems to be a popular rumor. Seems like there might be some smoke, but I haven’t seen any fire.

          Interesting breakdown[1] of one of the claims that E2E encryption on ProtonMail is broken.

          I’m assuming that Proton storage is a product from the same team as ProtonMail.

          [1] https://lemmygrad.ml/post/4177

        • yieldcrva day ago
          In account creation, requiring a phone number for “spam prevention” on Tor

          There was some deanonymizing like that, phone or credit card

          • ranger_danger15 hours ago
            KYC for a business is the smart legal move IMO whether it's technically required or not. Yes Proton is required to cooperate with law enforcement and government requests. Mullvad has been raided and Tutanota servers have been seized before too. Nobody is going to jail for you.
      • triyambakama day ago
        So what's the alternative?
        • The alternative is to store Cryptomator vaults on any cloud. I’m looking forward for reading that proton drive allows cyberduck compatibility to manage Cryptomator vaults
        • opengearsa day ago
          Selfhosting
          • dartos19 hours ago
            If that 3+ TB attack CF just mitigated starts aiming at the entire ipv4 range (probably more spread out and cyclical), self hosting could die :(

            At least anything on UDP

          • triyambakama day ago
            Hmm, yeah that's true
      • traceroute6619 hours ago
        > I think Proton is more viewed as a honeypot

        Honestly, this FUD goes round more often than the seasons.

        As with most FUD, I'm still waiting for someone to prove it.

        And no, I don't buy the "smells like crypto AG" FUD, because you could use that sort of FUD for any of the US-cloud providers ....

        For example, when AWS says "trust us" in relation to their KMS or HSM services, can you, really ... eh eh eh .... how do you know KMS or HSM isn't just a software proxy that pretends to be what it is ? :)

        The fact is that if you are going to use someone else's servers to do something for you. Whether that someone is Proton or AWS or anyone else. You are, by definition, forced to abstract away your trust boundaries.

  • xaropea day ago
    I like the way you can use the tabs to check the results of each reviewed cloud storage service, and the exposition on each. Anybody know what the authors used to create this website? Custom built, or a templated version?
  • fguerraza day ago
    It's too bad they focused on commercial closed-source solutions providers. The ecosystem would have really benefited if they had put their efforts to, for example, do the same work with NextCloud.
  • iknowstuff2 days ago
    curious about iCloud with Advanced Data Protection enabled
    • MichaelZuoa day ago
      Considering iCloud does have some documented cases of silent corruption, such as of original resolution media stored in Photos, it might not be the best choice.
  • V__a day ago
    Since ente.io's server is just an object storage, I feel at some point either ente or someone else is going to make a drive app for it.
  • https://dropbox.tech/security/end-to-end-encryption-for-drop...

    dropbox has been mentioned in the article and I think the author is drinking kool-aid and throwing random facts

  • CPAhema day ago
    We use Syncdocs (https://syncdocs.com) to do end-to-end Google Drive encryption.

    The keys stay on the client. It is secure, but means the files are only decryptable on the client, so keys need to be shared manually. I guess security means extra hassle.

  • ThePhysicista day ago
    Nice to see that Tresorit didn't have any serious issues in this analysis, I've been using that for a long time and it works really great, also one of the few players that have a really good Linux client.

    The two vulnerabilities they found seem pretty far-fetched to me, basically the first is that a compromised CA server will be able to create fake public keys, which I honestly don't know how one could defend against? Transparency logs maybe but even that wouldn't solve the issue entirely when sharing keys for the first time. The second one around unencrypted metadata is hard to assess without knowing what metadata is affected, it seems that it's nothing too problematic.

    • tptaceka day ago
      Tresorit had a game-over vulnerability: public keys aren't meaningfully authenticated (the server can forge keys; the CA the paper discusses is operated by the service) and any attempt to share a directory allows the server to share that directory with itself.
      • traceroute6620 hours ago
        > Tresorit had a game-over vulnerability:

        I would still (for now, at least) trust Tresorit over any of the US jurisdiction services. I wouldn't put my data on US jurisdiction servers no matter how much money you gave me.

        I am, for now, tempted to say we should get a detailed explanation from Tresorit before jumping to conclusions.

        It seems to me the author of the website made many assumptions, it is not clear if they entered into any sort of meaningful dialogue before publishing.

        > any attempt to share a directory allows the server to share that directory with itself

        Surely this is by definition required ?

        If you wish to share a file or a directory with somebody external from your organisation via a simple link. How, exactly, do you envisage that happening without granting the Tresorit server permission to be the intermediary ?

        Sure, you could, theoretically, mandate those third-party people to install software on their devices, or to register an account or whatever. But let's face it, in the real world, if you want to share a file or directory as a one-off with someone ? And forcing people to do extra steps for a one-off share is just introducing friction. Also some people can't install random software on their computers due to corporate policies.

        • tptacek16 hours ago
          I really don't care about this jurisdiction stuff; I'm just here to talk about the cryptography, which, in the case of Tresorit, is not great.
    • traceroute6615 hours ago
      Worth pointing out for balance that Tresorit are aware of the paper and have published a statement on their LinkedIn[1].

      [1] https://www.linkedin.com/posts/tresorit_end-to-end-encrypted...

  • eemil18 hours ago
    One downside to encryption, is it prevents the server operator from doing any deduplication (file or block level) on their end.

    Maybe one reason why cloud providers aren't pushing it that heavily. Especially the big players, since more data = more duplication = more efficient deduplication.

    • idle_zealot18 hours ago
      Is that true? Couldn't you run dedupe on blocks of encrypted files? I assume there would be fewer duplicate blocks compared to the cleartext, but if you have a bunch of blocks full of random bits there are bound to be repeats with a large enough number of blocks.
      • hansvm18 hours ago
        If you can, you've effectively broken the encryption. Any scheme that takes random data and stores it in less space, when accounting for the overhead of the scheme itself, is astronomically unlikely to succeed by more than a few bits saved in any specific example (and on average across all such random streams cannot save space at all).
      • mprime116 hours ago
        Indeed. Borg for example is e2e but able to dedupe.

        My bookmark archive is 10TB but deduped on-disk size is 100GB because most files are the same across backups!

        https://www.borgbackup.org/

        • orf15 hours ago
          That’s not the same thing at all.
      • RcouF1uZ4gsC18 hours ago
        Even 32 bytes of random data has an astronomically low chance to ever have a collision.
    • tjpnz18 hours ago
      Double edged sword. Mega Upload were doing it and it was argued (successfully) in court that they therefore had knowledge of what they were hosting.
  • mr_toada day ago
    If you don’t trust your cloud provider to not look at your data, why would you trust them with encryption?

    It’s not hard to encrypt it before you upload it.

    • tptaceka day ago
      Because not having to trust the provider is the entire premise of these services, and without that premise, you might as well just store things in GDrive.
  • a day ago
    undefined
  • cobbzillaa day ago
    The sad state of E2E encryption for cloud storage is a big part of why I wrote mobiletto [1]. It supports transparent client-side encryption for S3, B2, local storage and more. Rekeying is easy- set up a new volume, mirror to it, then remove old volume.

    [1] https://github.com/cobbzilla/mobiletto

  • paulgerhardta day ago
    My understanding was that Tarsnap was just fine and that was the preferred solution for Hacker News outside Dropbox.
  • megousa day ago
    That was a good skim for me as someone who implemented one of the first independent mega.nz clients. Useful to know especially about structure authentication and ability to swap metadata on files and move files/chunks of files around when server is compromised, when there's no e2e authentication for this. Lots of traps all around. :)

    Looks like the safest bet is still to just tar everything and encrypt/sign the result in one go.

    I wonder how vulnerable eg. Linux filesystem level encryption is to these kinds of attacks...

  • slaca day ago
    Google Drive has allowed for client side encryption since 2022... This papers first paragraph is false.
  • java-man2 days ago
    I want to see the response from sync.com on this, especially about

      Unauthenticated Key Material
    
      Unauthenticated Public Keys
    
    attacks.
  • swijcka day ago
    The world changes once you realize why usually encryption is capped at AES256...
    • oconnorea day ago
      256 bit symmetric cryptography keys are a bit like picking one atom in the universe (10^80 atoms, or 100000000000000000000000000000000000000000000000000000000000000000000000000000000). Your opponent would have to test half of the atoms in the universe to have a reasonable chance of getting the right key.

      That's generally understood to be not feasible.

    • ziddoapa day ago
      Care to enlighten us? What did you realize?
      • levkka day ago
        It's too CPU heavy and your webservers crash under load would be my guess, for no added benefit [1] of course.

        [1] https://security.stackexchange.com/questions/14068/why-most-...

        • swijcka day ago
          Correct. Anything higher is an order of magnitude more computationally expensive to do for no real reasonable gain. Multiple layers of encryption get you there far enough. Better to dig deeper into other cryptography methods than try increase AES beyond 256. Its already rather insane how quickly decryption happens.
          • coppsilgolda day ago
            You can trivially modify the AES key schedule to have a key size of any length (ex. replace it with a hash function or a sponge construct) and have any number of increased rounds in the AES permutation. Performance impact will linearly scale with the number of rounds.

            You can even have no key schedule at all and just make your AES key size in bits = 128 * num_of_rounds. This doesn't mean that the bruteforce complexity is going to be that high but that would hardly matter...

          • ziddoap5 hours ago
            Hmm, not sure how this was supposed to change my world. I thought you had some secret cabal conspiracy or something to share.