So it's a cool project, but not really what I'd say is a Dropbox replacement.
Which mobile OS would that be?
The big reason I stopped being excited about cloud storage is that on mobile, from what I can tell, none of the major providers care about "folder that syncs" experience. You only get an app that lets you view remote storage. The only proper "folder that syncs" I had working on my phone so far was provided via Syncthing, but maintaining that turned out to be more effort than my tiny attention span can afford these days.
(Unfortunately, both mobile platforms themselves are actively fighting this approach, and instead encourage apps to keep data in private databases and never expose them as actual files.)
Which is, in the end, true of a lot of tools where the underlying 'things' aren't particularly spectacular but rather it's the user experience that sells it
I ended up creating https://github.com/nickjj/bmsu which calls rsync under the hood but helps you build up a valid rsync command with no surprises. It also codifies each of your backup / restore strategies so you're not having to run massively long rsync commands each time. It's 1 shell script with no dependencies except rsync.
Nothing leaves my local network since it's all local file transfers.
Free, opensource, works on computers and phones, can in most cases puncture nat, supports local discovery (lan, multicast).
No googles, no dropboxes, no clouds, no AI training, no "my kid likes the wrong video on youtube, now our whole family lost access to every google account we had, so we lost everything, including family photos", just sync!
(not affiliated, just really love the software)
File sync can't be that hard! Enters the first 3 way conflict and everything explodes.
Dont misunderstand me, this is a cool idea. But if your rotation time between ideating a project and pushing it to HN is a week, you don't understand the problem space. You didn't go through the pain of realizing its complexity. You didn't test things properly with your own data, lost a bunch of it and fixed the issues, or realized it was a bad idea and abandoned it. I have no guarantee you'll still be there in a month to patch any vulnerabilities.
Not that any open-source project had these kind of guarantees until now, but the effort invested in them to get to that point was at least a secondary indicator about who built it, their dedication, and their understanding of the space.
I didn't realize I've been reading HN nearly its whole existence. For all my complaining about what's happened to the internet since those days, HN has managed to stay high quality without compromising.
My solution so far has been NextCloud, but I'm getting pretty fed up with it. But not enough to actually do anything about it... yet.
I too would like the answer to this concern because the features page doesn’t mention it. I want to be able to handle file version history.
I’m currently using Filen which I find very reasonable and, critically, it has a Linux client. But I wish it was faster and I wish the local file explorer integration was more like Dropbox where it is seamless to the OS rather than the current setup where you mount a network share.
1 TB is roughly 20-30 USD per month at AWS/GCP only in storage, plus traffic and operations. R2 is slightly cheaper and includes traffic.
Compared to e.g a Google AI plan where you get 5 TB storage for the same price (25 USD/month) + Gemini Pro thrown in.
Feature request: Google Drive for desktop.
That is the feature that gives your drive as a mounted file system that stream files as you need them.
It gives me the ease of having access to a giant amount of files stored in my gdrive without having to worry about the space they take up locally nor moving files up and down.
Actually, what solutions to that might already exist? I don't really use the web UI of gdrive as much as use it as a cloud disk drive.
How much on S3? A LOT more.
Maybe you use 1TB, maybe just 10GB. As a user on this site I expect you know that a 10GB plan and a 1TB plan won't be that much different.
I'd rather control the whole stack, even if it means deploying my own hardware to one or more redundant, off-site locations.
Edit: Are there robust, open source, self-hosted, S3-compliant engines out there reliable and performant enough to be the backend for this?
But then you still need a bazillion dependencies and a db just to manage files already on your filesystem.
I would have considered it when rebuilding my media infra but haven't seen anything close to this
It was something like that. https://www.kimsufi.com/pl/?range=kimsufi&storage=12000%7C11...
As you can see price is 180% of that now for bit more storage.
Yeah I thought that it would be the KS range but didn't see anything close in term of pricing.
16TB for $35 is a no-brainer!
I'm currently migrating away from a KS because the disks are almost dead now so I had to go with another solution for TB storage.
But it was great when it was working!
Right now I do not have time, but it would be nice to move storage all of my services there so in case of trouble with one server I could instantly spin them up on other machine by mounting S3 storage. Performance probably wont be great but if main machine will go down I will be still able to use my home automation for example on some secondary without much of a hassle.
Anyway having dedicated server and backup storage for 30$/mo does not seem unreasonable.
Old technology still works, even if it is old!
And so easy to set up on a home computer. Except it's not always on and doesn't come with backups.
I'm not saying S3 is where it's at but might need a bit more than just Samba. Or maybe you don't but people who need Dropbox do.
Turning on SMB is usually just a click of a button, even macOS supports it
Any user technical enough to be able to set up an S3 bucket, Syncthing, Nextcloud or this "Locker" tool from OP can also set up an SMB share
I was responding to the above thread, where sharing files on an offline network is being discussed. Backups were not mentioned as a requirement.
But sharing a folder on my Mac with my wife’s MacBook has been a Google diving, arcane command line headache.
I would have thought sharing the folder, and marking ‘everyone’ for all the read/write modes would be enough. But, no.
I guess with APFS it’s a lot more fiddly. It’s not intuitive, and not in the info panel.
Sure, ChatGPT can help, but to use it reliably, you still need enough medical knowledge to ask good questions and evaluate the answers.
(and regarding contributors for all of his projects, it's mostly vibe-coded)
The comment is disingenuous, though, since Locker doesn't need AWS S3 to function.
This is in Go, exposes both webdav and SFTP servers, with user and admin web interfaces. You can configure remotes, then compose user space from various locations for each user, some could be local, others remote.
For everything else I use paid onedrive subscription. The biggest problem is user interface with s3 like storage and predictable pricing because remember you also pay for data retrieval and other storage apis, with dropbox etc you pay a fixed amount. Every year or so I roll over data into the bucket.
But for infrequently accessed data its fine.
For a better alternative, run MinIO on a cloud provider of your choice, or stick with a secure option like Proton Drive.
I use a mini pc with small smb shares (less than 1 TB). This thing is on 24/7, but runs energy efficient.
When it's time to move data, i copy it to a Synology NAS that holds lots of TB's. Then it's also time to backup the really important stuff, which goes to a Hetzner Storage Box[2].
[1]: https://en.wikipedia.org/wiki/Backup#3-2-1_Backup_Rule [2]: https://www.hetzner.com/storage/storage-box/
> run MinIO
When people say "s3", they mean "any s3 compatible storage" in my experience, not "amazon s3 specifically" or just "s3 as a protocol".
Doesn’t require an external database (just a s3 bucket) and is a single binary. A webui is shipping in the next few days.