4 pointsby prabhavsanga4 hours ago1 comment
  • palata3 hours ago
    I personally choose not to depend on more wrappers. If I need to clone a git repo, I `git clone` it. Then I build and run the project using the build system of the project.

    If the project properly uses a build system I am familiar with, then I don't really need to think. If the project does something exotic, then chances are that I will just give up on that project. But I don't think that your tool would help here: if it is exotic, then your tool probably won't know how to automate it.

    • prabhavsanga3 hours ago
      That’s a totally fair take and I agree with you more than it might sound.

      I’m not trying to replace cloning or proper build systems, and I don’t expect this to handle exotic setups. If a repo has a custom toolchain and good docs, I’ll still clone it locally. The problem I keep running into is before that point: when I’m skimming 10–20 repos to decide which ones are even worth the effort. A surprising number either don’t run anymore, depend on unstated versions, or silently assume a local setup that isn’t obvious from the README.

      For me, even a fast failure with a clear reason (“missing env var”, “custom toolchain”, “expects GPU”, etc.) is useful, it tells me whether to invest time or move on, without polluting my machine or context-switching into setup mode.

      So I think of this less as a wrapper around build systems and more as a disposable “is this repo alive?” check — something you use before you decide it’s worth cloning.

      That said, I’m genuinely curious: when you give up on an exotic repo today, is it because the setup is unclear, or because you’ve already decided it’s not worth the effort? That distinction is what I’m trying to understand better.