The BSL-1.1 license choice is interesting too — shows you're thinking about sustainability and intent, not just shipping fast. That kind of explicit decision-making is actually what separates vibe coding projects that survive from ones that collapse: when the human behind the project is thinking about architecture, maintainability, and accountability.
Somewhat related: the Agile Vibe Coding Manifesto (https://agilevibecoding.org) is trying to formalize exactly these principles — that customer value and human accountability still drive everything even when AI is writing most of the code. Your project is a good example of vibe coding done with intention.
Good luck with Skales — the accessibility angle (no Docker, no CLI) is genuinely underserved.
One thing worth considering as you build out the agent UX: the quality of the default prompts/instructions you ship with Skales will matter a lot for first impressions. A non-technical user can't debug a bad system prompt — they'll just think the AI is dumb. Structuring those instructions carefully (role, constraints, examples) makes a huge difference.
Built flompt (https://flompt.dev / https://github.com/Nyrok/flompt) for exactly that — visual prompt structuring that helps get the instructions right before they become a UX liability.
Thanks for the thoughtful feedback!
For Execute Mode (multi-step autonomous tasks), Skales queues steps sequentially so there's no parallel bottleneck – it plans, you approve, then it runs through each step. There's also a desktop buddy (think as Microsofts Clippy, but actually useful) that sits in your system tray (if activated) as soon you minimize or close the main-windows, you can ask it quick questions without even opening the main interface. It runs within the same Electron process, so zero additional RAM overhead. Idle RAM sits around ~300MB (I had 400MB at least) which keeps things snappy.
The main speed factor is honestly the LLM provider, not Skales itself. With local Ollama models it's purely your hardware.
Happy to answer more specific questions, thank you for asking jlongo78!