5 pointsby yincrash13 hours ago1 comment
  • newsdeskx5 hours ago
    does this work with purely local models through Ollama, or do you still need the Ollama server running on another machine? been looking for something that actually works offline for basic voice commands
    • yincrash2 hours ago
      Still needs a server. You could run a server locally if you had a model that your device could handle then point aide to the localhost URL.