1 pointby Wewoc2 hours ago2 comments
  • maiconburnan hour ago
    This looks really interesting.

    Quick question, which models are you using with Ollama for analysis? I’ve been testing a few locally, and hardware requirements vary a lot, so not everyone can run the heavier ones.

    Have you done any benchmarking or have a recommended model for this use case?

  • Wewoc2 hours ago
    I wanted to ask an AI questions about my Garmin health data — sleep, HRV, stress, recovery — without sending it to OpenAI or any other cloud service. Built this with Claude as my coding partner. I can't write Python. It works anyway. What it does: downloads your complete Garmin Connect data locally, exports to Excel and interactive HTML dashboards, generates RAG-optimised JSON summaries for Ollama/AnythingLLM, and includes a Windows desktop app so no terminal is needed. Everything runs on your own machine. Nothing leaves it. GitHub: github.com/Wewoc/garmin_collector