Quick question, which models are you using with Ollama for analysis? I’ve been testing a few locally, and hardware requirements vary a lot, so not everyone can run the heavier ones.
Have you done any benchmarking or have a recommended model for this use case?