3 pointsby sujayk_3314 days ago1 comment
  • parthsareen12 days ago
    Hey! One of the maintainers of Ollama. 8GB of VRAM is a bit tight for coding agents since their prompts are quite large. You could try playing with qwen3 and at least 16k context length to see how it works.