Hacker News
new
top
best
ask
show
job
Ask HN: A good Model to choose in Ollama to run on Claude Code
3 points
by
sujayk_33
14 days ago
1 comment
parthsareen
12 days ago
Hey! One of the maintainers of Ollama. 8GB of VRAM is a bit tight for coding agents since their prompts are quite large. You could try playing with qwen3 and at least 16k context length to see how it works.