1 pointby msoul4 hours ago1 comment
  • msoul4 hours ago
    Running an LLM locally on your own laptop is no longer a problem. And the models are improving every month.

    “But which model should I choose?” That’s exactly why I built a detailed benchmark that measures not just quality, but also speed on a Macbook Pro.

    The current best all-rounder: Qwen-3.6 35B