4 pointsby scottyeager11 hours ago3 comments
  • SlavikCA3 hours ago
    The HuggingFace link is published, but not working yet: https://huggingface.co/MiniMaxAI/MiniMax-M2.1

    Looks like this is 10 billion activated parameters / 230 billion in total.

    So, this is biggest open model, which can be run on your own host / own hardware with somewhat decent speed. I'm getting 16 t/s on my Intel Xeon W5-3425 / DDR5-4800 / RTX4090D-48GB

    And looking at the benchmark scores - it's not that far from SOTA (matches or exceeds the performance of Claude Sonnet 4.5)

  • onebitwise9 hours ago
    Right now is "Internal Server Error"

    Here the saved page: http://archive.today/nDUc4

  • scottyeager11 hours ago
    "Significantly Enhanced Multi-Language Programming, Built for Real-World Complex Tasks"