5 pointsby nreece7 hours ago2 comments
  • operatingthetan6 hours ago
    I'm using minimax m2.7 and it's good enough. What I'd like to understand is how these models are so cheap though? Surely it costs them just as much for the compute? Do US-based AI companies have that much overhead?
    • yorwba5 hours ago
      There are US-based companies offering inference for MiniMax models charging slightly less than what MiniMax charges. MiniMax themselves claim to be using data centers in the US. US companies training their own closed-weight models charge so much more because they can. They're monopoly providers for their own models, so they can ask for whatever amount people are willing to pay.
  • gostsamo6 hours ago
    tbh, models in pipeline are cheaper if local is comparable only to warm water is nice and relaxing. The cursor case is a bit different, but it is because cursor cannot be profitable while competing with their providers and it is not clear yet if they will survive at all or the kimi model will prove itself as a good competition.