24 pointsby bigwheels6 hours ago3 comments
  • lemonish976 hours ago
    Some of the nemotron models are really good. Hope this encourages more open-weight/source models from the west
  • gigatexal6 hours ago
    Why would they do this? Ahh to keep the AI bubble afloat. Got it.
    • Havoc4 hours ago
      I think it’s small change for them and they realise tinkering keeps the momentum going.

      There are also some people that have an aversion to Chinese models so NVIDIA backed is good there.

    • Skyy936 hours ago
      Do you really think it is still only a bubble? The progress Anthropic did with Claude Code the last few weeks is tremendous.
      • owebmaster5 hours ago
        What progress?
        • saulpw5 hours ago
          I mean, are you using it? Things have really moved in the past few months.
  • ivanvoid5 hours ago
    I will just leave here article that open-weight is not open-training.

    when i use model i wanna be able to see and modify it, i don’t want another 12Gb black box.

    https://www.workshoplabs.ai/blog/open-weights-open-training