19 pointsby wildcatqz7 hours ago5 comments
  • mindwok6 hours ago
    This doesn’t appear to be an official website. The official release only mentions Huawei is supported for inference. So… pretty sure this is not true.
    • stingraycharles5 hours ago
      Which would make a whole lot more sense as a stepping stone. I believe Anthropic also only used Google TPUs for inference until the last generation.
      • mindwok5 hours ago
        Inferencing with Huawei chips is not new either, they’ve had DeepSeek running on them since last year.
  • raincole5 hours ago
    WTH is this website? Why a domain specifically for glm5? Isn't the official site z.ai? Scammy af.
    • Lennie2 hours ago
      https://deepseek.net/ was exactly the same last year.

      My guess is: Pick a popular keyword from Google trends of which the Chinese company only released Chinese content and take the domain and put up English content.

    • CamelCaseName5 hours ago
      Claude.ai and Anthropic.com?
      • raincole5 hours ago
        Have you heard about sonnet37.ai? The infamous chatgpt4o.net? Yeah, me neither.
  • readitalready7 hours ago
    Welp. NVidia had a good run while it lasted. RIP.

    BTW so far on my GLM-5 evals it's performing qualitatively as well as Opus 4.5/4.6. Only issue is maybe speed. I will likely incorporate it into daily use. The previous versions were trash, filled with syntax errors and instruction following mistakes.

    • chasing0entropy6 hours ago
      I wouldn't say nvda is completely out,but the chess moves in response is tough - Release a new chip, obsoleting the existing lines and take hits on billions of defaults on hardware
      • keyle6 hours ago
        Let's say nvidia has been de-moated.
        • spwa417 minutes ago
          Have they? Nvidia's moat is very different.

          TLDR: For now, everyone is sold out of tokens: a ridiculous percentage of every Nvidia card is selling every token it's generating, every token generated by Google's TPUs sells, Amazon's Trainium, Groq's silicon giants (they don't really name their chips and the chips are like 30 cm in diameter, so let's go with giants), ... and Nvidia B200s are the cheapest way, by far, to generate tokens and are being sold at something like double the speed they can be produced.

          Once the AI craze slows, the most surprising thing is going to happen: Nvidia sales will go up. Why? Because it's older cards that will get priced out first, and it will become a matter of survival for datacenter companies to fill datacenters that currently run older hardware with the newest Nvidia hardware ...

          That's the bull case. Under unlimited token demand, Nvidia wins big. Under slowing token demand, Nvidia actually wins bigger, for a while, and only then slows. For now, everything certainly seems to indicate demand is not slowing. Ironically, under slowing demand, it's China that will suffer in this market.

          And the threat? Well it is possible to beat Nvidia's best cards in intelligence, in usefullness, because the human mind is doing it, on 20W per head (200W for the "full machine"). And long story short: we don't know how, but obviously it's possible. Someone might figure it out.

  • limoce4 hours ago
    > This site is an independent informational resource and is not officially affiliated with Zhipu AI.
  • Aqua05 hours ago
    spam website