8 pointsby ianrahman2 hours ago5 comments
  • nerdjon2 hours ago
    I am still quite shocked that anyone looked at the side by side images within Nvidia and actually thought this was good and no one would have an issue with this.

    Now I will admit that if you don't compare them, the final image looks ok. Like if I did not know what was happening I likely wouldn't give it a second thought. It looks off but so many video games already look off that I don't think I would have really thought anything other than "well it's a video game".

    But when compared to the original image it is so obvious that the artistry and the original intent is just completely lost.

    They claim that the developers and artists have more control over this, well maybe if that is actually true (because we all know guardrails on AI have been perfect so far...) they should have been involved in using that control for the video showing this off. Otherwise I honestly hope this never ships.

    But even if it does, the power requirements for this make it kinda DOA anyways.

    • wtallis38 minutes ago
      As demoed, it's obviously very bad. But before giving up on it completely, I'd like to see a version that can remain faithful to the original color grading and tone mapping. Those changes affecting the overall look of the whole frame really distract from comparing the more subtle lighting differences where they might be onto something good.

      But I'm also skeptical about whether they can pull this off in a way that doesn't exacerbate the already-severe issues DLSS has with latency and temporal stability. Enhancements that make for great screenshots often don't translate to great realtime gameplay.

  • zardoan hour ago
    That first image looks great, but will it always deepfake Aubrey Plaza's face onto that character, or will she morph between different actresses?
  • kanemcgrathan hour ago
    I don't like the Netflix CGI slop movie style filter look it gives everything. But that is a more general trend in tv and movies that I just can't stand.

    I do think this will eventually be a major part of the graphics pipeline, but I hope it will be limited and masked to things like hair, which is almost impossible to get right in real-time rendering.

  • nateb20222 hours ago
    Previous discussion: https://news.ycombinator.com/item?id=47403044 (1 day ago, 24 comments)
  • BoredPositron2 hours ago
    What's just a sidenote in the slides is that they used two 5090s in the demo video. One for the conventional rendering and one for the AI pass. That's too much overhead for what's achieved. If they both run at 100% it's 1200W mindboggling.
    • jplusequaltan hour ago
      Don't you know that to game in 2026 you need 40000+ shader cores?