50 pointsby microsoftedging3 days ago14 comments
  • Night_Thastus3 days ago
    It's complete garbage and not worth buying. It's so cut down it's nearly useless outside of web browsing and very light games. The price is also effectively lie, it's going to be hard to get it for less than $300. Once we get some proper 3rd party test data in I'd be shocked if it's 5% better than a 4050 in raster without the use of fake frames.
    • pitaj3 days ago
      Agreed. Anybody buying this would be better off spending that money on a used card like a 2070 or 3060, and might even save a buck.
      • jamesgeck03 days ago
        Or an Intel Arc card. They aren't very high end, but they're competitive at MSRP and I suspect they'll demolish this in benchmarks.
    • jekwoooooe3 days ago
      Why is there so much hate against fake frames? I know you can’t tell the difference
      • jamesgeck03 days ago
        NVidia themselves have said that framegen shouldn't be used if the card isn't hitting 60 FPS to start with because of the latency it introduces. If the card is cut down enough that it's struggling to hit 60 FPS in games, enabling framegen will do more harm then good.

        You can feel additional latency easily in competitive FPS or high speed arcade racing games.

        • unaindz3 days ago
          You can feel less than 50-60 fps on a management game where you only interact with the UI and move the camera around, not game breaking but doesn't feel great. And I used to play far cry 3 and CSGO at ~25 fps, I'm used to lack of performance.
          • ycombinatrix3 days ago
            "I am accustomed to using crap so it is okay if this brand new graphics card is crap"
            • rcxdude2 days ago
              I think they're saying "I'm used to crap and even I can feel this is crap in cases where people will claim no-one can tell"
        • jekwoooooe2 days ago
          Well maybe I’m just spoiled by my 4090 then I never go under 100 anywhere let alone 60
      • toast03 days ago
        Fake frames have a big latency penalty, because you can't generate a frame between X and Y until you have Y. At the point that you have generated frame Y, however many frames you insert give you that much additional latency, beyond whatever your display adds.

        I guess I can see some utility in situations where latency is not a major factor, but IMHO, that pushes out most gaming.

        • jhanschoo2 days ago
          I believe DLSS frame gen predicts future frames (X_1, X_2, X_3) given (X_0, X_-1, X_-2, ...), without waiting for X_4. At least that's the impression I get from their marketing.
          • rcxdude2 days ago
            Yeah, but there's still a latency penalty, because X_1, X_2, X_3 won't respond to player input, so your effective latency is still that of your 'real' FPS, and that's lower than without because the frame gen takes a good fraction of GPU resources.
            • nh23423fefe20 hours ago
              This is where you get to pretend your losses are nvidia's fault.
          • ThatPlayer2 days ago
            Nvidia Reflex 2 does that with async frame warp (that has been used in VR for a while now), but it's separate from DLSS, and is not supported in many games.
      • Night_Thastus3 days ago
        You absolutely can tell the difference. DLSS (upscale) visually is massively different in some games. Sometimes it works great, sometimes the result is very ugly. I've tested with several of my favorites.

        And generated frames are far worse than that. If you're running at a very high base framerate (100+) then they can look OK but the moment the frames get any further apart the visual quality starts to tank.

      • kllrnohj3 days ago
        because you can tell the difference, they have quite a few artifacts, and they make latency worse which is especially problematic in the scenarios where you need the "performance" offered by fake frames. At this price point it's that last thing that's especially problematic. You may get 60fps in an fps counter with dlss 4, but it'll feel like 15-20fps and not be very playable
      • lithos3 days ago
        You can feel them and the feeling has a multiplicative effect with Unreal's shortcuts (their tweening, lighting/shadow estimations).

        Publishers are also forcing the settings to be on, to save time on optimization and for false advertising.

      • orphea3 days ago
        This is why: https://news.ycombinator.com/item?id=44368785

        Fake frames are cool tech but they are horribly mismarketed, indistinguishable from scam.

      • xcjs14 hours ago
        Try playing on a 43" TV. You can tell.
      • const_casta day ago
        Because there are other cards available at this price point with significantly better raster performance. So just go with that instead.
      • cedws2 days ago
        It’s an escape hatch for developers to ship shitty unoptimised games.
  • LorenDB3 days ago
    I can't believe that nobody has yet mentioned the Intel Arc Battlemage B580. Same $250 MSRP (which has inflated, but every other GPU is inflated too, and the 5050 will probably inflate as well), but has 12 GB of VRAM and bats just below a 4060 Ti 16 GB[0].

    [0]: https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html

    • some_random3 days ago
      I have to assume things are better to some degree but last I looked at Intel's offering the support was still unacceptably bad. That said, I really hope they can get things to a good state because there needs to be more competition at this price point.
      • kllrnohj3 days ago
        The support is still worse, but you're getting a big discount on the hardware by comparison. So it kinda evens out at this price point where you're deciding between either having every game run badly or most, but not all, games running decently
    • jamesgeck03 days ago
      I've been pretty happy with my Arc A770 LE (16 GB). The drivers were rough at launch, but they've gotten much better, and at the time it was the best performance $350 could buy.
      • prossercj3 days ago
        How is it for gaming? Had any compatibility issues?
        • KronisLV3 days ago
          I had both an A580 (not an A770, but at least something from that generation) and then later a B580, at one point even both in the same computer, side by side, when I wanted to use one for games and the other for encoding:

          https://blog.kronis.dev/blog/what-is-ruining-dual-gpu-setups

          https://blog.kronis.dev/blog/more-pc-shenanigans-my-setup-un...

          https://blog.kronis.dev/blog/two-intel-arc-gpus-in-one-pc-wo...

          When paired with a worse CPU like a Ryzen 5 4500, the experience won't always be good (despite no monitoring software actually showing that the CPU is a bottleneck).

          When paired with a better CPU (I got a Ryzen 7 5800X to replace it, eventually with an AIO cause the temperatures were too high under full load anyways), either of them are pretty okay.

          In a single GPU setup either of them run most games okay, not that many compatibility or stability issues, even in older indie titles, though I've had some like STALCRAFT: X complain about running on an integrated GPU (Intel being detected as such). Most software also works, unless you want to run LLMs locally, where Nvidia will have more of an advantage and you'd go off the beaten path. Most annoying I've had were some stability issues near the launch of each card, for example running the B580 with their Boost functionality on in their graphics software sometimes crashed in Delta Force, no longer seems to be an issue.

          Temperature and power draw seem fine. Their XeSS upscaling is actually really good (I use it on top of native resolution in War Thunder as fancy AA), their frame generation feels like it has more latency than FSR but also better quality, might be subjective, but it's not even supported in that many games in the first place. Their video encoders are pretty nice, but sometimes get overloaded in intensive games instead of prioritizing the encoding over game framerate (which is stupid). Video editing software like DaVinci Resolve also seems okay.

          The games that run badly are typically Unreal Engine 5 titles, such as S.T.A.L.K.E.R. 2 and The Forever Winter, where they use expensive rendering techniques and to get at least 30 FPS you have to turn the graphics way down, to the point where the games still run like crap and end up looking worse than something from 5 years ago. Those were even worse on the A series cards, but with the B series ones become at least barely playable.

          In a dual GPU setup, nothing works that well, neither in Windows 11, nor Windows 10, neither with the A580 + B580, nor my old RX 580 + B580: system instability, some games ignoring the Intel GPU preference being set when an AMD one is available, low framerates when a video is playing on a secondary monitor (I have 4 in total), the inability to play games on the B580 and do encoding on the A580 due to either just OBS or also the hardware not having proper support for that (e.g. can't pick which GPU to do encode on, like you can with Nvidia ones, my attempts at patching OBS to do that failed, couldn't get a video frame from one GPU to the other). I moved back to running just the B580 in my PC.

          For MSRP, I'd say that the Intel Arc B580 is actually a good option, perhaps better than all A series cards. But the more expensive it gets, the more attractive alternatives from AMD and Nvidia become. Personally wouldn't get an A770 unless needed the VRAM or the price was really good.

          Also I’m not sure why the A580 needed two 8-pin connectors if it never drew that much power and also why the B580 has plenty of larger 3 fan versions when I could never really get high temps when running Furmark on the 2 fan version.

          • distances2 days ago
            5800X is a 105W part so should be quite fine with air cooling still. I just built 9950X3D (170W) with air cooling and it's plenty enough for that too, temperatures under load are mostly in the 70s, stress test gets it up to 85C.
            • KronisLV2 days ago
              I have a pretty bad Kolink case and have to mount the fans in the front instead of the top, otherwise it gets too crowded: https://kolink.eu/Home/case-1/midi-tower-2/others/quantum.ht...

              Without the side panel, the temps are like 10-15C lower than with the side panel, so without they go up to about 78C under full load but do hit 90C and the clock frequencies are dialed back with the panel on.

              That is already with a CO value of -10 across all cores.

              I will probably need a different case altogether, or just get rid of the solid front panel (those vents on it are too small) and replace it with a custom mesh.

              Thankfully, for now, in CPU-Z the scores are ~6500 without the side panel and ~6300 with the panel, so with the AIO and more powerful fans on it, it's pretty close to working optimally, even if not quite there yet.

              I also tried it with 5x120mm case fans and an air cooler, it was slightly worse than the AIO. Also tried multiple different thermal pastes, didn't make much of a difference. Might also just be cursed and have ghosts in it, go figure.

              • distances2 days ago
                Yep I guess the case is the limiting factor then, no CPU cooler can do much if the case traps the hot air inside. Though 5 fans should be enough to force quite some air to move already.

                I had a fully new build so used one of the well reviewed Fractal cases to get good airflow, with 5x140mm case fans.

  • Kon-Peki3 days ago
    This is not going to go well:

    > x50-class GeForce GPUs are among the most popular in the world, second only to the x60-class on Steam. Their price point and power profile are especially popular:

    > For anyone upgrading an older x50-class system

    > Each GeForce RTX 5050 graphics card is powered by a single PCIe 8-pin cable, drawing a maximum of 130 Watts at stock speeds, making it great for systems with power supplies delivering as little as 550 Watts.

    The 1050, 2050 and 3050 were all bus-powered cards. I doubt 95% these systems even have the cable coming from their power supply. Imagine all the poor saps that excitedly swap out their old card for this, and... nothing works.

    Source link: https://www.nvidia.com/en-us/geforce/news/rtx-5050-desktop-g...

    • toast03 days ago
      I've got at 1650 Super; it's not bus-powered either. I think it's got a 6-pin, but often you can plug a 6-pin into an 8-pin board and it'll just run a lower current limit (this might not be accurate --- a lot of internet comments say 8 pin boards will detect a 6-pin connector and refuse to work). A whole lot of modern computing gets ~ 90% of the performance with 50% of the power; so if using a 6-pin lead drops power to 50%, you would still get most of it.

      I've got a ~ 2006 380W power supply hanging out near my desk and it's got a 6-pin pci-e cable; I really don't think people won't have at least that, certainly not 95% of systems with a pci-e x16 slot.

      • baobun2 days ago
        Non-super 1650 typically bus-powered and no PSU connector.
    • reginald783 days ago
      To bolster this, after the 750ti the 50 products have had pretty lame price to performance compared to the next step up, but have remained quite popular. Most people seem to argue that the lack of additional power is their main advantage and why they are popular.

      I personally think people remember being happy with the 750ti and just keep buying those cards.

    • LorenDB3 days ago
      Hey, at least it's not the infamous 12-pin connector.

      (Full disclosure, I have a 9070 XT with a 12-pin connector. No fires yet, though.)

      • Kon-Peki3 days ago
        Well, maybe a molex-to-8pin-PCIe cable comes in the box!
  • gs173 days ago
    And it's 8GB of last-gen GDDR6 video memory, the exact same as the $249 RTX 3050 from three years ago (same number of CUDA cores too). Technically with inflation that's more per dollar, I guess, but that's not super appealing.
    • happycube3 days ago
      This card might've been forgivable with 16GB or a $149 MSRP, so of course they didn't do either...
    • pama3 days ago
      The performance figure in the link clearly says that it it’s a significant improvement to the 3050.
      • sidewndr463 days ago
        The charts are from the Verge, not exactly known for their integrity in regards to anything.

        It's also with DLSS on, so you could just as easily have the framerate be 100 FPS, 1000 FPS, or 10000 FPS. The GPU doesn't actually have to render the frame in that case, it just has to have a pixel buffer ready to offload to whatever hardware sends it over the link to the display. Apparently some people actually really like this, but it isn't rendering by any reasonable definition.

      • cogman103 days ago
        This is creative marketing from nVidia. Notice the "With DLSS 4".

        That's AI frame hallucination which the 5050 has.

        Without the DLSS, the numbers from independent reviewers has basically been exactly on par with the previous generations (about 10% increase in performance).

      • kllrnohj3 days ago
        That's Nvidia's marketing slide and if you note the fine print they are tested at different settings. The RTX 5050 is using 4x frame gen which the 3050 isn't. Techpowerup has the RTX 5050 as being 20% faster than the 3050 give or take, which is certainly not enough to justify upgrading
    • bigyabai3 days ago
      If you're using less memory, it kinda stands to reason that you can get more mileage out of less bandwidth. I'd be really upset if this was a 16gb or 24gb card, but we've been using GDDR6 for 8gb cards without issues for years now.

      I agree that it's not super appealing, but Team Green has to hit the low price points somehow. This feels more like a hedged bet against Intel trying to muscle their way back into the budget market.

    • ryao3 days ago
      Why did I not see any complaints when AMD used GDDR6 on their current generation products:

      https://www.techpowerup.com/gpu-specs/radeon-rx-9070-xt.c422...

      • unaindz3 days ago
        Because they announced double the VRAM for the medium end card at around the same price. But there were complaints anyway.
        • ryao2 days ago
          How much of it there is is tangential to whether it is last generation.
  • Aurornis3 days ago
    The emphasis on last-gen memory is misplaced. I don't care what memory technology is used as long as the performance is good for the price.
  • lvl1553 days ago
    Just buy AMD AI Max+ no?
    • anonym293 days ago
      Undoubtedly a better system, but the 395 variant with a full 128GB of (soldered on) RAM, you're looking at ~$2k for the system. Comparing that to a $250 dGPU (that arguably isn't even worth that) is a very "apples to oranges" comparison.
  • eighthourblink3 days ago
    Coming from a 2060 Super, would this be a good upgrade? I dont really play newer high demand games, but i do enjoy my emulation. Currently on 2060 super and dont really have any issues with emulation.Ryzen 5 3600X / linux (of course :))
    • nagisa3 days ago
      It'd likely be a side-grade, unless you care specifically for the exact features that were introduced with 30/40/50 series (such as increasingly elaborate upscaling, other AI-driven features.)

      Although we don't know how 5050 will perform, 50 series have roughly ~same perf in render as models from 40 series at the same tier. 40 series in turn are only a mild bump over 30 at the same tier. And 30-series was a reasonable improvement over 20, but mostly in perf/$ measure and not raw perf. Extrapolating, 5050 is likely not going to give much of a boost if any, and spending money on a 8GB card in 2025 is just throwing money away at this point as software is now increasingly expecting to be able to work with >8GB of VRAM.

  • throitallaway3 days ago
    Can't wait to buy one for $550.
  • gunalx3 days ago
    The 50 line is the new gt-30 tier.
    • distances2 days ago
      Except for the price, where it's the new 60.
  • dunno74563 days ago
    It will take Nvidia 10 years to release the firmware for PMU and then they will cancel it because it's "too old". Just like the they did with pascal, P520 and other perfectly working hardware that are barely usable to this day.
  • drcongo3 days ago
    I'm still waiting for my DGX Spark. Starting to wonder if Nvidia have hired Musk for their PR to promise things they'll never deliver.
  • yrcyrc3 days ago
    Not following much hardware news. What can I do with this or the Intel Arc? Play games? Run AI workload? Genuine question
    • LorenDB3 days ago
      Both. AI is dependent on available VRAM, so the B580 will run some larger models than the 5050.
  • speed_spread3 days ago
    For such insanity, it should be labeled the RTX 5150
  • neepi3 days ago
    Just bought a 16Gb 5060 Ti for twice that. I don’t feel disappointed.
    • happycube3 days ago
      Yeah, moreso since that's the highest nvidia card this gen available with a sane power connector.
      • sidewndr463 days ago
        12VHPWR has to be one of the weirdest industry decisions I've seen in a while. So far I thought I had been able to avoid it, but recently bought a power supply that uses it on the modular cable connector.

        But it isn't really that uncommon either, I had a suzuki motorcycle that used a connector with 15 amp pins to handle 30 amps of current on one pin. I eventually concluded the only reason that connector was in the harness was to ease assembly and just cut it out entirely and soldered the junction together.

      • neepi3 days ago
        Oh yeah totally agree with that one. I hate Molex connectors but the new one is just stupid.