81 pointsby Arun_Kurian3 days ago8 comments
  • zokier3 days ago
    Here is the scene in browser. Runs plenty smooth on my 5 year old computer. https://splatter.app/s/lzs-xtl
    • Arun_Kurian3 days ago
      Ya Jakub really did an amazing job with Splatter, Martin and Will at SuperSplat also have something similar and works great. Only difference with us is that ours is light weight, native, support XR (VisionPro) and LOD structure is computed on load time, so you can load in any splat file within a couple of seconds.
  • tigranbs3 days ago
    Is this the WOW effect of the hardware price-to-performance ratio? The only significant benefit I can see is that the M-series chips have RAM as GPU memory, which is slower than traditional GPUs, but at least you can run things with that memory.
  • unbelievably3 days ago
    Is there some sort of level-of-detail going on to make this possible? I'd think that's the only way but the tweet says no preprocessing.
    • Arun_Kurian3 days ago
      Yup we compute the LOD structure on load within a couple of seconds using GPU compute shaders.
  • christkv3 days ago
    Runs fine on my iphone 14
  • lifty3 days ago
    Why can't they make video games with this tech?
    • lunaticlabsa day ago
      As a video game programmer, I can speak to this. For video games, we generally need geometry. Flat planes, things you can collide against, things we can reason about. Gaussian splats work as a bunch of 2D images stuck on top of each other, that in combination look correct. This is great for rendering, but makes it very very difficult/impossible to figure out whether you are inside some geometry or not, because it doesn't have any. it doesn't give you any way to reason about it as solid geometry. So in the end, you have to create the geometry that is the solid surfaces that you will collide against and move around in, and gaussian splats would be independent of that. Once you have all the geometry, its much easier to just render that.

      There are tools that will generate geometry from splats, but its generally not very good, and gives messy results. Fixing the messiness is often more work than just re-doing it from scratch. This is another incredibly difficult problem that hasn't been solved particularly well.

    • jncfhnb3 days ago
      Real world is a shitty map design
    • thfuran3 days ago
      Has anyone sorted out a good way to do dynamic lighting or animation with it?
    • meffmadd3 days ago
      From what I have heard „Bodycam“ uses scans of actual locations for its maps.
    • Arun_Kurian3 days ago
      I think they are coming, should see a few pop up in 2026 for sure.
    • deadbabe3 days ago
      Will be a nightmare to license the use of all this data for commercial purposes. Each house, each building, requires consent.

      Sorry, but the answer is no. Unless you are willing to pay.

      • carlosjobim3 days ago
        That's why Hollywood movies are so expensive. When they have a scene with spider man jumping around in New York, they have to pay a fee to every owner of real estate depicted in the scene.

        Worst of all is of course space documentaries, where you can see the whole Earth. The licensing fees are horrendous.

        • MrSkeltera minute ago
          I work in Hollywood and this is or true. We do not have to pay to have buildings in the background. Nor does TV. How would anything be filmed outside if news crews had to pay fees for filming like that? I have made films in NY, London, Paris, Sydney. We can shoot someone walking through a city as long as we have permits for the space. The skyline is free. As is anything else we capture from space we have rights to.
        • Arun_Kurian3 days ago
          wait... so what about Google Maps ?
      • segmondy3 days ago
        prove it, under what law?
  • 3 days ago
    undefined
  • jncfhnb3 days ago
    What’s the data?
  • tyleo3 days ago
    The MacBooks have insane performance and everything else is falling behind.

    It won’t be surprising if Apple overtakes Windows as a gaming platform in coming decades IMO if Intel can’t catch up.

    • honeycrispy3 days ago
      Apple may have good hardware, but their software support is comically bad. Their support for backwards compatibility (aside from Rosetta) has basically been "FU, I'm apple", which you need for gaming.

      Apple has had little to no interest in becoming a real gaming platform. Unless that changes, gamers will more likely be moving into the sweet embrace of Gaben on Linux.

      • latexr3 days ago
        Open the App Store on a iPhone. Of the four tabs, two are game-centric (“Games” and “Arcade”). Another (“Today”) has been consistently using more than half of its features for games.

        In their most recent operating systems, they have released a separate app specifically for games (look at that domain, even).

        https://games.apple.com

        They created the Game Porting Toolkit.

        https://developer.apple.com/games/game-porting-toolkit/

        When they discontinue Rosetta next year, they’ll continue limited support specifically for old games.

        https://developer.apple.com/documentation/apple-silicon/abou...

        Plus, whenever they announce new chips they feature games and gaming personalities in the keynote.

        Those are clear signs they are interested in gaming happening on their platforms. Whether they’re succeeding at it is another story.

        • abtinf3 days ago
          And how about all the games no longer on the App Store?

          Say, Flight Control (one of the first games to hit a million unit sales), or the Infinity Blade series (which wiki says was removed due to incompatibility with newer Apple platform changes)?

          • 3 days ago
            undefined
          • latexr3 days ago
            Both of those examples are old and precede current efforts. Plus, for the third time:

            > Whether they’re succeeding at it is another story.

            I’m not arguing Apple excels or is even decent at video games, I’m simply pointing out that it’s clear they are interested in having them on their platforms.

        • Induane3 days ago
          Of course they also took the route of inventing a new 3d api Metal which is at odds with Vulkan. There is HoneyKrisp of course, but if one want's decent gaming on an M1 or M2 laptop, Asahi Linux is actually the superior choice.

          I don't think one can call it even close to success when the best way to run AAA games on your hardware is to literally replace the entire operating system which uses cobbled together components like FEX and wine/proton, etc... the fact that that works with more games is insane.

          • latexr3 days ago
            Again:

            > Whether they’re succeeding at it is another story.

            You may disagree with their strategy all you like. You may even think they are doing everything wrong, that’s perfectly legitimate. But they are clearly interested in having gaming happen on their platforms. The claim that they aren’t is the only thing I disputed.

          • fingerlocks2 days ago
            Metal predates Vulkan.
    • semi-extrinsic3 days ago
      The Apple hardware is indeed very nice, but it's not a good environment for gaming. They've traditionally been quite gaming-hostile with refusing to support the later generations on OpenGL. Then there was a wrapper for Windows games called Whisky, but it was finicky and became unmaintained. Apple has their own App store which sells some games, which is in direct competition with Steam and others, so those actors are probably a bit wary of spending too much resources on the platform. Also a lot of gamer culture is related to building your own hardware, which Apple will never support.

      Meanwhile gaming on Linux is becoming better than Windows these days, especially with all the trash to be circumvented on Win11, and Steam working hard on SteamOS etc.

    • realusername3 days ago
      They'll never take the gaming segment as they would need proper backward compatibility.

      No game developer want to update their game continuously just to keep the lights on.

      The opposite is happening at the moment, they fell lower than Linux as a gaming platform.

    • sublinear3 days ago
      I was under the impression the only reason the macbook was mentioned is that it normally wouldn't be able to render this so well.

      Seems like a post about the software on display. i.e. "look it can even run on an m2 macbook air"

    • bigyabai3 days ago
      I hate to break it to ya, but Apple Silicon isn't in the top 25 highest-performing consumer GPUs. It's probably not even in the top 25 most-efficient either: https://browser.geekbench.com/opencl-benchmarks
      • lunaticlabsa day ago
        The big advantage of macs when it comes to GPUs isn't their direct speed, its the unified memory model. If I want to buy a GPU that has 64-128GB of addressable memory, it will cost me an enormous amount and the computer itself will be a server module for racks that is loud and not a consumer PC. You can buy a mac with a unified memory model, and even though its GPU is not on the top rankings, the fact that it can operate on your model in regular memory is what gives it its advantages.
      • rafram3 days ago
        That chart shows that M4 achieves 25% of the Geekbench scores of GPUs that pull >10x more power. That's definitely efficient.
        • bigyabai3 days ago
          Are you comparing it with other 3nm GPUs? When you normalize for process, Apple Silicon is definitely not the most efficient raster architecture.
          • rafram3 days ago
            It doesn't seem like Nvidia even has any 3nm GPUs on the market. But sure. When you control for power efficiency, it turns out there's no difference at all!
            • bigyabai3 days ago
              Process is not equivalent to power efficiency. It's a step-change enabling better designs.

              Apple and Nvidia both have 5nm and 4nm GPUs. Take those scores and divide it by the TDP, you'll be shocked at the difference design can make.

              • wtallis3 days ago
                Please never divide anything by TDP. Use actual power measurements, unless you're trying to ensure your numbers end up being bullshit. (In particular, any number someone claims is a TDP for an Apple processor is made up, because Apple doesn't publish or specify any quantity remotely similar to TDP.)
                • bigyabai3 days ago
                  Okay, then don't divide by TDP. Measure the GPU wattage frame-by-frame and you'll still end up with similar numbers. The point stands.

                  > because Apple doesn't publish or specify any quantity remotely similar to TDP

                  1) That doesn't mean that power usage isn't measurable.

                  2) They actually do, although it's not a perfect breakdown chip-by-chip: https://support.apple.com/en-us/102027

                  • wtallis2 days ago
                    Are you seriously trying to claim that Apple's total system wall power numbers are appropriate for comparing against an AMD or Intel processor TDP number? You really are trying to ensure the numbers you calculate are bullshit.
                    • bigyabai2 days ago
                      I think you did not read the context of this discussion. We're talking about GPU power draw, not SOCs, which can be measured on Apple Silicon and compared against third-party raster workloads.

                      If you think any of my calculations are wrong, please cite them and correct them. GPU-to-GPU, Apple's raster performance is lacking.

      • jiehong3 days ago
        Neither is their target, they are more in the perf/watt segment.
        • bigyabai3 days ago
          Which is why it's confusing that the M3 Ultra is less efficient than several 130w laptop chips.
    • alsetmusic3 days ago
      I'm a huge fan of Apple's hardware since they introduced their own silicon, but this is just silly. Apple doesn't have the personality needed to court and work with game companies. They're busy expecting everyone to come to them when they'd have to actually work to entice them.