There’s three possibilities.
1) Intel is optimising for common cases inside the most dominant desktop operating system.. this is like apple having really good floating point in their cpu’s that makes javascript not suck for performance… and is why macbooks feel snappy with electron.
2) Intel and microsoft worked together when designing the CPU, so Windows is able to take advantage of some features that Linux is only just learning how to handle (or learning the exact way it works).
3) The way the operating systems schedule tasks is better in this generation for Windows over Linux, by accident.
“it’s better” doesn’t really factor, Windows has been shown repeatedly over the last half-decade to be so inferior as to be beaten by Linux when Linux is emulating Windows APIs. It’s difficult to be so slow that you’re slower than someone emulating your platform.
> Or maybe it is just better?
Handhelds crush: Bazzite/SteamOS +20-36% FPS on ROG Ally X/Legion Go. Cyberpunk 39FPS@20W Linux vs less Win; Returnal 33 vs 18. [2][3]
AMD desktop: CachyOS +15-80FPS (Cyberpunk/CS2) on RX5700XT vs Win11. Smoother 1% lows too. [2]
NVIDIA par/mixed, tweaks needed.
Vulkan+NTSync > DX overhead. 90% Steam games? Linux wins now.
Anti-cheat blocks the rest.
[0]: https://www.reddit.com/r/linux_gaming/comments/1pxtcv3/rx_57...
[1]: https://gamersnexus.net/gpus/rip-windows-linux-gpu-gaming-be...
[2]: https://arstechnica.com/gaming/2025/06/games-run-faster-on-s...
[3]: https://www.notebookcheck.net/Asus-ROG-Xbox-Ally-with-Bazzit...
I'd really wonder if one took a game that was on both xbox and linux. constructed a linux box to have basically as close to specs as on the xbox and then benchmarked the games against each othe, what would would see.
I'm not saying that linux is better than windows or that windows is better than linux, just that I think its very hard to make an apples to apples benchmark comparison and there are constant services on windows that run that one doesn't generally have running on a linux system that can cause problems.
You buy Windows as a product, and those subsystems are so spidered in that turning them off is not possible, and if it even was possible it would have some impact.
You buy Windows for games, thats been the consensus for years, the NT kernel could in theory run games 10x better, but it doesn’t mean anything because you only get it, with Windows.
So, an Apples to Apples comparison is Bazzite. The general purpose operating system you install and play games on. No need to apologise for Microsofts choices.
Otherwise Windows could make WSL (1) faster than Linux, but they can’t because they don’t have the similar enough underlying operating system paradigms.
I could give examples, but I think just comparing native python performance on both platforms is the easiest case I can make without going into details.
- literally the history of Intel for more than 30 years and likely why we see this benefit now. gaming the compiler and hoping they wont get caught bought them a decade against AMD.
"Intel and microsoft worked together when designing the CPU"
- I guess the bitterness of Itanium doesnt last forever.
That's how a good benchmark looks like. From ancient wisdom (Linux Benchmarking Howto): " 5.3 Proprietary hardware/software
A well-known processor manufacturer once published results of benchmarks produced by a special, customized version of gcc. Ethical considerations apart, those results were meaningless, since 100% of the Linux community would go on using the standard version of gcc. The same goes for proprietary hardware. Benchmarking is much more useful when it deals with off-the-shelf hardware and free (in the GNU/GPL sense) software. "
But it doesn't align with the last 12~20 laptops I've tested between Ubuntu Linux and Windows out-of-the-box where if loading up say V-RAY, IndigoBench, Blender, etc, and using the official binaries on each platform, Linux has typically always dominated in said workloads for both AMD and Intel laptops. So something isn't aligning quite right there with this ThinkPad versus all the other hardware I have tested with Windows vs. Linux.
Also, have you tried Windows 10?
- Intel optimized something MS asked for, so now X and Y syscalls are faster
or
- MS wrote some super-optimized BLAS/LAPACK libraries for this exact CPU which were are not (yet) available on Linux
or
-Intel added management things specifically for Windows.
It could be that they chose something bleeding edge and the hardware drivers were built for windows but might be a couple revisions behind for the Linux equivalents. It could be the development cycle for windows vs that of Linux and how they integrate with new hardware. Just a hypothesis.
https://news.ycombinator.com/item?id=46002989
(i.e. no license, have to fallback to unaccelerated software-only implementation)
It's 2025 and I would have expected the linux foundation or canonical to at least create a label "linux compatible" or "linux tested", so that brands can license it, and maybe spend money to collaborate with hardware vendors so they can write correct drivers, but that has not happened.
I am not saying linux/OSS is at fault here, but I am confused why the situation is still so bad. You can even find several governments ready to use linux, but it's not reliable enough yet, or maybe they're too tech-illiterate.
Open source/linux folks are so politicized against capitalism, proprietary software and patents that they excluded themselves from the economy. Only valve and the steam machine might have a chance of changing that situation but it's not even guaranteed.
Then Google gives HSBC the ability to lock people out of their banking app if they installed a third-party password manager from the "wrong" app store and I start to think RMS was right about everything.
Does it always has to be this extreme?
I always thought he was right about some things, but wrong about others. (And all in all not great as a public face for the organisation)
I opened my browser on the same device and transfered it that way. So much for "security".
A few distros do have something like this. Ubuntu has the "Ubuntu Certified" program https://ubuntu.com/certified and Fedora has "Fedora Ready" https://docs.fedoraproject.org/en-US/marketing/ready/list/ . For a situation like this, that doesn't really matter though. Linux does run on the laptop and Lenovo does officially support running Linux on it. If there's a problem with the CPU scheduling or something for that line of processors, Intel would have to fix it, not Lenovo.
> Open source/linux folks are so politicized against capitalism, proprietary software and patents that they excluded themselves from the economy. Only valve and the steam machine might have a chance of changing that situation but it's not even guaranteed.
I don't know what you're talking about here. The vast majority of Linux kernel development is done by companies, not unpaid volunteers. This has been the case since at least as far back as the mid 2000s.
It doesn’t. I’ve had windows laptops that burn power when closed and apparently sleeping (in fact we still have it, a Lenovo yoga device), or just run up the fans when idle.
I’ve also had a MacBook that once in a while would be hot and thrashing its fans when I retrieved it from my bag (retina MBP 2014 IIRC)
I would say that the Framework is fine for battery life when you’re using it but loses like 20-30% of battery per day in sleep mode vs like 1% per day for the MBP.
The workaround I use now is to set the FW to hibernate after 30 minutes of sleep so it’s not dead when I decide to use it again after a few days.
The downside of this is that waking up takes a couple of minutes and so I still tend to use the MBP if I need to do something quick and don’t want to wait for the hibernate tax.
Honestly I have no issues on my own AMD laptop but iirc nvidia drivers are still relatively bad at keeping power consumption low.
It would be nice if Linux got the same vendor support as windows.
A first step would be if the kernel developers weren't changing the kernel-internal APIs all the time. Since the driver model on Windows rarely changes, hardware vendors don't have to rewrite the drivers all the time.
Additionally, it is a well-known secret that Microsoft provides quite some internal tools to hardware vendors to symbolically execute the driver binaries and check them for buffer overflows. Otherwise, the quality of the drivers on Windows would be a disaster. On the other hand, once the drivers pass certification (of which this is a part), it is rather easy for the hardware vendor to make the driver "official" - no discussing on the LKML that the architecture of the driver that the hardware vendor developed does not fit what the subsystem maintainer wants.
This all makes Windows a much more "convenient" system for hardware vendors to develop official drivers for.
On the right laptop, linux will have decent battery life. On the wrong laptop, windows will have terrible battery life.
To my knowledge Linux isn’t that capable on BIG.little architectures, and Linux power-management (as this intersects with) has always left a little to be desired - when comparing battery life to Windows.
Disclaimer: pure speculation. Possibly misinformed :-D
Android uses Linux as it kernel and runs on billions of devices with heterogeneous cores. Linux had this capability for way longer than Windows did; Windows for the most part did not run on devices with heterogeneous cores until the Intel Alder Lake (12th gen) CPUs.
Win11 outperformed Linux at Alder Lake release too [1] but eventually this changed and Linux was better on Meteor Lake [2]. Probably Arrow Lake has some microarchitectural changes which do not mesh well with Linux's core scheduling logic which Intel will need to fix, at which point Linux will probably close the gap again.
[1] https://www.phoronix.com/review/alderlake-windows-linux/9 [2] https://www.phoronix.com/review/intel-meteorlake-windows-lin...
The extra capabilities of Android come from custom patches from Qualcomm kernels. They are so far diverged from the mainline, it is really really hard to merge it back. They not only add drivers but patch the kernel itself. Windows NT can have hints for thread scheduling from the userspace since they control Win32. Now the question becomes is there a way to patch Glibc and all other system libraries on Linux to give equal information to Linux kernel. Of course Linux kernel can guess but it is a lossy information channel.
Had the same thought: I would also expect this to be an artifact of suboptimal scheduling on Linux or some otherwise unidentified issue.
Linux is usually outperforming Windows by a good margin on the same hardware.
Also, in my experience, Windows 11 does not improve performance compared to Windows 10 (I have to use both versions at my dayjob).
I would be very surprised if this isn’t an issue with drivers or scheduling.
As much as I want to use Linux on the desktop I've had terrible cases of instability:
My hardware on Windows 10 works perfectly well. Literally 11 years of being super stable, running assorted workloads (WSL 2, Docker based development, browsers, heavy terminal based workflows with Neovim, tmux, etc., video recording and editing, image editing, gaming, etc.). There's no lag, jittering or instability. My system never crashes or has weird issues requiring a reboot on Windows 10.
1 day into using Arch Linux, as soon as my GPU's memory gets close to 75% full then apps crash, my Wayland based window compositor (niri) starts to fail in unpredictable ways and I have to reboot basically every 3 hours because of GPU memory usage.
It's not stable or very usable IMO.
All I did was open 3 Firefox windows and 2 Ghostty terminals. Both apps are hardware accelerated so they use GPU memory.
Windows seems like it does something magical with how it offloads GPU requested memory to system memory in a transparent way if GPU memory is full but Linux, at least with the official proprietary 580 series DKMS NVIDIA drivers doesn't seem to do this with my GTX 750 Ti. Instead, I get kernel errors from the NVIDIA drivers when it fails to allocate memory, such as:
kernel: [drm:nv_drm_gem_alloc_nvkms_memory_ioctl [nvidia_drm]] *ERROR* [nvidia-drm] [GPU ID 0x00000100] Failed to allocate NVKMS memory for GEM object
It's wild that my system can be using 2 GB out of 16 GB of system memory at 5% CPU load with no disk I/O happening but I can't run a few apps in parallel, it's especially bad when trying to record 1080p videos with OBS. I recorded literally over 1,000 videos on Windows without a single hiccup.I wrote a lot more details about this Linux issue in an Ask HN but it didn't gain traction https://news.ycombinator.com/item?id=46436245.
I have to hand it to Windows with how it manages system memory and "just works", especially with older hardware.
Also as an aside, Ghostty on this Linux machine is very slow compared to the Windows Terminal. Opening half a dozen Neovim splits on my 4k monitor completely tanks its performance to where there's a lot of input lag and jitters. The screen redraws are very slow. The Microsoft Terminal had no issues with the same Neovim version and configs running in Arch Linux within WSL 2, it was buttery smooth. I opened a discussion about this on Ghostty with more information https://github.com/ghostty-org/ghostty/discussions/10114.
In 2019 I tried switching to native Linux and it failed with my Scarlett 2i2 USB audio interface. I got endless crackles and pops and after 5 days of debugging and trying everything I gave up and went back to Windows.
In Dec 2025 I tried switching to native Linux with the same hardware and the audio problems are solved but now there's this GPU memory problem. I spent another few days debugging as much as I could but it's looking like it's back to Windows.
I think the GPU issues probably won't be solved in 7 years because NVIDIA said they are going to end of life the 580 series drivers in August 2026 and the 590+ series don't support my card. The open source drivers produced a worse experience, it wouldn't let me use my 4k monitor and hard locked my machine a few times.
Arch, btw, is notoriously unstable. Back in 2012 I caught them enabling experimental kernel memory paging modules. Soon after my system got bricked. Maybe you want fedora or rocky.
As a scientist myself I would do my best to figure out why before publishing something like this.
The consumer loses out but that's not something new either.
Responsible articles and journals note these things.