(I'm aware that Battlefield series and League of Legends won't work due to draconian anti-cheat -- but nobody in my group cares to play those I guess.)
For some reason, the Lenovo Legion S's Windows still comes with a lot of baggage and background services etc.
I am just using dm-snapshot for this -- block device level, no fancy filesystems.
One annoying thing is that linux cant run many different GPU drivers at the same time, so you have to make sure the cards work with the same driver.
Properitary 3rd party multi seat also exist for Windows, but Linux has built in support and its free.
Also, amazing house, my friend is enamored of the cat-transit. I used to live not too far from you :)
No, just because the Steamdeck's distro is built on Arch, and so you can piggyback on what they are doing.
I download the nvidia drivers directly from nvidia. Their installer script is actually pretty decent and then I don't have to worry about whether the distro packages are up-to-date.
Pretty horrible technology, and unfortunately a good majority of the gaming industry by revenue relies on it.
This is my case with my relatively new/high-end RTX 4080 and OLED monitor. So until I upgrade both, I use HDMI to be able to drive a 1440p 240hz 10-bit HDR signal @ 30 Gbps.
I finally got the 240hz 4K uncompressed but it required buying a $1300 Asus OLED monitor and the RTX 5090. It looks amazing though, even with frame gen. Monster Hunter had some particularly breathtaking HDR scenes. I think it uses DisplayPort 2.1? Even finding the cable is difficult, Microcenter didn’t have them in April and the only one that worked was the one that came with the monitor.
(Some games support 120, but it's also used to present a 40hz image in a 120hz container to improve input latency for games that can't hit 60 at high graphics quality.)
The best Valve could do is offer a special locked down kernel with perhaps some anticheat capabilities and lock down the hardware with attestation. If they offer the sources and do verified builds it might even be accepted by some.
Doubt it would be popular or even successful on non-Valve machines. But I'm not an online gamer and couldn't care less about anticheats.
For competitive gaming, I think attested hardware & software actually is the right way to go. Don’t force kernel-level malware on everyone.
FPSs can just say 'the console is the competitive ranked' machine, add mouse + keyboard support and call it a day. But in those games cheaters can really ruin things with aimbots, so maybe it is necessary for the ecosystem, I dunno.
Nobody plays RTSs competitively anymore and low-twitch MMOs need better data hiding for what they send clients so 'cheating' is not relevant.
We are at the point where camera + modded input devices are cheap and easy enough I dunno if anti-cheat matters anymore.
I could almost get on board with the idea of invasive kernel anti-cheat software if it actually was effective, but these games still have cheaters. So you get the worst of both worlds--you have to accept the security and portability problems as a condition for playing the game AND there are still cheaters!
Other games did similarly. Quake 3 Arena added Punkbuster in a patch. Competitive 3rd party Starcraft 1 server ICCUP had an "anti-hack client" as a requirement.
It's a bit like complaining that these days people just want to watch TV, instead of writing and performing their own plays.
Competition vs other human beings is the entire point of that genre, and the intensity when you’re in the top .1% of the playerbase in Overwatch/Valorant/CSGO is really unmatched.
Case in point from a few years back - Fall Guys. Silly fun, sloppy controls, a laugh. And then you get people literally flying around because they've installed a hack, so other players can't progress as they can't make the top X players in a round.
So to throw it back - it is just a game, it's so sad that a minority think winning is more important than just enjoying things, or think their own enjoyment is more important than everyone else's.
As an old-timer myself, we thought it was despicable when people replaced downloaded skins in QuakeWorld with all-fullbright versions in their local client, so they could get an advantage spotting other players... I suppose that does show us that multiplayer cheating is almost as old as internet gaming.
Making a Valve-only Linux solution would take a lot of the joy of this moment away for many. But it would also help Valve significantly. It's very uncomfortable to consider, imo.
I'm far from an authority on this topic but from my understanding both Sony/MS have introduced mkb support, but so far it looks to be an opt-in kind of thing and it's still relatively new.
But even then, when everyone is trying out a new indie game there’s a chance it won’t work on non-Windows. It’s happened to me.
I am very pro-Linux and pro-privacy, and hope that the situation improves so I don’t have to continue to compromise.
At the same time, Vulkan support is also getting pretty widespread, I think notably idTech games prefer Vulkan as the API.
Id Software do prefer Vulkan but they are an outlier.
DX12 worked decently better than openGL before, and all the gamedevs had windows, and it was required for xbox… but now those things are less and less true.
The playstation was always “odd-man-out” when it came to graphics processing, and we used a lot of shims, but then Stadia came along and was a proper linux, so we rewrote a huge amount of our render to be better behaved for Vulkan.
All subsequent games on that engine have thus had a vulkan friendly renderer by default, that is implemented cleaner than the DX12 one, and works natively pretty much everywhere. So its the new default.
https://godotengine.org/article/dev-snapshot-godot-4-6-dev-5...
- https://forums.developer.nvidia.com/t/directx12-performance-is-terrible-on-linux/303207/432
- https://indico.freedesktop.org/event/10/contributions/402/attachments/243/327/2025-09-29%20-%20XDC%202025%20-%20Descriptors%20are%20Hard.pdf
- https://www.youtube.com/watch?v=TpwjJdkg2RE
The problem is on multiple levels, so everything has to work in conjunction to be fixed properly.Sure, except that anyone can just compile a Linux kernel that doesn't allow that.
Anti-cheat systems on Windows work because Windows is hard(er) to tamper with.
This isn't complicated.
Even the Crowdstrike falcon agent has switched to bpf because it lowers the risk that a kernel driver will brick downstream like what happened with windows that one time. I recently configured a corporate single sign on to simply not work if the bpf component was disabled.
Anticheat and antivirus are two similar but different games. It's very complicated.
Although even then I'd still have qualms about paying for the creation of something that might pave the path for hardware vendors to work with authoritarian governments to restrict users to approved kernel builds. The potential harms are just not in the same league as whatever problems it might solve for gamers.
Is it possible to do this in a relatively hardware-agnostic, but reliable manner? Probably not.
That way you could use an official kernel from Fedora, Ubuntu, Debian, Arch etc. A custom one wouldn't be supported but that's significantly better than blocking things universally.
I'm not aware that a TPM is capable of hiding a key without the OS being able to access/unseal it at some point. It can display a signed boot chain but what would it be signed with?
If it's not signed with a key out of the reach of the system, you can always implement a fake driver pretty easily to spoof it.
Basically TPM includes key that's also signed with manufacturer key. You can't just extract it and signature ensures that this key is "trusted". When asked, TPM will return boot chain (including bootloader or UKI hash), signed by its own key which you can present to remote party. The whole protocol is more complicated and includes challenge.
I don't really care about games, but i do care about messing up people and companies that do such heinous crimes against humanity (kernel-level anti-cheat).
I feel like this is way overstated, it's not that easy to do, and could conceptually be done on windows too via hardware simulation/virtual machines. Both would require significant investments in development to pull of
And then you have BasicallyHomeless on YouTube who is stimulating nerves and using actuators to "cheat." With the likes of the RP2040, even something like an aim-correcting mouse becomes completely cheap and trivial. There is a sweet-spot for AC and I feel like kernel-level might be a bit too far.
Valve doesn't employ kernel AC but in practice others have taken that into their own hands - the prevalence of cheating on the official CS servers has driven the adoption of third-party matchmaking providers like FACEIT, which layer their own kernel AC on top of the game. The bulk of casual play happens on the former, but serious competitive play mostly happens on the latter.
And for what it's worth, I'm pretty sure Valorant is the most played competitive shooter at the moment.
https://github.com/JacKeTUs/linux-steering-wheels
Hopefully vr headset support will get better
I haven’t found a tool that can access all the extra settings of my Logitech mouse, not my Logitech speakers.
OpenRGB is amazing but I’m stuck on a version that constantly crashes; this should be fixed in the recent versions but nixpkgs doesn’t seem to have it (last I checked).
On the other hand I did manage to get SteamVR somewhat working with ALVR on the Quest 3, but performance wasn’t great or consistent at all from what I remember (RTX 3070, Wayland KDE).
You don’t want a vendor you have to publically shame to get them to do the right thing. And that’s MS if any single sentence has ever described them without using curse words.
EAC has the support for Linux, you just have to enable it as a developer.
I know this, I worked on games that used this. EAC was used on Stadia (which was a debian box) for the division, because the server had to detect that EAC was actually running on the client.
I feel like I bring this up all the time here but people don’t believe me for some reason.
This does not mean it supports the full feature set as from EAC on Windows. As an analogy, it's like saying Microsoft Excel supports iPad. It's true, but without VBA support, there's not going to be many serious attempts to port more complicated spreadsheets to iPad.
Stock price growth is their core business because that is how large firms operate.
MS used to embrace games etc because the whole point was all PCs should run Windows. Now the plan is to get you onto a subscription to their cloud. The PC bit is largely immaterial in that model. Enterprises get the rather horrible Intune bollocks to play with but the goal is to lock everyone into subs.
I thought all of them more or less have operated under Ponzinomics ever since Jack Welch showed that that worked in the short term.
The strength of Linux and Free software in general is not in that it's completely built by unpaid labor. It's built by a lot of paid, full-time labor. But the results are shared with everyone. The strength of Free software is that it fosters and enforces cooperation of all interested parties, and provides a guarantee that defection is an unprofitable move.
This is one of the reasons you see Linux everywhere, and *BSD, rarely.
I doubt it's a large reason. I'd put more weight on eg Linus being a great project lead and he happens to work on Linux. And a lot of other historical contingencies.
This flow is basically the bread and butter for the OSS community and the only way high effort projects get done.
This is a far better user experience for Battlefield players than in Windows.
Have you ever actually attempted to play that half-assed buggy piece of shit?
The one thing I haven’t been able to get working reliably is steam remote play with the Linux machine as host. Most games work fine, others will only capture black screens.
Granted, I don't play online games, so that might change things, but for years I used to have to make a concession that "yeah Windows is better for games...", but in the last couple years that simply has not been true. Games seem to run better on Linux than Windows, and I don't have to deal with a bunch of Microsoft advertising bullshit.
Hell, even the Microsoft Xbox One controllers work perfectly fine with xpad and the SteamOS/tenfoot interface recognizes it as an Xbox pad immediately, and this is with the official Microsoft Xbox dongle.
At this point, the only valid excuses to stay on Windows, in my opinion, are online games and Microsoft Office. I don't use Office since I've been on Unixey things so long that I've more or less just gotten used to its options, but I've been wholly unable to convince my parents to change.
I love my parents, but sometimes I want to kick their ass, because they can be a bit stuck in their ways; I am the one who is expected to fix their computer every time Windows decides to brick their computer, and they act like it's weird for me to ask them to install Linux. If I'm the one who has to perform unpaid maintenance on this I don't think it's weird for me to try and get them to use an operating system that has diagnostic tools that actually work.
As far as I can tell, the diagnostic and repair tools in Windows have never worked for any human in history, and they certainly have never worked for me. I don't see why anyone puts up with it when macOS and Linux have had tools that actually work for a very long time.
(cue superiority complex) I've been using Linux Desktop for over 10 years. It's great for literally everything. Gaming admittedly is like 8/10 for compatibility, but I just use a VM with PCIe passthrough to pass in a gpu and to load up a game for windows or use CAD, etc. Seriously, ez.
Never had issues with NVIDIA GFX with any of the desktop cards. Laptops... sure they glitch out.
Originally Wine, then Proton, now Bazzite make it super easy to game natively. The only issues I ever had with games were from the Kernel level anti-cheats bundled. The anti-cheats just weren't available for Linux, so the games didn't start. Anyone familiar with those knows its not a linux thing, it's a publisher/anti-cheat mechanism thing. Just lazy devs really.
(cue opinionated anti-corporate ideology) I like to keep microsoft chained up in a VM where it belongs so can't do it's shady crap. Also with a VM you can do shared folders and clipboard. Super handy actually.
Weirdly enough, MacOS in a VM is a huge pita, and doesn't work well.
That isn't weird. It's by design. MacOS is only designed to run on Apple hardware, and a VM, even if the host is Apple hardware isn't really Apple hardware.
Tried running Worms: instant crash, no error message.
Tried running Among Us: instant crash, had to add cryptic arguments to the command line to get it to run.
Tried running Parkitect: crashes after 5 minutes.
These three games are extremely simple, graphically speaking. They don't use any complicated anti-cheat measure. This shouldn't be complicated, yet it is.
Oh and I'm using Arch (BTW), the exact distro SteamOS is based on.
And of course, as always, those for which it works will tell you you're doing-it-wrong™ .
Hard to say what might be going wrong for you without more details. I would guess there's something wrong with your video driver. Maybe you have an nvidia card and the OS has installed the nouveau drivers by default? Installing the nvidia first-party drivers (downloaded from the nvidia web site) will fix a lot of things. This is indeed a sore spot for Linux gaming, though to be fair graphics driver problems are not exactly unheard of on Windows either.
Personally I have a bunch of machines dedicated to gaming in my house (https://lanparty.house) which have proven to be much more stable running Linux than they were with Windows. I think this is because the particular NIC in these machines just has terrible Windows drivers, but decent Linux drivers (and I am netbooting, so network driver stability is pretty critical to the whole system).
Woah, that is extremely cool. Very nice work, sir.
Crazy—it used to be that nvidia drivers were by far the least stable parts of an install, and nouveau was a giant leap forward. Good to know their software reputation has improved somewhat
SteamOS is based on Arch, but customized and aimed at specific hardware configurations. It’d be interesting to know what hardware you’re using and if any of your components are not well supported.
FWIW, I’ve used Steam on Linux (mostly PopOS until this year, then Bazzite) for years and years without many problems. ISTR having to do something to make Quake III work a few years ago, but it ran fine after and I’ve recently reinstalled it and didn’t have to fuss with anything.
Granted, I don’t run a huge variety of games, but I’ve finished several or played for many hours without crashes, etc.
I've been gaming on linux exclusively for about 8 years now and have had very few issues running windows games. Sometimes the windows version, run through proton, runs better than the native port. I don't tend to be playing AAA games right after launch day, though. So it could be taste is affecting my experience.
I'm not saying "you're doing it wrong", because obviously if you're having trouble then that is, if nothing else, bad UX design, but I actually am kind of curious as to what you're doing different than me. I have an extremely vanilla NixOS setup that boots into GameScope + Tenfoot and I drive everything with a gamepad and it works about as easily as a console does for me.
That probably includes anything that isn't a PC in a time-capsule from when the game originally released, so any OS/driver changes since then, and I don't think we've reached the point where we can emulate specific hardware models to plug into a VM. One of the reasons the geforce/radeon drivers (eg, the geforce "game ready" branding) are so big is that they carry a whole catalogue of quirk workarounds for when the game renderer is coded badly or to make it a better fit to hardware and lets them advertise +15% performance in a new version. Part of the work for wine/proton/dxvk is going to be replicating that instead of a blunt translation strictly to the standards.
With regards to Linux I generally just focus on hardware from brands that have historically had good Linux support, but that's just a rule of thumb, certainly not perfect.
Its still open to customizing but out of the box is very damn usable and flexible.
There are people who make stripped-down versions of windows. Is it fair to say that because these releases exist that windows isn't "just works" either?
On my laptop I use to write blog posts, that never ever gets plugged into a second screen? Sure, Wayland's great. On a computer that I expect normal people to be able to use without dumb problems? Hell no!
Unfortunately, Wayland inherently can't be like Pipewire, which instantly solved basically 90% of audio issues on Linux through its compatibility with Pulseaudio, while having (in my experience) zero drawbacks. If someone could make the equivalent of Pipewire for X11, that'd be nice. Probably far-fetched though.
Well you see, you are actually just silly for wanting this or asking for this, because it's actually just a security flaw...or something. I will not elaborate further.
I'd say it pretty much "just works" except less popular apps are a bit more work to install. On occasion you have to compile apps from source, but it's usually relatively straightforward and on the upside you get the latest version :)
For anyone who is a developer professionally I'd say the pros outweigh the cons at this point for your work machine.
Although it was to BSDi then, and then FreeBSD and then OpenBSD for 5 years or so. I can't remember why I switched to Debian but I've been there ever since.
I'm sat here now playing Oxygen Not Included.
Interesting, I've had to switch off from Gnome after the new release changed the choices for HiDPI fractional scaling. Now, for my display, they only support "perfect vision" and "legally blind" scaling options.
Now whether or not this feature should have remained experimental is a different debate. I personally find that similar to the fact that Gmail has labeled itself beta for many years.
So on my Framework 13, I no longer have the 150% option. I can pick 133%, double, or triple. 160% would be great, but that requires a denominator of 5, which Gnome doesn't evaluate. And you can't define your own values in monitors.xml anymore.
One side benefit is how fast Moonlight + Sunshine is for using a remote desktop. Much smoother than RDP. On those rare times I need to do work in Windows, I just tab into Moonlight.
If you were interested in PC gaming, this could be a great setup for you. Put a Windows or Linux box at the corner of your desk or in a closet. Never have to think of the hardware again after the initial setup.
Have that desktop be reachable with SSH for all your CLI and sys admin needs, use sunshine/moonlight for the remote streaming and tailscale for securing and making sunshine globally available.
Beyond that, Lunar Lake chips are evidently really really good. The Dell XPS line in particular shows a lot of promise for becoming a strict upgrade or sidegrade to the M2 line within a few years, assuming the haptic touchpad works as well as claimed. In the meantime, I'm sure the XPS is still great if you can live with some compromises, and it even has official Linux support.
I don’t exactly understand this setup. What’s the vm tech?
Most VM software (at least all of it that I've tried) doesn't properly emulate this. Instead, after you've moved your fingers some distance, it's translated to one discrete "tick" of a mouse scroll wheel, which causes the document to scroll a few lines.
The VM software I use is UTM, which is a frontend to QEMU or Apple Virtualization framework depending on which setting you pick when setting up the VM.
This is an understatement. It is completely impossible to even attempt to install Linux at all on an M3 or M4, and AFAIK there have been no public reports of any progress or anyone working on it. (Maybe there are people working on it, I don’t know).
Sounds like the GPU architecture changed significantly with M3. With M4 and M5, the technique for efficiently reverse-engineering drivers using a hypervisor no longer works.
Thanks, I guess I stand corrected.
> There are screenshots of an M3 machine running Linux and playing DOOM at around 31:34 here
That is encouraging! Still, there is no way for a normal to user to try to install it, unless something changed very recently.
I've had Linux running on a variety of laptops since the noughties. I've had no more issues than with Windows. ndiswrapper was a bit shit but did work back in the day.
What issues have you had?
Updated Mesa to the latest and the kernel too.
Not working with Linux is a function of Apple, not Linux. There is a crew who have wasted the last half decade trying to make Asahi Linux, a distro to run on ARM macbooks. The result is after all that time, getting an almost reasonably working OS on old hardware, Apple released the M4 and crippled the whole effort. There's been a lot of drama around the core team who have tried to cast blame, but it's clear they are frustrated by the fact that the OEM would rather Asahi didn't exist.
I can't personally consider a laptop which can't run linux "top notch." But I gave up on macbooks around 10 years ago. You can call me biased.
Amazing that high dpi still doesn’t work. I tried to run linux on 4k in around 2016-2017 and the experience was so bad I gave up.
Prior to that windows was better on laptops due to having the proprietary drivers or working ACPI. But it was pretty poor quality in terms of reliability, and the main problem of the included software being incredibly bare bones, combined with the experience of finding and installing software was so awful (especially if you've not got an unlimited credit card to pay for "big professional solutions").
Every time the year of the Linux desktop arrives, I'm baffled, since not much has changed on this end.
Also, I was basically a child and had no idea what I was doing (I still don't but that's besides the point). Things have definitely gotten better.
(My customer demographic is seniors & casual users).
Loading Teams can take minutes. I'm often late to meetings waiting for the damn thing to load.
Feels like early 90s computing and that Moore's Law was an excuse for bad coding practices and pushing newer hardware so that "shit you don't care about but is 'part of the system'" can do more monitoring and have more control of 'your' computer.
It’s super annoying!
Bazzite is rough in the way that all distributions are, but I imagine Windows 11 is rougher.
In Fedora Atomic it should be foolishly easy to set up a system account, with access to specific USB devices via group, and attach a volume that can easily be written to by a non-root user inside of the container.
Ubuntu seems to be slowly getting worse.
- Firefox seems to be able to freeze both itself and, sometimes, the whole system. Usually while typing text into a large text box.
- Recently, printing didn't work for two days. Some pushed update installed a version of the CUPS daemon which reported a syntax error on the cupsd.conf file. A few days later, the problem went away, after much discussion on forums about workarounds.
- Can't use more than half of memory before the OOM killer kicks in. The default rule of the OOM killer daemon is that if a process has more than half of memory for a minute, kill it. Rust builds get killed. Firefox gets killed. This is a huge pain on the 8GB machine. Yes, I could edit some config file and stop this, but that tends to interfere with config file updates from Ubuntu and from the GUI tools.
None of these problems existed a year ago.
But you can adjust your own system. It'd be unhelpful of me to suggest to an unhappy Windows user that they should switch to another operating system, as that demands a drastic change of environment. On the other hand, you're already familiar with Linux, so the switching cost to a different Linux distribution is significantly lower. Thus I can fairly say that "Ubuntu getting worse" is less of a problem than "Windows getting worse." You have many convenient options. A Windows user has fewer.
1. 10bpp color depth is not supported on RGB monitors, which are the majority of LCD displays on the market. Concretely, ARGB2101010 and XRGB2101010 modes are not supported by current nVidia Linux drivers - the drivers only offer ABGR2101010 and XBGR2101010 (See: https://github.com/NVIDIA/open-gpu-kernel-modules/blob/main/...).
2. Common browsers like Chrome and Firefox has no real support for HDR video playback on nVidia Linux drivers. The "HDR" option appears on YouTube, but no HDR color can be displayed with an nVidia GPU.
Also, video backgrounds in Google Meet on Chrome are broken with nVidia GPUs and Wayland. Ironically it works on Firefox. This has been broken for a few years and no fix is in sight.
The "HDR" toggle you get on Plasma or Mutter is hiding a ton of problems behind the scenes. If you only have 8bpp, even if you can find an app that somehow displays HDR colors on nVidia/Wayland - you'll see artifacts on color gradients.
My monitors (InnoCN 27M2V and Cooler Master GP27U) require RGB input, which means it's limited to 8bpp even with HDR enabled on Wayland. There's another commentator below who uses a Dell monitor and manages to get BGR input working and full HDR in nVidia/Linux.
PC has Manjaro Linux with RTX 3060 12GB
Graphic card driver: Nvidia 580.119.02
KDE Plasma Version 6.5.4
KDE Frameworks Version: 6.21.0
Qt Version: 6.10.1
Kernel Version 6.12.63-1-MANJARO
Graphics Platform: Wayland
Display Configuration High Dynamic Range: Enable HDR is checked
There is a button for brightness calibration that I used for adjustment.
Color accuracy: Prefer color accuracy
sRGB color intensity: This seems to do nothing (even after apply). I've set it to 0%.
Brightness: 100%
TV is reporting HDR signal.AVR is reporting...
Resolution: 4KA VRR
HDR: HDR10
Color Space RGB /BT.2020
Pixel Depth: 10bits
FRL Rate 24Gbps
I compared Interstellar 19s into Youtube video in three different ways on Linux and 2:07:26 on Blu-ray.For Firefox 146.0.1 by default there is no HDR option on Youtube. 4K video clearly doesn't have HDR. I enabled HDR in firefox by going to about:config and setting the following to true: gfx.wayland.hdr, gfx.wayland.hdr.force-enabled, gfx.webrender.compositor.force-enabled. Color look completely washed out.
For Chromium 143.0.7499.169 HDR enabled by default. This looks like HDR.
I downloaded the HDR video from Youtube and played it using MPV v0.40.0-dirty with settings --vo=gpu-next --gpu-api=vulkan --gpu-context=waylandvk. Without these settings the video seems a little too bright like the Chromium playback. This was the best playback of the three on Linux.
On the Blu-ray the HDR is Dolby Vision according to both the TV and the AVR. The AVR is reporting...
Resolution: 4k24
HDR: Dolby Vision
Color Space: RGB
Pixel Depth 8bits
FRL Rate: no info
...I looked into this and apparently Dolby Vision uses RGB tunneling for its high-bit-depth (12-bit) YCbCr 4:2:2 data.
The Blu-ray looks like it has the same brightness range but the color of the explosion (2:07:26) seems richer compared to the best playback on Linux (19s).I would say the colors over all look better on the Blu-ray.
I might be able to calibrate it better if the sRGB color setting worked in the display configuration. Also I think my brightness setting is too high compared to the Blu-ray. I'll play around with it more once the sRGB color setting is fixed.
*Edit: Sorry Hacker News has completely changed the format of my text.
Also, go to YouTube and play this video: https://www.youtube.com/watch?v=onVhbeY7nLM
Do it once on "HDR" on Linux, and then on Windows. The "HDR" in nVidia/Linux is fake.
The brightness you see on Plasma or Mutter is indeed related to the HDR support in the driver. But - it's not really useful for the most common HDR tasks at the moment.
Your Display Configuration
Both monitors are outputting 10-bit color using the ABGR2101010 pixel format.
| Monitor | Connector | Format | Color Depth | HDR | Colorspace |
|------------------------|-----------|-------------|-------------|--------------|------------|
| Dell U2725QE (XXXXXXX) | HDMI-A-1 | ABGR2101010 | 10-bit | Enabled (PQ) | BT2020_RGB |
| Dell U2725QE (XXXXXXX) | HDMI-A-2 | ABGR2101010 | 10-bit | Disabled | Default |
* Changed the serial numbers to XXXXXXXI am on Wayland and outputting via HDMI 2.1 if that helps.
EDIT: Claude explained how it determined this with drm_info, and manually verified it:
> Planes 0 and 3 are the primary planes (type=1) for CRTCs 62 and 81 respectively - these are what actually display your desktop content. The Format: field shows the pixel format of the currently attached framebuffer.
EDIT: Also note that I am slowbanned on this site, so may not be able to respond for a bit.
EDIT: You should try connecting with HDMI 2.1 (you will need a 8k HDMI cable or it will fall back to older standards instead of FRL).
EDIT: HDR on youtube appears to work for me. Youtube correctly indentifies HDR on only 1 of my monitors and I can see a big difference in the flames between them on this scene: https://www.youtube.com/watch?v=WjJWvAhNq34
Here's what I'm getting on both monitors, with HDR enabled on Gnome 49: https://imgur.com/a/SCyyZWt
Maybe you're lucky with the Dell. But as I understand, HDR playback on Chrome is still broken.
I'm actually surprised that YouTube HDR works on your side - perhaps it's tied to the ABGR2101010 output mode being available.
That's still pretty crappy. Monitors do not say whether they support BGR input signals or not as opposed to RGB.
The GPU and monitor combination has full 10-bit HDR in Windows. But in Linux it's stuck at 8bpp due to nVidia driver not having 10-bit RGB output.
EDIT: See my sibling comment.
Here's what I'm getting on an RTX 4090 / InnoCN 27M2V and Cooler Master Tempest GP27U.
Debian is a breath of fresh air in comparison. Totally quiet and snappy.
> The drivers included are just too old.
This can usually be fixed by enabling Debian Backports. In some cases, it doesn't even need fixing, because userland drivers like Mesa can be included in the runtimes provided by Steam, Flatpak, etc.
Once set up, Debian is a very low-maintenance system that respects my time, and I love it for that.
However, despite really, really wanting to switch (and having it installed on my laptop), I keep finding things that don't quite work right that are preventing me from switching some of my machines. My living room PC, which is what my TV is connected to, the DVR software that runs my TV tuner card doesn't quite work right (despite having a native linux installer), and I couldn't get channels to come through as clearly and as easily. I spent a couple of hours of troubleshooting and gave up.
My work PC needs to have the Dropbox app (which has a linux installer), but it also needs the "online-only" functionality so that I can see and browse the entire (very large) dropbox directory without needing to have it all stored locally. This has been a feature that has been being requested on the linux version of the app for years, and dropbox appears unlikely to add it anytime soon.
Both of these are pretty niche issues that I don't expect to affect the vast majority of users (and the dropbox one in particular shouldn't be an issue at all if my org didn't insist on using dropbox in a way that it is very much not intended to be used, and for which better solutions exist, but I have given up on that fight a long time ago), and like I said, I've had linux on my laptop for a couple of years so far without any issue, and I love it.
I am curious how many "edge cases" like mine exist out there though. Maybe there exists some such edge case for a lot of people even while almost no one has the same edge case issue.
But some of the drawbacks really aren't edge cases. Apparently there is still no way for me to have access to most creative apps (e.g. Adobe, Affinity) with GPU acceleration. It's irritating that so few Linux install processes are turnkey the way they are for Windows/Mac, with errors and caveats that cost less-than-expert users hours of experimenting and mucking with documentation.
I could go on, but it really feels like a bad time to be a casual PC user these days, because Windows is an inhospitable swamp, and Linux still has some sharp edges.
It doesn't feel real sometimes. My dotfiles are modularized, backed up in Github and versioned with UEFI rollback when I update. I might be using this for the rest of my life, now.
It's not advisable to switch to one of these paranoid configurations outright, but they're a great introduction to the flexibility provided by the NixOS configuration system. I'd also recommend Xe's documentation of Nix Flakes, which can be used on any UNIX-like system including macOS: https://xeiaso.net/blog/nix-flakes-1-2022-02-21/
It's a slow moving evergreen topic perfect for a scheduled release while the author is on holiday. This is just filler content that could have been written at any point in the last 10 years with minor changes.
I've not seen anything like the current level of momentum, ever, nor this level of mainstream exposure. Gaming has changed the equation, and 2026 will be wild.
On the other hand, on the Linux side, we had the release of COSMIC, which is an extremely user-friendly desktop. KDE, Gnome, and others are all at a point where they feel polished and stable.
The level of momentum feels roughly equivalent to the era of Ubuntu coming around in the mid-2000s. We have been here before.
1. 'office' cloud services - now you just need a browser for majority of docs/sheet/slides tasks
2. gaming - while it was possible back, but it was really hit or miss with a game. Nowadays vast majority of games work on Linux out of the box.
The bloat is astounding. This is especially egregious now that RAM costs a fortune.
To be honest, I always figured we'd make it in the long run. We're a thrifty bunch, we aim to set up sustainable organizations, we're more enshittification-resistant by nature. As long as we're reliable and stick around for long enough.
The success measurements are quite strange. How am I supposed to think Linux is finally good when 96.8% of users do not care to adopt it. I can't think of anything else with that high of a rejection rate. The vast majority do not consider it good enough to use over Windows.
E.g three weeks ago nvidia pushed bad drivers which broke my desktop after a reboot and I had to swap display (ctrl-alt-f3 etc), I never got into gnome at all, and roll back to an earlier version. Automatic rollback of bad drivers would have saved this.
Are Radeon drivers less shit?
Then again Arch is one of those distros that has the attitude that you need to be a little engaged/responsible for ongoing maintenance of your system, which is why I'm against blind "just use (distro)" recommendations unless it's very basic and low assumptions about the user.
[0] https://old.reddit.com/r/archlinux/comments/1prm8rl/archanno...
A couple of months ago I bought a second hand RX 7800 XT, and prepared myself for a painful experience, but I think it just worked. Like I got frustrated trying to find out how to download and install the driver, when I think it just came with Linux Mint already.
After a particularly busy OSS event a non-programmer friend of mine asked me, why is it that the Linux people seem to be so needy for everyone to make the same choices they make? trying to answer that question changed my perspective on the entire community. And here we are, after all these years the same question seems to still apply.
Why are we so needy for ALL users and use-cases to be Linux-based and Linux-centric once we make that choice ourselves? What is it about Linux? the BSD people seem to not suffer from this and I've never heard anyone advocate for migration to OSX in spite of it being superior for specific usecases (like music production).
IMO if you're a creator, operating systems are tools; use the tool that fits the task.
I do understand the evangelism being obnoxious. I don’t advocate for people to switch if they have key use cases that ONLY windows or OS X can meet. Certainly not good to be pushy. But otherwise, people are really getting a better experience by switching to Linux.
Yes, you can get this stuff working, but if you enjoy doing other things in life, have a job and don’t life alone, it is SSSOOOOO much easier to get a Mac mini. Or even windows 11 if that’s your thing.
One big plus with Linux, it's more amenable to AI assistance - just copy & paste shell commands, rather than follow GUI step-by-steps. And Linux has been in the world long enough to be deeply in the LLM training corpuses.
The Linux world is amazing for its experimentation and collaboration. But the fragmentation makes it hard for even technical people like me who just want to get work done to embrace it for the desktop.
Ubuntu LTS is probably the right choice. But it's just one more thing I have to go research.
If using Ubuntu LTS for gaming, you might want to add a newer kernel: https://ubuntu.com/kernel/lifecycle
Linux Mint would also be a reasonable pick.
I haven't tried Bazzite because I'm not into gaming but Linux Mint is working very well for a lot of people coming from Windows. It just works and has great defaults. Windows users seem to pick it up pretty easily.
Also, Linux Mint upgrades very well. I've had a lot of success upgrading to new versions without needing to reinstall everything. Ubuntu and other distros I've tried often have failed during upgrading and I had to reinstall.
Any reasonably popular distro will have enough other users that you can find resources for fixing hitches. The deciding factor that made me go with EndeavourOS was that their website had cool pictures of space on it. If you don't already care then the criteria don't need to be any deeper than that.
Once you use it enough to develop opinions, the huge list of options will thin itself out.
Linux/x86 still is poor for battery life compared to Apple.
That’s my impression anyway.
One thing that can be annoying is how quickly things have moved in the Linux gaming space over the past 5 years. I have been a part of conversations with coworkers who talk about how Linux gaming was in 2019 or 2020. I feel like anyone familiar with Linux will know the feeling of how quickly things can improve while documentation and public information cannot keep up.
Ubuntu’s default desktop felt unstable in a macOS VM. Dual-booting on a couple of HP laptops slowed to a crawl after installing a few desktop apps, apparently because they pulled in background services. What surprised me was how quickly the system became unpleasant to use without any obvious “you just broke X” moment.
My current guess: not Linux in general, but heavy defaults (GNOME, Snap, systemd timers), desktop apps dragging in daemons, and OEM firmware / power-management quirks that don’t play well with Linux. Server Linux holds up because everything stays explicit. Desktop distros hide complexity and don’t give much visibility when things start to rot.
Does this line up with others’ experience? If yes, what actually works long-term? Minimal bases, immutable distros, avoiding certain package systems, strict service hygiene, specific hardware?
For certain timeperiods I have needed to switch to Fedora, or the Fedora KDE spin, to get access to more recent software if I'm using newer hardware. That has generally also been pretty stable but the constant stream of updates and short OS life are not really what I'm looking for in a desktop experience.
There are three issues that linux still has, which are across the board:
- Lack of commercial mechanical engineering software support (CAD & CAE software)
- Inability to reliably suspend or sleep for laptops
- Worse battery life on laptops
If you are using a desktop and don't care about CAD or CAE software I think it's probably a better experience overall than windows. Laptops are still more for advanced users imho but if you go with something that has good linux support from the factory (Dell XPS 13, Framework, etc.) it will be mostly frictionless. It just sucks on that one day where you install an update, close the laptop lid, put it in your backpack, and find it absolutely cooking and near 0% when you take it out.
I also have never found something that gave me the battery life I wanted with linux. I used two XPS 13's and they were the closest but still were only like 75% of what I would like. My current Framework 16 is like 50% of what I would like. That is with always going for a 1080p display but using a VPN which doesn't help battery life.
My experience with FOSS has mostly been that mature projects with any reasonable-sized userbase tend to more reliably not break things in updates than is the case for proprietary software, whether it's an OS or just some SaaS product. YMMV. However, I think probably the most potent way to avoid problems like this actually ever mattering is a combination of doing my updates manually (or at least on an opt-in basis) and being willing to go back a version if something breaks. Usually this isn't necessary for more than a week or so for well-maintained software even in the worst case. I use arch with downgrade (Which lets you go back and choose an old version of any given package) and need to actually use downgrade maybe once a year on average, less in the last 5
No, not really. A Linux desktop with a DE will always be slower and more brittle than an headless machine due to the sheer number of packages/components, but something like Arch + Plasma Shell (without the whole KDE ecosystem) should be very stable and snappy. The headaches caused by immutable distros and flatpaks are not worth it IMO, but YMMV.
Not really, no. What did you install that slowed things down?
> If yes, what actually works long-term?
Plain ordinary Ubuntu 24.04 LTS, running on an ancient Thinkpad T430 with a whopping 8GB of RAM and an SSD (which is failing, but that's not Linux's fault, it's been on its way out for about a year and I should probably stop compiling Haiku nightlies on it).
Can you give an example of which desktop apps are "dragging in daemons"?
If you think Gimp is terrible you'll hate something like DaVinci Resolve.
I've run Void Linux + Xmonad for many years without any such issues. I also recently installed CachyOS for my kid to game on (KDE Plasma) and it works super well.
So far all the games I want to play run really well, with no noticable performance difference. If anything, they feel faster, but it could be placebo because the DE is more responsive.
This is more about what you choose as your operating environment, not what your work imposes as your working environment.
Most places of work, mine included, run Microsoft services that lock them into the ecosystem incredibly tightly.
As per the article title, "if you want to feel like you actually own your PC", this is about your PC, not the one provided to you by your workplace (since it's likely owned by them).
One thing I'm worried about in my work environment is Microsoft enforcing the web versions of Office and deprecating the stand alone desktop applications. The web versions are a massive step down in terms of functionality and ease of use. Your mention of OWA makes me feel as if that is what Outlook will be sacrificed for at some point in the future anyway.
IMO the next important unblocker for Linux adoption is the Adobe suite. In a post-mobile world one can use a tablet or phone for almost any media consumption. But production is still in the realm of the desktop UX and photo/video/creative work is the most common form of output. An Adobe CC Linux option would enable that set of "power users". And regardless of their actual percentage of desktop users, just about ever YouTuber or streamer talking about technology is by definition a content creator so opening Linux up to them would have a big effect on adoption.
And yes I've tried most of the Linux alternatives, like GIMP, Inkscape, DaVinci, RawTherapee, etc. They're mostly /fine/ but it's one of the weaker software categories in FOSS-alternatives IMO. It also adds an unnecessary learning curve. Gamers would laugh if they were told that Linux gaming was great, they just have to learn and play an entirely different set of games.
I've used Mint in the past, loved it until I spent a day trying to get scanner drivers to work. Don't know if that's changed now, was 4 years ago
I am using Fedora on machines with new hardware and liking it as well. It has small pluses/minuses vs Mint.
People dual boot SSD OS for very good reasons, as kernel permutation is not necessarily progress in FOSS. Linux is usable by regular users these days, but "Good" is relative to your use case. YMMV =3
And if you are running Chrome, and something starts taking a lot of memory, say goodbye to the entire app without any niceties.
(Yes, this is a mere pet peeve but it has been causing me so much pain over the past year, and it's such an inferior way to deal with memory limits tha what came before it, I don't know why anybody would have taken OOM logic from systemd services and applied it to use launched processes.)
If anybody can help me out with a better solution with a modern distribution, that's about 75% of the reason I'm posting. But it's been a major pain and all the GitHub issues I have encountered on it show a big resistance to having better behavior like is the default for MacOS, Windows, or older Linux.
Regardless, I believe EarlyOOM is pretty configurable, if you care to check it out.
If you want a distro that really cares about the desktop experience today, try Linux Mint. Windows users seem to adapt to it quite quickly and easily. It's familiar and has really good defaults that match what people expect.
Try doing less at once, or getting more memory.
If your solution is "don't ever run out of memory" my solution is "I won't ever use your OS unless forced to."
Every other OS handles this better, and my work literally requires pushing the bounds of memory on the box, whether it's 64GB or 1TB of RAM. Killing an entire cgroup is never an acceptable solution, except for the long-running servers that systemd is meant to run.
Windows is unstable even if you have more than enough memory but your swap is disabled, due to how its virtual memory works. It generally behaves much worse than others under heavy load and when various system resources are nearly exhausted.
There are several advanced and very flexible OOM killers available for Linux, you can use them if it really bothers you (honestly you're the first I've seen complaining about it). Some gaming/realtime distros are using them by default.
Of course, if it's absolutely not compatible with your work, you can just disable systemd-oomd. I'm wondering though, what sort of work are you doing where you can't tune stuff to use 95% of your 1TB of memory instead of 105% of it?
Using hardware at least 6-12 months old is a good way to get better compatibility.
Generally Linux drivers only start development after the hardware is available and in the hands of devs, while Windows drivers usually get a head start before release. Brand new hardware on a LTS (long term support) distro with an older kernel is usually the worst compatibility combo.
I recently switched to using a thumb drive to transfer files to and from my phone/tablet, I became demoralized when faced with getting it all setup.
No, thank you! The "smooth, effortless [, compulsory, mandated, enforced] integration" between my Apple devices is the very worst thing about them.
And it mostly works! At least for my games library. The only game I wasn't able to get to work so far is Space Marine 2, but on ProtonDB people report they got it to work.
As for the rest: I've been an exclusive Linux user on the desktop for ~20 years now, no regrets.
I tried Cinnamon and while it was pleasantly customizable, the sigle-threadedness of the UI killed it for me. It was too easy to do the wrong thing and lock the UI thread, including several desktop or tray Spices from the official repo.
I'm switching to KDE. Seems peppier.
Biggest hardware challenge I've faced is my Logitech mouse, which is a huge jump from the old days of fighting with Wi-Fi and sound support. Sound is a bit messy with giving a plethora of audio devices that would be hidden under windows (like digital and analog options for each device) and occasionally compatibility for digital vs analog will be flaky from a game or something, but I'll take it.
Biggest hassle imho is still installing non-repo software. So many packages offer a flatpak and a snap and and build-from-source instructions where you have to figure out the local package names for each dependency and they offer one .Deb for each different version of Debian and its derivatives and it's just so tedious to figure which is the right one.
In case it helps:
If I have an issue with an application or if I want an application, I must use the terminal. I can't imagine a Mac user bothering to learn it. Linux is for people who want to maximize the use of their computer without being spied on and without weird background processes. Linux won't die, but it won't catch Windows or Mac in the next 5 decades. People are too lazy for it. Forget about learning. I bet you $100, 99% of the people in the street didn't even see Linux in their lives, nor even heard of it. It is not because of marketing, it is because people who tried it returned to Windows or Mac after deciding it is too hard to learn for them to install a driver or an application.
On one hand we have Steam that will make 1000s of games become available on easy to use platform based on Arch.
For developers, we have Omarchy, which makes experience much more streamlined and very pleasant and productive. I moved both my desktop and laptop to Omarchy and have one Mac laptop, this is really good experience, not everything is perfect, but when I switch to Mac after Omarchy, I often discover how not easy is to use Mac, how many clicks it takes to do something simple.
I think both Microsoft and Apple need some serious competition and again, came from Arch who turned out to be more stable and serious then Ubuntu.
It's funny they would choose this phasing.
This is exactly the way I described my decision to abandon windoze, and switch to linux, over 20 years ago...
Can I get a laptop to sleep after closing the lid yet?
Not that long ago the answer to these questions was mostly no (or sort of yes... but very painfully)
On Windows all of this just works.
> on windows all of this just works
Disagree on the sleep one - my work laptop doesn’t go to sleep properly. The only laptop I’ve ever used that behaves as expected with sleep is a macbook.
It’s more than fine for people to dislike Apple products but this is simply not an area where other platforms have them beat.
All in all, I've given up on sleep entirely and default to suspend/hibernate now.
Laptop sleep and suspend can still be finicky unfortunately.
I will say my experience using CAD or other CAE software on windows has gotten progressively worse over the years to the point that FEA is more stable on linux than on windows.
We do really need a Solidworks, Creo or NX on linux though. My hope has been that eventually something like Wine, Proton, or other efforts to bring windows games to linux will result in us getting the ability to run them. They are one of the last things holding me back from fully moving away from windows.
I added a couple VMs running windows, linux, and whatever else I need in proxmox w/ xrdp/rdp and remina, and it's really the best of both worlds. I travel a good deal and being able to remotely connect and pick up where I left off while also not dealing with windows nagware has been great.
As many have pointed out, The biggest factor is obviously the enshittification of Microsoft. Valve has crept up in gaming. And I think understated is how incredibly nice the tiling WMs are. They really do offer an experience which is impossible to replicate on Mac or Windows, both aesthetically and functionally.
Linux, I think, rewards the power user. Microsoft and Apple couldn't give a crap about their power users. Apple has seemed to devolve into "Name That Product Line" fanboy fantasy land and has lost all but the most diehard fans. Microsoft is just outright hostile.
I'm interested to see what direction app development goes in. I think TUIs will continue to rise in popularity. They are snappier and overall a much better experience. In addition, they work over SSH. There is now an entire overclass of power users who are very comfortable moving around in different servers in shell. I don't think people are going to want to settle for AI SaaS Cloudslop after they get a taste of local first, and when they realize that running a homelab is basically just Linux, I think all bets are off as far as which direction "personal computing" goes. Also standing firmly in the way of total SSH app freedom are IPhone and Android, which keep pushing that almost tangible utopia of amazing software frustratingly far out of reach.
It doesn't seem like there is a clear winner for the noob-friendly distro category. It seems like theyre all pretty good. The gaming distros seem really effective. I finally installed Omarchy, having thought "I didn't need it, I can rice my own arch", etc, and I must say the experience has been wonderful.
I'm pretty comfortable at the "cutting edge" (read, with all my stuff being broken), so my own tastes in OS have moved from Arch to the systemd free Artix or OpenBSD. I don't really see the more traditional "advanced" Linuxes like Slackware or Gentoo pulling much weight. I've heard interesting things about users building declarative Nix environments and I think that's an interesting path. Personally, I hope we see some new, non-Unix operating systems that are more data and database oriented than file oriented. For now, OpenBSD feels very comfortable, it feels like I have a prayer of understanding what's on my system and that I learn things by using it, the latter of which is a feature of Arch. The emphasis on clean and concise code is really quite good, and serves as a good reminder that for all the "memory safe" features of these new languages, it's tough to beat truly great C developers for code quality. If you're going to stick with Unix, you might as well go for the best.
More and more I find myself wanting to integrate "personal computing" into my workflow, whether that's apps made for me and me alone, Emacs lisp, custom vim plugins, or homelab stuff. I look with envy at the smalltalks of the world, like Marvelous Toolkit, the Forths, or the Clojure based Easel. I really crave fluency - the ability for code to just pour out - none of the hesitation or system knowledge gaps which come from Stack Overflow or LLM use. I want mastery. I've also become much more tactical on which code I want to maintain. I really have tried to curb "not invented here" syndrome because eventually you realize you aren't going to be able to maintain it all. Really I just want a fun programming environment where I can read Don Knuth and make wireframe graphics demos!
I also play a decent amount of Flight Simulator 2024 and losing that is almost a non-starter for switching.
turn on anticheat if you want to join no cheat sessions.
if you want a cheat game turn off anticheat and you join sessions with other cheat players.
the whole dilemma comes out of malignant users that enjoy destruction of other users ability to enjoy the game.
go nuclear on clients that manage to join anticheat sessions with cheats turned on.Instead of distro upgrades, spend 3 minutes disabling the newest AI feature using regedit.
But, as the author rightly notes: It's more about a "feeling." Well then, good luck.
If I remember correctly, after the Crowdstrike BSOD-all-windows-instances update last year Microsoft wanted to make some changes to their kernel driver program and these anti-cheat measures on Windows might need to find a new mechanism soon anyway. That's a long way of saying, it's plausible that even that last barrier might come down sooner rather than later.
Some interesting reads on what modern anticheats do:
https://github.com/0avx/0avx.github.io/blob/main/article-3.m...
https://github.com/0avx/0avx.github.io/blob/main/article-5.m...
https://reversing.info/posts/guardedregions/
https://game-research.github.io/ (less in detail and less IDA pseudo)
Not up close due to the vast number of inconsistencies.
This could only be fixed by a user experience built from the ground up by a single company.
I get that you're making a Windows joke, but this describes Linux equally well.
The UX leaves a lot to be desired.
Even modern macs fall short of the UX Apple has traditionally been known for...
MacOS is highly consistent compared to Windows.
Perhaps Linux operating systems like Steam or ChromeOS might finally create a beautiful and consistent UI.
Please revert this submission to use the correct title.
Even with imperatively configured distros like Ubuntu, it's generally much easier to recover from a "screen of death" than in Windows because the former is less of a black box than the latter. This means its easier to work out what the problem is and find a fix for it. With LLMs that's now easier than ever.
And, in the worst case that you have to resort to reinstalling your system, it's far less unpleasant to do that in a Linux distro than in Windows. The modern Windows installer is painful to get through, and then you face hours or days of manually reinstalling and reconfiguring software which you can do with a small handful of commands in Linux to get back to a state that is reasonably similar to what you had before.
Incidentally, I can now honestly say I've had more driver issues with Windows than Linux.
The result was that from day 1 of using Linux I never looked back.
Fun aside: I had a hardware failure a few years ago on my old workstation where the first few sectors of every disk got erased. I had Linux up and running in 10 minutes. I just had to recreate the efi partition and regenerate a UKI after mounting my OS from a live USB. Didn't even miss a meeting I had 15 minutes later. I spent hours trying to recover my Windows install. I'm rather familiar with the (largely undocumented) Windows boot process but I just couldn't get it to boot after hours of work. I just gave up and reinstalled windows from scratch and recovered from a restic backup.
Windows has recently been a complete shitshow - so even if Linux hasn't gotten any better (it has) it is now likely better than fiddling around with unfucking Windows, and Windows doing things like deleting all your files.
That's exactly my point.
There's an ever growing list of things to do in order to fix Windows, and that list is likely longer than Linux. This whole "your time is free" argument hinges on Windows not having exactly the same issue, or worse.