I had setup a dual boot when I swapped my old GTX 1080 for an RX 5700XT, figuring the "open source" drivers would give me a good Linux experience... it didn't. Every other update was a blank/black screen and me without a good remote config to try to recover it. After about 6 months it was actually stable, but I'd since gone ahead and paid too much for an RTX 3080, and gone back to my windows drive...
I still used WSL almost all day, relying mostly on VS Code and a Browser open, terminal commands through WSL remoting in Code and results etc. on the browser.
Then, one day, I clicked the trusty super/win menu and started typing in the name of he installed program I wanted to run... a freaking ad. In the start menu search results. I mean, it was a beta channel of windows, but the fact that anyone thought this was a good idea and it got implemented, I was out.
I rebooted my personal desktop back to Linux... ran all the updates and it's run smoothly since. My current RX 9070XT better still, couldn't be happier. And it does everything I want it to do, and there's enough games in Steam through Proton that I can play what I want, when I want. Even the last half year on Pop Coxmic pre-release versions was overall less painful than a lot of my Windows experiences the past few years. Still not perfect, but at least it's fast and doesn't fail in ways that Windows now seems to regularly.
Whoever is steering Windows development at Microsoft is clearly drunk at the wheel over something that should be the most "done" and polished product on the planet and it just keeps getting worse.
I know its a "meme" to talk about how great Arch is, but when you want the latest of something, Arch has it. I use EndeavourOS since it had a nicer simpler installer (idk why Arch doesn't invest in whats standard in every other major distro) and if you just use "yay" you don't run into Pacman woes.
Alternatively, I'm only buying Macs as well, but for my gaming rigs, straight to Arch. Steam and Proton work perfectly, if you don't sell your games on Steam or in a way I can run them on Linux I am not buying or playing them.
So much this. People like to moan about "oh game XYZ doesn't run so it's not reasonable for gaming". More games run on GNU / Linux than any gaming console. There are simply too many games that do run to give a second thought about the ones that don't, and it's been that way for years.
People tend to generalize, but what they probably mean is "it's not reasonable for gaming for the games I play.
I haven't fully switched over yet because the games the combo of the hardware I have + the games I play regularly, still give me issues vs. Windows. Getting them to run isn't the problem, but I haven't been able to solve miscellaneous crashes, lag, lower frame rates, etc.
My next PC upgrade will probably be getting rid of my Nvidia 1660 super and getting something AMD for less headaches.
This. The corollary is also that people take the such quips way too literally.
I, personally, don't play that many games, and those that I do play tend to run faster on Linux (with an AMD GPU, which I bought specifically to avoid nvidia headaches).
But I still game on Windows. Why? Because I still have a Windows box, "because Linux is not reasonable for photo editing". I actually daily drive Linux, but I can't be assed to move from Lightroom and photoshop, so I still keep a windows pc under my desk. I just play games on it because it's much beefier than my 5 yo ryzen U laptop, and since I don't interact with that box all that much, I didn't feel like partitioning my smallish drive for no tangible benefit. My laptop is more than enough for all my other needs.
Can you elaborate on this?
For example, it was convoluted getting StarCraft 2 to run. Then it did eventually work, though it felt ever so slightly laggy.
Anno 1800 ran though it occasionally slowed way down, occasionally crashed, and multiplayer never worked.
Hogwart's Legacy ran but crashed, and ran massively slower / lower quality settings than on the same hardware but in Windows.
All of those were not binary "runs / doesn't".
I haven't booted windows in months but there is definitely some caveats for gamers
I've been meaning to set up Bazzite on an older desktop.
And specifically the state of multiplayer games with anticheat here (which is a much less favorable % of working games):
https://areweanticheatyet.com/
I personally wouldn't install any kernel anticheat on a computer that I intend to use for anything important, so I would personally refuse to install the incompatible games even if I was using windows.
11/12 top selling new releases (the exception is battlefield 6, because the anticheat blocks Linux)
9/12 top selling (COD, BF6 and Apex block Linux)
11/12 most played (Apex blocks Linux)
So if you’re into competitive ranked games (especially fps), you might face problems due to anti cheat blocks, but practically everything else works
That said, I haven't tried getting the same kit working on windows so I can't say if it's any better.
Most people had never even heard of Linux. It has taken a lot of very bad things on Windows for it to get to this point. It’s classic frog in a slowly heating up pot territory.
My experience is that people fear linux, rather than not knowing. I am the lonely Linux user since c. 2005, and people see half my screen is always a console, the other half a browser. So they fear linux is for console wizards, not for regular users. Nothing will convince them otherwise, even when they are 100% of the time using online webapps. I have some coworkers using browser + VS code + WSL2 all the time, but they don't switch because they fear the console-to-config-everything instead of Control Panel.
Simplicity, among other reasons. Installers force the users hand and need maintenance. Having no installer but rather a detailed installation guide offers unlimited freedom to users. Installation isn't difficult either, you just pacstrap a root filesystem and configure the bootloader, mounts and locale.
ArchLinux does now have an installer called archinstall, but it's described more as a library than a tool. It allows you to automate the installation using profiles.
* A user configured through systemd-homed with luks encryption
* The limine bootloader
* snapperd from OpenSUSE with pacman hooks
* systemd-networkd and systemd-resolved
* sway with my custom ruby based bar
If you were to follow the installation guide it will tell you to consider these networking/bootloader/encryption options just fine. But trying to create an installer which supports all these bleeding edge features is futile.
Nowadays, there are so many ways to partition the drive (lvm, luks, either one on top of the other; zfs with native encryption or through dm-crypt), having the efi boot directly a unified kernel image or fiddle with some bootloader (among a plethora of options)...
One of the principal reasons why I love Arch is being able to have a say in some of these base matters, and would hate to have to fight the installer to attain my goals. I remember when Ubuntu supported root on zfs but the installer didn't it was rather involved to get the install going. All it takes with Arch is to spend a few minutes reading the wiki and you're off to the races. The actual installation part is trivial.
But then again, if you have no idea what you want to do, staring at the freshly-booted install disk prompt can be daunting. Bonus points for it requiring internet for installation. I would have to look up the correct incantation to get the wifi connected on a newer PC with no wired ethernet, and I've been using the thing for a very long time.
Exactly, Arch allows you to do many bleeding edge things. An installer would never keep up are give you that freedom.
> I remember when Ubuntu supported root on zfs but the installer didn't it was rather involved to get the install going.
That's why many installers allow you to drop a shell when it's time to partition.
> I would have to look up the correct incantation to get the wifi connected on a newer PC
To be honest that would largely be helped if archiso would start using NetworkManager
I love my Arch installs to death, but I feel like I'm the oddball out about the mess that is AUR. The main repositories have a lot of things but I always end up getting pushed to AUR and then it just feels like I bolted on a hack rather than pacman/the arch base just supporting AUR more like a different package source normally.
They largely have now, archinstall.
It's still text based/TUI but it's pretty simple and intuitive, anyone already familiar with installing a Linux distro (especially any sort of -server variant) will be comfortable with the archinstall script.
I actually tried Fedora first (thinking dev-first workflows) but ended up switching to Ubuntu w/x11 for gaming. A lot of that had to do with Fedora's release schedule (rather than Ubuntu's 2-year LTS) breaking working GOG/steam/wine-based apps on a rotating basis. Since switching to a defaults lifestyle / Ubuntu with x11 I deal with NVIDIA driver compatibility issues every 6 months or so instead of once/month. The 22 -> 24 upgrade was better than I expected and I didn't lose more than a couple of hours of life to appease the shell gods.
In any case Fedora and a once/month problem would still beat the Windows update nonsense, which I am still supporting since my spouse hasn't switched yet :/
I had bad experiences with Arch before because of Manjaro, but in hindsight, I think the main issues I had were more to do with how Pacman can get insanely nuanced. When you update packages you have to know what you're doing, it will update all weird, its not like Debian or Ubuntu upgrades where it installs / uninstalls what you do and don't need unless you tell it to be that nuanced.
My homeserver is Ubuntu, my gaming PC is Arch.
I get that maybe that was the final straw or something, but come on, “I switched to Linux because I didn’t want to take an hour to set up Windows” really sounds like you never really wanted Windows in the first place, you were just looking for an excuse.
1) very stable due to rolling-release producing small changes
2) the skill barrier to getting a full system is “basic literacy, to read the wiki”
Eventually I switched to Ubuntu for some reason, it has given me more headaches than Arch.
Having very frequent updates to bleeding edge software versions, often requiring manual intervention is not "stable". An arch upgrade may, without warning, replace your config files and update software to versions incompatible with the previous.
That's fine if you're continuously maintaining the system, maybe even fun. But it's not stable. Other distributions are perfectly capable of updating themselves without ever requiring human intervention.
> 2) the skill barrier to getting a full system is “basic literacy, to read the wiki”
As well as requiring you to be comfortable with the the linux command line as well as have plenty of time. My mom has basic literacy, she can't install ArchLinux.
ArchLinux is great but it's not a beginner-friendly operating system in the same way that Fedora/LinuxMint/OpenSUSE/Pop!_OS/Ubuntu/ElementOS are.
Can you elaborate on the chain of thought here? The small changes at high frequency means that something is nearly constantly in a <CHANGED> state, quite opposite from stable. Rolling release typically means that updates are not really snapshotted, therefore unless one does pull updates constantly they risk pulling a set of incompatible updates. Again, quite different from stable.
if GenZ knew how to read they would be very disappointed right now
in the age of tablets and tiktok, basic literacy is quite a big ask
* UI framework balkanization has always been, and remains a hideous mess. And now you don't just have different versions of GTK vs QT to keep track off, but also X vs Wayland, and their various compatibility layers.
* Support for non-standard DPI monitors sucks, mostly because of the previous point. Wayland has fractional scaling as a sort-of workaround if you can tolerate the entire screen being blurry. Every other major OS can deal with this.
* Anything to do with configuring webcams feels like you're suddenly in thrown back 20 years into the past. It'll probably work fine out of the box, but if it doesn't. Hoo boy.
* Audio filtering is a pain to set up.
At least things look more or less the same over time. With commercial offerings one day you open your laptop and suddenly everything looks different and all the functions are in a different submenu because some designer thought it was cool or some manager needed a raise.
> It'll probably work fine out of the box, but if it doesn't. Hoo boy.
LLMs are actually very useful for Linux configuration problems. They might even be the reason so many users made the switch recently.
I'd take balkanization over the "we force-migrate everyone to the hot new thing where nothing works".
> It'll probably work fine out of the box, but if it doesn't.
Drivers are a pain point and will probably stay so until the market share is too large for the hardware vendors to ignore. Which probably aren't happening any time soon, sadly.
I think Windows is the only other one which really does this properly, macOS also does the hack where they simulate fractional scales by rendering with an integer scale at a non-native resolution then scaling it down.
Windows is the only one that does this properly.
Windows handles high pixel density on a per-application, per-display basis. This is the most fine-grained. It's pretty easy to opt in on reasonably modern frameworks, too; just add in the necessary key in the resource manifest; done.[1].
Linux + Xorg has a global pixel density scale factors. KDE/Qt handles this OK; GNOME and GTK break when the scaling factor is not an integer multiple of 96.
Linux and Wayland have per-display scaling factors, but Chromium breaks, and GTK breaks the same way as the Xorg setup.
On macOS, if the pixel density of the target display is at least some Apple-blessed number that they consider 'Retina', then the 'Retina' resolutions are enabled. At resolutions that are not integer multiples of the physical resolution, the framebuffer is four times the resolution of the displayed values (twice in each dimension), and then the final result is raster-scaled with some sinc/Lanczos algorithm back down to the physical resolution. This shows up as ringing artifacts, which are very obvious with high-contrast, thin regions like text.
On non-retina resolutions, there is zero concept of 'scaling factor' whatsoever; you can choose another resolution, but it will be raster-scaled (usually up) with some bi/trilinear filtering, and the entire screen is blurry. The last time Windows had such brute-force rendering was in Windows XP, 25 years ago.
[1]: https://learn.microsoft.com/en-gb/windows/win32/hidpi/settin...
I'd blame Linux as a very small percentage of the problem here. This is on NVIDIA ensuring their hardware doesn't last to long and forcing you to throw it away eventually. Open source can make the monitor 'work' but really aren't efficient, and really can never be efficient because NVIDIA doesn't release the needed information and directly competes with their proprietary driver.
- Kind of ties to the old point: KDE on Wayland does this extremely well.
- You're back to 20 years because problems are exactly from 20 years ago. Vendors refusing to support linux with d rivers.
- Audio filtering? Interesting. I know people who use Pipewire + Jack quite reasonably. But may be you have usecase I am now aware of? Would be happy to hear some.
This has happened to me a couple of times. I put the PC to sleep and the next morning I discover it has decided to close everything to install an update.
Not using Windows ever again to do any work. Say what you will about Apple but at least they don't do crap like this.
You want proprietary programs? Alright, fine, one can argue for that. But the central, core operating system of general purpose computers should be free and fully controllable by the users that own them!
We've been hearing this for decades and yet the home Linux userbase is microscopic and somehow even smaller than ever. Unless we're going to count Google's Android and Chrome OS. Those are the only Linux-based distributions that have ever gained market share over desktop Windows.
On the other hand I am also a realist and I don't think that Linux will take over the Desktop, but it will certainly have it's biggest growth year ever in 2026.
It's slightly larger in the US at 5.28%: https://gs.statcounter.com/os-market-share/desktop/united-st...
In India, where I live, it's surprisingly at 6.51%: https://gs.statcounter.com/os-market-share/desktop/india
Take this with a grain of salt, because numbers from Statcounter are not fully accurate. However, none of those numbers are small. 3.86% of the entire PC market is not something to scoff at.
https://gs.statcounter.com/os-market-share/desktop/worldwide
Not insignificant at all.
Must admit, not sure if the data torrents are uptodate now that Reddit anti-scrapes so hard to raise their premium on the exclusive contract to the highest bidder, OpenAI.
It is true, they could not do this themselves and sometimes my mom can test my patience, but this is the way if you can do it. (Hint: get a remote desktop with shared view working first :).
Really, the stronghold for windows is their office suite (other family require Word/Excel for work), enterprise domain integration (work to home pc familiarity), and, to a weaker extent, gaming. Gaming is why I still keep an install of windows on my pc.
Linux should consider paying Microsoft and Apple for new customers. Perhaps the customer acquisition funnel is quite long, at least it took 20 years of using Apple in my case before switching to Debian (Xfce), but it was worth it!
I hear this from a lot of people when they get their first Mac. When they get specific about what their issues are, it tends to be that macOS doesn't do a thing how they are used to doing it, which is more of a learning curve issue, or rigid thinking. Apple software can be quite opinionated, those who fight against those opinions tend to have a hard time. This is true of any opinionated software.
Linux/Windows (historically) were straightforawrd, each tool did exactly what it said it would do, and it was up to you to learn how to use the tools available.
On linux/windows, if a button was "capture image", it would just capture the image on the screen. On a mac a "capture image" button could do anything from displaying the image on the screen, to saving it in a photos folder, to saving and syncing it to an iCloud account. Whatever the apple PM decided the most common use case was, and god help you if you want to do something different.
If you've been in the mac ecosystem for a while, you've grown used to this and don't notice any longer. You may even occasionally express happiness when a function does something unexpected and helpful!
If you're coming from anywhere else, its unbelievably painful.
Not sure about other people, but in my case I spend 99% of the time using software made by 3rd parties so my exposure to the OS is very limited.
Latest OS is making life miserable though, compared to all the previous releases.
Six years ago everything was stable and solid, but Apple's board of directors seems to have decided that new Mac users can't handle a computer interface anymore and started merging it with mobile OS interfaces. And the result is absolutely terrible.
Who or what is the "Linux" entity in this context?
So I can either get a top tier tool when I upgrade this year or I can buy a subpar device, and the power management is going to likely be even worse on Linux.
Most of my personal development these days is done on my home server - 9995wx, 768GB, rtx 6000 pro blackwell GPU in headless mode. My work development happens in a cloud workstation with 64 cores and 128GB of ram but builds are distributed and I can dial up the box size on demand for heavier development.
I use laptops practically entirely as network client devices. Browser, terminal window, perhaps a VS Code based IDE with a remote connection to my code. Tailscale on my personal laptop to work anywhere.
I'm not limited by local compute, my devices are lightweight, cheap(ish) and replaceable, not an investment.
Apple is disabling downgrading across all of iOS, and starting to do the same with MacOS. So you need to keep old hardware to run older MacOS versions, and it's only a matter of a few years before Tahoe is the latest OS you can run on your Mac.
I must have taken some shrooms before I downgraded from Tahoe to Sequoia a few hours ago then.
What you did is a downgrade in what's called the supported OS.
However, if you decide to downgrade to Catalina on an M1 Mac, it's not possible — Big Sur is the earliest version that runs on Apple Silicon.
Anyway, you cannot downgrade to a macOS version older than what your Mac originally came with. So if you buy a Mac now, Tahoe will be the minimum option.
Uhh, I guess.
AFAIK iOS has been very locked down wrt rolling back upgrades since forever and isn't super relevant to this thread. Happy to be corrected.
I'm not even sure what macOS have for its own since I basically open either the browser or the terminal. I am vaguely aware that Finder exist when I accidentally open it maybe twice a month.
"Also, is it weird that I still remember the specs of my first computer, 22 years later?"
My first computer was a TRS-80 Model 1, 1.78 Mz Z80 with 16 KB RAM.
That was 48 years ago. Is it weird that I remember that?
First own "PC": Atari ST 1040e, 1MB, with supercharger to run DOS and a 30MB hard disk the size of a regular PC. Donation from a family member.
Same for my first computer I built myself out of a TigerDirect order. Made a few mistakes there (K6 generation.)
Having these computers was such a change in our lives that they should be somewhat indelible memories.
32MB ram <-- no way. 4 and 8MB were the standard (8MB being grand), you could find 16MB on some Pentiums. So 40MB drive and 32MB RAM is an exceptionally unlikely combo.
32MB become norm around Pentium MMX and K6(-2).
A few months after taking possession, I upgraded the disk to a luxurious 400MB.
As for a new computer and price - it was like $1000 to get AMD 486DX2-80 with 4MB RAM in '95...
Then a 286 with 1MB.
Then a 386 with 4MB, then a 486DX with 8MB, then a P166 with 64MB (that was awesome), then a P4 with 1GB (hot and the first to burn the Motherboard), then an i3 with 4GB, gradually upgraded to 16GB.
And so on and so forth...
First modern PC (dos/win3.1) I had a 12mhz 286, 1 meg of ram, AT keyboard, 40MB hard drive. This progressed via a 486/sx33/4m/170mb and at one point a pentium2 600 with (eventually) 96mb of ram, 2g hard drive, then a p3 of some sort, but after that it's just "whatever".
In all honesty, it was easy for me to switch to Linux because I was always more interested in the computer itself rather than what useful things I could do with it, so I actually never missed a particular application. I also was more interested in making a game run in Wine with maximum effort rather than actually playing it (I did play countless hours of World of Warcraft though...)
The installer also completely broke the Windows partition that came with the workstation even though I was planning on dual booting, but oh well, no great loss there.
Other than that, there are some small conveniences and apps that I miss from MacOS (the mac calendar and mail apps are just so nice!) but the Niri window manager is just so amazing that at this point I don't think there's anything that could make me switch back.
> Linux is the preferred platform for development
Honestly I'm surprised he was using a non unix system this long, I guess it kinda proves his point that switching costs can seem huge
This does have downsides, and the author lists many. It also has some marginal upsides. For example running multiple distros for testing is trivial, and while the Windows file Explorer might be a shitshow that reached its peak over two decades ago it somehow seems to still be leagues ahead of the options in linux gui land. And of course the situation in gaming and content creation used to be way worse just a couple years ago, so for many switching only became viable relatively recently
Hu... use Dolphin?
There's literally nothing special about Linux when it comes to development. And there are quite a few downsides especially when it comes to some specialized tooling because many vendors often only have Windows tools for their devices.
Funny enough, the bluetooth stack works better on a bare metal Linux box than a Windows one. Audio starts being played sooner.
For anyone hosting a product on servers (almost everything web related)... there IS something special about linux: It's where your product is going to run in production.
For folks who are doing work in other spaces, especially development that involves vendor provided physical devices: Then yes, I agree with you. Vendor support is almost always better for Windows, and sometimes entirely non-existent otherwise. I'll note this is starting to change, but it's not yet over the hump.
The only place I'd consider macOS as a "perfectly fine" linux alternative is mobile (and mainly because Apple forces it with borderline abusive policy/terms). Otherwise it's just a shittier version of linux on nice hardware, riddled with incompatible tooling, forced emulation problems, and a host of other issues. It's not really even "prettier" anymore.
Linux for desktop is a joke, always have been since at least Slackware 7.1 running at my 486
(a) gives us back 2000/XP/7/11 options for UI,
(b) gives us a desktop-first experience when we have keyboard/mouse plugged in,
(c) stops turning every OS feature into an ad, and makes it utilitarian again,
(d) and focuses 100% on making a stable OS and high quality dev/office apps?
It would be so nice if they just forked a commit from ~2005 and started from there.
(Maybe Copilot will mess up & erase commits so they have to? One can only dream.)
For me the biggest thing is control, with Windows there are some things like updates that you have zero control over. It's the same issue with MacOS, you have more control than Windows but you're still at the whims of Apple's design choices every year when they decide to release a new OS update.
Linux, for all it's issues, give you absolute control over your system and as a developer I've found this one feature outweighs pretty much all the issues and negatives about the OS. Updates don't run unless I tell them to run, OS doesn't upgrade unless I tell it to. Even when it comes to bugs at least you have the power to fix them instead of waiting on an update hoping it will resolve that issue. Granted in reality I wait for updates to fix various small issues but for bigger ones that impact my workflow I will go through the trouble of fixing it.
I don't see regular users adopting Linux anytime soon but I'm quickly seeing adoption pickup among the more technical community. Previously only a subset of technical folks actually ran Linux because Windows/MacOS just worked but I see more and more of them jumping ship with how awful Windows and MacOS have become.
It would help a lot if there were a distro that was polished and complete enough that most people – even those of us who are more technical and are more demanding – rarely if ever have any need to dive under the hood. Then the control becomes purely an asset.
> I don't see regular users adopting Linux anytime soon...
I can see why you think the second statement is true based on the first statements. When Ubuntu switched their desktop to Gnome, they gave up on being the best Linux desktop distro. I'd recommend you to try Linux Mint.
It annoyed me so much that I switched to mint.
I should really do more to evangelize. It's not ok to use an OS monopoly to degrade and squeeze your users' often primary career and creative tool to your own short term ends, making their lives worse and worse. And it's such a delight to get out from under.
Not sure the situation for normies currently, but for power users, definitely dual boot and give it a try.
MS in general have idea of consent of an average rapist.
Yes/Remind me later is basically norm in their dark UI patterns, it bothered me for months to add copilot button to teams
I don't think I ever had to reinstall Windows 2000 but here we are.
After the last "update" the setting for turning windows "game optimization" on and off doesn't work anymore and made factorio unplayable (it MUST be off, otherwise it optimized lag and stuttering and it automatically turns on after every larger update). Since games was the only reason I still had a pc with windows it was time to move. For funzies it tried installing some updates on the last shutdown (it got wiped afterwards).
The only pc I now have with windows on it is a early 00's pc with 98SE on it.
I think the most recent 'production' Windows issue I've had was OneDrive failing to recognize it was syncing my data even though it was syncing. The status symbols for the files and folders wasn't showing up. But that's about it.
My gaming desktop is stable, my PC is rock solid, I run VMs on it (game servers, dev/test environments), and overall just absolutely 0 problems with Windows or my OS at all.
I do, however, have hardware issues semi often. One of my monitors doesn't turn off its backlight, for example. I've had Razer devices just flat out quit on me over the years (multiple Razer mice, at least a couple of Nagas, etc.).
I contend that most people would do better with Windows if they just didn't mess with it (don't run any of those tools proclaiming to "debloat" your OS), and make sure you read the hardware compatibility list of your systems REALLY hard. Incompatible RAM can cause significant problems, a lot of which is completely avoidable if you just read the RAM QVL.
The only thing that I wish vendors would do more is work closer with Microsoft to provide BIOS updates over Windows Update. But, most of these motherboard IHVs are absolutely terrible about doing BIOS updates anyway and require specific mechanisms to keep going correctly. This is in contrast to the Enterprise/Business devices released by HP or Dell which have a usually solid BIOS update track. And again, the only issue I've ever had there was incompatible RAM.
Only to see this article today. lol
I guess at this point, whatever works for you and your situation is what you should use and ignore all the static. I use Linux for the majority of my dev work, but have the inability to move off Adobe products for the photo and video processing work I do. Something I've found that Linux doesn't compete very well with MS and Apple. I would love to finally get off of one or the other, but I have one foot in each because they both excel in different areas.
I still have yet to hear any non-technical person I know encounter issues on Windows and seriously consider switching away. The learned helplessness instilled by Microsoft is very difficult to get people to shake off.
This tends to be better overall anyway, if you are really looking to switch. Dual booting is enough of a hassle that I've always ended up staying in whatever OS I felt required me to think I needed to dual boot, and the other aspirational OS gets forgotten.
Going all-in requires that you figure out new workflows, find new software, or in some cases change what you use the computer for and accept it.
I tried building a gaming PC, but I hated PC gaming. It felt like it was half sys admin work, half gaming... if the sys admin work went well that day. I dual booted it for a while, then ran straight Linux on it, and eventually sold it. I liked the idea of one box that did everything, but the reality of it wasn't so great. I now have computers I don't care about gaming on, and have consoles that require 0 effort and let me play games when I feel like playing games.
Most reliable way I've ran dual boot systems is to have each OS on it's own separate drive, and then choose with the UEFI boot menu which one to boot instead of choosing in GRUB off a single drive.
As for games, plug them into protondb (https://www.protondb.com) to see compatibility & read through the comments
I know that some things are not as nice on Linux (ie you need to do MS365 in a browser for example, and MS365 files from a NAS in OnlyOffice is not great, etc). But other than that, I just love living in Gnome. What more do you need that just a clean desktop with some tiling, some virtual desktops, a clock, battery indicator and windows with your stuff? I don't even know. I like that I can set up Linux in 10 min.
I recently set up a Windows 11 machine for a neighbor, it took so long! And it offered dozens of things I didn't want, to the point that I began feeling a bit nervous towards my neighbor (no you don't need that, no not that, no that's just tracking, no why would you want your desktop in the cloud?). Then when finished... it wasn't finished, I need printer drivers, an HP package with drivers and stuff for the BIOS etc etc etc. So much time.
I saw the amount of ads they were getting on their laptops and One Drive was even advertising to them on Samsung Android phones.
Esoteric errors are now resolvable with a simple query. Often with just a few cut and paste commands.
This improves the rough edges to a point that Linux is now a reasonable option for a larger cohort of previously unfeasible users.
2. Linux is already to the point of giving you about as many esoteric errors as Windows or macOS will.
People don't switch either because they are comfortable where they are and don't want to put forth the effort of changing their OS, or they are afraid of outdated criticism of Linux Desktops being error-filled nightmares.
It's happening right now. Maybe you're so opposed to the concept that you hate to imagine it, but it's the reality.
> or they are afraid of outdated criticism of Linux Desktops being error-filled nightmares
Your concept of people installing Linux is behind because even just over the last 12 months things have changed a lot.
2025 has had some of the biggest Linux hype in recent times:
- Windows 10 went EOL and triggered a wave of people moving to Linux to escape Windows 11
- DHH's adventures in Linux inspired a lot of people (including some popular coding streamers/YouTubers) to try Linux
- Pewdiepie made multiple videos about switching to Linux and selfhosting
- Bazzite reported serving 1 PB of downloads in one month
- Zorin reported 1M downloads of ZorinOS 18 in one month and crossed the 2M threshold in under 3 months
- I personally recall seeing a number of articles from various media outlets of writers trying Linux and being pretty impressed with how good it was
- And don't forget Valve announced the Steam Machine and Steam Frame, which will both run Linux and have a ton of hype around them
In fact, I think that we will look back in 5 or 10 years and point at 2025 as the turning point for Linux on the desktop.
"And worst of all, you're like a pit bull that has lock-jawed onto OpenAI's ballsack, and you're not letting go, not matter how much we tell you to."
jk, I wanted to install Ableton and now it's been 15 years.
Ease in gently, with Ubuntu or Fedora. Get familiar. Then go crazy.
Pacman is _amazing_. Apt broke dependencies for me every few months & a major version Ubuntu upgrade was always a reformat. Plus, obviously, the Arch wiki is something else. I would go as far as to say you'll have an overall better Linux experience on Arch than Ubuntu and friends, even as a beginner.
It's definitely the superior OS for modern development and general system admin, WSL/Docker always felt like an uncanny valley kludge.
Ooph. It's frustrating to see the community starting (again) to get purchase in public mind share at exactly the moment when it's least prepared to accept new users.
The Linux desktop right now is a wreck. EVERYONE has their own distro, EVERYONE has their own opinions and customizations, and so everyone is being pulled in like 72 different directions when they show up with search terms for "How do I install Linux?"
For a while, 15-ish years ago, the answer was "Just Install Ubuntu". And that was great! No one was shocked. Those of us with nerd proclivities and strong opinions knew how to install what we wanted instead. But everyone else just pulled from Canonical, a reasonably big and reasonably funded organization with the bandwidth to handle that kind of support.
Now? CachyOS. Yikes.
The moment your Commodore 64 made you old.
Yes, I am aware there are alternatives that others think are as good or better. No, I have not personally found that to be true.
Jovian is a NixOS module that sets up a SteamOS-like experience on top of your existing NixOS config. I was able to build & tweak the config before even building my PC. It booted first try and has since been working without hiccups. Now I am setting up emulators, which is relatively straightforward with nixpkgs :)
"The only real limitation is that some games with anti-cheat like Valorant, Call of Duty or League of Legends won't run. But honestly I think not being able to launch League of Legends is actually a feature - one final reason to install Linux."
Fair point though :P
Nothing wrong with staying on Windows if compatibility is an issue, though.
The disjointed WebView mixed with old winforms for navigating simple things is infuriating alone. I've had a problem where the webview wouldn't render any of the display settings so my machine was stuck at a certain resolution and scale.
Simple things like accessing Environment Variables now is atrocious and hidden in the most obscure unintuitive way. That's to say nothing to the crashing. Linux desktop environments have come such a long way it's really any wonder anyone would put up with Windows anymore.
But then again, Microslop don't seem to care about the customer market much anymore anyway.
I’m definitely most productive building software with .net, which is kind of why I feel locked in. And although it’s cross platform now, visual studio definitly is the one and just feels like a good pair of jeans. I’ve tried Ryder/VS code for .net development but never really got along with them for .net stuff.
Maybe I should just learn python/django, grow a beard and install nix or something. And get into espressos.
Microsoft had a chance make even better OS than XP and 7 and convince millions of users to use Windows.
Okay maybe with Office products the ocean was already red, but still, instead of disgusting its millions of users, they could make them happy.
I am not a firm believer that GNU/Linux distributions are a drop-in replacement for Windows. One can work around compatibility issues, but for non–tech-savvy people, it's just not feasible.
I switched to MacOS since the release of Windows 10 and never looked back, of course I did miss some apps, though using laggy windows was much more painful.
I used to prefer Windows for work. After the absolutely abysmal performance using a SurfaceBook Pro, never again. I’ve never had to deal with such slow performance in my life. I literally cannot get work done. Staff with Windows have constant problems, updates take forever, reboots aren’t very fast, programs crash, and (not OS related) but the new Outlook is universally despised.
I’ve never seen a company shoot themselves in the foot so badly as I’ve seen Microsoft do this of late. More and more staff want MacBooks , and are even ok with using a remote session (ugh) to access the one app that relies on Windows.
"I'll switch when Linux supports X."
Linux still doesn't supports X.
"Okay, but how about my X?"
Linux still doesn't supports X.
"Well, X is still missing..."
Trados Studio, good luck finding equivalent, I tried, and the alternatives are horrendous and I'm not gonna run it in VM.
Also I tried at least for son on his old computer live distro Mint from USB drive, everything works fine (unlike Zorin, which had problem with sound I think), but when I try to install it of course it doesn't detect Windows, same with wife's laptop.
So I have 3 computers: son's old laptop where I could install Mint - Linux Mint doesn't detect Windows
wife's old laptop where I could install Mint - Linux Mint doesn't detect Windows
my daily driver where my work SW requires Windows and there is no point installing Mint - Linux Mint detects Windows
I will have look at it during CNY holidays, if I will be able to install it alongside Windows (I need there Windows in case something would happen with my daily driver laptop).
I also plan to switch my father's old desktop to Linux Mint, but somehow I already know what will be most likely Windows detection status over there as well after son's and wife's laptop experiences. It works where it's not needed and it doesn't work, where I could actually install it.
Dev in linux is so much nicer for me than dev in Windows.
Is this an ideological question? They are still primarily closed source.
Is this an install difficulty question? If you can read you can install them.
Is this a performance question? If you're a normie they're good. If you're demanding the top fps at the top resolution in dx12 games then there is still a noticeable difference but it should be fixed this year.
Right now and for the foreseeable near term (3 years or so?) it seems like the focus on GPU advancements isn't aimed at gaming so will be a period of stability, but I wonder if/when focus does come back to gaming, when there's a new round of consoles, when a company wants a new feature set to distinguish a new generation (like geforce 20 series versus 10 and earlier), what can be done to make sure linux users aren't second class citizens. I'd also wonder about development tools, to use the most popular engine as an example, what could change with unreal engine to make sure it builds software that plays nice with the linux ecosystem even if the tooling works best under windows.
That's a decent enough reason for a linux user to buy an AMD GPU but it isn't a good reason not switch to linux from a closed source OS. I'm in the process of switching to linux full time (it shouldn't really take that long but I haven't had a solid chunk of time in a bit) and am using an NVidia GPU so I went from closed source windows drivers to closed source linux drivers.
You're the top comment that addresses this so I'm putting this here but not exactly replying to you.
If they're bad because they are proprietary, it is what it is. If they're bad because their dx12 performance is worse on linux than windows, supposedly the fixes for the vulkan descriptor boogeyman problem are just around the corner.
Depends a ton hardware. Newer hardware has been playing well with the kernel but still not fully oss.
You’ll still have less trouble over all with amd though.
More seriously, editing is either a lost art or click bait headlines are more important than ever. The title is very immature.