> The change isn't about the core operating system becoming resource-hungry. Instead, it reflects the way people use computers today—multiple browser tabs, web apps, and multitasking workflows, all of which demand additional memory.
So it is more about the 3rd party software instead of OS or desktop environment. Actually, nowadays it's recommended to have 8+ GB of RAM, regardless of OS.
I just checked the memory usage on Ubuntu 24.04 LTS after closing all the browser tabs. It's about 2GB of 16GB total RAM. 26.04 LTS might have higher RAM usage but it seems unlikely that it will get anywhere close to 6GB.
https://www.microsoft.com/en-us/windows/windows-11-specifica...
4GB of RAM? What? I guess if your minimum is "able to start Windows and eventually reach the desktop", sure? I wouldn't even use Windows 11 with 8GB even though it would theoretically be okay.
Not okay as soon as you throw on the first security tool, lol.
I work in an enterprise environment with Win 11 where 16 GB is capped out instantly as soon as you open the first browser tab thanks to the background security scans and patch updates. This is even with compressed memory paging being turned on.
Apparently 13.6GB is in use (out of 64GB), and of that 4.7GB is Chrome. Yeah, I'm glad I'm not running this on an 8GB machine!
Current minimum specs should be more like
2 cores, 2ghz min with SSE4.2
128GB SSD
8GB RAM
I just tested this with 25.10 desktop, default gnome. With 24.04 LTS it doesn't even start up with 2GiB.
Apparently it's still in discussion but it's April now so seems unlikely.
Kind of weird how controversial it is considering DOS had QEMM386 way back in 1987.
Found good visual explainer on this - https://vectree.io/c/linux-virtual-memory-swap-oom-killer-vs...
That's why SoftRAM gained infamy - they discovered during testing that swapping was so much faster than compression that the released version simply doubled the Windows swap file size and didn't actually compress RAM at all, despite their claims (and they ended up being sued into oblivion as a result...)
Over on the Mac, RAMDoubler really did do compression but it a) ran like treacle on the 030, b) needed to do a bunch of kernel hacks, so had compatibility issues with the sort of "clever" software that actually required most RAM, and c) PowerMac users tended to have enough RAM anyway.
Disk compression programs were a bit more successful - DiskDoubler, Stacker, DoubleSpace et al. ISTR that Microsoft managed to infringe on Stacker's patents (or maybe even the copyright?) in MS DOS 6.2, and had to hastily release DOS 6.22 with a re-written version free of charge as a result. These were a bit more successful because they coincided with a general reduction in HDD latency that was going on at roughly the same time.
There's rather a lot of information in a single uncompressed 1080p image. I can't help but wonder what it all gets used to for.
For example, Java applications will claim much more memory than they need for the heap. Most of that memory will be unused, but it's necessary to have a faster running application. If you've ever run a Java app at consistently 90% heap usage, you know it grinds to an absolute halt with constant collection.
The same is true for caching techniques. Reading from storage is slow, so it often makes sense to put stuff in RAM even if you're not using it very often.
But I remember in 2016 Fedora Gnome consumed about 1.6GB of RAM on my PC with 2GB of RAM a decade ago. Considering that after a decade the standard Ubuntu Gnome consumes only 400MB more RAM and also that my new laptop has 16GB of RAM (the system might use more RAM when more RAM is installed), I think the increase is not that bad for a decade. I thought it would be much worse.
What has changed? Why do I need 10x the RAM to open a handful of terminals and a text editor?
> What has changed? Why do I need 10x the RAM to open a handful of terminals and a text editor?
It’s not a factor of ten, but a 4K monitor has about four times as many pixels. Cached font bitmaps scale with that, photos take more memory, etc.
> When Windows 2000 came out
In those times, when part of a window became uncovered, the OS would ask the application to redraw that part. Nowadays, the OS knows what’s there because it keeps the pixels around, so it can bitblit the pixels in.
Again, not a factor of ten, but it contributes.
The number of background processes likely also increased, and chances are you used to run fewer applications at the same time. Your handful of terminals may be a bit fuller now than it was back then.
Neither of those really explain why you need gigabytes of RAM nowadays, though, but they didn’t explain why Windows 2000 needed whatever it needed at its time, either.
The main real reason is “because we can afford to”.
Partly because increased RAM usage can sometimes improve execution speed / smoothness or security (caching, browser tab isolation).
Partly because developers have less pressure to optimize software performance, so they optimize other things, such as development time.
Here is an article about bloat: https://waspdev.com/articles/2025-11-04/some-software-bloat-...
But raw imagery is one of the few cases where you can legitimately require large amounts of RAM because of the squaring nature of area. You only need that raw state in a limited number of situations where you are manipulating the data though. If you are dealing with images without descending to pixels then there's pretty much no reason to keep it all floating around in that form, You generally don't have more than a hundred icons onscreen, and once you start fetching data from the slowest RAM in your machine you get pretty decent speed gains from using decompression than trying to move the uncompressed form around.
The culprit is browsers, mostly.
Minimum memory as in this change sets a completely different expectation.
I suppose the most major change on RAM usage is electron and the bloated world of text editors and other simple apps written in electron.
For people wanting the old-fashioned fast and simple GUI experience, I recommend LXQt.
So basically, no use when you've not tasted 120+Hz displays. And don't because once you do, you won't go back.
But for gaming, it really is hard to go back to 60.
Incredibly, Linux has better support than windows for it on the desktop: DWM runs full blast, while sway supports VRR on the desktop. Windows will only enable it for games (and games that support it). Disclaimer: Wayland compositor required.
It’s not enabled by default on e.g. sway because on some GPU and monitor combos, it can make the display flicker. But if you can, give it a try!
* The minimum variable refresh rate my display (LG C4) supports is 40 Hz.
But it has been a while since I've tried it, maybe I should look into it again
The state of Wayland compositors move fast, so support may be here. Last thing I’m waiting for is sway 1.12 that will bring HDR.
On a CRT monitor the difference between running at 60 Hz and even a just slightly better 72 Hz was night and day. Unbearable flickening vs a much better experience. I remember having some little utility for Windows that'd allow the display rate to be 75 (not 72 but 75). Under Linux I was writing modelines myself (these were the days!) to have the refresh rate and screen size (in pixels) I liked: I was running "weird" resolutions like 832x604 @ 75 Hz instead of 800x600 @ 60 Hz, just to gain a little bit more screen real estate and better refresh rate.
Now since monitors started using flat panels: I sure as heck have no idea if 60 fps vs 120 fps or whatever change anything for a "desktop" usage. I don't think the problem of the image fading too quickly at 60 Hz that CRT had is still present. But I'm not sure about it.
Gnome 3 seems similar to Unity nowadays, and it is pretty good.
I find it much easier to use than Windows or Mac, which is credit to the engineers who work on it.
I guess one of the few smaller things would be wayland, but this has so few features that you have to wonder why it is even used.
Those are all development tools. Has the runtime overhead grown proportionally, and what accounts for the extra weight?
As a side-note, that's how GC languages can perform so well in benchmarks. If you run benchmarks that generate huge amounts of garbage or consistently run the heap at 90%+ usage, that's when you'll see that orders of magnitude slowdown.
Oh also containers, lots more containerized applications on modern Linux desktops.
If a Java application requires an order of magnitude more memory than a similar C++ application, it's probably only superficially similar, and not only "because GC".
In Java, that's two objects, and one will be collected later.
What this means is that a C++ application running at 90% memory usage is going to use about the same amount of work per allocation/destruction as it would at 10% usage. The same IS NOT true for GC languages.
At 90% usage, each allocation and deallocation will be much more work, and will trigger collections and compactions.
It is absolutely true that GC languages can perform allocations cheaper than manual languages. But, this is only true at low amounts of heap usage. The closer you get to 100% heap usage, the less true this becomes. At 90% heap usage, you're gonna be looking at an order of magnitude of slowdown. And that's when you get those crazy statistics like 50% of program runtime being in GC collections.
So, GC languages really run best with more memory. Which is why both C# and Java pre-allocate much more memory than they need.
And, keep in mind I'm only referring to the allocation itself, not the allocation strategy. GC languages also have very poor allocation strategies, particularly Java where everything is boxed and separate.
If there's no opt-out, that's a different story.
If META's business model is not lucrative, is not my problem.
Given it's a field where you can put absolutely anything in (and probably randomize, if you want), how is this different than the situation today, where random sites ask you for your birthday (also unverified)? Moreover Meta already has your birthday. It's already mandated for account creation, so claims of "so they can earn money with users' preferences" don't make any sense.
https://www.theregister.com/2026/03/24/foss_age_verification...
Good luck when most libre users toss RH/Debian because of this and embrace GNU.
This is against HN guidelines: " Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith."
>The contents of the field will be protected from modification except by users with root privileges.
So... most users?
Most of the bloat these days is from containers and Canonical's approach to Ubuntu since ~2014 has been very heavy on using upstream containers so they don't have to actually support their software ecosystem themselves. This has lead to severe bloat and bad graphical theming and file system access.
Another one is memgaze, a program to vizualize linux process virtual memory spaces as RGB images and explore them using various binary visualization and sonification tools. Ie, you can just click a hilbert map of all processes then in the new window click around inside the image of that particular process' virtual ram and then listen to it interpreted as an 8bit wav, or find an extract images, for example. Or search for strings, run digraph analysis, etc. http://superkuh.com/memgaze-page.html
Or feeed.pl, my very quick and low resource usage feed reader for 1000+ feeds written in Perl/Gtk2 that is text only (no html, no images, etc). It is really handy for loading .opml files and finding and fixing broken feeds using the heuteristics I hard coded in to find feed urls. http://superkuh.com/blog/2025-09-13-2.html
These are a few I made 2025-26 that other people might care to use. But I have a lot more that just scratch my own particular itches. Like a Perl/Gtk2 version of MS Paint that interprets arbitrary loaded and painted images as sound, or the things that I use to monitor my ISP uptime/speed, etc.
And to say the desktop experience was more polished than what we have now is laughable. I remember that you couldn't have more than one application playing sound at the same time. At one point you had to manually configure Xfree86 to be aware that your mouse had a middle button. And good luck getting anything vaguely awkward like WiFi or suspend-to-ram working.
The Linux desktop is in a vastly better position now, even taking the Wayland mess into account.
First, it sounds like this 6gb requirement is more like a suggestion/recommendation than a requirement. I also am curious if it actually actively uses all 6gb. From my own usage of Linux over the years the OS itself isn't using that much ram, but the application is, which is almost always the browser.
Secondly. I haven't used Ubuntu desktop in years. So I have no real idea if this is something specific to them, but I do use Fedora, so I would imagine that the memory footprint cannot be too different. Whilst I could easily get away with <8gb ram, you really kind of don't want too if you're going to be doing anything heavier than web browsing or editing documents. Dev work? Or CAD, Design etc etc. But this isn't unique to Linux.
When they turned Centos into streams, I cut my workstation over to Ubuntu. It has been a reasonable replacement. Only real issues were when dual booting Win10 horked my grub and snap being unable to sort itself on occasion. When they release 26 as an LTS, I'm planning to update. You are spot on - the desktop itself is reasonably lean. 100+ tabs in Firefox... less so. Mind you, the amount of RAM in the workstations I'm using could buy a used car these days.
I believe Fedora and Ubuntu use about the same set of technologies: systemd, wayland, Gnome, etc. so it is about the same.
Apart from working out of the box I do not really know what those distros have and I don't. I just have to admit managing network interfaces is really easy in Gnome.
With the skyrocketing price of RAM this might finally be the year of the Linux desktop. But it is not gonna be Gnome I guess.
Can't move to Linux because it's Intel Atom and Intel P-state driver for that is borked, never fixed.
i remember this and had no idea that's why it would be doing that. thanks, i learned something today.
That is the case with every mainstream JS engine out there and is one of the many tradeoffs of this kind.
This is garbage writing. Linux’s advantages are numerous and growing. Ubuntu ≠ Linux. WRT RAM requirements, Win 11’s 4GB requirement isn’t viable for daily use and won’t represent any practical machine configuration that has the requisite TPM 2 module. On the other side, the Linux ecosystem offers a wide variety of minimal distributions that can run on ancient hardware.
Maybe I’m just grouchy today but I would flag this content if sloppy MS PR was a valid reason.
Apps are still a huge gap on Linux, but as an OS, I choose it every time over Windows and MacOS.
"With Desktop" has 1GB minimum and 2GB recommended - along with Pentium 4, 1GHz cpu.
This seems like a recommendation to just really get to the desktop itself + maybe some light usage. Anything more than that and the "recommendation" is fairly useless with the memory hog of apps that are commonly used.
I don't know how they have done it but with the latest updates to Windows 11 (In the Canary Channel, Optional 29000 series) it is VERY fast in Chrome, even on a PC with just 4GB Ram.
So all those mocking and laughing say you might just get a window up haven't a clue what they are talking about. It WAS terrible but now it is much much better. I don't know how they have done it but they have. It might be due to Microsoft's aim to rewrite the kernel in RUST and other things.
> Linux's advantage is slowly shrinking
Ubuntu is not Linux. Also, would love to see Windows running on 4GB.It says that Ubuntu increase the requirements not because of the OS itself but to have a better user experience when people have many browser tabs opened. Then it compares to Windows which has lower nominal requirements but higher requirements in practice to get a passable user experience.
I'd say Windows 11's real minimal is 8 GB in 2026, with the recommended being 16 GB.
PS - And even at 8 GB, it hits 100% usage and pages under moderate load or e.g. Windows Update running in the background.
Why miss things that are still around? I dunno how close GNUstep is, but the original CDE is still here, open source and ported to most unix-likes.
You can install Debian and it gives you all that you are familiar with from Ubuntu.
> The change isn't about the core operating system becoming resource-hungry. Instead, it reflects the way people use computers today—multiple browser tabs, web apps, and multitasking workflows
Basically the change reflects the fact that, at this level of analysis (how much RAM do I need in my consumer PC), the OS is irrelevant these days. If you use a web browser then that will dominate your resource requirements and there's nothing Linux can do about that.
It doesn't matter how efficient your kernel or DE is if users expect to be able to load bloated websites in Chrome.
It's slightly off from llm content but reads like someone touched it up afterwards
What I mean is, yes, WE know Win11 barely works with 4GB and WE know that 6gb is quite generous for a Linux machine, but they don't.
The general public isn't as informed as we think they are (which is proven by 75 million people last election).
I think we have quite different definition of "minimum requirement", then.
If ram is a problem there's always alternatives. The impediment is always having to rethink your workflow or adopting someone else's opinion.
My desktop runs Arch with Sway (so quite close), three monitors, and uses ~400MB ram after boot. Most of it are the framebuffers. All the rest is eaten by Firefox, rust-analyzer and qemu.
1.5TB in /var/log
All from the Firefox snap package complaining every millisecond about some trivial Snap permission.
I'm glad I chose an OS without goddamn Snap. It's been unadulterated pain every time I've ever interacted with it.
but I still would recommend 6 GiB.
no matter of the OS
the problem here is more the programs you run on top of the OS (browser, electron apps, etc.)
realistic speaking you should budged at least 1GiB for you OS even if it's minimalist, and to avoid issues make it 2GiB of OS + some emergency buffer, caches, load spikes etc.
and 2GiB for your browser :(
and 500MiB for misc apps (mail, music, etc.)
wait we are already at 4.5 GiB I still need open office ....
even if xfc would safe 500 MiB it IMHO wouldn't matter (for the recommendation)
and sure you can make it work, can only have one tab open at a time, close the browser every time you don't need it, not use Spotify or YT etc.
but that isn't what people expect, so give them a recommendation which will work with what they expect and if someone tries to run it at smaller RAM it may work, but if it doesn't it at least isn't your fault
they compared the Ubuntu minimal recommended RAM to Windows absolute minimal RAM requirements.
but Windows has monetary incentives (related to vendors) to say they support 4GiB of RAM even if windows runs very shitty on it, on the other had Ubuntu is incentivized to provider a more realistic minimum for convenient usage
I mean taking a step back all common modern browsers under common usage can easily use multiple GiB of memory and that is outside of the control of the OS vendor. (1)
As consequence IMHO recommending anything below 6 GiB is just irresponsible (iff a modern browser is used) _not matter what OS you use_.
---
(1): If there is no memory pressure (i.e. caches doesn't get evicted that fast, larger video buffers are used, no fast tab archiving etc.) then having YT playing likely will consume around ~600-800 MiB.(Be aware that this is not just JS memory usage but the whole usage across JS, images, video, html+css engine etc. For comparison web mail like proton or gmail is often roughly around 300MiB, Spotify interestingly "just" around 200MiB, and HN around 55MiB.
I couldn't understand why everything was that slow compared to Debian and didn't want to bother looking into it so...
After a few weeks: got rid of Ubuntu, installed her Debian. A simple "IceWM" WM (I use the tiling "Awesome WM" but that's too radical for my wife) and she loves it.
She basically manages her two SMEs entirely from a browser: Chromium or Firefox (but a fork of Firefox would do too).
It works so well since years now that for her latest hire she asked me to set her with the same config. So she's now got one employee on a Debian machine with the IceWM WM. Other machines are still on Windows but the plan is to only keep one Windows (just in case) and move the other machines to Debian too.
Unattended upgrades, a trivial firewall "everything OUT or IN but related/established allowed" and that's it.
I don't remember all of my frustrations, but I remember having a lot of trouble with snap. Specifically, it really annoyed me that the default install of firefox was the snap version instead of native. I want that to be an opt-in kind of thing. I found that flatpak just worked better anyway.
I almost tried making the switch to arch, but I've been pretty happy running debian sid (unstable) since. The debian installer is just more friendly to me for getting encrypted drives and partitions set up how I want.
It's not for everyone, but I like the structured rolling updates of sid and having access to the debian ecosystem too much to switch to something else at this point.
I use sway with a radeon card for my primary and have a secondary nvidia card for games and AI stuff.
It has its warts, but I love my debian+sway setup
Nothing in UI should take longer to draw than the human reaction time (~250ms). Most linux distros I tried pass this snappiness test with flying colors. Windows after Windows 7 don't.
Besides, Ubuntu is just 1 distro. There will always be alternatives on Linux for lower resource usage.
Maybe in some ways, yes. But there are distros out there that can run easily in as little as 1G RAM. And I heard people have used it with far less.
I also remember hearing Ubuntu moved to default to Wayland, if true I have to wonder if defaulting to Wayland is part of the problem because Gnome / KDE on Wayland will use far more memory than FVWM / Fluxbox on X11.
FWIW, you can do a lot just from the console without a GUI w/Linux and any BSD, in that case the RAM usage will be tiny compared to Windows and Apple.
It always make me chuckle when I hear this. Default server (ie no GUI at all) installation of a RHEL derivative just outright dies silently with 1GB of RAM if there is no swap. Sure with the enabled swap it no longer dies but to say what the performance is anywhere performant is to lie to yourself.
RHEL/RHEL-like, just like Ubuntu is thevgeneral purpose distroes and the point of the minimal sysrq for running is for them, not for the excercises in the RAM golfing.
2: Win11 is not usable with 4GB
3: Trisquel 12 Ecne exists. You might need Xanmos as a propietary kernel because of hardware, but try to blacklist mei and mei_me first in some .conf file at /lib/modprobe.d. Value your privacy.
Trisquel Mate with zram-config and some small tweaks can work with 4GB of RAM even with a browser with dozens of Tabs, at least with UBlock Origin.
Windows, its default, used so much memory that there was not much left for apps.
Ubuntu used 500MB less than Windows in system monitor. I think it was still 1GB or more. It also appeared to run more slowly than it used to on older hardware.
Lubuntu used hundreds of MB less than Ubuntu. It could still run the same apps but had less features in UI (eg search). It ran lightening fast with more, simultaneous apps.
(Note: That laptop's Wifi card wouldn't work with any Linux using any technique I tried. Sadly, I had to ditch it.)
I also had Lubuntu on a 10+ year old Thinkpad with an i7 (2nd gen). It's been my daily machine for a long time. The newer, USB installers wouldn't work with it. While I can't recall the specifics, I finally found a way to load an Ubuntu-like interface or Ubuntu itself through the Lubuntu tech. It's now much slower but still lighter than default Ubuntu or Windows.
(Note: Lubuntu was much lighter and faster on a refurbished Dell laptop I tested it on, too.)
God blessed me recently by a person who outright gave me an Acer Nitro with a RTX and Windows. My next step is to figure out the safest way to dual boot Windows 11 and Linux for machine learning without destroying the existing filesystem or overshrinking it.
https://community.acer.com/en/kb/articles/16556-how-to-upgra...
Looks like you got space for 2 drive.
Those number meant nothing comparing across OS. Depends on how they counts shared memory and how aggressive it cache, they can feel very different.
The realistic benchmark would be open two large applications (e.g. chrome + firefox with youtube and facebook - to jack up the memory usage), switch between them, and see how it response switching different tasks.
Unrelated to this, despite Ubuntu’s popularity, I think it’s one of the worst distro choices out there, especially for including old kernels for essentially no discernible reason.
I wouldn’t go so far as defending Microslop but I do get tired of the Apple fanboys accusing Windows of being bloated and running poorly.
They seem to defend Apple’s 8GB machines by saying that Apple systems perform better than Windows with the same amount of RAM. This claim is entirely unsubstantiated.
Windows has a lot of problems but performance and memory efficiency is not one of them. We should recall that Microsoft actually reduced RAM usage and minimum requirements between windows 7 and 8 as they wanted to get into the tablet game, and Windows has remained efficient with memory since then as Microsoft wants Windows to come with cheap Chromebook-like hardware and other similar low-end systems.
I have seen MacOS overcommit up to 50% of memory and still have the system be responsive.
Yesterday I filled up my ram accidentally on Fedora and even earlyoom took several minutes to trigger and in the meantime the system was essentially non-responsive
It did eventually work to but it took a while. It also did not killed the culprit runaway processes somehow but it did kill enough stuff for me to regain control of the system.
On my desktop I use linux-cachyos-bore-lto which seems to give me a slight performance boost in compilation times compared to the regular kernel, but I've had at least one crash that I've unable to attribute to any other specific issue, so could be the kernel I suppose, I wouldn't use it on a server nonetheless.
If you’re running applications as in a server that’s an entirely different discussion. I have been assuming we are talking about desktop users who are not serving anything.
E.g., if I go out and buy a 2026 Panther Lake laptop with a new WiFi 7 chip or what have you, I’m going to want a distro with the latest kernels so that I don’t have hardware issues. If I install the default Ubuntu download it’s going to almost certainly have problems.
So around ~3 years ago or so I bought a lightweight low-end laptop (Intel Core i3, 14 inch display, 8GB of RAM) for everyday stuff so I could easily bring it with me everywhere I need to go (I mean, everywhere I would need it). It came with Windows 11 pre-installed. Now, for you to understand, previously, like ~10 years ago or so I had a Windows 7 system and it was pretty neat. And I remembered when people were switching from Windows 7 to Windows 8 or 10, they blamed the new OS version just like right now the Windows 11 was blamed; yet everyone got used to it, it received some fixes, improvements, etc; so I thought "well, maybe Windows 11 is not so bad, I should try it out at least just for the sake of curiosity".
And now, the clean installation of the Windows 11 that came to my was requiring like ~20 seconds to fully boot up to the login window. I know that my laptop is not best of the best, but still... After a startup, with no apps opened, there was like ~4 GB of RAM usage just out of nowhere; so effectively I was limited to ~4GB of RAM to run something I want to. Bluetooth drivers were terrible (at the time) - sometimes I was able to connect to my headphones and sometimes I wasn't, while they were working with all of my other devices perfectly. Then there was also this hellish "Antimalware Executable" - and I know how it sounds, I have nothing against anti-virus software, but when it randomly shows up several times per day, eats all of your processing power (like ~80% of CPU usage, and note that I have 8 cores ~3 GHz here), heats up your laptop to the point that fan starts screaming... that was not very good, to speak softly. Battery usage was also a disappointment - sometimes it couldn't last for just 3 hours, while the most heavy thing I was doing during that time was compilation of some software.
I was trying, I was re-configuring, I was applying patches... and finally I got fed up with all of this bloat, broken updates and other garbage. So I just backed up all of my important files and data to external drive and installed Linux Mint (because in this particular case I just needed working laptop). And wow, it just worked! Now at startup I get like ~1 GB RAM usage at most (this actually depends on the DE I use, so numbers could be different from time to time), battery life improved, no more weird Bluetooth issues, no more random bloatware... it just works, and that's it.
I know that distros like Mint are focused on stability and efficiency, so maybe the comparison is a bit unfair. But hell, even while I don't have anything against Windows 7 or Windows 8, the recent Windows 11 is a real combination of bloatware and spyware. So performance and memory efficiency is, actually, the problem here. Or at least it was a problem last time I tried it.
Now, again, I may be wrong somewhere, maybe I missed something out. If I did - please point it out.
My Framework laptop running CachyOS with KDE Plasma with nothing open except System Monitor reserves 4GB with 500MB in swap (I enabled swap for sleep to hibernate, normally there’s no swap).
Reserving RAM doesn’t mean there’s a performance problem.
Most of the things you’re talking about in your comment have nothing to do with RAM usage and memory efficiency. You’re complaining about some annoying preinstalled OEM software [1], bad drivers, fan noise, battery life, and windows updates. That stuff isn’t great but a lot of it doesn’t have anything to do with Windows RAM efficiency itself.
If you download the Windows ISO from Microsoft and clean install you’ll have a pretty nice experience. I think Microsoft needs to crack down on OEM software additions.
As far as slow boot up times/slow initial setup I’ll remind you that Macs also have that as an issue during first boot and spend a lot of time doing initial indexing.
Linux mint is a great distro and I also prefer Linux to both Mac and Windows as well. Mostly my commentary is on the subject of people claiming Microsoft Windows is bad with RAM when we now see some Linux distros asking for more RAM than Windows. I think it’s quite clear that RAM isn’t the problem with Windows, it’s a lot of other things and the surrounding ecosystem.
[1] I have to assume you’re talking about some third party antimalware program because the Microsoft one absolutely does not behave how you describe.
It does in my own experience (so it may not be a problem for you, I agree, but it is a problem for me). Because when OS allocates ~50% percents of RAM for itself and isn't letting it go, then other software simply can't use it. Therefore, you're limited. Your potential performance is capped at certain level just because your OS decided to allocate half or more of your system RAM. Why? Well, just because it wants to.
> have nothing to do with RAM usage or performance
Well, to be honest, most of them don't. But would you please explain then, why it takes around 20 seconds just to boot up, while for the aforementioned Linux Mint (and I'll clarify that it's currently 22.3 for me, the latest version, it was 22.1 at the time as far as I remember) it's only around ~3-4 second to take me to the login screen and then another second (at most) to load everything after I have logged in? Could you also, please, explain how does it happen that even GNOME's Nautilus file explorer takes less RAM and far less CPU usage than Microsoft's Explorer (and I won't even mention Thunar, that's kinda unfair)? What about "Start" menu in Windows which spiked up CPUs just by opening/closing? There's a lot of performance issues, both with RAM and CPU usage.
I'm not saying that these problems are unique to Windows, no; but saying that Windows doesn't have any performance issues is not really true.
> I think it’s quite clear that RAM isn’t the problem with Windows, it’s a lot of other things and the surrounding ecosystem.
I agree with you here. That's true. A large part of the problem comes not from the actual operating system, but from the application software. I thought once that well, maybe if RAM shortages will last longer than for just one or two years, that will be bad, but also, maybe - just maybe - some software developers will start to think at least a bit more about optimization...
Editing without specifying that you have edited your reply is not very good, you know. But okay.
Actually, I'm talking about the Windows-shipped Microsoft Defender process (at least it seems to come from Microsoft Defender). I have not seen anything third-party installed on my laptop at the time, and it actually behaved just like I described. I should also remind you that it is a low-end laptop, that's just Intel Core i3-N305, it's not the most powerful CPU in the world - just 8 cores, 8 threads and 3.80 GHz of max boost frequency.
If you think that I'm lying, then just search for "antimalware executable high CPU usage" in any search engine. You will find a plenty of complaints and even some guides on how to deal with it.
I find on my Windows system it's only doing things when specific actions are happening.
Right now the antimalware executable process is using 196.4 MB of memory and 0% CPU for me as I type this.
When I download an executable from the Internet and run it, the CPU usage spikes to 8-10% briefly and the RAM usage goes up by 30MB or so.
I have a much higher-end CPU than that, 6 cores 12 threads (AMD Ryzen 5600X3D)
In my experience the executable is pretty much doing nothing unless I'm opening up an exe that's trying to elevate privileges or if it's doing an active periodic scan.
Strangely, but these were... specific moments, but not too specific. I mean, it wasn't running from the startup till the shutdown, but I also don't remember any specific triggers such as running untrusted applications or something similar. However, I guess that, according to the CPU usage, it most probably were full system scans. One thing remains unclear though: why, despite me turning off this feature, it was still quite frequently (I'd say way too frequently for normal operation) performing the full scan. I have no ideas. I'll not list everything I tried and checked here - because it would be too long - but the most "suspicious" thing I downloaded at the time was ImHex executable. And I don't think any anti-malware would spend ~80% of CPU processing power just to scan one relatively small app. I checked different things that could trigger it - but found nothing.
> Right now the antimalware executable process is using 196.4 MB of memory and 0% CPU for me as I type this.
Likely because it's idle as you type this. It wasn't using all of my CPU resources all the time, it's just that it could randomly spike up at some moment, run for ~1-2 minutes (sometimes more), drag the system into thermal throttling despite the active cooling (a bit of hyperbole here but still, it was really able to heat hardware up) and then disappear.
> if it's doing an active periodic scan.
As I mentioned, most likely that was the issue: frequent periodic full-disk scans.
For me, the worst thing was not that it was happening at all. To be honest, the worst thing was that I didn't understand why it was happening or how I could fix it. I just felt humiliated by the system that I, supposedly, owned. I know that I might have done something wrong or missed something, but... That still doesn't feel any good, you know.
> I do get tired of the Apple fanboys accusing Windows of being bloated and running poorly.
> Windows has a lot of problems but performance and memory efficiency is not one of them.
I can't even describe how much your experience differs from mine. I would never have imagine someone to utter such sentence about Windows in todays day and age.
For everyone else reading this, a couple of advice I have gotten that made me suffer less with Windows is to replace Windows search with Everything (by Voidtools) and replace Explorer with Filepilot (filepilot.tech).
On a older machine, I switched to Tiny10.
Explorer works fine for me but File Pilot does look cool. I'll give it a try. (Good luck replacing Finder on Mac, is that even possible?)
I only use Windows for desktop and if I was clean installing I'd probably switch it to Linux. My laptop is Linux then I share a macOS system with my partner which I occasionally use for things that require Mac.
I wouldn't say I suffer at all with Windows. It's fine. It runs, it performs well, it's stable. I can't speak to other people who have different experiences. I usually assume they're using some kind of OEM abomination while I used the plain ISO downloaded from Microsoft, and I've already gone through the ~10 minutes of effort to turn off the annoying stuff.
I sold my personal Mac and switched to Linux on Framework 13" after Liquid Glass came out. It was almost as jarring and poorly executed as Windows 8. Well, okay, maybe that's going too far.
(The other problem with my MacBook was the tiny amount of storage was growing difficult to work with, much easier to toss a 2TB SSD into a Framework and finally be done with worrying about storage)
I knew they were fucking with my virtual memory cause theirs sucks, the partition schemes on this Mac mini were ridiculous and the helpers weren’t stealing my information.
The same as any other corporate PR department: "At least now when people run it with N GB of RAM, we can just point to the system requirements and say 'This is what we support' rather than end up in a back-and-forth"
If you expect them to have any sort of long-term outlook on "Lets be careful with how developers view our organization", I think you're about a decade too late for Canonical.
At home I have a desktop running Arch plus Gnome with 32GB RAM and I am at 7GB on a normal day and below 16GB at all times unless I run an LLM.
Mainstream users and business organizations don’t really understand that concept and would prefer to see how the operating system enables their use cases and workflows.
RAM shortages will be quite temporary. Making predictions based on individual component shortages has never been a winning strategy in the history of the industry. Next you’ll tell me that graphics cards will be impossible to get because of blockchain.