Some additional examples beyond the OP:
- In the latest macOS, trying to set a custom solid color background just gives you a blinding white screen (see: https://discussions.apple.com/thread/256029958?sortBy=rank).
- GNOME removed all UI controls for setting solid color backgrounds, but still technically supports it if you manually set a bunch of config keys — which seem to randomly change between versions (see: https://www.tc3.dev/posts/2021-09-04-gnome-3-solid-color-bac...).
The pattern here seems pretty clear: a half-baked feature kept alive for niche users, rather than either properly supporting or cleanly deprecating it. Personally, I’d love to simply set an RGB value without needing to generate a custom image. But given the state of things, I’d rather have one solid, well-maintained wallpaper system than flaky background color logic that’s barely hanging on.
It also shows you the screen you set it for, and a boolean to set it for all screens at once.
KDE used to be the "bloated" desktop way back when (I know, pretty silly and laughable now given the current state of things).
That cemented Gnome/Mate into a lot of major distros as the primary DME. Ubuntu being the most famous.
The QT licensing situation is also a bit of a bizarre quagmire. There are certainly people that don't like KDE for more ideological reasons.
Personally, none of this bothers me and it's what I use for my personal computer. KDE is just so close to exactly how I'm used to interacting with computers anyways growing up through the Win95 era. It is so close to the Windows experience you want to have.
I'm not trying to be mean here, I'm just fascinated by what people will consider to be a waste of time.
Something that should be a default option, or a single-tap switch in settings, turned into a chore consisting of a period of agonising disbelief, doubt, denial, search, and eventually bitter acceptance.
However, I've noticed, there's not much point in changing it. Showing the desktop is a waste of screen real estate because of the generations of abuse of desktop shortcuts. Even if you are careful, it becomes a cluttered wasteland (on Windows anyways). I just learned to never use the desktop for anything and always have windows up on various monitors.
The situation is better these days, with windows store apps. Still, I developed the habit of just never using the desktop in the XP days when things were really bad.
There was a war over your eyeballs, which had shady software vendors warring over desktop space, start menu space, taskbar space, even fucking file associations. I recall for a little RealPlayer and Windows Media Player used to yank back and forth file associations each time they ran, even if you tried to make them stop.
They've also disabled auto-save if you don't have the documents backed up by OneDrive, which is the most egregious for me.
There's the peak GNOME experience.
Quite a capable machine for my uses.
Not supported in Windows 11. Maybe with some additional config? Can’t be bothered with something hat might turn out to be fragile and need more maintenance than I can be bothered with. That’s a young man’s gane.
Ok, I’m about due to give Linux another tickle anyways.
Hmm, which distro… can always give a few a spin.
Keep it simple, Pop!_OS.
Installed fast, no issues, runs fine, stable. Seems entirely usable.
Customisations? Nah, keep it simple.
I’ll set a black background though.
Nope.
Switching to upstream (Ubuntu) with KDE would probably be more your speed.
win11 ltsc works perfectly on it. With a solid background :D
Based on: Arch. Init: systemd. https://cachyos.org
Based on: Debian. Init: Non-systemd. https://www.devuan.org
Based on: Arch. Init: systemd. https://garudalinux.org
Based on: Independent. Init: Non-systemd. https://www.gentoo.org
Based on: Red Hat Fedora. Init: systemd. https://nobaraproject.org
Teams not loading due to security issues, but notifications coming through with full content of messages. Ability to type a handful of words in cloud version of Word (or paste full documents) before security check catches up and requires me to set up a sensitivity label. Etc.
It mostly tells me about MS doing very bad software architecture for web apps, though apparently the desktop apps are not immune to it either.
Some customers will push back and have enough leverage to get an exception, but the default answer will be that this can't be disabled. You'll have some sales engineer challenged about the product behavior as part of an RFP and they'll try to convince you that nothing is leaked while knowing the financial opportunity would be much larger with these customers, if there was more concern for the customer.
(Teams makes this Byzantine in the extreme to accomplish as you have to go find the folder it drops all shared files in to gain access to manage access settings. But it does allow you to retro change access even for things shared in Teams)
If so, pasting that link into Slack may reveal its first page.
1. All teams will henceforth expose their data and functionality through service interfaces.
2. Teams must communicate with each other through these interfaces.
3. There will be no other form of interprocess communication allowed: no direct linking, no direct reads of another team’s data store, no shared-memory model, no back-doors whatsoever. The only communication allowed is via service interface calls over the network.
4. It doesn’t matter what technology they use. HTTP, Corba, Pubsub, custom protocols — doesn’t matter.
5. All service interfaces, without exception, must be designed from the ground up to be externalizable. That is to say, the team must plan and design to be able to expose the interface to developers in the outside world. No exceptions.
6. Anyone who doesn’t do this will be fired.
7. Thank you; have a nice day!
Number 7 is a joke, etc.
Those photos may have already been uploaded to google's web servers (from my understanding, this happens with google photos by default?), from which a preview has been generated. The permission is at the android app level, and is requested at some point to ensure that the permission model is respected from the POV of the user. I can imagine the permission request being out of sync!
However when you're inside a note (which BTW, can also be converted into checkboxes, aka very simple TODOs), Google Keep, the note taking app from search giant Google, doesn't have search functionality for that specific note.
Besides the many small bugs, sometimes the missing functionality in Google apps is mind boggling.
https://support.microsoft.com/en-us/office/find-and-replace-...
Makes me think it must have something to do with their corporate culture and how they work, since their developers, to my knowledge, never had a reputation for being weak. Maybe it's just because they have such a gigantic user base for everything they do, that 80% is the best business value. Though I doubt that a bit, as a third party developer, it makes me avoid their products where I can.
Like in this article alone, that's a change that should never be made. It's an understandable bug but it's indicative of carelessness, both managerial (how does this hold up to code review?) and QA (why is no one asking why login splashes take 30 seconds?).
I was thinking about something similar recently. 80% of features take 20% of the time. For the hobby stuff I program, I want to make the best use of my time, so I skip the last 20% of features to make more room for stuff that matters.
I call it PDD: Pareto-Driven Development. Looks like you think Microsoft is similar.
Only when they "release" it. And then they start again from 10%. /s
I'm certain that some idiotic change just like the ones suggested in the article destroyed this perfectly working feature, and nobody is bothered to fix it because it would impact the latest harebrained scheme to make my 10 year laptop do AI in its sleep and suggest helpful ads about the things it couldn't help overhear while "sleeping".
Most of my computers and friends computers have been ASUS though, maybe that is a connection.
(Windows user since 3.11 but I don't think those had sleep modes :-)
That said, if I remove the joystick from the picture, my ASUS-based Am4 system sleeps just fine.
It can't just be the hardware, I think.
Back then Windows would default to a crap version of sleep but you could still disable it in the BIOS and by tweaking a couple of other settings, thus forcing it to use proper sleep. I’m pretty sure I wrote a lengthy response on HN about this including the specifics at the time.
That worked well until I got a new Dell laptop that removed support for the good sleep mode entirely.
So then I’d make sure to always leave the machine plugged in and switched on overnight before any travel… which is how I discovered that the machine had a bug where sometimes it would fail to charge whilst plugged in and switched on but not awake, so occasionally I’d still end up with a dead laptop on the train.
So then I’d start up the machine as soon as I got out of bed so it’d get at least 30 - 45 minutes of charging with not much load on it whilst I was getting ready to leave.
I absolutely hate Dell.
For my own use I’ve been buying Apple laptops since 2011 and, although they went through a grim period meaning I kept a machine from 2015 to 2024, I never had this sort of nonsense with them.
More recently, long after I stopped using Windows but still many years ago, I was reading an article about Arthur Whitney. It had a photo which seemd to be at home, maybe in a furnished garage, and in the background was a desktop computer running Windows. The only window open was a cmd.exe. I am not suggesting anything. It is just something I always remember.
Perusing some recent Microsoft documentation I noticed this:
https://learn.microsoft.com/en-us/windows/configuration/shel...
Now you have to guess whether the software has really loaded or not before you start using it.
I could understand it if your device needed special access (VPN to prod etc), but you usually can't do that either from the dev machines - and need to first connect to a virtual machine (via browser or rdp) to be able to do that...
and it has migrated to web apps today - where doing something causes the UI to show a loading/progress wheel, but it takes forever in actuality (or on start up of the webpage, you get a blank screen with placeholder bars/blurred color images etc).
And this is the so-called responsive design...
I’m not sure if this was meant to be a pun, but “responsive design” has nothing to do with how quickly a UI loads. It’s about adapting to different screen sizes.
Ding! Ding! Ding! We got a winner!
Yeah, maybe we could expect machines which got 40 years of Moore's law to give you an experience at least as snappy as what you got on DOS apps.
It's honestly very sad.
People underestimate how slow the network is, and put a network between the app and its logic to make the app itself a thin HTTP client and "application" a mess of networked servers in the cloud.
The network is your enemy, but people treat it like reading and writing to disk because it happens to be faster at their desk when they test.
Loading apps on it definitely did not take one second. The prevalence of splash screens was a testament to that. Practically every app had one whereas today they're rare. Even browsers had multi-second splash screens back then. Microsoft was frequently suspected to be cheating because their apps started so fast you could only see the splash for a second or two, and nobody could work out how they did it. In reality they had written a custom linker that minimized the number of disk seeks required, and everything was disk seek constrained so that made a huge difference.
Delphi apps were easier to create than if you used Visual C++/MFC but compared to modern tooling it wasn't that good. I say that as a someone who grew up with Delphi. Things have got better. In particular they got a lot more reliable. Software back then crashed all the time.
I know it's very simple, I know there isn't a lot of media (and definitely no tracking or ads), but it shows what could be possible on the internet. It's just that nobody cares.
[1] Yes, Hacker News is also quite good in terms of loading speed.
It's the right thing to do to load resources asynchronously in parallel, but you shouldn't load the interface piecemeal. Even on web browsers.
I'd much rather wait for an interface to be reliable than have it interactive immediately but having to make a guess about its state.
We all like to think we have picked up habits that immunize us from certain kinds of error but large software systems are complex and bugs happen.
The number of people in here taking ‘Raymond Chen tells an anecdote about the time a dumb bug shipped in Windows and was fixed two weeks later’ as an indictment of Microsoft’s engineering culture is frankly embarrassing. Trading war stories is how we all get better.
It would be better for us all if culturally, the reaction to a developer telling a story of how they once shipped a stupid bug were for everyone to reply with examples of worse stuff they’ve done themselves, not to smugly nod and say ‘ah yes, I am too smart to make such a mistake’.
I didn't say I'm immune to doing this myself, nor did I condemn anything about the particular scenario in the blog. My pain is in trying to articulate why some ways are better when any code that works is in some sense just fine.
>> We all like to think we have picked up habits that immunize us from certain kinds of error but large software systems are complex and bugs happen.
We sure do, although "immunize" is too strong. We try to minimize the likelyhood of these kinds of things. Experience is valuable and sometimes it's hard to articulate why.
It still feels more like craftsmanship than actual engineering. A lot of the time it’s more like how a carpenter learns to use certain tools in certain ways because it’s safer or less prone to error, than how an engineer knows how constructing a particular truss ensures particular loads are distributed in particular ways.
And one of the best tools we have for developing these skills is to learn from the mistakes others have made.
So I agree - I think your instinct here was to look at this error and try to think whether you have engineering heuristics already that would make you unlikely to fall into this error, or do you need to adjust your approach to avoid making the same mistake.
My criticism here was more directed to others in the thread who seem to see this more as an opportunity to say ‘yeah, Windows was always buggy’ rather than to see it as an example of a way things can fail that they need to beware of.
I've learned to use default configurations pretty much everywhere. It's far too much of a hassle to maintain customizations, so it's easiest to just not care. The exception is my ~50 lines of VS Code settings I have sync'd in a mysterious file somewhere that I've never seen, presumably on github's servers, but not anywhere I can see?
Just your regular reminder that nix is good actually.
"I have a bug, you can get a full VM that reproduces it with 'nixos-rebuild build-vm --flake "github:user/repo#test-vm" && ./result/bin/run-*-vm'"
And the code producing that VM isn't just a binary blob that's a security nightmare, it's plain nix expressions anyone can read (basically json with functions).
And of course applying it to a new machine is a single command too.
(Would it be pedantic of me to say that I receive my fair share of bug reports on nix code I maintain, and when someone sends me their entire nixosConfig the very first thing I do is punt it back with a "can you please create a minimal reproducible configuration"? :D but your point stands. I think. I like to think.)
Is it? The vast majority of the time, I change settings/set things up the way I want, and then... leave them for literally years. Hell, I can directly restore a backup I have of Sublime Text from years ago and my customizations will work.
Somewhere along the way I lost interest in customizing the OS. These days I routinely switch between MacOS, Windows and various Linux flavors on lots of computers. The only thing I may customize is I write my .vimrc from memory.
On my Android phones, I change the wallpaper and I disable animations. Otherwise, stock everything.
Now that I think about it, it can't be the time saved, surely I waste more time on HN. It likely correlates more with using computers for work as opposed to for fun and learning. Even the learning I do these days is rather stressful - if I can steal an hour or two on the weekend, I feel lucky, so spending time to customize the environment seems like a waste.
Maybe if life slows down, I'll find joy in customizing my OSes again.
On the note of programming not being fun anymore, that's exactly why I'm making my secret project that I hope to release very very soon, maybe in a week or so. I want to make programming fun again, in a similar way that pico8 did, but x100.
> I have the same history customizing everything! ... then giving up because life gets busy.
I think this might be why some people have such different experiences. I don't try to customize "everything" - just what needs to be. Like, yeah, I would expect it to be difficult to maintain random Explorer customizations. I would not expect it to be difficult to maintain customization for a popular IDE.
Too much software put host-specific stuff in settings files (like absolute paths) or just are not stable enough in general that it is worth trying to maintain a portable configuration.
The hard part of maintaining a config is that there's no such thing as cost-free usage, it always takes a mental toll to change a config, to learn a new config, to remember which configs are in use and what they do, to backup configs, or at least to setup and maintain a config-auto-backup flow.
By far, the easiest mental model is just learning how everything works out of the box, and getting used to it. Then again, sometimes what people want is to do things the hard way for its own sake. That's probably part of why I kept going back to C over and over for so many years.
The oldest parts of my emacs config go back at least 30 years and I have had it in a git repo for ~15. I keep my entire .emacs.d versioned, including all third-party packages I depend on (so no fetching of anything from the cloud).
Have had to do at most minimal changes when upgrading to newer versions and with some tiny amount of work the same config still works for emacs from around version 21 to 31 (but features are of course missing in older versions).
He surely didn't use any Microsoft product. /s
Because they made it a runtime thing - "components just have to remember to do this", the code structure itself affords this bug.
There was a similar bug at facebook years ago where the user's notification count would say you had notifications - and you click it, and there aren't any notifications. The count was updated by a different code path than the code which inserted notifications in the list, and they got out of sync. They changed the code so both the notification count & list were managed by the same part of the system, and the all instances of the bug went away forever.
if (request.authenticationData) {
ok := validate(etc);
if (!ok) {
return authenticationFailure;
}
}
Turns out the same meme spans decades. void doFoo(PermissionToDoFoo permission, ...){...}
and then, the only way to call it is through something like from request import getAuth, respond
\\ Maybe<AuthenticationData> getAuth(Request request)
\\ void respond(String response)
from permissions import askForPermissionToDoFoo
\\ Maybe<PermissionToDoFoo> askForPermissionToDoFoo(AuthenticationData auth)
response =
try
auth <- getAuth(request)
permission <- askForPermissionToDoFoo(auth)
doFoo(permission)
"Success!"
fail
"Oopsie!"
respond(response)
It becomes impossible to represent the invalid state of doing Foo without permission.If there is no authenticationData then the if !Ok is never run and the code continues execution as it were authenticated.
If I had dollar for every minute of my life I spent troubleshooting random group policy quirks during my previous life as a sysadmin...
> Personally, I use a solid color background. It was the default in Windows 95,¹ and I’ve stuck with that bluish-green background color ever since.
My thoughts exactly, but I think it goes back to the Mac LC's we used in a school computer lab, and the palette of colors you could have with 16-bit graphics was so vast compared to the 16 color PC's I was used to.
Plus, you always have so much stuff open you're never going to see your wallpaper anyway. That's what screensaver are (were) for that rotate through a folder full of images.
A similar (slightly older) laptop I own boots from fully off to the KDE desktop in 25 seconds total including typing my password.
But on my Win10 it stopped working idk why, so I wrote a script to download Bing Image of the Day instead: https://blog.est.im/2025/stdout-03
Would have been easier to stick with the pixel density we had.
Oh, and we have to wait a frame to see everything because of compositing that I still don't quite understand what it's supposed to do? Something something backing store?
https://randomascii.wordpress.com/2024/10/01/life-death-and-...
One pattern I've had success with is using handles they need to be returned. If you never grab a handle you don't have to worry about returning it. Seems to work better than the fire and wait for side effects approach.
- Half of each boot time was wasted for a copilot 360 dialog. On every fucking boot, no copilot no office installed. Or rather copilot installed itself without notice and started to spam me
- In several places the OS would show me "news" like death messages, war updates and economy details. Definitely far from a productive environment and honestly heavily triggering, I don't read news anywhere but my PC is full of it and there is no option to disable? What about kids?
- I have updates or a broken system about every second time I boot the PC. I know it's because I just cut the power, but I hate when it asks 3 times if I want to actually shut down (and then still breaks it, or never actually shuts down)
- I constantly end up in a variety of login screens that want me to login to an Microsoft account I don't have and want
- There are soooo many ads. I've been on Linux for years, instead of traditional TV I almost always stream with ad blocker. The country I live in isn't plastered with Ads either. But this shithole of operating system is. It literally pops up ad notification above apps on default.
If anyone wonders most problems where solved with "ShutUp10" others with chatGpt and regedit. It was actually pretty hard when you have no idea about this OS and it's dark patterns.
On my Linux machines I don't even change the wallpaper, but windows defaults are unbearable and upright productivity killers
I think they're trying to emulate Apple who has had stocks integration by default for years, including being alongside the other pre-installed apps like SMS and Mail on the first iPhone. I imagine Apple did it to cement themselves as a high-class lifestyle brand, even though I'm sure there was never a time where most iPhone users were doing a lot of day trading.
I wonder what percentage of Windows users rely on the stock ticker in the start menu though...
Their other distributions are very good as well, especially for Windows XP because they bundle a lot of important drivers for old software to work correctly
In test and CI we had this set to a very low number. In acceptance (manual testing, smoke testing) to a very high number.
This was useful because it showed three things:
- Network- and services- configuration bugs, would immediately give a crash and thus failing test. E.g. firewalls, wrong hosts, broken URIs etc.
- Slow services would cause flickering tests. Almost always a sign that some service/configuration/component had performance problems or was misconfigured itself. Quick fix would be to increase the timeout, but re-thinking the service - e.g. replace with a mock if we couldn't control it, or fixing its performance issues, the proper- and often not that hard- fix. Or re-thinking the use of the service, e.g. by pushing it to async or a job queue or such another fix.
- Stakeholders going through the smoke-test and acceptance test would inevitably report "it's really slow" showing the same issues as above but in a different context and with "real" services like some external PSP, or SMTP.
It was really a very small change: just some wrappers around http-calls and other network calls, in which this config was used, next to a hard rule to never use "native" clients/libs in the code but always our abstraction. This then turned out to offer so much more benefits than just this timeout: error reporting, debugging, decoupling.
It wasn't javascript (Ruby, some Python, some Typescript) but in JS it would be as easy as `function fooFetch(resource, options) { return fetch(resource, options) }` from day one, then slowly extended and improved with said logging, reporting, defaults etc.
I've since always introduced such "anti-corruption" layers (facades, proxy, ports/adapters) very early on, because once you have "requests.get("http://example.com/foo/bar")' all throughout your python code, there's no way to ever migrate away from "requests" if (when!) it gets deprecated, or to add said timeout throughout the code. It's really a tiny task to add my own file/module that simply imports "requests" and then calls it on day one, and then use that instead.
Typical response is "Well it should just work anyway!". Which is theoretically true -- the worst kind of true.
Seeing how that complicated if-then logic is just too stiff a challenge to your average developer, we should probably just dispense with it.
Just yesterday, I ran into a bizarre bug on Windows where the mouse cursor would move every time I pressed the arrow keys—almost like I was controlling the mouse with the keyboard. It drove me nuts. I checked all the usual mouse and keyboard settings, but everything looked normal. At one point, I even wondered if my machine had been infected by a virus.
Desperate, I Googled "mouse pointer moving on arrow keys". The first result had only one answer, which blamed... Microsoft Paint. I was skeptical—Paint? Really? That couldn’t possibly be it. Still, with no other leads, I gave it a shot. As it turned out, I did have Paint open in another Desktop View, where I’d been cropping a screenshot. The moment I closed it, the problem vanished. Instantly.
I still can’t believe that was the cause—and I’m a little embarrassed to admit it, even though no one was around to see it.
_____________________
1. https://superuser.com/questions/1467313/mouse-pointer-moving...
Years ago, I had a bug so bizarre I nearly convinced myself the machine was haunted. My mouse pointer started drifting—not randomly, but only when I pressed the arrow keys. Up arrow? Cursor nudged north. Down arrow? There it went again. I was convinced some accessibility setting or keyboard remap had gone haywire. I rebooted. I checked drivers. I even briefly entertained the idea that my codebase was cursed.
Three hours in, I realized the true culprit: MSPaint. I had opened it earlier, and because the canvas was selected, the arrow keys were actually moving the selection box—which, by delightful Windows design, also moved the mouse cursor. I wasn’t losing my mind. Just... slowly drawing rectangles in the background every time I hit an arrow key.
I closed MSPaint, and poof—my “haunting” ended. I haven’t trusted that application since. Great for pixel art, less great for your sanity.
------------
You and ChatGPT sound identical.