Luckily for Apple, Windows 11 is not exactly in a position to attract switchers.
Let’s see if Apple can turn things around. iOS 8+ did improve on iOS 7’s worst bits.
Don’t try to interact with a windows desktop while it is still booting up. Better to wait for everything to settle down, otherwise apps will constantly snatch away focus and your typing will go into random applications.
I work on a desktop Windows/Mac application that takes forever and a day to launch (CAD package), and pops up a million pop-ups during the process. I try to get minor admin tasks done while it is compiling/launching, but it steals focus every 10 seconds!
Still beats using XCode, though
by default if you haven't typed anything for a little bit Windows allows an application to steal focus. If you change that value you can prevent windows from ever stealing focus or change how long they have to wait before they're allowed to.
Mac OS used to be rock solid. We had machines at work that had uptime measured in years. My own machine would go months.
It doesn’t anymore. Restarted twice today.
Wikipedia claims that Android "has the largest installed base of any operating system in the world", if you're going to measure popularity that way.
(Of course it's hard to know how to define an OS. Is Android a kind of Linux? Are the various things called "Windows" or "MacOS" to be regarded as different versions of the same OS just because marketing people decided to use the same name? If not, how much similarity in code or design is required?)
Android is an app platform.
No.
Most common? Loathed? Used? Most tolerated?
It’s not liked, and ‘popular’ implies that.
A flashing cursor in an inactive text box. Possibly the most annoying of bugs.
Looking at you Windows, COMRAD and every login I ever do.
by default if you haven't typed anything for a little bit Windows allows an application to steal focus. If you change that value you can prevent windows from ever stealing focus or change how long they have to wait before they're allowed to.
by default if you haven't typed anything for a little bit Windows allows an application to steal focus. If you change that value you can prevent windows from ever stealing focus or change how long they have to wait before they're allowed to.
Windows has a ton of little settings you can tweak like this if it's not working quite how you like it.
I personally tweak it the other way to allow a window to pop up and still focus sooner .
If you set up via PowerShell you can do it more dynamically and if you're doing it via the API there's behavior in there too force a lock
I experience the same with macOS. For example Discord steals focus.
When you buy powerfull computers, this problem basically doesn't exist, both on Windows or macOS. Since Macs have historically been more expensive and premium, even the cheaper model was powerfull enough to finish the boot sequence fast enough that the desktop would feel snappy almost instantly. On the other hand, cheap PCs struggle to accomplish every task in a timely manner.
I am amazed about how stupid and ignorant is the average Mac fanboy. I have been a Mac user first and foremost, but you guys are just full of shit.
If you want to report something to Apple you use the "Feedback Assistant App"
and watch years go by with no fixes or improvements to basic OS fundamentals.
They finally found a marketable name for /dev/null.
I wish these people would wake up and spend their time helping peers on a forum for some open source project instead.
They pretend to offer "solutions" so their posts don't come across as unconstructive, but their solutions are always essentially the same, often culminating in a factory reset. There is never any attempt to get to the bottom of anything or diagnose what the actual issue is.
They are volunteering their time to make people shut up, bow their head in shame and go away. I don't think this is what you want in an open source project.
Instead we get:
https://developer.apple.com/forums/thread/669252
>”… Now, the few Apple engineers that get back to me for some of these issues and the Apple support as well often tell me that Apple really cares about customer feedback. I really want to believe this ... but it's so hard to believe it, if less than 1% of my submitted reports (yes, less than 1%, and it's probably much less) ever gets a response. A response, if it ever comes, can come after 3 months, or after 1 year, or after 3 years; only rarely does it come within 1 month. To some of the feedbacks, after getting a response from the Apple engineers, I responded, among other things, by asking if I'm doing something wrong with the way I submit the feedback reports. Because if I do something wrong, then that could be the reason why only so few of them are considered by the Apple engineers. But I never got any answer to that. I told them that it's frustrating sending so much feedback without ever knowing if it's helpful or not, and never got an answer. …”
Why is this Apple’s path?
In my exp. their _support_ is fantastic which is another reason it’s odd they will simply leave countless _feedback_ submissions open nearly indefinitely. They ignore their free laborers!
It's not just on Apple's forums, Microsoft has the same kind of guys. They tend to look really popular too because all the other fanbois upvote their comments.
And not only there, many open-source software forums have the same problem.
At least you know it’s not working as place to submit issue reports. It is better than other way, like Figma, 1Password and many others: a Support Forum with an army of yes-men “support specialists”. They would answer your query with basic troubleshooting and then will say that it will be passed to development team or will be considered, etc. perfectly designed system to pacify user and dismiss their report.
I’m no Harley owner but you and I both know the answer to that.
As a developer, it's easy to be blind to this because they're on "your side", but it's bad for the health of your support forums.
(They do defend them IRL, it's "commonly known" that HDs have issues that the install base "overlooks".)
lol. Can't tell you how many times I've clicked "I have this problem too" on a page where users can't help each other because apple is doing/hiding something stupid.
There's a handy Python script here to show log which application is stealing focus: https://superuser.com/a/874314
If you find it's SecurityAgent then you might be hitting this bug: https://developer.apple.com/forums/thread/807112
I suspect it's related to a JIT privilege management app my company uses.
But apparently Apple is not the only offender. Just as I was typing this (on Harmonic on Android), a popup popped up, ate a few of the characters I typed and disappeared again. No idea what it said. Why do people do this? Don't hijack let applications I didn't ask for hijack my input.
Non-Mac OSs don't know how to turn this GPU on out of the box, so it just sits there without bothering anybody. But, for some reason, MacOS turns it on and it craps the bed, rendering the machine unusable.
Apple ended up replacing the mainboard in a free out-of-AppleCare repair. I never had the problem again and I used the machine regularly until about 2018.
Another option which sidesteps the Logi Options crap is Logitech "gaming" mice. These have an integrated memory that actually remembers the configuration set by the driver. So, you only have to put up with the shitty experience once, and then the mouse remembers those settings wherever you use it. Some models can actually remember multiple setting sets.
One of my best mice is a G700s. I haven't used the Logitech G crap in like... ten years? The mouse is still going strong. Its only issue is that it goes through batteries like a hot knife through butter. I like it so much, I actually bought a second one for work. Got it used, since they weren't making them anymore.
https://support.logi.com/hc/en-us/articles/37493733117847-Op...
Which would only happen if you are using it wireless.
It happened to me this morning.
That's not macos fault in this case, it's just that Logitech mouses (MX Master at least) doesn't act well at all without driver. Like, for scrolling, it's like the mouse is sending raw smooth scrolling each time you just touch the wheel and without the driver that presumably fake it on the computer side, there is no synchronisation between your actual scroll and the steps you physically feel in the wheel.
Whether or not external micr suck on MacOS doesn’t really matter. The objective was to diagnose an issue.
I use a trackball for RSI reasons, in order to get across the screen in a single flick means high sensitivity, mouse acceleration is absolutely needed to be able to make small movements. This makes my scroll wheel useless because a single scroll moves the page about 1/10 of a line
My sister in law gave me her G700S to fix the main button microswitches, and she convinced me that it's the apotheosis of the design - it's what should have replaced the Performance MX. No soft-touch plastic, extra buttons, and the higher resolution sensor. I'll probably have to get one off eBay.
Edit: also all of the Masters have non-user-replaceable batteries.
Yes, but the battery is standard and easily replaceable.
My main gripe with the G700s is the weight, although it's not much heavier than the mx master 3. It also helps to have a great mousepad, or else I get tired of pushing that brick around. There are also aftermarket pads if you use it on the desk and they wear. I haven't tried any, though, my pads are still fine.
The G604 that the prior post referenced uses a single AA battery, so also standard and easily replaceable.
(except that my latest one has just suffered catastrophic battery failure)
My latest version of the problem is with Ubuntu Gnome. Upgrade software and, later, you will be interrupted with a pop-up window to enter your system password. Not only is this an interruption, I’m always doubtful that this is the system asking for a sudoer password!
UIs, in my experience, are very bad at handling “interrupts”. Sorry, my dad designed chips, so I use that hardware term when talking about notifications and other times another application needs to notify or get the input from user. Personally, I’d have the UI change the color/texture of the system menubar/taskbar and wait for the user to click it.
Just install the SuperTyping app. It's sooo good and intuitive. Totally worth the $189, if you consider how often you need to type something.
I also recommend Little Snitch as firewall and Parallels for virtualization.
Does anyone have a recommendation for bootloader or filesystem app? Preferably subscription model for intuitive accounting.
Here I've been thinking it's a hardware problem, like some sort of mechanical intermittent. Maybe not.
Especially annoying when every app is likely to have single-key shortcuts which end up being accidentally triggered.
Obviously by shutting the hell up, you ungrateful serf. The beatings will continue until morale improves.
Seriously, though, if you want this to stop, people like you are going to have to start voting with their wallets.
I finally pulled the plug on macOS a couple years ago for Linux, and I haven't been unhappy about it. However, I did make a point of buying a laptop that was well supported on Linux (a Lenovo X1 Carbon that was in the same price class as an equivalent Mac).
Too many companies are balking at spending money on hardware right now. While I would love to think that this will drive Linux adoption, it probably won't. Microsoft is going to cave on TPM 2.0 for Windows 11 or extend Windows 10 support much further.
I wouldn't mind if this finally lights a fire under certain software companies to also actually optimize their shit for memory use, but... I'm not that optimistic.
(OTOH running text-mode Emacs from a headless VM in a full-screen built-in Terminal may suddenly feel sluggish. Kitty or WezTerm solves this.)
You think I CHOSE to be miserable? Sorry, I have kids, and you know, they don't exactly live on dog food, but even that costs money. My job requires me to use a Mac. And please don't tell me to "find a better job". I've been programming for over twenty years, only once I was given a clearance to pick "whatever kind of machine you want". For my personal computing I do use Linux.
https://www.reddit.com/r/privacy/comments/1d8a0wv/if_you_are...
https://talk.macpowerusers.com/t/bartender-change-of-ownersh...
> superpowered your menu bar, giving you total control over your menu bar items, what's displayed, and when, with menu bar items only showing when you need them.
Which also, for some reason has permission to record your desktop and recently had a change of owner? I'd be reformatting my computer so quickly if I found this out about software on my computer...
The screen recording permissions are needed for it to be aware of when menu bar icons update so it can move them in and out of the menu bar; I believe later versions allow you to skip screen recording permissions if you’re willing to forgo that feature.
[1]: https://setapp.com/
https://support.logi.com/hc/en-us/articles/37493733117847-Op...
You can run a python script to track the focused window every few seconds to identify what’s stealing focus.
So its not just me!
The memes about Steve Jobs turning in his grave are true. He would not have stood for slop like this for even a moment. Apple's quality game was miles higher back in the day.
Even if they tried to do some kind of Snow Leopard maintenance release for all of their products, I don't think they could raise the bar on quality high enough in just a single release. They'd have to do it a few times with nothing new to show for it.
This speaks nothing of the transition to MacOS looking more and more like a dysfunctional toy since Jony Ive left and Alan Dye took over.
Tiger and Snow Leopard were the peak.
I had a similar problem at one point, then finally figured out it was when I accidentally hit the fn button which triggered the emoji picker window and moved focus to it (IIRC), but it was off-screen because I'd previously used it on a secondary monitor. Reconnecting the monitor and moving the window back to my primary display fixed it. (Obviously, it's a bug to show a picker window outside of visible coordinates, and I think it got fixed eventually.)
But it also might not be Apple at all, if it's some third-party background utility with a bug. E.g. if that were happening to me, my first thought would be that it might be a Logitech bug or a Karabiner-Elements bug. Uninstalling any non-Apple background processes or utilities seems like a necessary first step.
In my experience, shipping a product as polished as Mac OS X 10.6.8 Snow Leopard requires a painful level of dedication from everyone involved, not just Quality Assurance.
As long as neither the New York Times nor the Wall Street Journal writes about how bad Apple’s software has gotten, there’s even no reason for them to think about changing their approach.
The drama surrounding Apple’s software quality isn’t showing up in their earnings. And at the end of the day, those earnings are the "high order bit," no matter what marketing tries to tell us.
How?
How do you reproduce something when you have no idea of the cause and it's not happening on any of your machines?
And remember they don't have just this one unreproducible bug reported. They have thousands.
If you have experience writing software, you're going to end up with a lot of unreproducible bug reports. They're a genuine challenge for everyone, not just Apple.
No OS should steal focus, Windows absolutely is guilty of it.
I chose what happens after. Can recommend. I wasn't even aware of my privilege.
Reproduction steps:
- Start a reply to this comment in your browser, type some example words.
- Create a BAT file with the following contents:
@echo off
timeout /t 15 /nobreak >nul
start notepad.exe
- Run the BAT file.
- Immediately switch back to the browser tab, and place your focus into the HN reply box. Type a word.
- Wait for notepad to open
- Continue typing. Your typing will go into Notepad and not the browser tab you had focused last.
This occurs commonly and continuously on Windows, it is damn obnoxious. The OS should never ever change focus, it should however flash the window/taskbar, that is acceptable, but not shift my typing into whatever arbitrary program opened. This used to be fixable via "ForegroundLockTimeout" which is what classic TweakUI altered, but was killed in Vista.If you're a Visual Studio user, it is a daily annoyance. You hit Start/Play, go about your work, and then suddenly some time later focus shifts out from under you.
Edit: I tried it with Firefox and got a repro there. No stealing with Edge.
Focus should change only in response to user commands.
When you launch an application or open a dialog, you expect the new window to "steal" focus. When you close a dialog, you expect focus to go back to the main window. If it didn't, it would impair usability.
So how would an OS decide when "stealing focus" is allowed and when it is not?
Like, I'm frustrated with it too. I hate when an app pops up a dialog while I'm typing and my next keystroke dismisses it and I have no idea what I've done. But at the same time, I'd hate to have to manually switch focus to a pop-up dialog every single time before dismissing it with Enter or Escape too -- that would be way too annoying in the other direction.
Tiny number of users with such an enormous user base (10-16% desktop share) still means there's thousands of users affected.
This seems like an example of a situation that modern machine learning could help with. Take bug reports permissively and look through all of them for patterns. Loss of focus should be the kind of thing that would stand out and could be analyzed for similarities and recurring features. Making sense of large amounts of often vague and rambling reports has been a problem for a long time and seems like a domain that machine learning is well set for.
Yes, but Linux is finally in that position, not to mention we're seeing silicon from intel and amd that can compete with the M series on mobile devices.
Let's not even talk about the case when you have monitors that have different DPI, something that is handled seamlessly by MacOS, unlike Linux where it feels like a d20 roll depending on your distro.
I expect most desktop MacOS users to have a HiDPI screen in 2026 (it's just...better), so going to Linux may feel like a serious downgrade, or at least a waste of time if you want to get every config "right". I wish it was differently, honestly - the rest of the OS is great, and the diversity between distros is refreshing.
I have been using a 4K display for years on Linux without issues. The scaling issue with non-native apps is a problem that Windows also struggles with btw.
I'm currently using a laptop (1920x1200, 125%) + external monitor (1920x1080, 100%) at work. The task manager has blurry text when putting in the external monitor. It is so bad.
I also have programs that bleed from one monitor onto another when maximized. AutoCAD is one offender that reliably does this -- if it's maximized, several pixels of its window will overlap the edge of the window on the adjacent screen. The bar I set for windows is pretty low, so I'm generally accepting of the jank I encounter in it vs Linux where I know any problem is likely something I can fix. Still, that one feels especially egregious.
Works just fine here (1920x1200 125%, 4K 150%, 1080p 100%).
(And a workaround).
Unfortunately I cannot modify registry at work so I have to live with it.
Text rendering × DPI seems to be one of those difficult problems.
Gnome in Linux works great for a decade+ with a single high resolution screen, but there are certainly apps that render too small (Steam was one of the problems).
Different scaling factors on several monitors are not perfect though, but I generally dislike how Mac handles that too as I mostly use big screen when docked (32"-43"-55"), or laptop screen when not, and it rearranges my windows with every switch.
Zero fiddling necessary other than picking my ideal scaling percentage on each display for perfect, crisp text with everything sanely sized across all my monitors/TVs.
I gave up on Linux Mint for that exact reason. I wasted so much time trying to fine tune fonts and stuff to emulate real fractional scaling. Whenever I thought I finally found a usable compromise some random app would look terrible on one of the monitors and I’d be back at square one.
Experimental Wayland on Linux Mint just wasn’t usable unfortunately and tbh wasn’t a big fan of Cinnamon in general (I just really hated dealing with snaps on Ubuntu). I did tweak Gnome to add minimize buttons/bottom dock again and with that it’s probably my favorite desktop across any version of Linux/MacOS/Windows I’ve ever used!
I kept reading endorsements of Fedora's level of polish/stability on HN but was kinda nervous having used Debian distros my entire life and I’m really happy I finally took the plunge. Wish I tried it years ago!
This. I don't know why, but people forget about Fedora when considering distros. They rather fight Arch than try Fedora. So, did I. Maybe its Redhat. Wish I switched earlier, too. (Although I heard this level of polish wasn't always the case.)
I love Fedora so much. Everything just works, but that's not that special compared to Ubuntu. What is special is the fucking sanity throughout the whole system. Debian based distros always have some legacy shit going on. No bloat, no snap, nothing breaking convention and their upgrade model sits in the sweet spot between Ubuntu's 4 year LTS cycle and Arch's rolling release. Pacman can rot in hell, apt is okay, but oh boy, do I love dnf.
Tho, Fedora has some minor quirks, which still make it hard to recommend for total beginners without personal instructions/guidance IMO. Like the need for RPMFusion repos and the bad handling/documentation of that. Not a problem if you know at all what a package manager, PKI and terminal is, but too much otherwise.
I'm now looking to get off Windows permanently before security updates stop for Win 10 as I have no intention of upgrading to Win 11 since Linux gaming is now a lot more viable and was the only remaining thing holding me back from switching earlier. I've been considering either Bazzite (a Fedora derivative with a focus on gaming) or Mint but after reading your comment I may give vanilla Fedora a try too.
So far I've tried out the Bazzite Live ISO but it wouldn't detect my wireless Xbox controller though that may be a quirk of the Live ISO. I'm going to try a full install on a flash drive next and see if that fixes things.
I still had the issue of no gamepad detection. I had to install xone which took some trial and error. Firstly, I didn't have dkms installed and secondly, soon after installing Fedora the kernel was updated in the background and on reboot my display resolution was fixed to 1024x768 or something for some reason (that's gonna be another issue I'll have to look into). I rebooted and went back to the previous version and then dkms complained the kernel-headers were missing. However, the kernel-headers were installed for the latest kernel but not the older version I had rebooted to. I'm not used to Fedora or dnf (I run Proxmox+Debian in my homelab) so after a quick search to figure out how to install a specific version of a package (it's not as simple as <package>@<version> but rather <package>-<version>.fc$FEDORA_VERSION.$ARCHITECTURE) I got kernel-devel installed and was able to finally run the xone install script successfully and have my gamepad detected.
The most frustrating thing is that the xone install script doesn't fail despite errors from dkms so after the first install (where I almost gave up because I thought something was wrong with my setup) I had to run the uninstall script each time there was a problem and then run it again. The xone docs also mention running a secondary script which doesn't actually exist until the first script runs successfully so that added a lot of confusion.
My understanding is you only need xone for the special adapter right? Have you tried cable and plain bluetooth before? Also Steam seems to come bundled with their own drivers for it, so the controller may just work within games in Steam, regardless.
I feel a bit bad, but honestly gaming on Linux is not my thing. From a quick glance, messing with the kernel like that may cause problems with secure boot and maybe that's causing your issues. Maybe you need to sign your modules or disable secure boot.
Have you tried the Copr repo? https://copr.fedorainfracloud.org/coprs/jackgreiner/xone-git...
And of course Bazzite seems to have addressed this out-of-the-box... :D
Quite frankly, if you want to do anything but gaming on that machine, at least for me, manually installing kernel modules from GitHub would be a deal breaker, since that seems rather unstable and prone to cause nasty problems down the line.
I have a new issue though after updating 900+ packages using KDE Discover which is that the GUI login doesn't work. The screen goes blank after I enter credentials and nothing happens unless I switch to another TTY at which point I get thrown back to the login screen on TTY1. As a workaround, I can login on another TTY and then use startplasma in order to use KDE. I've learnt my lesson not to use KDE Discover for updates though because it doesn't get logged in dnf history so you can't use dnf rollback.
LTS support runs for 5 years (there is extended support for 10 years available), so you can skip an LTS if you don't need the latest base software.
The only reliable fixes are to either disable that DisplayPort feature if your monitor supports it, or to disable GPU Dithering using a paid third-party tool (BetterDisplay). Either that or switch to Asahi, which doesn't have that issue.
The issue is common enough that BENQ has a FAQ page about it, which includes steps like "disable dark mode" and "wait for 2 hours": https://www.benq.com/en-us/knowledge-center/knowledge/how-to...
One of the many random issues on the OS with the best UX in the world (lol). Like music sometimes stopping and sometimes switching to speakers when turning off Bluetooth headphones, mouse speed going bananas randomly requiring mouse off and on, terminal app (iterm2) reliably crashing when I dare to change any keybinding, and many other things that never happened in years of working on Linux.
https://www.rtings.com/monitor/reviews/best/by-usage/busines...
And as noted here on HN a couple days ago, avoid OLED. Coincidentally, the top office monitor per rtings is what that post compared OLED to:
https://nuxx.net/blog/2026/01/09/oled-not-for-me/
https://news.ycombinator.com/item?id=46562583
We use pairs of these Dells per MacBook at our offices and provide them for WR as well. There've been no issues on this Dell or prior models on M1 through M4 (M5 iPad is fine too).
As for DSC, that's been a complaint for a minute… Example HN reader theory on DSC, from Aug 2023:
In my opinion a QHD 23.8” panel is the next best option for developers (any M-series chip handles scaling without issues); I find the common 27” and 32” at 4K a weird spot - slightly too large, slightly too low resolution – and 5k+ options are still rare.
I’m also, to get the two external displays without them being mirrored, using a docking station and a display driver from Silicon Motion called macOS InstantView.
This is of course not ideal if you need DP and DSC.
I would like to point out that, from my experience on M1, external displays do not work at all over DisplayPort on Asahi Linux at the moment.
My 2019 Mac Pro with Catalina could happily drive 2 4K monitors in HDR @ 144 Hz.
People wondered how Apple got the math to work to drive the ProDisplay.
Big Sur? Not any more. 95Hz for 4K SDR, 60Hz for 4K HDR. Not the cables, not the monitors. Indeed, "downgrading" the monitors advertised support to DP 1.2 gave better options, 120Hz SDR, 75Hz HDR.
And it was never fixed, not in Big Sur, Monterey or Ventura, when I had switched monitors.
Hundreds of reports, hundreds of video/monitor combinations.
The third party software is really good, but come on, Apple, take a hint.
I doubt anything is going to get fixed, and Apple's hardware crown isn't as strong as before. But they like selling "services," so...
And, as I said, I really only needed the software once I got an (ultra)ultrawide monitor, and it could be the info it is sending is also non-standard in some way.
"a hurried, wild, or desperate manner, often due to extreme worry, fear, excitement, or panic"
At some point this frantic nature of trying to do something will cause more issues all by itself.
Instead of spending hours in desperation, I was only suggesting taking a step back and maybe when not in a frantic state, it would be easier to move forward.
i think that's what you're describing, anyway.
Linux has bugs, bug MacOS does too. I feel like for a dev like me, the linux setup is more comfortable.
I know people will say meh but coming from the world of hurt with drivers and windows based soft modems — I was on dial up even as late as 2005! — I think the idea that everything works plug and play is amazing.
Compare with my experience on Windows — maybe I did something wrong, I don't know but the external monitor didn't work over HDMI when I installed windows without s network connection and maybe it was a coincidence but it didn't work until I connected to the Internet.
Meanwhile on MacOS my displays may work. Or they might not work. Or they might work but randomly locked to 30hz. It depends on what order they wake up in or get plugged in.
I suspect the root of the problem is one of them is a very high refresh rate monitor (1440p360hz) and probably related to the display bandwidth limitations that provide a relatively low monitor limit for such a high cost machine.
https://discussions.apple.com/thread/255860955?sortBy=upvote...
After 344 "me too"s and 180+ replies they silently locked the thread to save themselves from more embarassment.
Macs don't support the USBC / displayport daisy chaining support that my monitors should be able to handle. Very frustrating that this stuff is still so nonstandard. If you have all Apple it all works perfectly, of course.
I’m glad everyone is dogpiling on this statement cause man people seriously have to stop parroting this years out of date claim at this point. Any big well supported distro using Wayland should be fine, at the very least KDE and GNOME are guaranteed work perfectly with HiDPI.
Daily Fedora KDE user here on 4K HiDPI monitor plus another of a different lower resolution, flawless experience using both together in a setup. Fractional scaling also there working perfectly as well and you choose how you want KDE to scale the apps if you want (forcefully or let the app decide).
Why don’t they just make it obvious? Why doesn’t the installer just figure it out or ask me when it launches?
I agree that that would help, but it was easier to just install another distro.
MacOS isn't in any kind of position regarding displays. 180+ replies and 300+ upvotes by the 0.1% of sufferers who bother to find these threads, log in, and comment of them. Exteemely widespread, going on for years, thread silently locked.
I hope to put my money where my mouth is and contribute to one of the tiny handful of nascent Mac-like environment projects out there once some spare time opens up, but until then…
That means you can take the standard KDE "panel" and split it in two halves: a dock for the bottom edge, and a menus/wifi settings/clock bar for the top edge.
There are some things I don't know how to work around - like Chrome defaulting to Windows-style close buttons and keybindings, but if the Start menu copy is the thing keeping you off Linux, you can mod it more than you think you can.
What was impossible was to stop apps from showing the usual menu bar inside the window.
Obviously, with something so core to the system, plenty of devils in the details.
merge request: https://gitlab.freedesktop.org/wayland/wayland-protocols/-/m...
The biggest difference is probably that under Windows-style environments, applications/processes and windows are mostly synonymous — each window represents an independent process in the task manager. In a Mac-style environment, applications can host multiple windows each, so for example even if you've got 7 Firefox windows open, there's only one host Firefox process. This is reflected in the UI, with macOS grouping windows by application in several difference places (as opposed to Windows, where that only happens in the taskbar if the user has it enabled).
"Windows style" also comes a number of other patterns, such as a taskbar instead and menubars attached to windows (as opposed to a dock and a single global, system-owned menubar under macOS).
"Mac style" comes with several subtleties that separate it from e.g. GNOME. Progressive disclosure is a big one. Where macOS will keep power user features slightly off to the side where they're accessible but unlikely to confuse non-technical users, GNOME just omits the functionality altogether. It also generally implies a greater level of system-level integration and cross-functionality from apps (including third party), lending to a more cohesive feel.
And we were talking about why Linux wasn't an alternative to macOS, weren't we?
Also fractional scaling is not supported out-of-the-box in GNOME, you have to set a config value to use it IIRC.
The only reason I can't completely switch to Linux is because there are no great options for anything non-programming related stuff I love to do ... such as photography, music (guitar amplifier sims).
Don't knock it unless you've tried it.
This was CachyOS btw. Windows actually required MORE work because I had to install drivers, connect to the internet during setup, get nagged about using a Microsoft account, etc.
CachyOS was basically boot -> verify partitions are correct -> decide on defaults -> create account/password -> wait for files to copy -> done. Drivers, including the latest NVIDIA drivers, auto installed/working.
I give Linux a try each time I need to set up a new computer, and each time run into new issues. Last time (2 years ago) the hdmi connection with the screen would drop randomly twice a day. Same for the keyboard, and the wifi card didn't have drivers available. It became quite annoying, reducing my productivity as I had to reboot and pray. I then installed Windows, which solved all of the issues (unfortunately?)
Maybe I'm just unlucky.
Things are changing and improving VERY fast in linux land lately, so being behind by that much is gonna pretty much set you up for disappointment, along all the usual reasons why you ideally want to be on the just dull enough part of the bleeding edge for linux desktop, where you are only getting a few small shallow cuts and hopefully no deep cuts...
Anyway, popular acclaim for popos reached it's peak just when those problems started to show up. It used to be better in years prior, but the reputation tends to lag the actual reality, so sentiment at that point was to recommend it even though it wasn't actually a good choice.
Honestly, give Linux another try four or so months from now. You will get to start fresh on a brand new Ubuntu LTS or the usual new Fedora release. Try Gnome or KDE, see which ones sticks the best with you. Just don't try anything else if you want maximum features, commodity and stability.
Yes, you were unlucky :(
Eventually I found a fix that worked and now I’m happy. So, next time you can try this. In the file:
~/.config/gtk-4.0/settings.ini
You can add:
[Settings]
gtk-hint-font-metrics=1
Here’s the Arch wiki page that explains it:
https://wiki.archlinux.org/title/GTK#Text_in_GTK_4_applicati...
If your settings.ini is in a different spot see:
IMO, there's basically no problem Linux has that isn't worse in Windows (at the OS level). Especially once you get into laptops.
My final conclusion was that I hate computers.
Its the "getting every config" right thing that is the problem.
That’s why Apple used 4k on 22”, 5k on 27 and 6k on 32 to make it crispy always on 200%
Therefore newcomers should use Kubuntu or the likes of it
Even if KDE/Xorg is a stable experience is true now, it will not be true in the medium to short term. And a distro like Kubuntu might be 2 years out from merging a "perfect" KDE Plasma experience if it arrived right now.
And god forbid you still have low DPI monitor still!
5k has been surprisingly stagnant.
I’d prefer that to not be so, because 5K panels are so much more expensive. But in a side by side comparison it’s very obvious.
But the market has spoken: a quality 4K display is very good, certainly good enough, and the value for money is great.
I’m ok with spending more on a better display that I spend so much time with. The cost per use-hour is still very, very low.
And MacOS has removed support for subpixel rendering because "retina", though I only use it when forced (work).
When considering a single-cable solution like Thunderbolt or USB-C with DP altmode, if you are not going with TB5, you will either use all bandwidth for video with only USB2.0 HID interfaces, or halve the video bandwidth to keep 2 signal lanes for USB 3.x.
(I am currently trying to figure out how can I run my X1 Carbon gen 13 with my 8k TV from Linux without an eGPU, so deep in the trenches of color spaces, EDID tables and such as I only got it to put out 6k to the TV :/)
FWIW, I could see jagged edges on 4k at 24" without subpixel rendering, 27" is worse. Yes, even 4k at 32" is passable with MacOS, but Linux looks better (to the point that 4k at 43" has comparable or slightly better text quality to 4k at 32" for a Mac).
I am trying to get a 55" 8k TV to work well with my setup, which might be a bit too big (but same width as the newly announced 6k 52" monitor by Dell), but it's the first next option after prohibitively expensive 32" options.
In some sense it's reasonable that you need a supported monitor, it's just strange that Linux can support all these monitors, but macOS can't?
This has been my experience every time I try Linux. If I had to guess, tracing down all these little things is just that last mile that is so hard and isn't the fun stuff to do in making an OS, which is why it is always ignored. If Linux ever did it, it would keep me.
- Machine failed to wake from suspend almost 50% of the time (with both wired and BT peripherals) - WiFi speed was SIGNIFICANTLY slower. Easily a fraction of what it was on Mac - USB C display was no-op - Magic trackpad velocity is wild across apps - Window management shortcuts varied across apps (seems Gnome changes a lot, frequently) - Machine did not feel quicker, in fact generally felt slower than Tahoe but granted I did not benchmark anything
I would happily try it again when the project is further along
Apple has also done things such as adding a raw image mode to prevent macOS updates from breaking the boot process for third-party operating systems. Which is only useful for 3rd party operating system development.
Sure, some developer may have added things like raw image mode, but if someone on high says "wait, people are buying macbooks and then not using the app store?" or as soon as someone's promo is tied to a security feature that breaks third-party OSes... well, don't be surprised when it vanishes. Running any OS but macOS is against ToS, and apple has already shown they are actively hostile to user freedom and choice (with the iOS app store debacle, the iMessage beeper mini mess, and so on). If you care about your freedom and ability to use Linux, you should not use anything Apple has any hand in ever.
> Apple formally allows booting third-party operating systems on Apple Silicon Macs. Shortly after the Asahi project started, Apple even added a raw image mode to prevent macOS updates from breaking the boot process for third-party operating systems. This provided no benefit to macOS whatsoever; it merely served to help third-party operating system development.
There are a lot of reasons to be annoyed with Apple, but we don't need to invent new ones, and there's an awful lot of misinformation out there about Macs that conflates how locked down iOS is with the Mac (combined with the insistence that Macs are going to be locked down just as much as iPhones within the next few years, which I have literally been hearing since the iPhone came out in 2007). There are some things that are more difficult to do on macOS Tahoe than they were on MacOS Leopard twenty years ago (like, apparently, resize windows), but there is nothing that is "locked down" in a way that makes something I remember doing then literally impossible to do now.
iOS and the third-party app store court battles makes it clear to me that Apple is actively hostile here.
It would have taken less work for apple to implement the EU "third-party app store" regulation as "anyone can install a 3rd party app store if they jump through enough hoops". They instead require that you live in the EU, as verified through many factors. They break it if you take too long of a vacation, they make using your new right to install a 3rd party app store as difficult as they can.
Apple clearly does not value user freedom nor users abilities to install their own software on their own devices. Apple would rather old iPhones and iPods become useless e-waste bricks than release an EoL update to unlock the bootloader and let you install linux to turn that old iPod touch into a garage remote, or photo-frame, or whatever.
Your comments about iPads/iPhones may well be true but not relevant to my point. See also the comment from user Kina upthread.
Asahi linux would have been "a company that's extremely hostile to this effort."
They instead said "a company that's extremely hostile to this _kind of_ effort", which turns it into a broader category, which I believe quite reasonably includes their hostility to general "using their devices outside of the apple walled garden".
If you're going to be pedantic, please at least be correct, but "this kind of" clearly makes it more broad than just asahi linux itself.
"the Asahi team have been doing incredible work .." -> the team working on porting Linux to Apple Silicon Macs.
"They just chose to do it on the hardware of a company that's extremely hostile to this kind of effort."
They -> Asahi Linux Team
it -> (note the singular) porting Asahi Linux
the hardware -> Apple Silicon Macs
a company -> Apple
My comment (the one you responded to): "it would have been shot down", (note the singular) it -> porting Asahi Linux.
You cannot torture the sentence to encompass the broader Apple ecosystem when the the subject is very obviously and solely the Asahi Linux team and Apple Silicon Macs. You're welcome to your views, just drop them somewhere more relevant next time.
(I usually miss the little Linux-specific things that macOS does not.)
There's probably a lot more I'm not thinking of right now. Point is, if you're an iOS, macOS, and iCloud user you give up a lot of quality of life bits going to another platform. There are times I want to go back to Linux, but when I think about the stuff I'm going to loose I talk myself out of it. macOS isn't the greatest, but it's not the worst either and Apple's products and services just tie in very well with each other. I get annoyed by things like the shitty support for non-apple peripherals, needing 3rd party apps to make them work decent, crappy scaling except on the most expensive monitors and no decent font smoothing when running at native resolutions. But... I stick with it because I either like or love the tight integration and added quality of life that comes with it.
It's a different set of trade-offs; less polish, more control.
Definitely not exhaustive since I only spent a few weeks with it. There were also plenty of things I liked about Gnome more but not enough to tip the scale for me
I use macs at work and Linux at home. There's no uniform way to make a Linux machine accept things like cmd right arrow to jump to the end of the line, etc.
This is the closest attempt, but it has many gaps: https://github.com/rbreaves/kinto
Unfortunately today it is a race to the bottom.
I work at Google, which issued a Gubuntu workstation by default when I joined. I exchanged it for a Mac, which I've spent a literal lifetime using, because I didn't wanna fall down a Linux tinkering hole trying to make Gubuntu feel like home. Every corp device I've had has been a Mac.
I'm reading this from a coffee shop. On my walk here, I was idly wondering if I should give Glinux (as its now called) a try when I'm forced to replace the iMac. SteamOS is making Linux my default environment in the same way Mac was for decades prior.
Unfortunately, I looked into it, and my other options are an Asus CX54 Chromebook or a Lenovo X1. There simply aren't competitive alternatives to Mac hardware, at least not at modern Google.
GNU/Linux isn't sold in shops like macOS and Windows for regular consumers, until it goes out from DYI and online ordering, it will remain a niche desktop system.
I've heard that for almost 20 years now, but it never was.
Sometimes, people think "it can be made to look similar, therefore it's the same" (especially with regard to KDE), and no, just no.
It's the way drag and drop is a fundamental interaction in text boxes, the proxy icons in title bars, how dragging a file to an open/save panel changes that panel's current folder rather than actually move a file.
It's how applications are just special folders that are treated like files, how they can update themselves independently of each other or any system packages, how you conventionally put them in the /Applications folder so you can put that folder on the Dock to use directly as a launcher.
It's how all text fields consistently support emacs-style keyboard shortcuts, respond appropriately to the system-provided text editing features such as the built-in Edit menu, text substitutions, and writing features.
It's how you can automate most Mac-assed apps; how you can extend the operating system through app-provided and user-created services to every other application that handles text, files, images, PDFs, through the built-in APIs using AppleScript, Automator, and Shortcuts.
It's how the whole program rather than its last window is the fundamental unit of an application such that document-based applications can exist without a window without also polluting some system tray with an unnecessary icon, how that means workflows expect more than one window open.
It's how there's a universal menu that works for every app, not just conforming ones (i.e. KDE's global menu only shows KDE apps' menus; other apps need a plugin or just don't show at all), how the help menu has a search field to look for menu items, how keyboard shortcuts are bound to the menu items are bound arbitrarily within the program's settings window and can thus be assigned globally in System Settings, how this means all of an application's main features are therefore accessible via the menu bar, how that creates consistency in the menus.
Those are just some things off the top of my head but there are plenty of others, some a bit more user facing, some less. Just examples, a non-exhaustive list.
I'm sure those who don't care about these things will dismiss it but if you've been using a Mac since before macOS, before OS X, or even before Mac OS X, these are things you won't drop for Linux just because the design is a bit uglier.
Of course, if none of these things matter, then the swap is easier. It doesn't mean any DE is a drop-in replacement by any means. Many of the things that make some DEs "Mac-like" are skin deep.
[[citation needed]], benchmarks please, incl battery life, not promises. "We are seeing" implies reality
Most people want to get productive work done with their computer, and OS X has top tier apps for every need possible.
No good e-mail app, no good office apps, no good calendar app, no good invoicing app, no good photo editing app, no good designer app, etc
> No good e-mail app, no good office apps, no good calendar app, no good invoicing app ...
People who aren't programmers use Gmail, Google Docs, Google Calendar, Stripe Invoicing, etc for those various use-cases.
Firefox and Chrome work just fine on Linux, so Linux has all the apps people actually use these days on computers.
And then you of course have corporate, who will not switch from Windows.
Nobody will voluntarily spend all day working within Gmail or Google docs.
You also conveniently cut out photo editing and design in your quote.
Edit: Also I wonder if all you server-admins, programmers and gamers would have switched to Linux if your only option was to do your work or gaming within a laggy and inadequate web-app? But you want other people to suffer that.
Funny, there are whole companies, pretty big ones at that, that run entirely on the G Suite. Regardless of OS.
If the idea is moving from Windows to something better, then Mac is usually the answer. Unless you're a server admin, programmer, or gamer. Then Linux is probably great. Everybody wants to have tools that work well for the task.
I'm not saying 2026 is the year, but...
I would argue the OS closest to "mainstream Linux" is Ubuntu or Fedora with Gnome DE. Gnome has many many faults but it's probably the closest DE you're going to get to what Windows and MacOS have.
Still on iOS 18 and macOS 15 (Sequoia). I was a day one upgrader up until now, never had any regrets but this time things seemed very different.
It's worrisome but all is not lost, I'll start sweating for real if next year's releases don't improve things substantially.
https://www.forbes.com/sites/zakdoffman/2026/01/07/hundreds-...
I've tried and returned the iPhone 17 Pro. Love the hardware (especially the camera), but iOS 26 is inefficient (for lack of a better term), and the new camera UI hides too many things.
My solution was to change from "Unified" to "Classic" which changes the bottom bar from [("Calls" "Contacts" "Keypad") "Search] to ["Favorites" "Recents" "Contacts" "Keypad" "Voicemail"]. THE ICONS ARE EXACTLY THE SAME SIZE. The only difference is the spacing between them.
But again, this is fucking crazy because going back to the classic mode, if I click on a recent name it starts dialing them. But in the unified mode it gives me information. The unified makes the whole name act as if I'm pressing the info button.
The problem is that Apple created an anti-pattern, TO ITSELF. They taught users that an action did one thing and then used that action to do something completely different. No one on iOS 26 should expect that clicking the call line will take you to the information page and should instead expect that doing so will start dialing that person.
I working in software and "build features" for a living, and over the years I've come to prioritize reliability, performance, and an intuitive experience over all else. No matter how good the feature set is, if it crashes, is painfully slow, or I can't figure out how to use it, then I don't want it.
Apple used to have that focus, but seems to have lost it of late.
But I find iOS 26 absolutely disrespectful. It wants you to use it in ways that previous iOS versions pushed you away from. It's an anti pattern to previous versions. I'm sorry, if you teach users one pattern don't update to have them do the opposite. Nothing is could be less intuitive
macOS has been an incredible productive OS for me since I was 15. I'm now 39. In the last few years is the first time in that period that I've seriously begun to wonder if it would be wise to get off the platform. I've already dropped iOS, watchOS (Garmins are actually amazing these days, for what it's worth), and iPadOS. I still use macOS daily along with tvOS when I happen to watch something, but the days seem numbered now. I'm pretty disappointed. I hope it turns around, but I'm slowly preparing myself to be on Linux primarily.
I can't see a single reason to upgrade to Tahoe. We'll see what 2026 brings.
There was also a great boutique apps ecosystem.
Right now, it seems that macOS is going through its enshittification phase, sadly.
Most of the upgrades since then I have resisted and not enjoyed, though I seem to recall liking Mavericks.
A lot of the big features each time seem to be about tieing further into the Apple ecosystem, which doesn't interest me at all, since I have no other devices and don't use iCloud.
Was it also great for developers? (Genuine question.)
If I buy a product and the hardware is good for 10 years (because I looked after it), I expect the software to also run just as well as when I purchased it - that is the case with Linux, why isn't it the case with macOS?
Every year the software upgrades invariably degrade system performance. Outrageous.
Apple is miles ahead of Android when it comes to phones and tablets, most in the Android ecosystem is e-waste four or five years in, while Apple stuff can still be re-sold for actual money at that time assuming you didn't bust your screen.
For laptops, Apple is so far ahead it can't even be described. Most Windows laptops physically break apart before macOS ceases to support any Apple laptop.
Only thing we can maybe talk about is desktop PCs ever since the switch to M that basically made meaningful upgrades impossible, but eh, in my attic there's a 2009 Mac Pro still chugging along as my homelab server + gaming rig.
Edit: just did a google and it seems I can still sell it for about $600AUD, I don't know how anyone is buying a non apple lap top.
I have a very old android tablet (Nexus 7, 2013). I can install Linux on it and it works just fine. I can convert it into a full screen kiosk mode thing that displays photo albums, put it next to my tv as a song controller, etc etc.
Older iPads no longer get updates, and I can't install linux on them. Apple is wildly behind a lot of other hardware in terms of software-support since I can install linux on a lot of other stuff. Apple devices turn into useless e-waste bricks, other devices can get a second life running linux.
Yeah, Nexus and being old, that's the thing. Everyone else other than Nexus, you gotta be lucky if you even get kernel sources and device trees that you can compile, but the code quality will usually be so rotten there's no hope of mainstreaming it to the Linux kernel.
> Apple devices turn into useless e-waste bricks
Only the iDevice lineup though. The Intel and M series devices all can be made to run Linux.
If you buy the $199 Windows laptop that can barely run Windows, yes. Anything comparable in price to a MacBook? Not really.
> Anything comparable in price to a MacBook?
The current MacBook Air is at ~1100€ here in Germany. That's not that expensive, particularly as even the entry models still blow away the competition for CPU.
Eh, I had to use a variety of iPhones for work recently, don't remember which models, from probably the last ~7 years though, and they really felt limited and frustrating on the software side. My already years old Pixel 7 feels miles ahead, and so did my Pixel 4a, even with the worse hardware of the latter. They just feel more capable.
I've been a mac guy for work for at least 15 years though, now with an M4 on Sequoia, and definitely won't be buying anything else (windows for most gaming), but Tahoe is not looking promising.
And Mussolini wasn't nearly as bad as Hitler. A relative measure like this sets an artificially low bar. If these devices had replaceable screens and batteries, they would be good until the mobile standards stopped being supported.
Damn, I haven't seen an instance of Godwin's law outside of political threads for years in the wild.
> If these devices had replaceable screens and batteries, they would be good until the mobile standards stopped being supported.
The problem is, even replaceable components don't matter when the OS support drops and the device becomes a bad netizen as a result. And no, there is no viable FOSS competition to Android and iOS, many including giants such as Mozilla learned that lesson the hard way.
And that's before getting into the whole issue with BSPs, horrible code quality (good luck trying to get any SoC BSP upstreamed to u-boot or god forbid the Linux kernel), or the rapid evolution in mobile SoC performance.
I'm not calling anyone Hitler, though, just pointing out the flaws that can come with relative comparisons. A known, extreme example here is useful as it's well known and illustrative.
Anyhow, Apple & Android should just support old hardware for longer.
Apple already does. The iPhone 6s, released 2015, got a security update just a few months ago [1]. That's ten years worth of security updates, I'm amazed that people are still using such old phones.
If we go by the metric of "app developers can still publish app updates", the minimum target version is iOS 18 [2], which means you can still target the iPhone XS from 2018, that's a 7 year old phone.
The true catastrophe is Android, and that's actually not Android's fault. That's the fault of Qualcomm, MTK, Samsung and other more obscure SoC vendors - only in 2023, with the Pixel 8 [3], came the first SoC with seven years of support. As said: most BSPs are utter dogshit, and so are the firmwares for all the tiny chips and IP cores. The Linux kernel is a very fast moving target and it's (by intent) a gargantuan effort to keep forked kernels up to date. And it's made even worse by the embedded industry's trend of continuously "improving" their chips/IP cores without changing model numbers, making it sometimes outright impossible for a kernel module to deal with two different steppings and respective quirks on its own.
Apple in contrast insists on writing everything themselves - that's why they fell out of love with NVIDIA a decade ago, NVIDIA refused to give Apple that level of access. That allows Apple to keep even very outdated stuff supplied at least with critical security fixes.
Google could do something here, say by adding a requirement to the Play Store license that BSPs must be actually accessible open source and vendors have to commit reasonable effort in upstreaming their kernel level drivers, but I guess Google is too afraid of getting hit by anti-trust issues.
[1] https://en.wikipedia.org/wiki/IPhone
[2] https://developer.apple.com/news/upcoming-requirements/?id=0...
[3] https://blog.cortado.com/en/7-years-of-software-updates-for-...
I'd like Apple to focus more on the things that actually matter to users. To fix bugs, to work on performance, to simplify things rather than complicate them. Focus on making it a better platform for doing work and less a playground for pointless fiddling with design and sloppiness.
It's why your favorite shoe company, that you buy from every 2-3 years when you wear out your favorite shoes, always has new styles and discontinues other styles. Converse is a great example.
Journalists will report whatever they get fed anyway (notice how they all talk gleefully over the wobbly new iPhone with a jutting-out camera bump when only a few years ago they talked gleefully about how flat the iPhone was, and then gleefully wrote about how their screen estate was invaded by a notch etc), so if Apple focused on fixing issues instead of short-attention-span apps (when was the last time you used "Image Playground"?) the media could report how committed to reliability and quality Apple is, gleefully.
I do wish I heeded your advice. I bought some awesome Doc Marten boots/sneaker hybrids called "Boury" a few years ago, and they stopped making them. I can't find any in my size anywhere, like EBay. I walked all around Disney World with them in shorts, and they looked / felt awesome.
Features, people, FEATURES.
It's more effort to do things that also make sense than only to produce the bullet point.
I hoped the .1 or .2 would fix things, but I'm still seeing glitches and even random freezes.
Microsoft is a disaster right now, but if the new intel processor can compete on battery life with mac I might go back to linux.
Unfortunately Intel is cutting down their Linux involvement so I wouldn't have high hopes for it. Newer AMD laptops are probably on par with Intel on Linux now.
Dye didn't bring something that users didn't know they needed, he brought chaos to the entire ecosystem, and he's the only Apple executive folks are willing to talk garbage about.
If Cook and his other senior staff had recognized the problems Dye was causing and wanted him gone, what possible motivation would they have to make his firing secret? How could it possibly serve them better to have it look like they were chumps?
"You can change your icons!" - What? Was that the big issue of my day? (Although, after I saw what they had done to them, it certainly loomed larger in my mind)
and
"Notification summaries that may be incorrect"
Miserable. Won't be upgrading the personal computer, am fast moving away from Apple as a whole, am telling others not to upgrade for as long as possible.
Can't you do a factory reset/recovery on Mac that lands on the version of macos shipped with the device? Then you could re-upgrade to the os you wanted, without trying it it seems Sequoia is still available in the app store
One huge reason to use third-party programs where possible. I dislike Apple's tight coupling of utilities as it is.
Though amongst many other wonderful things lost in the mysts of Mac history I still desperately miss NetBoot/NetInstall and ultra easy clone/boot with something like CCC and TDM. It's so fucking miserable now in comparison to do reinstalls/testing/restores.
----
You read that right: apparently rounded corners are so resource-intensive that if you don't have or disable GPU acceleration, they'll disappear.
As much as I absolutely hate rounded corners in general, it's astonishing the apparent inefficiency with which MS have implemented them. Then again, mediocrity seems to be par for the course with their developers: https://news.ycombinator.com/item?id=28743687
This is akin to MobileMe -
https://www.cultofmac.com/apple-history/steve-jobs-mobileme-...
Other than that weird resize thing written about here (which I didn't notice, thanks SizeUp for providing me with hotkeys remarkably similar to Windows) - why? Vista and 8 were immediately obvious changes in the UI, but in general it still looks and feels just like macOS has for well over a decade now.
New icons, new fonts, but... that's it?
Oh and HyperSwitch for some reason can't switch to Finder windows any more, but that's probably because HyperSwitch hasn't seen an upgrade in years...
Apple had a HIG and third party developers used it. Different apps looked consistent: toolbar, sidebar etc.
Apps had borders. If there was too much content there were big blue scrollbars.
Buttons had borders. Windows had texture. You could find stuff.
I could go on forever. But the OS was simultaneously better visible to the eye, and less visible to the mind. Stuff worked.
Or iOS 8 and 9 did revert back a lot of iOS 7 changes.
I am not against changing UI, but it seems every time they are doing it they forgot all the lesson learned from previous attempt, and in such short period of time suggest they haven't learned anything.
Just to give a few examples which annoys me the most:
- Finder. It just something else. After 10 years of using OSX I still can’t figure out how to use it efficiently for selecting the path - this experience is different every time, depending on the context where Finder was called from. I just don’t get.
- Lack of the true tiling window manager experience. Yes, there is Yabai, but it still suck due to the fact that you can’t have truly independent spaces each with individual layout and stack of windows.
- Infamous Magic Mouse’s charting port at the bottom.
I just wish I could have normal Linux natively on MB Pro.
They also added stupid "quick launch" areas with places nobody went, like "3D Objects", and reduced the menu area to a "grope and find a button" ribbon.
The older Explorers were usable like File Manager on Windows 3.11 was: address bars that were usable from the keyboard and mouse (no subdivision buttons for parts of the path), acceptable launch speed, and no extra "features" that were unnecessary (like it ignoring "use same view for all folders" when your directory happens to have MP3s in it - it'll switch to showing rating / bitrate etc.)
I believe all developers should use older versions of the software to see how usable they were in comparison to the modern "improvement".
Yes, because my apple hardware does not run properly with any other operating system. I would have switched to linux a while ago otherwise.
I guess Apple has realized that their hardware is so good that they don't have to worry about the software anymore.
I’m hoping they’ll wake up and fix this with the next release, but I’m not super optimistic.
We’ll see.
It had decent bones though -- arguably a lot of its bad reputation was due to hardware/third party driver issues and people trying to run it on old hardware that just couldn't hack it. Windows 7 was well received and is basically the same thing with small improvements and some of the UX issues smoothed over (i.e. less annoying UAC)
But I see many references to it being called just "Aero", but some call it "Aero Glass" [1]
Does anyone know the truth?
[1] https://www.pcmag.com/archive/rip-aero-glass-windows-8-stick...
> "Rest in peace, Aero. I liked you, a lot. Still do. And I'll miss you," Thurrott writes
On top of that, the OS feels more bloated and disorganized than ever, with something like six different UI frameworks all present in various spots on the OS; system settings are scattered across the Settings app (new) and various legacy panels like Control Panel and Network Connections.
What else... Microsoft now requires an online connection and Microsoft account to sign in to your PC; no more local-only accounts allowed.
I'm sure there's more I'm missing. It's not a pleasant operating system.
Recall is turned off by default and Copilot never nags you to use it (like Gemini on Chromebooks/Google Search/Google Docs does).
I completely agree with the UI frameworks thing though. They really need to remove the Control Panel.
Pleasant compared to what? Older versions of windows? linux, or macs? This is the first positive review I've ever heard.
Another factor vs Mac (for me) is that if something to happen to my ThinkPad while I'm at a factory somewhere in rural Uzbekistan, there is always a store in the nearby city where I can grab a Windows laptop for like $400 and continue with the job, and/or have my machine serviced.
Windows has enormous userbase, and obviously you'll hear a high absolute number of criticisms, especially considering that those who actively dislike the OS for whatever reason will take take their time to bring their frustrations online, and those who are fine with it rarely comment about it.
Windows laptops vary in hardware quality and software support significantly, maybe that’s where issues arise for some people?
Linux obviously has its strengths, so I have a dual boot with the latest Fedora, but I almost always end up using W11, even for personal use.
... they really need to provide 100% coverage to all the same settings, THEN remove the control panel.
> original version of Recall stored these in an unencrypted, insecure database.
Why do you bother mentioning it, given that's been long rectified and that particular version never made it to the production ring?
> six different UI frameworks all present in various spots on the OS
Windows has always been like this. It wasn't until Windows 11 that the Font dialog was upgraded from a Win 3.x look and feel.
> no more local-only accounts allowed.
Just false.
Yes, UAC could be considered as annoyance by some but it's no different than "sudo" on single-user Linux machines and we seemingly have no problems with that (I wish we'd move on past that because it is damn annoying and offers no security benefit).
Comparing Vista to modern macOS is insulting. Vista didn't have that level of jank and the UIs were actually quite good, consistent and with reasonable information density, unlike "System Settings" or shitty Catalyst apps.
It was wild to me when I was testing out if I wanted to move over to Linux as my full-time desktop OS how much it was asking for my password. And it didn't even have a mechanism to make it a little less painful such as requesting a short PIN (which I think is a fine option as long as a few incorrect PIN entries forces full password input).
On the other hand I'm not sure NOPASSWD would affect desktop environments - any desktop stuff goes via PolicyKit or whatever the latest systemd iteration is and I doubt it's smart enough to read Sudo's config (and there's an argument it shouldn't - if anything it should be the other way around, a system-wide generic "this is single-user machine, the only user is effectively root anyway" flag that both Sudo and Polkit should obey).
In both cases yes it's solvable, but I wish it became the default if there are no other interactive user accounts, or at least be easy to configure - if anything, by a simple "don't ask me again" on the permissions popup.
The UAC wasn't even the main problem, the overall performance of Vista was, everything was so much slower.
They addressed most issues in the 8.1 update, like a year later I think.
There was no start button. There are no screen edges to swipe in from. Hot corners are really hard to hit. I still can't believe somebody said "yes, good idea" to using that UI for Server 2012.
If that wasn't bad enough, the popup was a web view, meaning none of RDP's acceleration/client-side compositing was in play and I was greeted with a ~1fps slideshow.
Getting to Windows 11 today, they have ads in the Start menu. Not exactly appealing to the Apple crowd…
https://9to5mac.com/2025/12/08/ios-26-new-airpods-setting-ca...
In iOS 26, you can keep audio playing with your headphones by enabling the new "Keep Audio with Headphones" setting, found in Settings > General > AirPlay & Continuity, which stops audio from automatically switching to nearby devices like car stereos or Bluetooth speakers when you're already connected to your headphones.
This setting, which is off by default, ensures your music, calls, or podcasts stay with your AirPods or wireless headphones, preventing frustrating interruptions when you start your car or enter a room with another speaker.
Disclaimer: I was one of the dozens who used a windows phone. The Nokia Lumia 920 was great, you can fight me.
I'll give you the benefit of the doubt and assume you're thinking of the lack of device OS upgrades: from WP6.5 to WP7, from WP7 to WP8, and from older WP8 devices to W10M. So no forward compat, but absolutely yes to backward compat.
For an app platform already a distant third place and struggling to attract developers, pissing off the few devs you do have TWICE was not a smart move.
And I disagree with your implicit claim that the WP7 & WP8 Silverlight -> Win10 UWP transition had no migration path. There was >= 90% source code similarity, bolstered if you had already adopted the Win8.1/WP8.1 "universal" project templates. And Microsoft provided tooling to ease the transition. Sometimes it was literally just s/Microsoft.Phone/Windows.UI/g.
Games were a different matter, I'll admit. XNA as an app platform had no direct replacement that was binary compatible with Win10 desktop, but even then, not only was DirectX already available from WP8.0, but Microsoft invested in MonoGame as an XNA replacement precisely because they knew the end of XNA would hit hard. (In fact it was the Windows Phone division that had essentially kept XNA on life support in its final years so that WP7 games would not break.)
Seems that's the standard now for .NET desktop dev. Every 2 or 3 years MS crank out a new XAML based framework that's not compatible with the previous and never gets completed before a new framework comes out.
I'm happy they're at least maintaining (to a limited extent) Windows Forms and WPF and updating their styles to fit with their fancy Fluent design.
But even that is a pretty sad state of affairs, since Windows Forms should be able to get that info from uxtheme (which Microsoft fumbled) and WPF should be able to get that info from the style distributed with the system-installed .NET framework (which Microsoft fumbled and now only exists for backcompat).
For the company with the best track record for backwards compatibility (with Windows), they sure suck at developing and evolving the same API for long.
Must have felt incredibly liberating, and maybe they got a little too into the whole idea of "fresh start"(s).
See also Windows RT.
Windows 8 design wasn't bad per se, but they shipped the start screen when it lacks even the most basic features, so you'll return to legacy desktop the moment you want to do anything.
I don't think any of them are like Tahoe TBH.
I've blocked Apple's update servers via /etc/hosts so this monstrous thing doesn't sneak onto my machine in the middle of the night, still happily on Sequoia.
Vista ate every bit of RAM it could find, had severe driver issues and riddled with instabilities. It would not run on half the hardware at the time. I faintly remember a DX10 shitshow as well. And 8 hopelessly tried to apply Metro to the desktop and added a third (or was it forth?) settings panel. Also killed the Start menu.
If it wasn't for Apple Silicon and its stellar impact on battery life, I'd be gone. iOS 26 might make it happen anyway!
Resizing isn't great, but it's also deeply shitty in Win 11. I feel like window manager thought leadership has failed across the board, but the regression isn't that big of a deal in day to day usage, and is definitely not unique to Apple.
I mean Windows 10 wasn't great but I got used to the taskbar searching the web somehow and the dual config menus everywhere and so on. But 11 was just terrible.
macOS has its pain points but man oh man what a disaster Windows is.
I have had Linux on my personal desktop and laptop forever so that hasn't been an issue, only used Windows for work.
Not even close.
It's taken a few steps in the wrong direction, but nothing compaered to the user-hostility of Win8 (attempting to move users from 'real Windows' into locked-down dumbed-down touch-centric mobile-like app store hell), let alone Win11 (creating an e-waste mountain, then pushing AI slop into everything)
Are you just saying that because it has new glassy windows and is a resource hog? What is that different about Tahoe vs Sequoia?
- Apple Music requires one more click to pop the multiplayer, UI is worse and the click hitbox for the progress seek bar is too small
- Volume +/- now acts like a notification (top-right corner of screen and clickable). Horrible design decision (gets in the way of browser tabs)
- The "A > B > C" folder thingy at the bottom of Finder windows is gone, and the tabs' styling looks unsettling
- Weather (and Stocks, to a lesser degree) looks worse, lots of space wasted
I'm sure people noticed this issue internally and brought it up but some thing by some designer was seen as biblically sacred and overruled all reason.
I've been at companies were you get severely punished... sometimes fired for subordination for fixing an obviously broken spec by a designer emperor.
It's normal to be "I guess 2+2=5 here, whatever" as if the designer went in a tiny room, had a seance with the divine...
Yo, newsflash, everyone makes mistakes. Failure is when you force them to stay uncorrected.
If this is the click target area specified by the designer (or it was simply unspecified) then it's absolutely the designer's fault. I'm a UX designer and I've made mistakes like this before, though this one is pretty egregious because the issue is core to the interaction.
It's sometimes easy as a UX designer to forget to specify some of the smaller details (though this example isn't what I'd call a "small detail"), particularly because they're the kinds of things you don't notice when they work, and I don't have to implement it. The developer has to sit down and write code for what will or will not happen.
I've made mistakes in the past where in an mobile interface I neglected to specify the click target area for some controls. Typically the minimum clickable area we'd use was something like 44x44 but the visual was smaller than that, and I didn't specify it, so the developer made the visible element the one that would respond to the click events. It was too small and it caused issues. I owned up to that one, I didn't want to let the developer take the blame for that.
I've also been fortunate enough to work with developers who would notice these things and then ask me if it was intended and whether they should increase the clickable area. I was always so grateful to have colleagues like that, and I'd always offer to set some time aside to come take a look at things on their local environment before they moved things forward just to catch any issues where they could immediately fix it instead of having to push fixes later on.
I don't know where the failure happened at Apple, but based on what I've seen from "Liquid Glass" it's clear there's some real institutional failures involving either the design leadership, the development leadership, or somewhere in between both. It's really quite embarrassing the quality of GUI and UX that has come out of Apple recently.
This is the first time ever where the hurdle of rolling back my iPhone to an earlier version of iOS feels worth the effort. I disabled as much of the liquid glass effects as I could because I found it difficult to read and now it all looks like shit, whereas before I could read it and it looked nice.
In past it was "given" by desktop env, now it's all rebuilt in material or other design but without any advanced behavior, it only "looks good on static screen".
https://en.wikipedia.org/wiki/Phosh
And amusingly Ubuntu uses Qt for its phone clone of Gnome.
Visual consistency means that your app looks as similar as possible across platforms. Regardless of those platforms' native UI. It's the bad kind.
UX consistency means that your app behaves the same across platforms, but adopts their style and conventions. You actually want this.
i also hate this "consistency" idea. was working on mobile app for android/ios. and a requirement was for apps to look identical on both platforms. whyyyyy. sure for designer it looks nice, but as a user who uses either ios OR android im used to conventions of particular platform. why throw that all away just to look identical an both platforms.
I bet designers aren't at fault here either because Liquid Glass violates at least three rules of design every second that passes.
https://www.vitsoe.com/us/about/good-design#good-design-is-i...
Instead OP mentioned "visual artists"; I agree. Liquid Glass is an art show; something that belongs in the realm of concept cars, not on the road.
The huge corner radius is one thing I do wish they reverted in Mac OS.
It's kinda the rule for programmes too.
The ones that went to a small liberal arts school you've never heard of programming as their second career are usually more effective to work with then the Stanford/MIT crowd.
The problems start I think, when you have an expectation that your collaborators are somehow either superhuman or subhuman and not peers.
Humility and mutual respect gets things done.
I've worked with some younger designers who couldn't even put together a consistent click-dummy once the client wanted to see flows outside the happy path. To be fair, all they really had to go on was their education and Figma's panels.
I stopped looking at the educational background years ago in a fear that it would influence my bias either way. We shouldn't base someone's suitability at 40 upon what opportunities they were afforded at 17.
I do have a somewhat prestigious pedigree btw. I removed it from my resume around 2010 and never looked back
> I'm sure people noticed this issue internally and brought it up but some thing by some designer was seen as biblically sacred and overruled all reason.
Funny how Apple went from Jony Ive sacrificing hardware usability for "beauty" (touch bars and butterfly switches) to Alan Dye mucking up macOS and iOS with Liquid glAss.
In that sense a touch bar in addition to function keys would be very nice, because it's a much smoother way to adjust volume.
Adoption engenders development, and development engenders adoption. All of the best use cases of a touch bar are ones we would have seen had such a virtuous cycle been allowed to occur.
I agree with the sentiment - making a control surface that adapts to the user's current task makes total sense to me, and is a compelling feature in theory.
The execution (and how the touchbar differs from the Stream Deck) is where I think the argument falls apart. There is effectively zero ability to navigate the touchbar without using your eyes and taking your focus off the display, and your work. The Stream Deck can easily be used without looking. A static grid of real buttons whose function changes within context is a more useful implementation in the real world, even though it is technically _less_ capable.
IMO the touchbar concept is flawed in exactly in the same way as the modern car user interface.
As a user of the touch bar, I _hated_ having to look down from the screen, and move my hands away from the keyboard home row / touchpad area, _all the way up_ to the touchbar area to finally use it. It completely breaks the flow every single time. I don't think just inserting physical Fn keys beneath it would have won me over at all.
I'm not familiar with the stream deck, haven't even heard of it until just now.
I don't get it, I have medically tested 120% color vision (it was a lengthy test), definitely nothing wrong on my side, so I don't understand at all what the designers and coders are seeing that they think that that is a great idea. The difference between the pixels is objectively bad, one can take a screenshot and look at the background versus text pixels.
You want your designers to have accurate color reproduction for obvious reasons, but they should be testing their work on shitty monitors, too.
I don't know, I conclude the opposite. If you need accurate color reproduction when you publish online, you are doing something wrong.
I used to co-own a small digital printing business, so I'm aware of what all of it means, and I had an appropriate monitor myself and a paid Adobe Design Suite subscription.
But for the web, when our setup is too good it's actually a detriment. It is predictable that you end up publishing things that require your quality setup. There is a good reason not to bother with a high quality monitor usable for serious publishing and photo/video editing when you only do web thing. Which is exactly why when I bought my last monitor, which is for business work and coding and web browsing and other mundane things, I deliberately ignored all the very high quality displays, even though the company would have paid whatever I chose. It is not an advantage for that use case.
It's a throwback to BDUF.
If they aren’t willing to try out their design and find issues with it, or be open to feedback from others, they’re incompetent.
Looking at the non-tech people in my life, exactly ONE had a positive initial reaction after installing ios 26. Do these people at apple not do “normal” user testing?
Computers are faster than ever, every task other than UI rendering is finished faster than ever, but these geniuses keep slowing down the UI with every update. It's criminal.
When Windows went to a 1 pixel border and shadow effects, it still had hit testing in a region around the window to account for that. No idea what they're doing with rounded corners in Win11.
The Expert (Short Comedy Sketch) https://www.youtube.com/watch?v=BKorP55Aqvg
Visual artists and graphic/ux designers weren’t exactly claiming for Tahoe either.
Going off spec is not the correct way to deal with this, and I could see how that might get you in trouble. It's counter productive.
Better choice is to escalate, but at some point you have to simply disagree and commit.
I disagree. Seems more like the group that implemented border radius at the OS UI implementation level did not work with the group that handles window sizing. Not everything is a conspiracy.
Most of the software creeping towards complete unusability devolve through non-practical apparence tweeking bullshit, ruining usability, while the functionality is intact (apart from bugfixes).
The other reason for decay is the overcomplication - pilin new and new marginal things on the top of the functionality heap - combined with sloppines, rushing through things, but that's an other discussion.
Did we reach a peek in software quality recently? So things only go down from here? I have this growing itchy feeling. I feel obstructed, forced to jump hoops, also disgust touching an increasing amount of software, most of those used for many many years without trouble (i.e. did not really registered its usage, it was doing things silently and well, but now starting to jump into my face or kick my legs).
https://guidebookgallery.org/pics/gui/system/managers/filema...
https://guidebookgallery.org/pics/gui/system/managers/filema...
You can make them always on still. I've done so ever since their disappearing act started. It's not even much hidden, it's in the "Appearance" setting pane.
Of course we all make mistakes, but anyone who has made this mistake should really fix it!
Classic Macs were designed for the mouse or trackball. Modern Macs are designed for multitouch scrolling. When it's easy to get the scrolling infrastructure on demand, the desktop might not need the same click-first affordances.
It is configurable, right in System Preferences > General. (Or I guess it's "Settings" now on modern systems, don't know what menu it's in there.)
P.S.: seems like the setting still keeps the scroll bar on top of the windows content (e.g. a website), not outside of the content.
Just like iOS7+ it is possible to position and layer interface elements in a way where the visual effects will render a screenshot difficult to read, but in practice the elements are frequently in motion or as you've already pointed out easy to make them move. That motion is what negates the layering problems, thus making visual occlusions rare, short-lived and easily resolved.
There is a certain unreasonableness in ignoring that reality, and also ignoring that there is a user setting to keep a full-sized version of the scroll bars always visible.
This isn't to take away from legitimate criticisms such as the issue with the resize hotbox not being updated to match the more rounded corners, but rather highlight that not all online forum criticisms comes from a bona fide place.
That changes the effort required to show useful information from zero to more than zero. Which, while it not be a great quantitative change, is an enormous qualitative change.
Like Chesterton's Fence, it was there for a reason.
"At last (and at least) we have reclaimed that narrow vertical strip of screen real estate on the screens eastern-most vestige! Now to find a good use for it!"
The true annoyance is that in many cases explicitly enabling them does not restore the original functionality.
1. The trivial ability it is to resolve, and
2. The existence of an easily accessible user setting to enable the behaviour that you desire.
Fundamentally your complaint thus comes down to a gripe that the OS's defaults don't match your completely subjective idea of how just one of many OS elements should work.
Which raises such an interesting question, because of all of the UX behaviours present on macOS - this is your hill?
...extra padding?
*They're the same thickness as Aqua.
This is endemic now. Cinnamon does it by default and I hate it. I only managed a partial fix, and then I had to do more work per-app (especially Firefox) to make them behave.
Also, horizontal scrollbars suck. One thing later versions of Finder did well was adjust columns to minimize the presence of them.
We just don't need UI that big anymore. These days our cursors are much more accurate, from the magical Mac trackpad to high DPI optical mice, and we're 40+ years into GUIs so the limited number of people who opt-in to a full computing experience can already be expected to know the basics.
Yes Tahoe sucks, but going back to Aqua or classic MacOS would also suck, just in a different direction. If you actually spend time using classic MacOS and Aqua these days, man is it frustrating to get basic things done. Everything is so slow and you're constantly resizing windows to see whats in them. I own several Macs from the 80s-00s and they are really in need of many quality of life updates that later MacOS revs added. On a modern Mac, enabling 'show scrollbars' gets you to a pretty optimal Finder experience, minus all the stupid Mac bugs and Tahoe nonsense like this article points out.
Scrollbars used to be invisible to me. They only bubbled up to my consciousness when I needed them, and then there was no friction in their use. Now I am having to think about them constantly. To me that is 'standing out'.
I do think that was better overall, and it's something I miss about Snow Leopard, but I can see why they changed it.
This was one of the worst things about MacOS and why they lost me as a user early on. I used to be a Mac Sysadmin for 3 years, and the awful window system (and Finder) made it a living hell. I still don't find much to like about the GUI part of MacOS.
(and yes Lion was garbage, first upgrade I skipped since Tiger, and definitely the first "what the fuck are they doing").
Here's a resizable window in Platinum that has a drag handle but does not have a status bar: https://guidebookgallery.org/pics/gui/settings/appearance/ma...
edit: I missed "Windows" in GP comment. Well let it be known that at least Platinum wasn't like this :)
For this and many other reasons, I just don't think the paradigm would work today. It's philosophically smart but limiting in too many other ways.
Relevant TA: https://web.archive.org/web/20090625152558/http://support.ap...
It was parctical (just like clearly visible scrollbars).
And my conviction is that computers are for practical and not the pretty things primarily. Can be pretty but not on the expense of usability. This last one is increasingly and sadly untrue nowadays!
(P.S. scrollbars aren't even bad)
I remember a few years ago, people complained when Apple merely made the entire operating system uglier. (Something about a gradient on the battery?) A lot of people would talk hyperbolically ("apple KILLED macos!"), and that's indistinguishable to an outsider when an update like this brings other people out of the woodwork to say, "Hey, these changes are genuinely bizarre and absurd, what happened?"
So while it makes a lot of sense to grab inside the object to move it, IMO it actually makes less sense to grab _inside_ the object to resize it. (Imagine the reverse argument -- IRL you can actually grab the middle of the plate to move it, but if grabbing the middle of the window resized it, that would also be very bad.)
I've been trained to grab the edge to resize windows. So I wouldn't try to reach so far inside the rounded rectangle as OP, although it doesn't invalidate their entire argument.
A few things sorta do.
If you want to increase the size of saran wrap or aluminum foil, you grab the edge and pull. Same for increasing the size of toilet paper before tearing it off.
When you want to stretch your fitted sheet onto your mattress, you grab the corner and pull to stretch it over.
When you want to make your pizza dough larger, you toss it above your head in a circle, so I guess that one doesn't really match the macOS gesture, I guess you should be spinning windows to make them bigger.
However, when you're doing other baking things, like placing fondant or a pie crust, you do stretch from the edges some.
Really wouldn't recommend it though, all sorts of consequences for the food (if present), your hand, the hygiene of the plate and potential damage to underlying surfaces. Generally preferable to pick it up and put it down again.
Apple seems to have forgotten its own innovations.
I suspect you must be very careful, overwhelmingly complete and understandable presenting this kind of stuff.
otherwise:
- if it isn't, the post might be buried in drama.
- or else, the post will be another obscure warning that nobody sees.
for me, I noticed after the ios 26 "upgrade" I must continually forget, then repair my bluetooth devices. But I don't have the benefit of a clearly understandable article that calls apple out.
Verily, the last UI redesign that was based on honest research and watching real users act was WinXP.
So then they're left with a conundrum: do they adjust the 19x19 region on a per-window basis, depending on the per-window corner radius, or do they stick with one standard drag region? Probably it should be the former, but that comes with its own set of issues.
In a word, it's hubris. It's not care about the user, it's not even care about market domination or setting a fashion trend; both have been flunked. It looks like somebody's ego needed an affirmation, or someone's grip on corporate power needed a demonstration. It's a bad, bad sign of a deadly corporate disease.
Considering how Vision Pro rollout and the AI development went, I’m having doubts about Apple adapt to an AI world, e.g., fundamentally rethinking what hardware means if you no longer need to interact with a screen or hardware in a similar fashion anymore, i.e., keyboard and GUI manipulation.
But is Google better? Not really, they killed a lot of good products like Reader.
But is Facebook better? Not really, Cambridge Analytica and Metaverse and .. facebook products are disposable.
But I think these Apple UX bugs are misdiagnosed. Yes they are atrocious. But think about how atrocious and non-representative and non-competitive Apple’s testing population is.
But nobody from likely hundreds of people inside Apple involved in the project was able to effect a change towards sanity. I'm afraid many just didn't feel like speaking.
The circular self-congratulation of DEI introduces an intimidation factor where the objective and scientific truth is inherently no longer the basis for decision making because there are multiple layers of a kind of aristocratic privilege that cannot be questioned, let alone criticized, because critique of their actions equals critique of their divinity, i.e., becomes heresy.
So we end up with this point where no one pointed out the increasingly ridiculous reductions of the emperor’s clothes, only ever cheering on with positive affirmations, to the point that everyone’s intimidated to even point out the emperor is walking around stark naked.
I could see how a combination of the DEI intimidation tactics with the advent of AI, the hash economic factors, and general desire to not rock the personal benefit boat could have resulted in institutional paralysis.
Is there anyone with a force of personality left at Apple? Ultimately, this is on Cook as the Chief Executive Officer poorly executing. It really makes you wonder if the leadership doesn’t actually use any of their own company’s products. How do you not notice these glitches immediately like everyone else if you are using them? I could see Cook not having even regularly used an iPhone or actively interacted with any Apple product himself in years as his real life Siris around him do every single thing for him every day all day, besides maybe giving him briefs on screens that happen to be iPhones and iPads. At that level you actively have to make choices to remain connected to the ground. I doubt Cook finds being grounded comes easy.
You say DEI then describe brown nosing. That's a thing that happens in any org.
In DEI type con jobs, it is just administered by an odd and even contradictory hodgepodge of people/identities who are hijacking different characteristics, or more accurately, maybe manipulating characteristics differently.
Where brown nosing also manipulates things like flattery, currying favor, narcissistic positioning and sabotage; DEI manipulates things more like sympathy, graciousness, generosity, and empathy. At the same time is also employs psychological and emotional coping mechanisms, like gratuitous pride and affirmation, while also making heavy use of rather harmful personality traits like shaming, blaming, blame shifting, cooption, purloining, appropriation, and emotionally abuses people generally.
It's all very common and typical of extreme and malignant narcissism, including the put on, fake self-congratulation you often see the DEI type crowd engaging in where they lay on the affirmation of equality and equity so heavily that it usually just highlights the contradiction with reality most people experience.
MacOS Tahoe has been heavily criticized for its UI decisions, especially Liquid Glass, which many people feel actively hurts usability rather than improving it. On the other side, Windows keeps piling on user-hostile features, dark patterns, and friction that increasingly frustrate power users and regular users alike.
Distributions like Ubuntu, Fedora, Mint, and others have mature desktops, solid performance, and fewer design decisions that get in the user’s way.
I honestly cannot remember another moment where both major desktop platforms were being questioned this openly at the same time. If Linux is ever going to take advantage of dissatisfaction at scale, this feels like it.
I agree, and its likely that both macOS and Windows will continue to get worse.
That said, it's important to be realistic because users can and will put up with quite a lot of discomfort before switching, and this is because for every bad feature or misstep, there are 100 others that are so good you don't even notice them. And when you switch, you start noticing all those others features you never noticed before, because they are now gone. Some of these features will be hardware, some OS, some application support, and some of them you can fix and some you just have to get used to.
An approach I recommend is to add a linux laptop to the mix. You can buy a used, powerful laptop cheap, install Linux on it and try to use it for a time, keeping your other machines around. Chances are you'll find various trade-offs - Linux will NOT be a strict improvement, it will have downsides. Linux is particularly weak with power management and certain devices like fingerprint readers. Depending on the apps you use, it can be weak there, too. That said, Linux is very usable, easy to install, and you should try it. But I think it does people a disservice to imply its better on every axis. It's better on some, worse on others.
All of these are longshots, but it really feels like we've hit a historic level of discontent.
People do have difficulty switching between Mac and Windows, but each has critical mass so it's still easy to get help with the finer details. And unrelated to fragmentation, anyone tech literate won't have nearly as many showstopping issues to ask about there in the first place.
Really? And windows does?
If you're not convinced: look at the difference between desktop Linux and Android. Although Android Studio seems to be a bit of a disaster nowadays, there's a lot of development support for Android, and it shows in the 1.6 million apps that have been built for it. Android has got what people crave: easy, slick, user-friendly apps, no technical hassle. It's an uphill battle, and at the same time, the focus is shifting away from desktop. So I think the year of Linux for the desktop will likely never come.
It's like the console wars — different camps say "our console is better, it has more teraflops." In reality, nobody cares about that — buyers will get the console that has the games.
Seriously, I think it depends if you're talking about business or home. For business, sure. For home—and this is quite relevant to the rest of your comment—I think it comes down more to gaming.
As for Linux. I also have a Linux laptop with Gnome for light gaming (Manjaro). It's alright. But a bit of a mess from a ux point of view. Linux always was messy on that front. But it works reasonably well.
The point with the distributions that you mention is that they each do things slightly differently, and I would argue in ways that are mostly very superficial. Nobody seems to be able to agree on anything in the Linux world so all you get is a lot of opinionated takes on how stuff should behave and which side of the screen things should live. This package manager over that one.
I've been using Linux on and off for a few decades, so I mostly ignore all the window dressing and attempts to create the ultimate package manager UI, file managers and what not and just use the command line. These things come and go.
It seems many distros are mostly just exercises in creating some theme for Gnome or whatever and imitating whatever the creator liked (Windows 95, Beos, Early versions of OSX, CDE, etc.). There's a few decades of nostalgia to pick from here.
Tahoe is tragically bad by almost every UX measure, and following various Apple subreddits i wonder if they just don't care anymore - since the majority of people are shocked by the amateurishness of both bugs and design choices in the latest update - this comes on top of literally every major bug being ignored from the alpha to releasing anyway then continuing to ignore feedback.
The UX group would present work to Steve J. every Thursday and Steve quickly passed judgement often harshly and without a lot of feedback, leading to even longer meetings afterward to try and determine course corrections. Steve J. and Bas were on the same wavelength and a lot of what Bas would show had been worked on directly with Steve before hand. Other things would be presented for the first time, and Steve could be pretty harsh. Don, Greg, Scott, Kevin would push back and get abused, but they took the abuse and could make in-roads.
Here is my snapshot of Stephen from the time. He presented the UI ideas for the intial tabbed window interface in Safari. He had multiple design ideas and Steve dismissed them quickly and harshly. Me recollection was that Steve said something like No, next, worse, next, even worse, next, no. Why don't you come back next week with something better. Stephen didn't push back, say much, just went ok and that was that. I think Greg was the team manager at the time and pushed Steve for more input and maybe got some. This was my general observation of how Stephen was over 20 years ago.
I am skeptical and doubtful about Stephen's ability to make a change unless he is facilitated greatly by someone else or has somehow changed drastically. The fact that he has been on the team while the general opinion of Apple UX quality has degraded to the current point of the Tahoe disaster is telling. Several team members paid dearly in emotional abuse under Steve and decided to leave rather than deal with the environment post Steve's death. Stephen is a SJ-era original and should have been able to push hard against what many of us perceive as very poor decisons. He either agreed with those decisions, or did not, and choose to go with the flow and enjoy the benefits of working at Apple. This is fine I guess. Many people are just fine going with the flow and not rocking the boat. It may be even easier when you have Apple-level comp and benefits.
My opinon; unless Stephen gets a very strong push from other forces, I don't see that he has the will or fortitude to make the changes that he himself has approved in one way or another. Who will push him? Tim Cook, Craig Federighi, Eddy Cue, Phil Schiller? The perceived mess of Tahoe happened on the watch of all of these Apple leaders.
I’m asking you to judge people’s state of mind here, which is near impossible, but please bear with me…
> Several team members paid dearly in emotional abuse under Steve and decided to leave rather than deal with the environment post Steve's death.
Normally during an event like this there is a change in culture as well which I think we have seen under Cook. So why did they assume that the abusive situation would continue? Jobs was generally known to be harsh to the point of abusive, but if the situation did not change on his death maybe the abuse was equal parts cultural rather than just from the CEO, so why not leave earlier?
Some people left early, like Don Lindsay. Don was instrumental in bringing Aqua to life, along with Bas of course, and led the team up and through the release of Cheetah and more. This task wasn't easy at all. To me it seems like he was finally going to receive some reward of those hard years of work. But instead he chose to leave to go to Microsoft. This boggled my mind, as leaving to Microsoft to me seemed incomprehensible. Maybe Don had enough of the abuse? Maybe he was sick of the increasingly crowded commute? The daily visits from Steve pointing out every detail of the UI that bothered him? Did you know the UX designed many of the big banners and posters for the WWDC events. Steve didn't want any old graphic designer to do those, so Bas, Imran and others would work on them. Don had to deal with that too.
When Steve left to receive cancer treatment in 2004, he still had influence, Bertrand Serlet was running engineering, Jony Ive was focussed on industrial design. We were working on Tiger with the brushed metal interface and there was a lot of activity on that. Tim Cook was running the business, but Bas and others were keeping the ball rolling on the UX with remote input from Steve.
I wasn't around for the next two leaves of absence, the last one being final, but heard that things were becoming increasingly fractious with camps emerging around Tony Fadell, Scott Forstall, Jony Ive and general politcal unpleasantness as Tim Cook was given various ultimatums about "I won't work with this or that person." Everyone was trying to say that they represented the vision of Steve and somehow knew what would Steve do given any sitution. Geez, if we knew what Steve would do or wanted, there could have been a lot of really distressing confrontations avoided over the previous years.
This type of internal sniping didn't happen with Steve around, or if it did, it wasn't very effective. I think it would have gotten you fired. Tony Fadell pushed it to the limit with Steve and Scott. I remember someone once asking Steve about getting free lunch at Apple, like you could get at Google and they were told "If all you want is free lunch, then you should be working at Google."
For me, there was a certain amount of clarity that came from Steve's abusive behavior. It could wear you down on one level, but also brought focus and drive to getting things done. I think it was very unhealthy one one level and very exciting on another. There weren't endless meeting on calendars discussing minutia. It also meant that the obvious horrors of the Tahoe wouldn't happen. Steve himself would have grabbed the windows with different corner radii, stacked them up and excoriated whoever was responsible. Some of my work was called "real bottom of the barrel shit", "the worst he has ever seen" and told "this is not the way we do things at Apple." I assure you, what he was complaining about was nothing remotely close to what we are seeing in Tahoe.
Tim Cook has no taste and no sense of quality. He merely counts beans really well. Craig Federighi is responsible for the most precipitous drop in Apple's software quality since the late 80s and early 90s. Eddy Cue is responsible for some of Apple's worst software (music, iCloud, services), and Phil Schiller… what exactly does he do again?
It would be nice if veterans of the post-Steve era would post on here. Maybe they are scared, bound by NDAs or could care less. Like I said, I need some mental health treatment about my time(s) at Apple I was there working on Final Cut Pro after Be, went to Eazel, and then rejoined Apple as part of Steve's mass hiring of Eazel employees at the behest of Andy Hertzfeld.
I think this is just what happens to companies as they get older. Most of the people who pioneered the Human Interface Guidelines aren't at the company anymore, and management doesn't see much financial growth in Mac sales compared to AI and services.
It's probably the services (Care, iCloud, Music, and even TV), Apple's AI isn't on the overall map at all compared to the competition.
defaults write -g NSWindowShouldDragOnGesture -bool true
You can then use Control+Command+Click to move windows from anywhere inside them. Sadly this doesn't provide resizing.These are problems humanity solved over 35 years ago (see NeXTSTEP). Why are these designers breaking basic features that worked for over 35 years?
I wish the average dev would recognise this.
Steve's brain fell out when he got back his throne at Apple. Aqua was a mistake.
That the roles got reversed became painfully clear when macOS copied the Windows Vista style popup mess for access permissions.
Windows Vista may have been plagued by programs assuming administrator access for everything but at least it isolated the security prompt.
You can verify that you're interacting with a real UAC prompt (by pressing ctrl+alt+delete for instance, which can be configured to he required before approving a prompt).
Any program can replicate the macOS security dialogs. You just have to hope that you can safely enter the password to your account into one, or activate TouchID when prompted.
Designers probably have perverse incentives. Showy new designs get promotions. Even when they hurt usability, it's often only in insidious ways.
Do not hire visual designers as UX designers - unless you know what you're doing.
The best UX designers design to solve business and user problems and work within constraints.
"Socrates", in Plato's Republic
None of us are immune to cycles in fashion, and the need to differentiate ourselves and our work from what came before, even if what came before was pretty much a solved problem.
Maybe it's humanity's way of escaping local minima, or maybe it's an endless curse which every generation must bemoan.
I am. If it isn't broke, I don't fix it. And I suspect others are as well. The problem is that too many people are not immune, so it doesn't matter if some are.
For example, the constantly recurring critique that the music of the young is not about musicality[1] is always wrong. It's as wrong today as it was about Elvis.
On the back of my mind I think part of this was the move to fit scaling to large resolution monitors (i.e. 4k+) work better, as a graphical border of a fixed pixel width will shrink proportionally compared to a border that is as thin as it can be. For a while I've felt that it's a missed opportunity on high res displays to not use more detailed art for window chrome as pixel wide will only get smaller and more difficult to distinguish, such as the minimize/maximize/close icons which remain pixel wide line art even at big scaling.
If the anchor point for window resizing was more inside the window, then you encounter an annoying problem where youre trying to click or drag content, but you end up just resizing your window instead.
The obvious solution is to just keep the old bezel that separate the content from the scroll wheels / resizing handles and make it visually obvious what you're doing, but apparently they think that's too ugly.
All the Apple engineers and other visual designers get quite defensive really quick when we mention that Tahoe really screwed things up, because it's more than just a transition into glass design, but a complete dismissal of design principles, to the point that the entire system is slowly becoming user hostile.
Every critique of the 26 series can be explained like this article with really in depth design principles, which is already engraved in Apple design guidelines, but Apple itself just dismissed it all. Everything from being able to clearly distinguish UI elements, to general accessibility, to discoverability, everything got worse.
Operating Systems are one of the most complicated systems we created, not because they're a collection of processes and thread, but because everything is built on top of them and creating something that's well thought out and stable, and intuitive is really hard. Designers just randomly creating visual elements just because it looks cool and not paying attention how people are going to use it is simply half assing the whole thing.
That's still one of the reasons I believe Alan Dye was let go, fired in a sense, he had power over the company, but with that power he screwed things so much that we need to rediscover all the things related to usability in very high detail as if we're rediscovering the wheel, just so that we can get back to square one.
Is there evidence of this?
When trying to reproduce the problem as shown in the article by resizing the Safari window currently displaying the article, the drag cursor changes shape at the visible border of the window, not the shadow and consequently, dragging works as expected.
This might be an application- or driver specific issue, not necessarily a common Tahoe issue.
It definitely serves to prove that this is not a design-issue but just a simple bug and thus has at least some chance of being fixed.
FWIW, I cannot reproduce the issue demonstrated in the original article with any window of any application on my machine (M1 Mac Studio), but I thought that listing a very commonly used application alone would be enough to challenge the article's assertion ("the macOS designers are stupid because they make me do something that doesn't make sense in order to resize windows").
“As much as I like to *” is a common way to start a rebuttal (the subsequent “I’m not going to see/do that” is implied by that turn of phrase).
> but I thought that listing a very commonly used application alone would be enough to challenge the article's assertion
So it was a rebuttal? Why the disingenuous doublethink?
As for the fact that one cannot resize from inside the window, it makes absolute sense for every other corner of the window, where the user would instead be clicking an icon or some other button (try the top right corner of the finder, where the search button sits).
So, while I agree on the whole that Tahoe is a huge step backwards in terms of design, this seems like an odd gripe to point out, as it doesn’t in fact seem to be an issue at all.
Edit: clarification
if you check the screencast I posted, you'll see that you can indeed resize from inside the window. Not by a huge margin, but definitely from inside the actual window boundaries.
... great speed? Interpolating from the zoom, I would say its not fast at all.
So if it was something that was learned whilst using the previous version, and worked, I'd argue it wasn't 'erratic'.
It seems to be common.
Additionally, it is hard on all developers (Apple included) to release updates for all of its many platforms on the same day, which IMO reduces software quality across the ecosystem.
(Apple also has the luxury of only supporting the latest OS versions with its software. Customers often expect third-party developers to support a wider range of OS versions and devices than Apple does.)
Still bitter that my 2006 Core Duo MacBook only had support up to 10.6 Snow Leopard but back then that was over 4 years of being able to use the latest OS, so comparable to four releases with the current cycles.
The only device I ever got Apple Care on and I got thousands in repairs covered for free. This was from before Apple would just replace the entire device.
All my other MacBooks have been trouble free luckily.
If cars were like computers, the steering wheel would be in a different place after every maintenance check.
Anyway, I'm on Linux, using Gnome Classic as my WM, and I don't have these stupid "everything is suddenly different now" issues.
Because if they kept it the same, then there would be no need to continue to employ all those UI designers. Therefore, to be assured of their continued employment, the UI designers have to make constant changes to justify their existence. Meanwhile, we get to suffer with their changes.
Companies don't build things to motivate having developers - Remember they are the "cost center", while sales are the creators of value. The developers are a necessary burden and would be axed as soon as they don't provide what is needed.
Old products are boring. New products are interesting. Customers likes new thing. Media writes about new things, even writes negatively if updates are slow to come.
Compare to cars, skis, tennis rackets even dishwashers, new coke, new christmas special of somesuch not the same as last Christmas. Things that have new models every year or season, every six months etc. We create newness, not because it is really needed, but it drives sales.
Moving to a once a year makes Apples products guaranteed to get buzz, sales repeatedly. And investors can predict when that will happen. All are happy. Almost.
On top of that, managing a huge redesign is a great career opportunity for everyone involved. The incentives are simply stacked in favor of doing redesigns for their own sake all the time. You need a clear minded top level manager to stop these kinds of ideas.
It's also an issue on Linux, to an extent. GNOME has a tendency of forcing UIs on users, and Ubuntu with Unity, now GNOME again, etc. Though, thankfully, since the user is free to choose their own desktop environment and window manager, it's not as pressing of an issue.
I realized many years ago that simpler UIs deliver the best UX. This is a large reason why I love the command line so much. Most programs have a fixed and stable interface, and can be composed to do what I want. For graphical programs I prefer using a simple window manager like bspwm on X, and niri on Wayland. These don't draw window decorations, and are primarily keyboard-driven, so I don't need superfluous graphics. I only need a simple status bar that shows my workspaces, active window, and some system information. I recently configured it with Quickshell[1], and couldn't be happier. I plan to use this setup for years to come, and it gives me great peace of mind knowing that no company can take that away from me. I will have to maintain it myself, but there shouldn't be any changes in the programs I use to break this in a major way.
[1]: https://github.com/imiric/quickshell-niri/tree/main/fancy-ba...
What bothers me about Linux is that realistically your entire DE will change at some point if/when the one you're staying on becomes too unsupported. They did this kinda recently at work, not cause IT wanted a fresh new look but because of some compatibility issue.
Eg on my iPhone filling in a password sometime is kinda blanks the screen while I’m trying to fill the password in.
My keyboard is absolutely terrible.
Lots of other little annoyances I can’t remember right now.
This window thing is another good example of just not enough thought being put into things.
I think one of the big issues is the autocorrect seems to make as many correctly typed words into random bullshit words as it does typos into correct words. So you feel like you’re walking on ice, just constantly monitoring what you last typed to make sure it didn’t make it into nonsense.
It seems like a clear regression in usability. By moving from a high-density, full-screen experience to a constrained, scrolling window, they’ve increased the interaction cost for launching apps via the mouse. It feels like a 'unification tax. Sacrificing desktop utility to align with non-Desktop modalilties. Does anyone see a functional upside here, or is this purely aesthetic consistency?
Why would I want my dev tools, audio apps, 3-D-modeling apps, and office apps all jumbled together?
It's as if Apple is trying to catch up to Microsoft in the race to regress.
It wasn't a blunder. It was absolutely intentional to force users to start using the AI component.
I suspect someone probably pointed out no one would use it because launchpad has a better UX, so they removed it and forced the three finger pinch to launch spotlight.
I'm currently using the following to fix it.
- Bug in preferences that disabling show home also disables 3 finger pinch.
- I'm using AppGrid as my new launchpad.
- Using better touch tool to activate launchpad with 3 finger pinch.
"Open the SD-card data-recovery application I installed a few years ago"
So elegant!
Oh look, right now on the HN front page is an article about why this sucks: https://tidepool.leaflet.pub/3mcbegnuf2k2i
And Apple's categorization is trash, a huge regression. There is absolutely no advantage to anything offered now over Spotlight, which not only allowed FASTER search (because it only searched applications), but allowed you to group applications as you saw fit (which didn't preclude an OPTION to have Apple do it).
It also allowed you to launch several applications faster, because it kept your last-used group open. For example, if I sat down to do some development, I opened my Dev Tools group and could launch the four applications I typically use together with only eight clicks.
Mac OS comes with something like 80 apps out of the box. I have over 200 on my system, and I'm pretty stingy with space. I immediately delete stuff I try and don't like.
So the noobs who know nothing about Spotlight typically come back with some absurd suggestion like "put a shortcut to the Applications folder in your dock."
Um.. no. Not even close.
I complained about it to a team mate and he thought it was fine and I was weird for using the app launcher and not cmd-space. Although on Windows I always use win-r to run stuff.
Tahoe UI changes and LG are such a mistake and Apple being Apple will probably just double down on it.
I much prefer the new app launcher in Tahoe, but it was created at the expense of Launchpad, which some people actually relied on. I don't know why they couldn't have kept both options.
You get worse icon pop-in if you add your app folder with grid view to the dock. These aren't stored on the network, so it's baffling they take so long to load the icons.
No. Launchpad is just the iOS springboard brought to Mac, with big icons and folders and pages. When it was added people complained of "iOS-ification".
This time they made a proper, unique Mac equivalent, integrated in Spotlight and built around the keyboard. It's not as good, the window was too small in 26.0, doesn't support uninstallation like Launchpad, but it's definitely less iOS-like.
In all my years using computers I have never been so disappointed so profoundly by a 36 gigabyte operating system upgrade.
Simply put: the pointer doesn't always switch context properly. So, you'll have it hovered over a resize control and it will refuse to change from the default pointer. Or you'll be working, and suddenly notice the pointer is a 'drag' one, even though nothing's being dragged and nothing draggable is active.
I would love anyone with any knowledge, especially an (ex-)insider, to shed light on this issue.
So maybe the pointer is not as tightly-coupled to the underlying UI components, so some scenarios can cause them to briefly lose track of each other?
(I say "maximised", even though that isn't the right term, because there is no right term. I don't mean 'full screen', since the borders are actually squared off properly in that mode, thank the lord; I mean 'full screen except for the global menu bar')
For practical reasons I am stuck inside Apple’s macOS garden, but I wanted to share a few things that at least make me feel content using macOS:
First, I have at least two VPS systems so via mosh/ssh/tmux I always have Linux dev environments, the ability to use throwaway VPS for sandboxing, etc.
Second, when actually working on macOS I stick with tools that make me happy: Emacs and terminal windows, a uv-based Python enviroment and tuned-up Common Lisp, Haskel, and Clojure dev environments.
Anyway, I am just sharing my ‘macOS therapy’ - hope it helps someone here.
"Hi, I'm a Mac."
"And I'm a PC. Wow, you suck, Mac. What the hell happened?"
"Yo momma, PC."
<wild gesticulating and arguing ensues for 20-30 seconds>
"Hi, I'm Linux. Neither of these people care anything about you. You see, you're not their customer anymore. When you're ready to make computing personal again, check us out."
The ad should show something people want, not vague promises of being their customer or personal computing (a term essentially unknown by the new generations). Show something the new machine can do that the competition can't - built-in adblocker, cross-compatibility with Mac and Windows apps via VMs/rented servers, etc.
The price of renting a billboard isn't going to cover more than a week's worth of those people's fees. Billboard-induced shame has actually much more chance of succeeding.
The whole point of FOSS is that a single groups decision or opinion do not dictated your computing. There are enough forks or alternatives.
It's also hard to measure the quantity and genuineness of bitching online because people complain about everything and there's an inherent incentive online to complain to bring in ad revenue regardless of how genuine it is.
But it's a direct and unmistakeable sign (to you and your peers and colleagues) when someone paid actual money to rent a billboard just to remind you how much you fucked up.
Steve Jobs would have had a fit over this product line. As '97 era Jobs put it, "The products suck! There's no sex in them anymore!"
My modest proposal for Apple diehards (especially employees) is to feed all the data that exists on Jobs into a multi-modal model so that Apple can hear just how much their shit sucks from Jobs' digital ghost.
A good starting point would be the https://stevejobsarchive.com/
Windows is going down a strange path, where it's productivity is suffering because Microsoft is measuring success in terms of CoPilot adoption. Apple is stuck trying to invent the next iPhone, but in the meantime they are trying to make the iPhone sexy by slapping on a new skin. Then they forgot about macOS and quickly moves over some stuff from iPhone. Neither of the products apparent have UX designers anymore and QA is meeeh.
I don't understand either company. Both use to have talented UI/UX teams and actually listened to them. Is it really just short term stock price thinking that make them both forget that their operating systems should be about productivity and user ergonomics?
Half of humanity is not very smart. Once you've sold computers and software to everyone who is smart, you have to sell to the not smart half. And that not smart half isn't going to like or even be able to use complex software. Since there are far more people out there simply consuming things and few people creating things, the bias is going to be for the simpletons.
Software and technology went from being a productivity tool to an ad delivery vehicle (or delivery vehicle for whatever bullshit is en-vogue like media subscriptions, AI, etc - that ultimately sooner or later comes back to ads).
Turns out you don't actually need much UX or design when the product's productivity capabilities no longer affect your bottom line.
My question is what those people think will happen when the transition completes and everything fully became an ad delivery machine with no productivity features? Ads only work as long as people have disposable income to spend on the advertised products/media, and they won't be having any money if you break the productivity tools they used to make said money. Ads can't work if the entire economy becomes ads.
Enter "Lickable Pixels" -- the phrase that stuck to describe the Aqua era.
Introducing Mac OS X's Aqua interface, Jobs said at Macworld in January 2000: "We made the buttons on the screen look so good you'll want to lick them."
https://en.wikipedia.org/wiki/Aqua_(user_interface)
Then there was the red hot irresistibly sexy and well designed IBM Thinkpad TrackPoint -AKA- Keyboard Clitoris -AKA- Joy Button, and IBM's explicitly lascivious "So Hot, We Had To Make It Red" ad.
https://www.reddit.com/r/thinkpad/comments/hodidb/so_hot_we_...
Ted Selker, the inventor of the TrackPoint, told me the story of how that ad got written and refined by focus groups: He slyly suggested the slogan, and IBM's ad designers begrudgingly put it on the page in small text in the corner, below the photo and ad copy. Then they A/B tested it with the text a little bigger, then a bit bolder, then even higher, and it finally worked its way up to the top of the page in BIG HUGE BOLD TEXT!
More about Ted's work:
https://news.ycombinator.com/item?id=34425576
Ted Selker fondly reminds me of "Mr. Lossoff" the "Button Man" in "A Nero Wolfe Mystery” episode “The Mother Hunt”, where Archie drops in on "Mister Lossoff’s Distinguished Buttons” in the garment district of New York:
https://youtu.be/h-QgWOSVKm4?t=724
He's totally THAT enthusiastic, a distinguished expert fiendishly obsessed with buttons! He even carries around a big bag of replacement Joy Buttons that he hands out for free like candy to anyone who’s worn theirs out.
I know this from personal experience: Ted and his wife Ellen once ran into me working on my Thinkpad at some coffee shop in Mountain View, and Ted noticed my worn out Joy Button. He excused himself to run out to his car to fetch his Button Bag, while Ellen smiled at me and rolled her eyes up into her head and shrugged, and we hung out and talked until he got back. I really appreciated a nice new crisp one with fresh bumpy texture, because mine was totally worn down, and it made his day to get rid of a few. (I imagine their house has hoards of boxes and piles of bags full of them!)
The common thread: design that makes you come. Back for more, that is. Buttons to lick till they click. Nubs to rub till they're bald. Products you touched obsessively until they're worn smooth. Tahoe gives us clownish corners we can't even grab. Apple dropped the ball -- and frankly, it's a kick in the nuts.
This is standard in Gnome and a must for me back when I switch to MacOS for work.
cmd+option+f = maximize to fill entire screen
cmd+option+ctrl+left/right = move window to other monitor on left/right
I occasionally use cmd+option+left/right if I need to have two windows side-by-side on the same monitor.
MacOS window sizes have always felt weird to me - no easy way to maximize without making it go into full screen mode.
As I was writing this, I just realized that hovering on the green traffic light shows a menu to choose some window placement options.... not sure how I never realized this before, but even the "maximize" option there doesn't go all the way to the edges - weird.
I rarely use windows anymore, but just like you installed a tool to get this behavior.
This UI feature saves approx 3 seconds on average for resizing windows. Plus, more importantly it more predictably works, and is an easier target to hit than a 2-10 wide pixel line or square region.
I have no idea what Apple were thinking, this OS is basically unusable, and extremely ugly.
I hope I won't somehow be forced to upgrade at some point.
Apple needs to start thinking about their users again instead of shareholders.
And it's not just Tahoe. The various iOS/WatchOS updates from the fall are all broken in one way or another.
For example, WatchOS's music app can't play more than 2-3 songs from a downloaded playlist without crashing.
The WatchOS Outlook app won't launch (which also means the watch face complication is broken).
iOS Safari's search bar/address bar periodically freezes after you enter a search term. If you click the bar, the search term disappears, so you have re-type it.
All that said, I REALLY would love to have a hotkey combo I can beep pressed down to resize anywhere over the window. Just like in many Unix/Linux window managers.
So I agree it's strange that the drag zone extends so far beyond, but that's not really something to complain about...? Everywhere inside the corner where it feels reasonable to resize, it resizes. The article is expecting an absurd level of a drag zone on the inside.
Again, the large drag zone outside the corner is kinda weird. But honestly that's more just an understandable artifact of the corner drag zone being a square. If it were me, I probably wouldn't bother to round off one corner of the drag zone either.
There's a lot of stuff to criticize about Tahoe, but this would be about last on my list...
defaults write −g NSWindowShouldDragOnGesture −bool true
This only supports moving around windows, not resizing.
So here I am, random hacker news links verifier.
Scrolling to the image below "So, for example, grabbing it here does not work:" text and reproducing the issue with a small caveat: just moving cursor 1 (ONE) pixel right turns the cursor into the "diagonal resizing mode" cursor. Overall, the resizing area of the window corner is comfortably bigger than the author draws. Dragging empty space outside the rounded corner is weird but what isn't in today's user interface designs?
All in all have never experienced difficulties resizing windows in macos.
Miss the times of windows 95/98 and macos 9 (as some other commenters here) when OS UI was designed by humans and for humans and everything was explicitly clear including the area for window resizing.
Double click any side or corner to move it to the edge of the screen, and hold down option to make the effect symmetric.
I just found out today that hovering over the green traffic light icon shows an arrange menu... but the "maximize" option there leaves some padding on all sides of the window - weird.
I swear by https://rectangleapp.com/ - same outcome but with keyboard shortcuts instead of the mouse.
I put a Teams meeting on my second monitor. I put Teams on my first monitor. I minimize Teams to look at something in a browser on the first monitor. The Teams meeting on the second monitor minimizes, too.
Mac window management UX is dogshit in a lot of different ways. There are a lot of problems that I either have to just deal with, or try to find some third party app to solve in lieu of Apple actually caring about UX again.
I doubt Apple ever really cared about UX. It took Apple 24 years after Microsoft's Windows 2.0 introduced resizing a window from any edge, for Apple to finally implement it in MacOS Lion in 2011. Apple UX is ridiculous.
If they cared about UX, they'd throw out their "HIG", hire some competent people, and start over.
Why is the first item on the first menu of every software program "About this software"? Is it because the most frequently used thing by every user is to know what version of the software they are running? Apple specified this in their "HIG" long ago and it never changed, and it's been stuck there ever since. And it's completely stupid. MS Windows applications typically have "About this software" as the last menu item on the last menu, which is objectively a far better place for it than the first thing on the first menu, since it is rarely needed when using an application.
I still operate off muscle memory, so it's not actually easier or harder, of course.
Also, for anyone reading this who hates the general aesthetic, go into Accessibility and hit "reduce transparency". This has been a desirable setting for last few OSX versions.
Edit: despite all the negative feedback, I’m quite happy with Tahoe and I enjoy the visual changes. I think some of the subtler changes is more intuitive and Spotlight’s improvement is quite nice.
I updated, carried on enjoying the best desktop experience (IMHO). It's not perfect, but was and remains better than the alternatives for me. Very little "struggle".
Out of all the things, the UX I cannot forgive is:
1. Hold Siri button
2. say "Create appointment at 3PM tomorrow."
The result is that no alert/notification/warning of this appointment occurs, unless I open the appointment and create the alert manually, at least at time of event. I cannot imagine any use case where one would create an appointment that required no reminder.
If I had created this appointment via Gmail or even Outlook, and synced... then there are notifications.
My point here is that the UX rot at Apple is not new. I am curious as to how this rot begins at BigOrg, and how it can be cured, if it can be addressed. I have never worked at BigOrg, so I really don't get it. Is there some missing UX role in the c-suite? How does my gripe, or Tahoe... ever happen? I understand how it happens at MSFT, but is this just what happens at all BigOrgs, eventually?
However, can you please explain to me the use case of "Siri, create an appointment at 3PM tomorrow" - where I would want no alert, at time of event, at the very least? I am pretty good at imagining edge cases, and I cannot imagine even one.
I have never been more upset at a default setting. I want to name and shame, and worse. Who made this call, a hippo? Think of the lost productivity at scale. "It just works UX" was supposed to be the entire point of Apple.
That's only speculation, but that's why it's a setting. You can have it either way.
Sorry, I have been spinning out on this for a while. I might be ridiculously upset about this. But, remember what Jobs said about boot times at scale?[0] Well...
Let's face it, new glass UI is stunning - not for everyone's taste, like everything in art - but it has the Wow effect. Fresh look, transparency, new colors, wow! Same goes to many, not all, web sites, apps, etc.
On the UX side, with some exceptions, it is a disaster, though. Why on Earth would I want an ill-readable text behind a semi-transparent panel? Windows that only use 90% of my OLED screen I paid for? Do I want every web app invent its own navigation? Not in my worst dreams.
I like the new UIs, designers do an excellent job. Now, we must also bring back the UX people! Real user-oriented UX, not dark patterns UX that trick users to sign up for services they don't need. Its a pity, the latter actually killed the UX domain I think.
`NSWindowShouldDragOnGesture` setting allows you to drag windows at any point if you hold ⌃⌘
`defaults write -g NSWindowShouldDragOnGesture -bool YES`
Has text selection also changed? When I drag a block and copy it, I often find I've missed the first character. It's happening almost every time and I swear this wasn't happening to me before.
--- start quote ---
The utter user-interface butchery happening to Safari on the Mac is once again the work of people who put iOS first. People who by now think in iOS terms. People who view the venerable Mac OS user interface as an older person whose traits must be experimented upon, plastic surgery after plastic surgery, until this person looks younger. Unfortunately the effect is more like this person ends up looking… weird.
These people look at the Mac’s UI and (that’s the impression, at least) don’t really understand it. Its foundations come from a past that almost seems inscrutable to them. Usability cues and features are all wrinkles to them. iOS and iPadOS don’t have these strange wrinkles, they muse. We must hide them. We’ll make this spectacular facelift and we’ll hide them, one by one. Mac OS will look as young (and foolish, cough) as iOS!
--- end quote ---
At the time it was only Safari that they wanted to "modernize". Now it's the full OS.
For the first time in 15 years I am considering to not buy apple as my next phone/laptop regardless of the specs.
Absolutely stupid design
This time the article is so good -- clear, funny, succinct, accurate -- that it's a better read than the comments.
We've spent billions. Are UIs a lot better off than Windows 3.1?
I guess I found BetterSnapTool first and it solved my issues with window management in macos.
I can't recall them all right off the top of my head, but I waited til 26.2 to update because of all the comments I saw about glitziness, and this resizing issue is just one of the quirks I have noticed are still not resolved; not to mention that my M4 Mac has not crawled and locked up as often as it has since I updated to 26.2. But again, to put it in perspective, that's only been very little hassle compared to what seems to be nothing but misery, suffering, and existential questions suffered by the wretched souls condemned to Windows.
Edit: another issue I have noticed in iOS is that now things like saving bookmarks in Safari is no longer a two step/tap process using long-press, it's a three step/tap process....WHY?? Same with "add to home screen". Also, the long press horizontal context menu (i.e., copy, paste) now does not slide left to reveal more options, it just changes mode to a vertical list. What is going on??? That's sickening...in my opinion. Horizontal, vertical? Pick one.
Second Edit: I just experienced another Tahoe glitch in at least Safari, where hyperlinks become un-clickable and the only way to resolve that is to seemingly restart safari. I don't find that acceptable in Safari of all places.
My biggest beef is there seems to be a lot of bugs in Safari. If I open Discord and switch tabs a few minutes later the tab is dead and a refresh doesn't work you need to retype the discord address again on the tab window.
On a full screen safari If I click on the share button by accident and don't pick any of the options the address bar for that tab becomes uneditable.
In IOS long pressing a video would show options such as opening on a new tab or downloading the file. Now for certain websites the options show for a split second before it switches to the full screen player.
There are many other annoying bugs but those are the most annoying ones.
BTW it's also amusing how not only iCloud doesn't flag a false Apple billing phishing message as junk but Apple """Inteligence""" will highlight it as priority. https://imgur.com/a/HaHxsUR
I get this on iOS26 all the time and it's extremely annoying since I don't always have the correct URL. Can't make heads or tails of what triggers it (I don't use Discord).
Safari 26 slowly leaks something on older macOS releases and opening new tabs or typing into the address bar becomes unbearably slow after a while[1]. The best solution is to downgrade to Safari 18.6 - though this seems to only work on Intel Macs.
Basically a total mess. I don't want to upgrade my MacBook to 26, but Apple seems to be embracing some dark patterns in their update dialog and I'm worried I'm going to accidentally upgrade and enter a world of pain one day.
[1] https://old.reddit.com/r/MacOS/comments/1nm534e/sluggish_saf...
I'm taken aback. Change the look, that's fair enough. But it should have some usability testing for this kind of thing before it goes out the door.
- moving windows without holding from any particular position
- resizing windows without grabbing a particular corner
Life changing small things.
That's when I realized there's no default hotkey for moving an app to an external monitor. That is absolutely wild. (Happy to be wrong)
For this kind of stuff Raycast is more than enough though. I use its window management features extensively.
I'm glad I saw this blog post. I'm not going to upgrade until stuff like this gets addressed.
I miss Windows 7 and OS X.
https://youtu.be/1fZTOjd_bOQ?si=BVOxUPjoULhwiclE
The tl;dw is that copying UX lets others invest energy in identifying the paradigm. Linux, which tends to be starved for resources, has historically been reasonably well served by letting Apple and Microsoft define UX, while Linux focuses on implementing it. However, those headlining companies haven't been investing in desktop UX excellence in recent years. It's time for open source projects to embrace experimentation and take the mantle of cutting edge UX, because Apple et. al. aren't paving the way anymore.
For a long time, I've found that either full screen or tiling (driven by keyboard shortcuts) is a far less frustrating a way to interact with windows, so I almost never use window-resizing. Window resizing is also horrendous when you try to do it with a touchpad.
Apple could fix it, but instead they made overlapping windows for iPadOS, which is even dumber considering the smaller display area.
I think it doesn't matter what they do; part of their clientele is fully captive, another part is only there for the status, and the last part is just using it rudimentally, so anything is OK.
BetterTouchTool + Alt Tab + TaskBar is my setup.
All apps used with any frequency mapped to keyboard shortcuts, mostly using right side CMD key. CMD C Chrome, CMD V VS Code, CMD T Terminal, CMD F Figma, CMD S Slack, CMD E Edge, CMD OPT A Activity MTR, CMD OPT CTRL F Firefox and not that many more.
And then for windows, it's left side CMD OPT ENTER maximize, CMD OPT LEFT left screen, etc etc. And then others for quadrants. If I need to multiwindow, it's rarely more than 3 things, and most of the time it's just 2.
Having alt tab mean cmd tab is windows style is huge. Many of these things aren't directly related to window management but I find myself not thinking about it at all, when I used to think about it all the time with Macs.
iOS on iPad has split screen mode now. It's pretty decent. Wouldn't defend it tho.
Something like that really needs to be implemented at the OS level to be trully competent.
TaskBar is cool but I'm not a fan of needing to have something like a dock constantly taking vertical space while we have the menu bar on top. After years of using both, I think Microsoft decision to bundle both into a single taskbar is just much better. And the menu bar is annoying when you use multiple monitors.
I personally use Moom for windows layout, which gives you something close to PowerToys and is pretty decent. Still, it is an added utility app that you have to pay for; Apple should ship it with the OS at this point.
iPadOS is almost irrelevant for productivity anyway; there are too many flaws and limitations, and all the software are expensive subscriptions. My point is that they had a blank slate to come up with something better but still decided to just copy what they already had, which completely defeats the purpose of having a newer platform/OS.
In the end, many of Apple's UI decisions looked good when computers were simpler, but nowadays they show their limitations heavily. Windows has many architectural flaws and not as good software for some niches, but the workflow feels better.
Linux desktops have great ground-up support for tiling window management, whether as an optional behavior (Gnome/KDE/ChromeOS), or strict tile enforcement (i3/xMonad).
> I think it doesn't matter what they do; part of their clientele is fully captive, another part is only there for the status, and the last part is just using it rudimentally, so anything is OK.
It's too bad even technologists often fall into those categories these days.
There are not a lot of technologists left, and they are dominated by the crowd anyway. Apple used to cater more to people who really liked computers, but now they try to sell to everyone, so the objectives have changed a lot, sadly.
In my head, I can hear GPT laugh hysterically, while it explains to me that I can just continue to use alt-SPACE to bring up the MOVE system menu, if I am overwhelmed, while it gleefully assures me that MSFT has 'no current plans to get rid of that feature' (which we know is Kremlin-speak for 'the system menu is NEXT brother'.
defaults write -g NSWindowShouldDragOnGesture -bool true
This allows you to drag windows around by grabbing the window anywhere you want while holding ctrl + cmd.This comment explains how to do it: https://news.ycombinator.com/item?id=46585551
I have often felt like GNOME is the most Apple-y of desktop environments; they're very form over function. Not surprising to me at all that both would pick a design that seems beautiful until you try to use it.
Now I guess it's Apple's turn.
Latter versions significantly improved it.
Any reason why you're not using it?
In the next generation or two iPads and MacBooks are going to essentially merge as a product line.
I wouldn't be surprised if Apple abandons classic macOS (w/ Terminal and a filesystem) all together. To continue to support developers all they need is a tweaked Xcode for Apple dev and their version of WSL for everything else. All the parts are already in macOS/iPadOS (native virtualization and containerization).
iMac and Mac Pro are all but dead now too. Mac Mini and Mac Studio will be the only desktop options and will be bought by people who are Millenials and older or ML/AI praciticioners. We may even see a special AI/Local LLM Mac Studio that would be the equivalent of mac pro of the ai era.
Your fingers will need these big round edges to grab. They may let you use a bluetooth mouse but they aren't going to cater their UX to you.
They year of the Linux desktop has come as commercial desktop OS's die.
I think it all boils down to this one question. It’s not complicated.
The resize corners grab area is also very frustrating though.
And I do not get why people so upset with Tahoe. I really really love it.
the cherry on top is the delay between the drag start and the window begining to resize
Edit: Oh, and the "beauty" is in the eye of the managers.
Third-party apps have to use the Accessibility API, which was designed for screen readers, not window manipulation. Some windows simply refuse to be resized below certain thresholds, and there's no way to query the minimum size in advance. You request 500px width, get 800px back, no error.
The real question is: will Apple ever provide a proper public API, or will this remain a cat-and-mouse game with Accessibility permissions?
He’ll show boring things like resizing windows because those things matter to you trying and if he cares about resizing windows to this degree then imagine what else this product has.
Apple today hides behind slick motion graphics introductions that promise ideal software. That’s setting them up to fail because no one can live up to a fantasy. Steve showed working software that was good enough to demo and then got his team to ship it.
If you use something long enough, you'll get used to its idiosyncrasies. Jobs would have clicked and dragged 10px away from the rounded corner here instinctively. This is why the owner of an old car can turn it on and drive away in a blink while his son has trouble: hold the accelerator 10% down, giggle the key a little while turning, pull the wheel a bit, ... all comes natural to the owner.
> This never happened to me before in almost 40 years of using computers.
> If you use something long enough, you'll get used to its idiosyncrasies.
Or you don't and get constantly annoyed by some basic thing that is broken (the owner of an old car would curse it every day when giggling the key)
I think that kind of behaviour ought to be controlled by the green dot at the top-left of windows, not by some particular mouse movements.
There was a time when the changes to the mac UI were quite good, or at least not annoying. Sometimes it seems as though they are changing stuff just to change stuff.
I’m glad other people are pointing this out. At first I thought my eyes were going. It’s especially bad with the magic mouse for some reason.
Into the 2000s, the only way you could resize a window on the Mac was to drag its lower-right corner. That is it. NO other corner, and no edge. So if the lower-right corner happened to be off-screen because the window was bigger than the screen, you were kind of screwed. You had to fiddle with the maximize & restore gumdrops to trick the OS into resizing the window to make that ONE corner accessible. Then you had to move the corner, then roll all the way up to the title bar and move the window, then roll back down to the corner... until you had the window sized and positioned as you wanted.
When Apple grudgingly added proper window-resizing, it made it as obscure as possible. Since Apple remains ignorant of the value of window FRAMES, there is no obvious zone within which the resizing cursor should take effect. There is no visual target for the user. This has always made an important and fundamental part of a windowed GUI a ridiculous pain in the ass on Macs.
And as the author here notes, it has gotten even worse. Not only will the window often refuse to resize, but you'll wind up activating whatever app lies behind the window you're trying to resize... hiding the one you were dealing with.
Could it be that I just need to drag the window outside of the pane..?
They won't do perfectly circular windows, that would be crazy— but I think we all know they can go further than this.
I've only owned macbook laptops but have run Linux at work since 2002. The lack of cohesion and non-stop changes in Linux is just as tiring and this MacOS Tahoe stuff. Gnome 3 cared just as little for users. FreeBSD + KDE Plasma is pretty good now, but lacks feeling and design.
In fact I am not on Mac anymore because with every release there were more features I didn't use (because they only work within the Apple ecosystem) and more and more things that ticked me off. Eventually I decided it wasn't for me anymore, after being on the platform for more than 15 years. Oh well. I am very happy on KDE now.
Even worse: because the min/max/close buttons are all shunted into the top right corner, if you're trying to resize from the top right and you miss, you close the window.
Now I call it with --required which only applies critical and security updates. Then I call it again with --safari-only, and one more time with --list to see what remaining updates are available. Frustrating (but sadly not surprising) there isn't a way to apply all available updates excluding major OS updates.
Dye destroyed macOS. I don’t know what they do, but they have to backtrack.
I personally don't see this behavour on Tahoe 26.2 and an M2 MBP.
I upgraded my mac to Tahoe and I don't like any change to the UI that I have noticed.
I upgraded my phone the other day, thinking it was just an update to whatever it already had, and ended up with LG on there and it is a disaster. I enabled the 'more opaque' feature and it did almost nothing.
LG is an awful experiment IMO. I'd put it at worse than Vista (which I skipped) and Gnome 3 which didn't bother me because I don't expect anything from linux desktops. I also skipped Windows 8 so not sure about the ranking there. But I'd say it's that level of disaster.
This seems like a very strange thing to release for a company that's supposed to care about the details.
I know that macbook has been crushing laptop market with their M chip. Macbook is amazing for sure. I very much enjoy using it at work. But for personal computing, I need Linux setup.
Seems like most the attention this is getting is people wanting to grave dance Apple at any chance given.
Apple keeps being a hardware company unfortunately.
Their software is not nearly as good as it could and should be if they had real competition.
Windows/linux is not a competition since they dropped bootcamp. Because that implies switching to a subpar hardware.
If there was a background window in that area outside the corner, would it receive the click event?
Just did a quick test in a VM, and it seems all of them. I.e. if you could resize the window, clicking that space (even if empty) brings it into focus. But then I also tested on Sequoia and the same happens.
It seems then that basically everything remained the same except for the visual presentation of the corner.
> everything remained the same except for the visual presentation of the corner
This either seems very well-researched change, or very shallow one.
Which just goes to show how holistic design is utterly lacking. They seem to think just swapping bitmaps is "UI redesign".
Funny enough, I never suffered this because my mouse pointer has always been configured to be comically large. So I had adapt with inaccurate click area for many many years due to my own cause.
They keep on asking me to upgrade to some new OS, I consistently keep on telling it to sodd off!
Please please please make this better Apple. Or just give us an option for square windows.
Also, years after reporting, you still need to pause typing for one second after switching keyboard language via keyboard shortcut, otherwise the original language stays selected.
Not to say this isn't the case anymore but
Is it possible for me to update to whatever was released just before "Tahoe", or will it just put me on that now?
It'll present you the Tahoe upgrade but underneath in small print it'll show other updates, which you have to then open and manually select the 15.7.3 update
And you really should keep up on the point updates because there's been a ton of major security patches since 15.4
my password is always incorrect unless i count to about 20 or 30 seconds. once i have 'redocked' for the day, unlocking it subsequently doesnt have the requirement. but every dock insertion, it comes back.
Considering how many people only buy a MacBook PRO no matter what they plan on doing with it, they really need to keep the actual salary-earning pros happy with it or else it’ll lose all credibility. A Mac in a recording booth has a look to it that sells well, but that aesthetic won’t last if you stop seeing them. Being an effective tool for the pro minority should honestly be the priority for MacOS, even at the cost of making it incongruous from iPadOS/iOS. *
* disclaimer: what do I know honestly haha, I’m sure they’ll print money anyway.
Still running Sonoma on my MBP and iOS17 on my phone.
It used to be in the middle of the screen and worked just fine. But then someone thoughts of putting it exactly where browser tabs usually are and I _constantly_ find myself in a situation where I change the volume and try to click on a tab that this UI is on top of. Then I need to move my mouse outside the UI otherwise it stays there, and wait for it to disappear before I can change tabs. It's infuriating.
That should nudge users away from this rather primitive method of window resizing using tiny 19px corners and instead set up a productivity app where your can use the full 33% of the window size (so conveniently huge! and of course customizable) to resize via an extra trigger (for example, using a modifier key)
(nice plate picture joke!)
"You shouldn't need this."
"You should use this more convenient way to fulfill your window resizing need!"
priceless
I almost always never use a mouse for more than maybe moving a tab to another window.
So I am wondering, are people fighting using a Mac in the most effective way simply because of old patterns and habits?
> So I am wondering, are people fighting using a Mac in the most effective way simply because of old patterns and habit
"Most effective" doesn't mean "most intuitive". I don't want to learn keyboard shortcuts just to move or resize a window. That's the entire premise of graphical user interfaces.
The location of the drag region is either the 10px-or-so just outside the window (GTK apps), or just inside the window (I see this in Electron apps). On GNOME, anyway.
On Windows this is caused by the removal of the thick window border with Win10. It wasn't really removed, it was just made transparent instead, thus the drag region moved outside the visible window to avoid the content size changing (for backwards compatibility). Apps often end up in a broken state too, because if you eschew system decoration, you lose the invisible border (which you don't even know you have), and it's easy to end up with a 1px drag region.
It's infuriating, because of the issue the author highlights -- you try and grab the window corner and fail.
It's a sad state of affairs, and a great example of how the basics are going backwards on desktop.
Yes, but that is skeuomorphic design, which is old and ugly. We live in the era of anti-skeuomorphic design, where nothing makes any sense but it looks sleek.
Hilarious. Is Apple attempting to defy the laws of physics?
New Desktop is FreeBSD+MATE. Config is a pain initially but idc.
I don't really see/care where my mouse exactly is. If it is outside or inside the window. Once my cursor turns to resize cursor, I just start dragging.
but its size still makes me use scientific notation to write it in kilobyte unit.
i am calling everyone(apple google..) here to switch their mindset to: "how can we reduce code size?", "what can we get rid of?", "how small can my product be?"...
set rules to measure everything in kilobytes and make your employees realize how big the number you are typing.
if every company thinks like that and stop the madness for a year or two, we might be able to solve the main issue: obesity.
They were praised for their human interface guidelines, and yet they now break almost every rule. I appreciate things change but those guidelines haven’t even evolved they have just been ignored.
Have they truly innovated in the last 10 years? What capitalist reason is for them to actually invest the manpower in the enshittification of the product experience? It feels counterintuitive. Maybe they are just too big to communicate internally?