386 pointsby humbledrone2 days ago25 comments
  • humbledrone2 days ago
    Some folks may have seen my Show HN post for Anukari here: https://news.ycombinator.com/item?id=43873074

    In that thread, the topic of macOS performance came up there. Basically Anukari works great for most people on Apple silicon, including base-model M1 hardware. I've done all my testing on a base M1 and it works wonderfully. The hardware is incredible.

    But to make it work, I had to implement an unholy abomination of a workaround to get macOS to increase the GPU clock rate for the audio processing to be fast enough. The normal heuristics that macOS uses for the GPU performance state don't understand the weird Anukari workload.

    Anyway, I finally had time to write down the full situation, in terrible detail, so that I could ask for help getting in touch with the right person at Apple, probably someone who works on the Metal API.

    Help! :)

    • bambax2 days ago
      > This is going to be a VERY LONG HIGHLY TECHNICAL post, so either buckle your seatbelt or leave while you still can.

      Well, I read it all and found it not too long, extremely clear and well-written, and informative! Congrats on the writing.

      I've never owned a Mac and my pc is old and without a serious GPU, so it's unlikely that I'll get to use Anukari soon, but I regret it very much, as it looks sooo incredibly cool.

      Hope this gets resolved fast!

    • my123a day ago
      Did you try this entitlement? https://developer.apple.com/documentation/bundleresources/en...

      wonder if com.apple.developer.sustained-execution also goes the other way around...

      • humbledronea day ago
        Thanks for the thought, unfortunately when running as a plugin Anukari is subject to whatever plist.txt the host application uses. I think that I did try that with the standalone binary at one point, but unfortunately I did not appear to take notes! That probably means I did not have success.
        • aldrich20 hours ago
          Very cool work.. and frustating running into walls imposed by manufacturers, I imagine! I've also been working on GPU-based audio plugins for a long time and have done some public material on the subject.

          Just my two cents: have you considered using a server/daemon process that runs separately and therefore more controllably outside a DAW (and therefore a client-server approach for your plugin instances)? It could allow you to have a little bit more OS-based control.

          • Archit3ch10 hours ago
            Do you have a link to your stuff?

            > have you considered using a server/daemon process that runs separately and therefore more controllably outside a DAW

            I'm slowly coming to the same conclusion, for audio plugins on GPUs.

    • vlovich123a day ago
      Interesting post & problem. I wonder if the reason that the idea of running the tasks on the same queue fails is for the same reason you have a problem in the first place - variable clock rate means it’s impossible to schedule precisely and you end up aliasing your spin stop time ideal time based on how the OS decided to clock the GPU. But that suggests that maybe your spin job isn’t complex enough to run the GPU at the highest clock because if it is running at max then you should be able to reliably time the stop of the spin even without adding a software PLL (which may not be a bad idea). I didn’t see a detailed explanation of how the spin is implemented and I suspect a more thorough spin loop that consistently drives more of the GPU might be more effective at keeping the clock rate at max perf.
    • TheAceOfHearts2 days ago
      I missed the Show HN, but the first thing that came to mind after seeing it was that this looks like it would lend itself well to making some very creative ASMR soundscapes with immersive multidimensional audio. I selfishly hope you or one of your users will make a demo. Congrats on the project and I hope you receive help on your Apple issues.
    • sunshowersa day ago
      Great post, I found the description clear and easy to understand. I've definitely run into the issue you're describing in other contexts.
    • Dlemoa day ago
      [flagged]
      • It’s technical to over half of programmers who don’t need to know these types of details about hw/sw interactions.
        • Dlemoa day ago
          It's about 'very technical'. If you can explain the problem in one basic sentence it's not very
    • aplummer2 days ago
      Have you filed a feedback? Seems like the right next step.
      • bayindirh2 days ago
        The post opens with the following TL;DR:, snipped for brevity:

        > It would be great if someone can connect me with the right person inside Apple, or direct them to my feedback request FB17475838 as well as this devlog entry.

        • sgerensera day ago
          Feedbacks often go into a black hole unless either: 1. A bunch of people file effectively the same bug report (unlikely here) 2. An individual Apple employee champions the issue internally 3. Someone makes a fuss on Twitter/X and it starts to go viral

          Sounds like the OP is trying to get #2 to happen, which is probably his best bet.

          • badc0ffeea day ago
            Another trick is to schedule some Apple engineer time during WWDC, and plead your case.
      • viraptora day ago
        Feedback is as effective as creating a change.org petition to some politician to stop doing crimes please. You'll be lucky to get an acknowledgement that something's a real issue after months.
  • humbledrone21 hours ago
    Hey everyone, it worked, I had a super productive conversation with exactly the right person on the Metal team! Thanks for helping me get Apple's attention. I didn't at all expect this amount of support.

    https://anukari.com/blog/devlog/productive-conversation-appl...

    • krackers20 hours ago
      >While I can't share any technical details... The engineer provided some suggestions and hints that I can use right now to maybe — just maybe — get things working in the short term

      Great that you have a workaround now, but the fact that you can't even share what the workaround is, ironically speaks to the last line in https://news.ycombinator.com/item?id=43904921 of how Apple communicates

      >there’s this trick of setting it to this but then change to that and it’ll work. Undocumented but now you know

      When you do implement the workaround, maybe you could do it in an overtly-named function spottable via disassembly so that others facing similar constraints of latency-sensitive GPU have some lead as to the magic incantation to use?

    • mschuster9121 hours ago
      Once again, HN has fulfilled its true purpose: cutting through the red tape that is placed in the front of every large corporation's customer support.

      Congratulations and good luck with your project!

  • AJRFa day ago
    I’ve worked in two high profile companies with very prominent apps on the Apple App Store.

    The team we talked to at Apple never ever cared about our problems, but very often invited us to their office to discuss the latest feature they were going to announce at WWDC to strong arm us into supporting it. That was always the start and stop of their engagement with us. We had to burn technical support tickets to ever get any insight into why their buggy software wasn’t working.

    Apples dev relations are not serious people.

    • waffletower21 hours ago
      I am glad that your experience is not the rule, as the OP reveals above. However, I worked for a company about 10 years ago with a fairly prominent app. An update that came out that absolutely destroyed the performance of it. At the precisely the same time, a competitor launched an app which did not have the performance difficulty. It turned out that the developer of the competing app had recently left Apple, and left an undocumented surprise in Apple's video drivers that broke it. It took disassembling the competitors binary to find the undocumented change and repair our application. The developer also taunted our CEO by email. Nice world we live in.
      • refulgentis17 hours ago
        Wow. I know the thicket of contracts and such makes it not worth it, but I do wish behavior like this could be called out more directly
  • krackers2 days ago
    >The Metal profiler has an incredibly useful feature: it allows you to choose the Metal “Performance State” while profiling the application. This is not configurable outside of the profiler.

    Seems like there might be a private API for this. Maybe it's easier to go the reverse engineering route? Unless it'll end up requiring some special entitlement that you can't bypass without disabling SIP.

    • bambax2 days ago
      There has to be a private API for this; the post says:

      > The Metal profiler has an incredibly useful feature: it allows you to choose the Metal “Performance State” while profiling the application. This is not configurable outside of the profiler.

      How would the Metal profiler be able to do that if not for a private API? (Could some debugging tool find out what's going on by watching the profiler?)

      • bambaxa day ago
        Lol I just read the parent comment without noticing that they were quoting the exact same sentence from the blog! ;-)

        Sorry about that!

      • a day ago
        undefined
  • LiamPowell2 days ago
    The problem with exposing an API for this is that far too many developers will force the highest performance state all the time. I don't know if there's really a good way to stop that and have the API at the same time.
    • grishkaa day ago
      There already is an unending number of ways for just one app to waste charge on battery-powered devices. It all already relies on developers not unnecessarily running energy-intensive tasks, either intentionally or accidentally. Adding one more API that has the potential to waste energy if not used appropriately will not change that.
      • madeofpalka day ago
        macOS also has a bunch of mechanisms to inform the user about this! IIRC the battery menu has entries for apps draining a lot of power (iterm always shows up there for me!)
        • tonyarklesa day ago
          My potentially incorrect understanding is that iTerm generally only shows up when the processes you run inside it are consuming a bunch of energy. It only shows up in the battery menu for me when I’m running simulations or other big CPU intensive stuff on the command line.
          • madeofpalka day ago
            Yeah - I've always thought about this and was never sure!
    • JimDabella day ago
      The article mentions game mode, which is a feature of the latest Apple operating systems that is optimised for cases like this. Game mode pops up a notification when it’s enabled, which most applications wouldn’t want to happen. So far I haven’t seen anything abuse it.
      • dentoa day ago
        Requiring a fullscreen window stops almost all possible abuses, though, as you cannot do this from a background process.
    • dupeda day ago
      Developers aren't (yet) abusing audio workgroups for all their thread pools to get pcore scheduling and higher priority. So it would imply that if an audio workgroup is issuing commands to the GPU there should be some kind of timeout to the GPU downclocking based on the last time a workgroup sent data to it.

      GPU audio is extremely niche these days, but with the company mentioned in TFA releasing their SDK recently it may become more popular. Although I don't buy it because if you're doing thing on GPU you're saying you don't care about latency, so bump your i/o buffer sizes.

      • Archit3cha day ago
        > if you're doing thing on GPU you're saying you don't care about latency

        This does not follow. Evidently it is possible to have low-latency audio processing on the GPU today (per the SDK).

      • krackersa day ago
        I'm not too familiar with audio workgroups but since the early days XNU had low-level APIs to set pthreads as pseudo-realtime
        • xmodema day ago
          This will definitely get you to run with a higher priority than the user's time machine backup, but it's not guaranteed to get your code onto a p-core if the machine is on battery and your app doesn't have focus.
    • zamadatixa day ago
      Abusing the API would still be more efficient than running fake busy workloads to do the same, which apps can already fo without the API (or permissions the API could require).
    • nottorpa day ago
      Manual permission? Maybe hidden somewhere, it's probably necessary for very niche apps.

      And default deny at the OS level for Zoom, Teams and web browsers :)

    • Cthulhu_a day ago
      But as the author mentions, they already do it by having a process spin indefinitely. If they want to abuse it, they will and can already.

      It's better to trust, the amount of people that won't abuse it far outweigh the ones that do.

  • threeseeda day ago
    Best way to do this:

    1. Go through WWDC videos and find the engineer who seems the most knowledgable about the issue you're facing.

    2. Email them directly with this format: mthomson@apple.com for Michael Thomson.

    • Hnrobert42a day ago
      Or his brother Pichael at pthomson.
  • vessenesa day ago
    Side note: Anukari should put out a Mick Gordon sound pack and share revs with him. That dude is making some crazy crazy stuff; his demo is awesome. Pairing up with artists once you have such a strong tool is good business and good for the world. If you like Mick Gordon. Which I do.
  • sgta day ago
    I have zero need for this app but it's so cool. Apps like these bring the "fun" back into computing. I don't mean there's no fun at the moment, but reminds me of the old days with more graphical and experimental programs that floated around, even the demoscene.
  • philsnowa day ago
    Don't miss the link thrown in the second to last paragraph to https://x.com/Mick_Gordon/status/1918146487948919222 , a demo Mick Gordon put together, to which @anukarimusic replied

    > Lol on the second day it's out, you have already absolutely demolished all of the demos I've made with it and I've used it every day for two years

  • chrismorgan11 hours ago
    > (An aside: chalkboards are way better than whiteboards, unless you enjoy getting high on noxious fumes. in which case whiteboards are the way to go.)

    That looks to be a smoother chalkboard than I’ve ever encountered. If I had been using such chalkboards, I suspect I’d agree, but based purely on my experiences to this point, my opinion has been that chalkboards are significantly better for most art due to finer control and easier and more flexible editing, but whiteboards are better for most teaching purposes (in small or large groups), mostly due to higher contrast. But there’s a lot of variance within both, and placement angles and reflection characteristics matter a lot, as do the specific chalk, markers and ink you use.

  • phkahlera day ago
    1024 objects updating at 48khz seems possible on the CPU - depending how the code is written. 48M updates per second? It seems like a possible use for OpenMP to run a few loops in parallel across cores.
    • humbledronea day ago
      1. Anukari runs up to 16 entire copies of the physics model for polyphony, so 16 * 1024 * 48K (I should update the blog post)

      2. Users can arbitrarily connect objects to one another, so each object has to read connections and do processing for N other entities

      3. Using the full CPU requires synchronization across cores at each physics step, which is slow

      4. Processing per object is relatively large, lots of transcendentals (approx OK) but also just a lot of features, every parameter can be modulated, needs to be NaN-proof, so on

      5. Users want to run multiple copies of Anukari in parallel for multiple tracks, effects, etc

      Another way to look at it is: 4 GHz / (16 voice * 1024 obj * 4 connections * 48,000 sample) = 1.3 cycles per thing

      The GPU eats this workload alive, it's absolutely perfect for it. All 16 voice * 1024 obj can be done fully in parallel, with trivial synchronization at each step and user-managed L1 cache.

    • cfstrasa day ago
      If my math is right, that gives you 83 clock cycles to calculate a single sample. on a 16 core, theoretically 1333 cycles. that‘s not a lot, considering you don‘t nearly 100% of the cpu all the time.
  • jonas21a day ago
    I'm having trouble understanding what the problem is -- as in, what are the actual symptoms that users are seeing? How much latency can the app tolerate and how much are you seeing in practice? It would be helpful (to me at least) in thinking about potential solutions if that information were available up front.

    Perhaps there's something in this video that might help you? They made a lot of changes to scheduling and resource allocation in the M3 generation:

    https://developer.apple.com/videos/play/tech-talks/111375/

    • humbledronea day ago
      It's a real-time audio app, so if it falls behind real time, no audio. You get cracks, pops, and the whole thing becomes unusable. If the user is doing audio at 48 kHz, the required latency is 1/48,000 seconds per sample, or realistically somewhat less than that to account for variance and overhead.
      • lostmsua day ago
        I find it hard to believe that users would notice latency under 1ms. Probably not even under 5ms.

        Have you tried buffering for 5ms? Was result bad? 1 ms?

  • Someonea day ago
    One thing I don’t understand: if latency is important for this use case, why isn’t the CPU busy preparing the next GPU ‘job’ while a GPU ‘job’ is running?

    Is that a limitation of the audio plug-in APIs?

    • humbledronea day ago
      I attempted to preempt your question in the section of my blog post, "Why don’t you just pipeline the GPU code so that it saturates the GPU?" It's one of the less-detailed sections though so maybe you have further questions? I think the main thing is that since Anukari processes input like MIDI and audio data in real-time, it can't work ahead of the CPU, because those inputs are not available yet.

      Possibly what you describe is a bit more like double-buffering, which I also explored. The problem here is latency: any form of N-buffering introduces additional latency. This is one reason why some gamers don't like triple-buffering for graphics, because it introduces further latency between their mouse inputs and the visual change.

      But furthermore, when the GPU clock rate is too low, double-buffering or pipelining don't help anyway, because fundamentally Anukari has to keep up with real time, and every block it processes is dependent on the previous one. With a fully-lowered GPU clock, the issue does actually become one of throughput and not just latency.

    • kllrnohja day ago
      That's pipelining and it's good for throughput but it sacrifices latency. Audio is not a continuous bit stream but a series of small packets. To begin working on the next one on the CPU while the previous one is on the GPU requires 2 samples in flight which necessarily means higher latency
      • Someonea day ago
        I don’t see that. If the CPU part starts processing packet #2 while the GPU processes packet #1, not after it has done so, it will have the data that has to be sent to the GPU for packet #2 ready earlier, so it can send it earlier, potentially the moment the GPU has finished processing packet #1 (if the GPU is powerful enough, possibly even before that)

        That’s why I asked about the plug-in APIs. They may have to be async, with functions not returning when they’re fully done processing a ‘packet’ but as soon as they can accept more data, which may be earlier.

        • dupeda day ago
          Audio is already asynchronous.

          But in general no, you can't begin processing a buffer before finishing the previous buffer because the processing is stateful and you would introduce a data race. And you can't synchronize the state with something simple like a lock, because locking the audio playback is forbidden in real time.

          You can buffer ahead of time, this introduces latency. You can't do things ahead of time without introducing delay, because of causality - you can't start processing packet #2 while packet #1 is in flight because packet #2 hasn't happened yet.

          To make it a bit more clear why you can't do this without more latency:

          Under the hood there is an audio device that reads/writes from a buffer at a fixed interval of time, call that N (number of samples, multiply by sample rate to get in seconds). When that interval is up, the driver swaps the buffer for a new one of the same size. The OS now has exactly (N samples * sample_rate) to fill the buffer before its swapped back with the device driver.

          The kernel maps or copies the buffer into virtual memory, wake the user space process, call a function to fill the buffer, and return back to kernel space to commit it back to the driver. The buffer you read/write from your process is packet #1. Packet #2 doesn't arrive until the interval ticks again and buffers are exchanged.

          Now say that processing packet #1 takes longer than N samples or needs at least M samples of data to do its work and M > N. What you do is copy your N samples of packet #1 into a temporary buffer, what until M samples have been acquired to do your work, but concurrently read out of your internal buffer delayed by M - N samples. You've successfully done more work, but delayed the stream by the difference.

        • kllrnohja day ago
          You're requiring that packet #2 be available before packet #1 has finished. That's higher latency than the goal, which is packet #1 is processed & sent to output before packet #2 has arrived at all.

          Or perhaps you're missing that there's an in event as part of this, like a MIDI instrument? It's an in->effect->out sequence. So minimizing latency means that the "effect" part must be as small as possible, which means it's desired for it to happen faster than "in" can feed it data

    • grandinja day ago
      this might trick the heuristics in the right direction ie. feed the GPU a bunch of small tasks (i.e. with a small number of samples) instead of big tasks.
    • mort96a day ago
      I mean the CPU can't prepare a job for samples which don't exist yet. If it takes 0.5 milliseconds to process 1 millisecond's worth of audio, you'll necessarily be stopping and starting constantly. You can't keep the GPU fed continuously.
  • dgs_sgda day ago
    > in parallel with the audio computation on the GPU, Anukari runs a second workload on the GPU that is designed to create a high load average and trick macOS into clocking up the GPU. This workload is tuned to use as little of the GPU as possible, while still creating a big enough artificial load to trigger the clock heuristics.

    That's quite the hack and I feel for the developers. As they state in the post, audio on the GPU is really new and I sadly wouldn't be holding my breath for Apple to cater to it.

  • PaulHoulea day ago
    It's an interesting trade-off. For decades the answer to having a reliable Windows computer has been to turn off as many power saving features as possible. Saving power on USB plugs for instance makes your machine crash. Let your CPU state drop to the minimum and you'll find your $3000 desktop computer takes about a second to respond to keypresses. Power savings might not be real, but the crashes and poor performance are very real.
  • rock_artista day ago
    While very different, it was already tricky in the past to make Apple silicon (on iPhones as well) perform reasonable.

    Ableton engineers already evaluated this in the past: https://github.com/Ableton/AudioPerfLab

    While I feel for the complaints about the Apple lack of "feedback assiting" The core issue itself is very tricky. Many years ago, before being an audio developer, I've worked in a Pro Audio PC shop...

    And guess what... interrupts, abusive drivers (GPUs included) and Intels SpeedStep, Sleep states, parking cores... all were tricky.

    Fast forward, We got asymmetric CPUs, arm64 CPUs and still Intel or AMDs (especially laptops) might need bios tweaks to avoid dropouts/stutters.

    But if there's a broken driver by CPU or GPU... good luck reporting that one :)

  • notnullorvoida day ago
    Sorry to hear about the issue, not too surprising given Apples track record with this kind of thing though (You still can't even pin processes to specific CPU core/threads). Anukari is really cool though, wish you had a Linux build :)
  • Liftyee21 hours ago
    Out of curiosity, what's the origin of the Anukari name?
  • thraway3837a day ago
    This is all just too much Stockholm syndrome. Apple’s DX (developer experience) has always been utterly abysmal, and these continued blog posts just goes to show just how bad it is.

    Proprietary technologies, poor or no documentation, silent deprecations and removals of APIs, slow trickle feed of yearly WWDC releases that enable just a bit more functionality, introducing newer more entrenched ways to do stuff but still never allowing the basics that every other developer platform has made possible on day 1.

    A broken UI system that is confusing and quickly becomes undebuggable once you do anything complex. Replaces Autolayout but over a decade of apps have to transition over. Combine framework? Is it dead? Is it alive? Networking APIs that require the use of a 3rd party library because the native APIs don’t even handle the basics easily. Core data a complete mess of a local storage system, still not thread safe. Xcode. The only IDE forced on you by Apple while possibly being the worst rated app on the store. Every update is a nearly 1 hour process of unxipping (yes, .xip) that needs verification and if you skip it, you could potentially have bad actors code inject into your application from within a bad copy of Xcode unbeknownst to you. And it crashes all the time. Swift? Ha. Unused everywhere else but Apple platforms. Swift on server is dead. IBM pulled out over 5 years ago and no one wants to use Swift anywhere but Apple because it’s required.

    The list goes on. Yet, Apple developers love to be abused by corporate. Ever talk to DTS or their 1-1 WWDC sessions? It’s some of the most condescending, out of touch experience. “You have to use our API this way, and there’s this trick of setting it to this but then change to that and it’ll work. Undocumented but now you know!”

    Just leave the platform and make it work cross platform. That’s the only way Apple will ever learn that people don’t want to put up with their nonsense.

    • dupeda day ago
      I don't disagree with you, but there simply isn't an alternative for pro audio developers. You go where the users are and the majority of the market (by revenue) are Mac users.

      Now a lot of people may reply to this that Windows isn't that bad with ASIO (third party driver framework) or modern APIs like WASAPI (which is still lacking), or how pipewire is changing things on Linux so you don't need jack anymore (but god forbid, you want to write pipewire native software in a language besides C, since the only documented API are macros). Despite these changes you have to go where the revenue is, which is on MacOS.

      • overfeeda day ago
        > I don't disagree with you, but there simply isn't an alternative for pro audio developers

        People used to say this about video pros too, until Apple royally screwed the pooch by failing to refresh its stale Mac Pro hardware lineup for many years, followed by a lackluster Final Cut release. An entire industry suddenly realized Windows was viable after all, they just hadn't bothered to look.

        • atonsea day ago
          But they had to be pushed in that direction. IT actually affected their work.

          In this case, the users of these tools seem perfectly ok with them and aren't going to just explore something as disruptive as an entirely different OS just for kicks.

          • overfeeda day ago
            Not sure why you started off with "but" when we are in agreement and/or you're not disputing my point - that Windows is viable but the Mac-using audio professional aren't (yet) sufficiently motivated to seriously evaluate Windows as a migration target.

            > In this case, the users of these tools seem perfectly ok with them

            That wasn't my takeaway from the article. The plugin is outright broken on the latest hardware, even with the workaround.

            > [...]something as disruptive as an entirely different OS just for kicks

            I don't think switching OSes is less disruptive than switching software packages. Cubase or Ableton on Windows is not much different from the respective DAWs on Mac OS. Modern desktop OS UI paradigms map 1:1, so switching isn't a big deal

            • atonsea day ago
              That comment was left in haste, sorry. To clarify, I meant that in the case of FCP and movie editing software, they were almost forced to switch.

              The FCP upgrade didn’t just break the main app, but the plugin ecosystem was wiped out too. (From what I read, I’m not a movie pro). And that was disruption forced upon the users.

              So in that scenario, they didn’t have much of a choice.

              But in this scenario, the audio apps work well and it’s just the developers complaining.

              And even though I’m a developer, I would say as long as the users are happy then I can see why there is less concern about dev happiness

      • johnnyjeansa day ago
        > You go where the users are and the majority of the market (by revenue) are Mac users.

        One of the worst things about Apple is how much time and effort they spend trying to lock you into their platform if you want to support it. There's no excuse for it. Even once they have you on their system, they're doing everything in their power to lock you in to their workflows and development environments. It's actually insane how shamelessly hostile OSX is.

        • dupeda day ago
          This is "don't anthropomorphize the lawnmower" territory imo. I don't think Apple is actively hostile to 3P developers or tries to lock them in. I think they simply don't care - or lack the institutional capacity to care even if individual developers in the organization want to care.

          The Apple developer experience is an abject horror because they believe everyone who is capable of developing high value applications for Apple devices works at Apple, or will work at Apple. 3P devs are a nuisance they tolerate rather than a core value-add for their services and devices. I assume it's less bad within Apple, but I really have no idea.

          • johnnyjeansa day ago
            I'd argue you can see the hostility if you compare shipping to Windows vs shipping to Apple. Microsoft doesn't care if you copy over your MSVC suite into a Wine environment to build your software for their platform. Even SignTool just works. It's not necessarily trivial to do, but that's simply because the MSVC suite is a horrible mess like everything else Microsoft.

            Apple explicitly disallows cross compilation in their Terms of Service. Even if you managed to get clang compiling for Mac on another Unix, even if you figure out how to get your app bundles signed outside of OSX, they'll revoke your developer license and invalidate your certs because you're in violation of their ToS. You're right they don't care about third party devs, but the amount of hoops you have to jump through for devops on Mac is almost certainly designed as a gluetrap.

            • freedombena day ago
              Agreed.

              I think Apple is actually one of the few companies that you should anthropomorphize because they have shown a long history of making decisions based on long term strategy rather than short term profits. They also react emotionally sometimes. Best example coming to mind is Steve Jobs on Accessibilty, "I don't care about the bloody ROI." I of course cheered that attitude, and still do for a11y, but that is a very human-like thing to do. Also lets not forget his hatred toward Android and vengeful attempt to kill it. Hence I don't think Apple is a lawnmower. They're more like an elephant with it's objectives and they know they're going to squash a lot of lesser life in the process but "you can't have an omellette without breaking a few eggs."

            • sunshowersa day ago
              (From private conversations with people at large tech corporations, my understanding is that that provision of the ToS is inconsistently enforced. Obviously not a good place to be for independent developers, since it partly depends on if you're important enough.)
            • matwooda day ago
              > I'd argue you can see the hostility if you compare shipping to Windows vs shipping to Apple.

              Windows has had 3rd party developers built into its DNA since the beginning though. Even today, Windows goes to great lengths to maintain backwards compatibility. I think this comes from the fact that MS has always been a software first company built around market domination.

            • pasc1878a day ago
              Apple is mainly a hardware company it is saying you must buy our hardware if you want to make money out of our users.

              That is not being developer hostile. Apple does many other things that don't help developers but forcing their hardware is just an entry cost.

        • cardanomea day ago
          I don't even understand why they are following a cash cow strategy of milking their current customer base dry when they could be growing massively.

          They have amazing hardware that is far superior to the competition and that they can build at very competitive prices while still making good money.

          Building a PC in 2025 absolutely sucks. The prices are getting insane. Plus Windows 11 is super hated. It is the perfect time for Apple to win over people.

          They just need to stop kneecaping their great hardware but the shitty software side. Just open it up a little bit. Add Vulkan support. Actually make your GPU usable. Actually help Steam to do their magic like they did with Linux, no one is going to buy games on the bloody Apple store anywhere. Show some respect to the developers.

          Shareholders giving up massive growth for short term profits. So frustrating.

        • anon7000a day ago
          Eh, is this worse than Windows? You develop UWP or WPF or whatever the current flavor is with Visual Studio, and use C# APIs that only exist on their platform.

          On Mac, I can use bash/zsh mostly how I would on linux. The main compatibility issues come from BSD tools vs GNU, which are very simple to replace if you want. On Windows, they use PowerShell, which is totally proprietary.

          On Mac, web & infra development can use completely open source tooling which can be shared with Linux.

          You can still use VS Code to edit Swift (or C#), but the more "proprietary dev environments" (Xcode or Visual Studio) are probably more powerful with system level integrations.

          Heck, you can use PyQT on mac if you don't like Swift or Xcode.

        • spacemadnessa day ago
          There is an excuse: shareholders. The more lock in there is, the more the champagne flows.
          • johnnyjeansa day ago
            Trillion dollar market caps don't come for free. The Apple Tree must be watered with the blood of third party developers so our gracious overlords can Do Computing Different.
          • scarface_74a day ago
            You can’t imagine how often I pine for x86 based Windows laptops with horrible battery life, loud fans, and enough heat that if I work with my laptop on my lap for an extended period of time there will be no future Scarface’s. But I’m locked into my Mac because of the dirth of Windows software.

            Not to mention all of the great Android tablets that I can’t get or the much faster Android devices…

            • bigyabaia day ago
              This would be a more poignant point if you had to sacrifice any of these things to get better developer experience. But you don't, you're intentionally conflating wildly different things to defend an irrational stance.
              • scarface_74a day ago
                If my developer experience depended on me using an x86 machine, I would have to sacrifice all of these things
        • scarface_74a day ago
          Exactly what are they suppose to do if not create their own frameworks to bear leverage their own hardware? Do you want them to use cross platform frameworks that are not optimized for their system?
          • johnnyjeansa day ago
            My problems start long before the special APIs come into play. When we supported Mac, I just wrapped the APIs like you do for any other system. The problem is I don't use Mac, so building software for Macs is inherently troublesome. I can build and test for Windows just fine from Linux and OpenBSD. I can't for Mac.

            Now you might say this is problematic, Apple doesn't want third-party developers locking their platform behind some conditionally compiled set of abstractions that ruin everything they've worked for. Putting aside how ridiculous that is given system APIs are often wrapped for normal abstraction reasons anyways, that's totally fine. But then, it's also not my problem because I'm not Apple. I don't mind supporting their platform, I'll even turn a blind eye to the audacity of charging a developer fee while offering abysmal documentation and support. But I'm not going to crawl and beg for the privilege.

            > Do you want them to use cross platform frameworks that are not optimized for their system?

            Just like everybody else, because it hardly matters. Outside of Apple-land, Intel, AMD and Nvidia all get along just fine with rewriting SPIR-V to their microarchitectures. CPUs get along just fine rewriting abstract instruction sets like AMD64 and the various ARMs to their microarchitectures. Code is by-default compiled for instruction compatibility. APIs like CUDA and ROCm explicitly exist for vendor lock-in reasons. There's absolutely no reason why the throughput of these APIs can't be generically applied to compute shaders. None at all. The hardware vendors just want to capture the market.

            Apple isn't exactly working with exotic hardware. The M1 is yet another ARM chip, not some crazy graph-reduction machine. These standards are fine and used across a wide-derth of hardware to no real detriment. I would suggest you may over-estimate how much they actually care about this idea of "specially optimized APIs." Consider that Apple pushes Swift as the primary language you Should be Using to ship software on OSX, and yet garbage collection is still handled in software. That's not what vertical integration for engineering purposes looks like.

            Again, it all hardly matters. I wouldn't mind just wrapping these APIs, they're not particularly special or exotic any more than their hardware is. But the fact of the matter is that as a non-mac user, they go through a lot of effort to ensure putting software on their platform is as unattractive as possible.

            • cosmic_cheesea day ago
              The way I see it, they mainly just aren’t interested in devs treating macOS as just another lowest-common-denominator target, which makes some amount of sense. Such software is likely to not be as nice to use as something purpose-built for the OS, particularly when considering that the dev probably never even tested the software in question under macOS, greatly hampering their ability to find and eliminate bugs.
            • scarface_74a day ago
              Why would I want to use software that you never tested on the target machine?
              • johnnyjeans5 hours ago
                I explicitly conflated building with testing not even 4 sentences into my post.
              • bigyabaia day ago
                Because it's not 1982 anymore?
                • scarface_74a day ago
                  I didn’t buy a Mac to use your lowest common denominator untested, unoptimized application that doesn’t take advantage of my hardware to its fullest.
                  • bigyabaia day ago
                    You're using HN right now. The web is one of many types of write-once-run-anywhere software you are critically reliant on in your day-to-day life.
                    • scarface_7421 hours ago
                      And I run Safari - a browser built by Apple to be battery efficient, memory efficient and better optimized than Chrome and uses native frameworks and UI.

                      My computer, the processor that runs in it, the operating system, much of the software and the rest of my computer life - phone, watch, set top device, tablet work together. I can copy text from one and paste it into the other. My watch unlocks my computer. My iPad can be used as a second monitor without any third party software.

                      I bet you HN tested it on iOS if not the Mac and not just hope the site looks fine

      • How is WASAPI lacking? I thought the abundance of FL Studio beat producers proved that for audio work, today, Windows is completely fine.
        • dupeda day ago
          Basically every software (except Logic) is available on Windows but users still buy Macs. Even among those users, people usually fallback to ASIO instead of directsound or wasapi backends.

          WASAPI requires exclusive mode to be useable for pro applications, or else your latency will suffer and they may be doing some resampling behind the scenes.

          • Exclusive mode is a feature, not a bug. If the user needs the bits coming out of the user's DAW to reach the speakers as pristine as possible, then the user probably doesn't want these bits mixed with any other application. If the user needs to switch between the DAW and a Youtube tutorial, then there's probably no need for exclusive mode.

            Latency is a valid concern, but is it really bad? PCs are fast now.

            • dupeda day ago
              afaik it's not possible to configure buffer size and sample rate from within a user application without exclusive mode, which actually does matter for the non-exclusive use case. iirc it was even worse where applications' streams would be resampled transparently and buffered, which is absolutely not what you want.

              I don't use windows for audio anymore so I can't comment on this in win11, but it used to be that WASAPI suffered unless you set your PC in "performance" mode in your power settings whereas ASIO was unaffected.

              And yes, latency matters! For live performance you're looking for < 2.5ms of one-way latency to get a roundtrip of under 5ms. After that point it starts being perceptible to players. This is not a performance floor so much as a scheduling one, and ime windows audio scheduling under dsound/wasapi was always shakey.

            • ziml77a day ago
              For a DAW running in exclusive mode, wouldn't they also have the option of setting the system default output to a virtual device that inputs into the DAW? That seems to me like it would be the most sensible way to handle mixing in YouTube or whatever.
      • trinsic2a day ago
        There is no buts. If you make a decision that you are going to live with a framework that is hostile to your efforts, then that is really your choice to make your life harder. If you really want to make things better for pro audio devs stop enabling organizations that want to mold you into there way of looking at the world. Blog about it and let the industry know you will not tolerate non-freedom in software. The same goes for windows people. Move away from non-open source, take back control of your life and find other endeavors that support open source until the idiots get the point. Stop enabling your jailers.
      • freedombena day ago
        Agreed, you're in probably the toughtest spot.

        That said, Reaper and many others have done great things with DAWs and other audio processing in C++. Maybe getting a "native" look is too difficult, but I figured I'd throw it out there.

        • dupeda day ago
          The problem is not the application software but the operating system and hardware. Linux is arcane and Windows is permanently behind the curve.
      • eikenberrya day ago
        > (but god forbid, you want to write pipewire native software in a language besides C, since the only documented API are macros)

        I've read that Zig can wrap C macros. So maybe there is some hope.

      • mvdtnza day ago
        > You go where the users are and the majority of the market (by revenue) are Mac users.

        You go to a different market.

      • bigyabaia day ago
        FWIW I have worked on several professional live audio productions and never heard of Anukari once. This is a pretty niche domain regardless of what circles you work in. It's really not about "X job is for Y OS" here.

        > there simply isn't an alternative for pro audio developers.

        Tell me you don't work on live audio without telling me you don't work on live audio. Windows has always been usable if you have a suitable ASIO (same as you used to use on Mac). Most shows will use some permutation of Windows boxen to handle lighting, visuals rendering, compositing and audio processing. The ratio of Macs to Windows machines is at least 1:10 in my experience.

        Heck, nowadays even Linux is viable if you're brave enough. Pipewire has all the same features Coreaudio was lauded for back in the day, in theory you can use it to replace just about anything that isn't conjoined at the waist with AU plugins. Things are very different from how they were in 2012.

        • dupeda day ago
          > Tell me you don't work on live audio without telling me you don't work on live audio.

          This is pretty rude, I'm among the (probably small) subset of HN users who has developed real professional audio software.

          All I can talk about is my experience, which is that in the plugin market a plurality of your revenue will be from MacOS users. My last job in this market had zero Windows/Linux users.

          Now I have done a good bit of live work on Windows machines with ASIO, but I also do a bit of work there myself from time to time in venues with musicians - and I don't really know any musicians that are carrying around Windows laptops. 100% of them are using Mainstage and Ableton on Macbooks.

        • NexRebulara day ago
          > Windows has always been usable if you have a suitable ASIO

          The gold standard being RME hardware and drivers. Not a single issue ever on windows.

      • ForOldHacka day ago
        They sell hardware, that shows demos well. They cannot do a demo at MacWorld because it does not exist anymore and worse, they don't care. I would suggest a jack black/school of rock appeal, but you are speaking to a company that is literally tone deaf to everything but what sells product.

        There is no revenue in MacOS, there is only revenue in machines that run A free OS, that they consistently lock their loyal customers out of.

    • fxtentaclea day ago
      The Apple DX used to be pretty great around 2010. But by now, it's laughably bad. With every additional OS update, they asked for more and more work (and expensive EV signing certificates) to keep our pro audio app working, which is why it has since been abandoned.

      In fact, I'm now working on a USB hardware replacement for what used to be a macOS app, simply because Apple isn't allowing enough control anymore. Their DX has degraded to the point where delivering the features as an app has become impossible.

      Also, USB gadgets are exempt from the 30% app store tax. You can even sell them with recurring subscriptions through your own payment methods. Both for the business owner and for the developer, sidestepping Apple is better than jumping through their ridiculous hoops.

      • pixl97a day ago
        With the potential criminal filings against Apple hopefully we see them back off a bit.

        And yea, over the years you could tell Apple stopped giving a shit except to turn everything into an app store where they can earn 30% and it's lessened the experience.

        • freedombena day ago
          We can hope, though Apple is the most "malicious compliance" company I can think of, so I don't doubt they'll figure something out.
      • matwooda day ago
        The DX was certainly better. I wrote some early iOS apps and the docs were good. They have never been at the MS level though, which is what I programmed for in my day job at the time. MSDN was an achievement.
    • galad87a day ago
      It's surely not perfect, and so much is quite horrible, but at least try to keep the facts in check. AppKit and auto layout are still working fine, they aren't going anywhere any time soon, there is no need to rewrite all the UI code.

      Core Data threading? Well, it has got its pitfalls, but those are known, and anyway, nothing is forcing you to use it.

      Xcode is so slim these days, it a ~3 GB download, it doesn't take an hour to unxip, and it can be dowloaded from the developer website.

      Swift? It might be needed for a bunch of new frameworks, Objective-C isn't going anywhere anytime soon either.

      • ryandrakea day ago
        I wouldn't call Xcode slim. It currently sits at 13GB+ as installed on my drive, and that does not include the simulators which are, what, 10GB each or something? Xcode is by far the largest application I have installed on my "daily driver" Mac.
        • galad87a day ago
          It's still compressed on disk, so it takes only 5,4 GB of space, not 13 GB+. Sure, the simulator and the iOS or the other SDKs will take more space, but those aren't needed to develop macOS apps.
      • gjsman-1000a day ago
        Let's also keep in mind that the Linux desktop commits most of these offenses, but worse.

        Core Data threading? Does Linux even attempt something like Core Data? How well is that going?

        Swift? I remember when Linux diehards invented Vala. The Swift of Linux, but with none of the adoption.

        As for UI code, Linux is finally starting to get a little more stable there. GTK 2 to 3 was a disaster; Qt wasn't fun between major upgrades; if you weren't using a framework, you needed to have fun learning the quirks of Xorg; nobody who builds for Linux gets to lecture Mac about UI stability.

        Or, for that matter, app stability in general. Will a specific build of Blender outside of a Flatpak still work on the Linux desktop after 2 release cycles? No? Then don't lecture me about good practices. Don't lecture me about how my website or app was sloppily engineered because it has dependencies.

        • graemepa day ago
          Why bring Linux up?

          Are the target users for this likely to use Linux (rather than Windows) if the ditched Apple?

          > Swift? I remember when Linux diehards invented Vala. The Swift of Linux, but with none of the adoption

          Plenty of languages used on Linux. Why pick one that did not gain traction?

          > f you weren't using a framework, you needed to have fun learning the quirks of Xorg;

          Who does that?

          > GTK 2 to 3 was a disaster; Qt wasn't fun between major upgrades

          But they are cross platform.

          > Will a specific build of Blender outside of a Flatpak still work on the Linux desktop after 2 release cycles?

          Does that matter? Maybe a bit of extra work for packagers - and people can use Flatpack or Snap.

        • hshdhdhj4444a day ago
          You seem to be conflating 2 different things. Apple’s OS proficiency and the associated technologies they support on their OS and Apple’s dev tools proficiency.

          People use Apple’s dev tools because they are the only/best way to deliver apps on Apple’s OSes.

          If we changed the situation, so that Apple Dev Tools could be used to create applications for non Apple OSes, or non Apple Dev tools were first class citizens for creating Apple apps, I bet the vast majority of people would use the non Apple dev tools to create both Apple and non Apple apps.

          What’s keeping Apple Dev Tools in the game is their privileged position in the Apple OS ecosystem.

        • cosmic_cheesea day ago
          And the UI situation still has issues. If you want flexibility in language choice, GTK is the only modern-ish framework option there is. The rest are tied to 1-2 languages, bad at accessibility, look archaic, etc.
        • 4ndrewla day ago
          [flagged]
          • People complaining about whataboutism are more obstinately committed to avoiding decent conversation than the people who commit it. This ain’t a formal debate.
            • bigyabaia day ago
              [flagged]
              • Formal in what sense? There's certainly no set form for discussion outside of threading. But there's certainly no assumption of persuasion or sense of shared goals or values.
          • gjsman-1000a day ago
            Even Wikipedia says Whataboutism can be completely deserved in some cases.

            It is absolutely deserved here - Apple built a 100 foot tower, and it's grown hairy over the last few decades. Linux built 7 30 foot towers without stairs in the same timeframe; but yelling about the overgrowth on the 100 foot tower is still somehow defensible.

            If they can't build their own towers correctly, they have no right to act like the main tower was built worse than their own.

            (Edit, posting too fast: For the complaint that Apple has money, Linux does too. 90%+ of work on Linux comes from corporate sponsorship, and has since 2004 when it was first counted. They are fully capable of doing better.)

            • hshdhdhj4444a day ago
              Corporate sponsors on Linux provide a fraction of the money Apple does and even what they do are geared towards their own needs.

              But more relevant is the fact that their donations are focused on running Linux as servers and there Linux is miles ahead of anything Apple provides, to the point that Apple has abandoned its server OS.

            • umanwizarda day ago
              “Linux” isn’t a person or a company. Different people contribute to it with different goals.

              > 90%+ of work on Linux comes from corporate sponsorship

              And approximately 0% of these corporate contributors care about the “Linux desktop” experience. Unlike Apple their goal is not to build a consumer-targeted OS.

              Linux on the desktop is very, very niche, and even among the people who do use it, a lot of them will spend almost all their time in just a few windows (e.g. terminal, browser, emacs), not a rich array of desktop applications.

              • amliba day ago
                If you haven't used linux desktop for a while, even a year ago, try again. Use a bleeding edge distro like the latest Ubuntu or Fedora ideally running Wayland and you will be surprised how smooth and feature-full it has become, with gobs of high quality apps available with no finicky compile instructions or crazy installation steps needed to follow.

                Whatever rough edges you may encounter will keep being sanded down at a speed I haven't witnessed since when linux was the hot new thing in the 90s. Linux desktop felt stale and abandoned trough-out the 2010s but nowadays its pretty marvelous how fast it's becoming a real alternative to windows and mac. I truly believe that if it had the proper developer adoption and first class hardware support from OEM vendors it would already be a true alternative.

                • Klonoara day ago
                  I’m pretty sure I read your exact comment way back in 2006. ;P
            • sunshowersa day ago
              Most of the work on the Linux kernel is commercially funded. Plenty of other parts to a KDE/GNOME/systemd/GNU/Linux desktop.

              (I'm a pretty happy desktop Linux user, mostly because I don't think commercial OS vendors' incentives are properly aligned in the B2C space.)

            • mixmastamyka day ago
              Apple has money… like coming out of their ears.
            • 4ndrewla day ago
              > If they can't build their own towers correctly, they have no right to act like the main tower was built worse than their own.

              And yet OP did.

    • pjmlpa day ago
      On the old Mac OS, and early OS X day, the documentation was great, I dunno what happened to the documentation team.

      Swift on the server is for Apple ecosystem developers, to share code, just like all those reasons to apparently use JavaScript on the server instead of something saner.

      • spacemadnessa day ago
        I found some of that older documentation one day while being beyond frustrated with understanding some underdocumented iOS library APIs and it is incredible. What they have now is a joke in comparison. WWDC as a documentation strategy is terrible for people that learn from text. And it’s just a bad medium to begin with for information transfer. By its nature it’s bad at information density, and is often distracting and filled with fluff.
        • pjmlpa day ago
          Yes, some of it survives on the archives, who knows for how long, always save copies of them.

          I wonder if it is a generation gap, as many apparently learn coding via videos, however that it is not enough to go deep.

          By the way, Microsoft suffers from the same diseas, they reduced their team size, ans unless one is coding since the 16 bit days, there are many things no one will find.

          Some of it is gone forever, as they kept replacing their documentation, blogs and video platforms.

          Other is still there, but you have to have actually used that in practice, to find out the Win32, or .NET Framework documentation that nowadays only gives the most recent version.

          Or even Microsoft Systems Journal articles, as another example.

          Google on Android is also a mixed bag, depending on what one is looking for.

          • skydhasha day ago
            My belief is that taking the time to write docs is kinda like an editing process of your thinking. You start to think hard about the reasons you've written an API and how it could be better. And there's a limit to how big you can think about something as a whole so you will naturally try to modularize and layer things. Not just adding things in an adhoc fashion.

            I think they threw the towel when they realize the mess they've built. In contrast you have things like RHEL, FreeBSD, etc, where there's a drive to keep things small and neat just to be able to document them.

          • HelloImStevena day ago
            Articles get removed from Apple's documentation archive seemingly randomly. However, on a good note, there are backups of the entire ADC Leopard Reference Library (available at several places online). That covers that vast majority of all the documentation Apple's ever produced. There's also the Apple II FTP archives, which have older but often less applicable documentation, but are definitely still valuable troves of information.
      • wpma day ago
        The story I have heard was that they all got shit canned. They were professional technical writers focused on telling you “why”. These days most of Apples docs are “what”, and a lot of those don’t even tell you anything. Enums without values documented. Class docs that amount to “yep this class exists”.
      • cjpearsona day ago
        I remember it being pretty good circa 2012 or so. The API docs generally told me what I needed to know and there were some helpful in-depth technical notes. Did they lose something in the redesign or Swift migration?
        • Klonoara day ago
          It was starting to go downhill even then. They let go a significant number of documentation people for whatever reason.
      • no_wizarda day ago
        > JavaScript on the server instead of something saner

        JS on the server is actually really fast and well supported. Not really sure what you're driving at here.

      • newscluesa day ago
        Marketing pushed for features faster than engineering could build them properly to the former standard (quality and documentation).
        • pjmlpa day ago
          Ever seen the compiler support tables for ISO managed languages, Web or Khronos standards?
          • newscluesa day ago
            No but I noticed software quality went down when they added emojis
    • jmulla day ago
      > Stockholm syndrome

      I don't think that's apt. What you find to be "abuse" others might find to be the kind of obstacles/issues that every platform/ecosystem has.

      It probably helps if you never put Apple on a pedestal in the first place, so there's no special disappointment when they inevitably turn out to be imperfect. E.g., just because Apple publishes a new API/framework, that doesn't mean you need to jump on board and use it.

      Anyway, developers are adults who can make their own judgements about whether it's worth it to work in Apple's ecosystem or not. It sounds like you've made your decision. Now let everyone else make theirs.

      • spookiea day ago
        Their single biggest priority is to provide a better user experience for your customers compared to competitors.

        Units sold in the smartphone world uses the same function video game consoles' market does: you simply offer a bigger and better software offerings, not just hardware.

        If you, as a developer, have a worse time contributing to that ecosystem, then it is just a matter of time before the users themselves have a worse time with their device.

        I take the comment above as a signal that something is clearly not working towards Apple's goals. Of course, you make your own judgements to support a platform or not, but this indicates that decision is a lot easier than it should be. In detriment of Apple's ecosystem.

        All in all I wouldn't discount it.

        • pixl97a day ago
          >Their single biggest priority is to provide a better user experience for your customers compared to competitors.

          I mean in late stage capitalism their single biggest priority is to become a rent seeking monopoly by regulatory capture. If they can accomplish that, then user experience is a distant concern.

          Luckily it looks like apple is having some problems with that recently.

      • pixl97a day ago
        >What you find to be "abuse" others might find to be the kind of obstacles/issues that every platform/ecosystem has.

        Right, that's why judges are making criminal recommendations to the US prosecutors. No abuse at all.....

        • jmulla day ago
          The context here was DX. Maybe start a new thread to change to a new topic?
      • owebmastera day ago
        > so there's no special disappointment when they inevitably turn out to be imperfect.

        Oh poor Apple. If only they had the resources and engineers to fix that. /s

    • HelloImStevena day ago
      Apple's documentation used to be quite good—many useful guides, thousands of technical notes, development books quarterly—it's really a shame that they've turned their back on that. Their old docs leaned toward being overly detailed, which some complained about at the time, but I'd much prefer that over near radio silence.

      Apple's also been deleting more and more of its old documentation. Much of the it can only be found on aging DVDs now, or web/FTP archives if you're lucky. Even more annoying is how some of the deleted docs are _still_ referenced modern docs and code samples.

    • EasyMarka day ago
      lol and people wonder why devs like using electron front ends for their back end code, despite the memory cost. I only have so much time in the day, so I'm doing a lot of calling of c++ backend code to display my data analysis output and configuration on Electron front end. I may look into wasm someday, when that mythical "extra time" comes around
    • a day ago
      undefined
    • eigenspacea day ago
      It's honestly nuts that so many developers continue to try to make software on MacOS. I understand the appeal of their current hardware, and I used to even be a big fan of the user experience, but it really seems like attempting to build software in MacOS is like trying to build a house on a sandbar.

      Apple has done nothing and continues to do nothing to engender any confidence in their platform as a development target.

      • philistinea day ago
        > Apple has done nothing and continues to do nothing to engender any confidence in their platform as a development target.

        You're missing the forest for the trees. Apple is very difficult to work with indeed, but they have a shit-ton of paying users. Still to this day, iOS is a better revenue maker than Android. Same for macOS compared to Windows. You want to make a living? Release on macOS. People there pay for software.

        • johnnyjeansa day ago
          > Same for macOS compared to Windows.

          This hasn't ever been my experience. Maybe if you're in a really specific market niche where most of the userbase is on Mac. Only 5% of users on Windows paying for the software still absolutely dwarfs 100% of Mac users paying for it. We have more sales on Linux than we do Mac.

          • oritrona day ago
            > We have more sales on Linux than we do Mac

            That's interesting, what's your product? There are a few pieces of software on Macs that I would love to pay for on Linux but the option isn't there.

          • wahnfrieden21 hours ago
            I deploy to iOS, iPadOS, macOS and Mac accounts for around 25% of downloads
        • arvinsima day ago
          I don't believe this. For iOS, sure. But for MacOS? The number of people that uses Windows dwarfs MacOS.
          • gjsman-1000a day ago
            Dwarfs MacOS, sure; but the user base has been conditioned, like Android, to never purchase anything. Why would they purchase anything, when most of their time is spent in the web browser and maybe a few Adobe apps?

            iOS is 27% of the mobile market; but total revenue through the App Store in 2024 was $103 billion. For Google Play, it was $46 billion. Double the sales, from a market 1/3rd the size. Whether we like it or not, the whole open platform of Windows being a breeding ground for viruses and piracy, and the ongoing cultural expectations that set, caused a direct effect on people's willingness to buy Windows software from unknown publishers without a third party (Steam, Microsoft Store) vetting them.

            I expect it's highly situational. Don't expect to sell many games on Mac. However, I do find it interesting that services like SetApp exist on Mac, but nobody has tried anything with that level of quality on Windows. SetApp also hasn't shown any interest in expanding to Windows.

        • a day ago
          undefined
        • newscluesa day ago
          Paying users is the key.

          I’d imagine that people have failed to attract users who pay on Linux or windows and developers know that people use their software via piracy.

      • pjmlpa day ago
        It is still better engineered that dealing with the distribution of the day, reinventing the way to do sound, graphics stack, UI, ......

        Once upon a time I thought either GNOME or KDE would win, and we could all enjoy the one Linux distribution, I was proven wrong.

        Then again, I have been back on Windows as main OS since Windows 7.

        • gjsman-1000a day ago
          I don't know why you are downvoted here.

          The engineering standards, and churn within the Linux desktop, are hilariously bad.

          Nobody who uses it has a right to complain about how node_modules has a thousand dependencies and makes your JavaScript app brittle. Their superior Linux desktop won't even be capable of running the same software build outside of a Flatpak without crashes in three years.

          As for lack of documentation, good luck pulling together all the pieces you need to write a fully native Linux application without using Qt, GTK, or a cross-platform solution. Maybe you have your own UI stack that needs porting. A simple request, fairly accomplishable on Mac. The lack of documentation on Linux outside of that privileged route will make Apple's documentation look like a gold standard. Heck, even if you stay on the privileged route, you're still probably in for a bad time.

          • sunshowersa day ago
            Speaking of development workflows, has Apple finally implemented a scan-resistant LRU cache within their VFS layer? Last I checked performance would fall off a cliff once you started scanning more files than can fit in the cache.
          • skydhasha day ago
            > good luck pulling together all the pieces you need to write a fully native Linux application without using Qt, GTK, or a cross-platform solution

            Isn't those the native stacks? Unless you're going for system programming. The nice thing about GTK and Qt is that you have access to the source code when you're trying to find the behavior of a component (if the docs is lacking). No such luck with AppKit.

          • bigyabaia day ago
            Desktop Linux is so good today that I have not turned on my Mac in 4 years. Sorry your experience has been so bad, but for ease of programming it is a black-and-white decision. Even Windows is a less excruciating Linux development environment, modern MacOS is a veritable dumpsterfire.
            • pjmlpa day ago
              Unless one does 3D development, or real time audio processing.

              My Asus Linux netbook, bought with Linux support, never had the same OpenGL support level as on the Windows drivers.

              And in what concerns hardware video decoding, it only worked during Flash glory days, never managed to get it working with VAAPI.

        • Suppaflya day ago
          >Once upon a time I thought either GNOME or KDE would win, and we could all enjoy the one Linux distribution, I was proven wrong.

          Linux users don't want one to win. As soon as one gained any traction, the users would switch just for the sake of it. It's also crazy how neither ever actually improves because they are so focused on copying whatever windows and mac are doing instead of continuously improving. The linux desktop experience isn't any better now than it was 20 years ago.

          • amliba day ago
            > The linux desktop experience isn't any better now than it was 20 years ago.

            You can't say that with a straight face. 20 or so years ago you would barely have hardware support for anything you wanted to use, or have to go trough a battery of guides just to get 50% of your computer working. Nowadays you just boot a live environment and likely 99% of your computer works out of the box, even tough your OEM gave ZERO shits about linux support.

            Wi-fi was between impossible or pray it works and use a bunch of disparate of cli commands to properly join a network. Nowadays I see linux being casually used on random machines without a single problem regarding Wi-fi, and the GUIs for managing it are as cromulent as what you get on other OSes.

            X kept being patched to make it do modern things it was never meant to do, thus creating a huge technical debt that is finally being payed off with proper wayland implementations.

            Linux audio went from a complete turd to best in class with the "merging" (more of a complete rewrite but with full backward compatibility backed in) of pulseaudio and jack into pipewire.

            It's now easy to acquire random linux desktop apps, and they keep working between upgrades! What a concept! Developers are actually finally having a decent time developing apps for desktop linux. Maybe it's no WIN32 but hey, you can run those too with WINE and PROTON trough Steam, Lutris, Bottles and so on :)

            I could keep going... Honestly, just give it a try if you haven't in a while.

            • Suppafly3 hours ago
              >You can't say that with a straight face. 20 or so years ago you would barely have hardware support for anything you wanted to use, or have to go trough a battery of guides just to get 50% of your computer working.

              Sure on weird hardware, but it you had something that was decently supported like a thinkpad, everything mostly just worked, same as now. A lot of your "linux improved..." stuff doesn't matter to end users for the most part. It's nice that they're growing the tent, but it doesn't change the fact that actual desktop experience hasn't improved much, despite it having been "year of the linux desktop" for the last 20ish years.

            • pjmlp17 hours ago
              Thing is many of us actually do, as it happens we have to reach out to GNU/Linux deployments on a daily bases.

              So audio is the best in class, how many industry DAWs support Linux, and are used at any random audio studio? Not that many.

              The netboook I had until 2024, never handled our router without issues, rebooting the wlan daemon was a common activity during "heavy" downloads, like e.g. a new Rust version.

              What works without issues on my place are Android/Linux, WebOS/Linux, and Sony/Linux (BlueRay).

              Proton is Valve's failure to nurture developers to target GNU/Linux, even though Android/NDK has the same technology stack for game development, and Sony's OrbitOS is close enough with its FreeBSD roots, even with its proprietary 3D API.

      • scarface_74a day ago
        Have you programmed using the Windows APIs?

        For example there are over a dozen ways to define a string and you constantly are having to convert between them depending on the API you are using.

        https://www.reddit.com/r/cpp_questions/comments/10pvfia/look...

      • gjsman-1000a day ago
        The problem is that this could be easily applied to many things. To paraphrase:

        It’s honestly nuts that so many developers continue to try to make software using a bloated JavaScript framework and thousands of Node dependencies.

        That might also be true but that misses the point - programming is not engineering; nothing is done to an engineer’s preferred standard; and probably never will.

        It’s like being a CNC Technician and complaining about how 90% of stuff on store shelves is plastic. A metal gallon of milk would be so much more durable! Less milk would be spilled from puncturing! Production costs, and how they go downstream, are being ignored.

        (Edit for the downvotes, dispute me if you care enough, but literally nobody other than computer programmers ogles your clean code. Just like how nobody other than CNC mechanics are going to ogle the milk carton made on a lathe.)

        • skydhasha day ago
          Software engineering is not programming and is not about clean code. Using electron is building a skyscraper when you want to rest or a suspended bridge for crossing a small river. Even if you can order almost everything and you're just assembling, it is a wasteful and lazy solution.
          • bigyabaia day ago
            It's very easy to understand. More platforms makes more money - your tinkertoy MacOS native frameworks aren't worth shit for nothing when Windows users will account for 90% of their customers.

            Wasteful? Wasteful is whichever solution takes the most money while giving the least in return. From the perspective of any rational business, not using Electron is an opportunity cost. Any Mac user knows the truth well, the web has been a more reliable runtime than native since Mojave.

            • skydhasha day ago
              > Any Mac user knows the truth well, the web has been a more reliable runtime than native since Mojave

              And we've got Sketch, Things 3, Bear, Omnigaffle and the whole Omnigroup suite, cleanshot, Alfred,... I'm not trying to defend Apple's ecosystem, but if opensource can deliver Libreoffice, calibre, VLC,... on all platforms, there's little defense for others to burden users with Electron.

    • favoriteda day ago
      > Networking APIs that require the use of a 3rd party library because the native APIs don’t even handle the basics easily

      This is nonsense. I've been a professional Mac and iOS developer for well over a decade, and even in the days of NSURLConnection, I've never needed a 3rd party networking library. Uploading, downloading, streaming, proxying, caching, cookies, auth challenges, certificate validation, mTLS, HTTP/3, etc. – it's all available out of the box.

      • adamwka day ago
        Yeah NSURLSession is great and I always find whatever library I’m forced to use clunkier than using it directly with some helper methods.
    • ryandrakea day ago
      I think people forget how horrible embedded (including mobile) programming was before the iPhone SDK came along. I just posted about this[1], so I won't repeat myself here, but TLDR: developing for the iPhone is a breath of fresh air compared to how unnecessarily difficult embedded and mobile development was (and in many cases still is) on other platforms.

      1: https://news.ycombinator.com/item?id=43657086

    • BonoboIOa day ago
      I'm constantly amazed how developers worship Apple while Apple couldn't care less about them. Bugs that never get fixed, documentation that's incomplete, wrong or non-existent, and their bug tracking is a complete joke.
    • Oh please, every platform and programming environment has undocumented apis, workarounds and hacks.
  • ramesh31a day ago
    Be careful what you wish for here. Knowing Apple, they will stonewall any API requests, and may very well shut your app out for the private API workarounds described.
    • mort96a day ago
      I don't think Anukari is in the Mac App Store, nor do I think a plug-in like it will ever be appropriate for the App Store, so I don't know what exactly you're worried about.
  • charcircuita day ago
    >Any MTLCommandQueue managed by an Audio Workgroup thread could be treated as real-time and the GPU clock could be adjusted accordingly.

    >The Metal API could simply provide an option on MTLCommandQueue to indicate that it is real-time sensitive, and the clock for the GPU chiplet handling that queue could be adjusted accordingly.

    Realtime scheduling on a GPU and what the GPU is clocked to are separate concepts. From the article it sounds like the issue is with the clock speeds and not how the work is being scheduled. It sounds like you need something else for providing a hint for requesting a higher GPU clock.

  • SOLAR_FIELDSa day ago
    https://xkcd.com/1172/ feels a lot like the workaround OP describes
    • rollcata day ago
      That's more like "I had to trick the OS into thinking that spacebar was held for my application to run at all".
  • Another 'appeal to the tsar'?
  • [flagged]
  • ArthurStacksa day ago
    [flagged]
    • mac9a day ago
      Some of them probably do... This is still a funny comment though