As an avid pirate, I’ll say these days even the Denuvo game which were going years without cracks now have “cracks”, although they rely on hypervisor fixes and disabling secure boot and giving the hypervisor cracks unfettered access to your system to intercept the Denuvo checks. [0] It’s a dangerous game we’re playing to keep these AAA games bottom lines fat.
[0] https://www.thefpsreview.com/2026/04/03/denuvo-has-been-brok...
...making it even more clear what "secure" boot actually secures: the control others have over your own computer.
If you own the computer yourself, you "ought" to be able to turn off these measures in a way that is undetectable. Being unable to do so would be the red line imho - and looking at those hypervisor cracks available, it's not quite being crossed. The pessimistic, but realistic future prediction is that various media companies would want and lobby for machines to have unbreakable enclaves for which they can "trust" to DRM your machine, and it's just boiling the frog right now. Windows 11's new TPM requirement is testament to that.
Switch to linux asap - that's about the only thing a consumer is capable of doing.
If you're starting to think "huh, maybe that's why these age verification laws suddenly became all the rage", you're onto something. Whatever the case, "general purpose computing" is definitely cooked.
Hardware vendors are not going to want that kind of liability, in California, Colorado, New York, or anywhere else. So they will switch to selling hardware with locked bootloaders and only allowing approved operating systems within that locality (which for end-user PCs will mean pretty much just Windows). There is still foreign hardware, but those chinesium PCs are going to be confiscated by ICE unless the Chinese manufacturers also play ball.
Besides all this... federal legislation is coming.
So let's say a PC builder(an individual; not a company) were to donate a PC to charity. Let's say it's built with a fairly recent MSI motherboard(https://www.amazon.com/dp/B0BRQSWSFQ/) 'MSI PRO B760-P' if you'd prefer to avoid amazon.
I remove all my internal SSDs and NVME drives but buy a new 1tb SSD for whoever receives the PC. I also install a Linux OS, as well as sign the secure boot keys via sbctl myself, setup ukify, efibootmgr, etc. Everything the recipient would need to switch over to another OS if they so choose.
But oh no, the donated PC landed in the hands of Johnny, a 17-year old in California.
So who's at fault here, MSI for creating a BIOS that allows for non-windows EFI images to be installed? The PC Builder(donator) for knowingly installing Linux(though not knowing where it would end up)?
This is kind of what confuses me and I'm curious what this means for future hardware sold in the US and those who build PCs for their own use or others. Most modern motherboards are "locked down" by default, but can easily be unlocked by the end-user, it may take a few extra steps or be a bit harder to find but still pretty simple for someone moderately tech-savvy.
Measured boot is actually better for that: You can still boot whatever you want however you want, but hashes are different which can be used for e.g. remote attestation. Secure boot has to prevent that "unauthorized" code (whatever that means for each setup) can ever run. If it does, game over. That means less freedom and flexibility.
Exactly why i said
> turn off these measures in a way that is undetectable.
If you own the device, you ought to have the means to make such configuration/changes in undetectable ways. Otherwise, you don't truly own the device.
Some apps want to run on devices that you don't "own", because they are doing something the owner would not want done (in secret or what not).
that being said, it does assume a certain trust in firmware vendors / oems. If you dont trust those, then dont buy from them.
i think for most ppl trusting OEM or trusting rando from interwebz with a custom hypervisor and requirement to cripple my system security are totally different things ..
u know they could actually make theyr HV support secure boot etc. to do it properly and have ur system run the cracks but not have gaping holes left by them -_-. lazy.
Boring claim, obviously true.
> and result in a situation worse than not having secure boot to begin with
A very big claim that requires evidence.
See Apple M chips which if they get locked you will never unlock them again.
* GeForce NOW SDK: https://developer.geforcenow.com/learn/guides/offerings-sdk
* Stadia SDK: developer.stadia.com (offline)
* Xbox Cloud Gaming: https://learn.microsoft.com/en-us/gaming/gdk/docs/features/c...
* ...
Just like every Game Store requires its own build: Steamworks SDK, even GOG: https://docs.gog.com/sdk/
Some games allow browsing files locally for savegames, music libray, ... . Imagine if you could do that on the cloud VM.
Stadia is completely shutdown and Archive.org has no captures of that subdomain so any content there is likely lost.
Without it you are risking that the malicious driver will be loaded first and then make itself invisible to the later drivers.
Of course there are ways to bypass this too, but it adds a whole other layer of complexity.
Tldr
Secure boot is there so drivers loaded at boot time can trust that nothing was tampered with before they were loaded.
I don't think any competent security researcher has anything positive to say about "security through obscurity"
at best this is lawyer position
Obscurity is totally underrated. Attacker resources are limited.
Further more you can also reveal position of the attacker and counterfire.
If someone with 1000 tanks attacks, it's a battle you would not have won anyway.
Sure it's not a security measure as such, but it's still a worthwile component to the overall defense system.
Are you familiar with the Swiss cheese model of risk management[0]? Obscurity is just another slice of Swiss cheese. It's not your only security measure. You still use all the other measures.
This isn't about security of the same kind as authentication/encryption etc where security by obscurity is a bad idea. This is an effort where obscurity is almost the only idea there is, and where even a marginal increase in difficulty for tampering/inspecting/exploiting is well worth it.
They apply to different threats and different contexts. When you have code running in the attackers' system, in normal privilege so they can pick it apart, then obscurity is basically all you have. So the only question to answer is: do you want a quick form of security through obscurity, or do you not? If it delivers tangible benefits that outweigh the costs, then why would you not?
What one is aiming for here is just slowing an annoying down an attacker. Because it's the best you can do.
Some people find cracking them interesting and fun.
The goal is not perfect security in all situations for all products. The goal is to make the effort required for your particular product excessive compared to the payoff.
Take the PS5 for example. It has execute-only memory. Even if you find a bug, how do you exploit it if you can't read the executable text of your ROP/JOP target?
Which provides way more information than the article
Heuristic-based anticheat seems to have fallen out of favor.
I honestly believe we should return to dedicated servers + admins. This hacker/anti-cheat arms race is never going to end.
it is wild to imply they are remotely the same in their effect on the user. one is literal malware, and the other shares 0 of the capabilities or effects of malware.
- from the slides
I have no idea why would anyone want to do that on Nintendo Switch though, Switch 1 doesn't have any headroom and Switch 2 OS security hasn't been defeated yet.
playing an online game, especially if it is competitive, alongside a bunch of cheaters is not fun.
reducing the number of cheaters is not "nothing"
security through obscurity is an effective defensive layer with a relatively low implementation effort. it raises the minimum effort required for bypass.
the quote you have parroted is only applicable when obscurity is the only defense layer. when obscurity is used in addition to other defensive layers, it is a great first line of defense.
You are wrong, if you need to hide your code for it to be secured, then it was never secure to begin with.
But it’s a great way to give a false sense of security through half baked metaphors.
what does this have to do with anything i said or the contents of article?
>You are wrong, if you need to hide your code for it to be secured, then it was never secure to begin with.
did you just ignore the entirety of my last comment? obscurity is a first layer, solely to raise the barrier of entry and slow the game-crackers down. it is not the entire security model.
it is effective at what it is designed to do, and it is low effort to implement.
>But it’s a great way to give a false sense of security through half baked metaphors.
my comments dont have any metaphors. what are you talking about? i think you may be out of your depth here.
your entire comment is based on the premise of obscurity being the only security. i can only say the same thing so many times, but here it is one more time: your original comment is only applicable if obscurity is the sole line of defense. it is not the sole line of defense here.
If you can’t understand very simple logic like, how open source vs closed source is the perfect example of open vs closed… well source, then guess who is out of his depth here?
calling me obnoxious and not responding to literally any point i have made in the entire comment chain is an interesting way to win an argument.
good luck.
I won’t lose time trying to understand what your tech poser logic is trying to piece together. Hiding your source is not how you secure software. It’s how you pretend to be secure for your shareholders.
That’s the main reason open source software is more secure than closed source. You don’t need to hide secure code, it actually makes it less secure because less good actors will be able to help you secure it.
It won’t change a thing for malicious actors.
So again, for people who might read this, he’s very, very wrong.
lol
>I’m trying to find what’s true.
i am telling you what is true, straight from someone with significant experience in the related fields.
>Hiding your source is not how yous secure software.
not once have i said that hiding your source = secure software.
you are intentionally ignoring and misrepresenting what i have said. i do not understand why.
>So again, for people who might read this, he’s very, very wrong.
for the context of other readers, i worked in cybersec for over a decade and now teach networking for the cybersec and game design programs at a post-secondary level (also in the pure networking program, but that is less relevant).
my opinions in this comment chain are not claims i came up with on a whim. i am happy to discuss them in more detail with anyone who has questions, provided that you will actually read what i have written instead of flailing around.