The reason that computers are locked down by the vendors is not that computers are somehow more dangerous than other things we buy. The reason is simply that it's technically possible to lock down computers, and vendors have found that it's massively, MASSIVELY profitable to do so. It's all about protecting their profits, not protecting us. We know that the crApp Store is full of scams that steal literally millions of dollars from consumers, and we know that the computer vendors violate our privacy by phoning home with "analytics" covering everything we do on the devices. This is not intended for our benefit but rather for theirs.
But on many systems these options do not exist because the vendor likes people dependent on them. This is why devices like chromebooks or all mobile phones are more or less e-waste in the making. In my opinion it is a waste to use any development capacity for these systems apart from consumer devices offering the next shitty app that hopefully always stays optional.
We even have dysfunctional laws that require banking apps to only run on these shitty systems. In my opinion, these errors need a quick correction.
Also, the most cases of scam still work as they did before and exfiltrating information, e.g. tracking and "diagnostic data" by bad operating systems are an additional security problem.
Making it easy to unlock could make it easy(er) for scammers to get it unlocked:
> I received the same type of call a little later in the day. They were very adamant they were calling from the Bell data centre, on a terrible line and I made them call back three more times while I considered their requests. They wanted to have me download a program that would have given them controI of my laptop. […]
* https://forum.bell.ca/t5/Internet/Call-stating-that-an-issue...
Making laptops that weigh two pounds instead of 40 pounds could make it easier for thieves to steal them. Making computers less expensive could increase the number of spammers who can afford one and make it easier to send spam. Making encryption widely available could make it easier for bad actors to communicate.
But these things have countervailing benefits, so we do them anyway and then address the problems by a different means. When someone insists on doing it in the way that "incidentally" provides them with a commercial advantage, suspect an ulterior motive.
It would be reasonable to:
- factory reset the device before unlocking it to protect existing data (like Android phones require)
- display warnings, for example "if someone's asking you to do this, it's probably a scam"
- for the owner to be allowed to permanently disable unlocking, e.g. the commonly cited example of someone setting the device up for their elderly parents
This opens a wormhole that warps us back to one of the core issues / battlegrounds in computing: ownership, and specifically, the balance of power and responsibility between the owner and the user, when they're not the same person.
Unfortunately, the same means and the same arguments cited in case of "someone setting the device up for their elderly parents" also apply to employers "setting the device up" for their employees (where "setting the device up" may just mean letting it access the company network), vendors "helpfully" "setting the device up" for the customers (this is basically the whole history of mobile phones - bootloaders now, SIM locks before), etc.
I don't know what the good answer is. I'm personally strongly biased towards "end-user should always be the owner" perspective, and while I recognize there are strong examples where this isn't the case, I can't figure out how to cleanly separate "legitimate interest" from for-profit or for-power abuse.
And while we're at it, let's not allow apps to refuse to run because of rooting.
I think it's pretty consistent, whoever legally owns the device should be allowed to decide what is and isn't allowed to run on it.
I never understood this point. From what threat is it protecting the data from? Surely a thief should not be able to unlock a device without first typing the correct pin/password, and it they can do that they should be able to access the data regardless.
There have been times when I really wished that I could OEM unlock my Android device without wiping but overall I think I sleep better knowing that my PIN isn't sufficient to extract all of its data.
Ahh, if only governments would start cracking down on scammers.
Alas, scammers are a feature of modern capitalism. You'd not be wrong if you thought modern businesses are built on scamming people.
Dependent is not exactly the right thing here. Lower support costs probably is. If a vendor gives out root access. If that root access can brick a machine. Then you will get a small percentage of very high touch broken things as returns. Customers like this are in the 'dangerous enough' but not 'good enough to do it correctly' stage of hacking. They will then not claim any responsibility for breaking it. As they are hoping you just fix it for free.
I had one customer who would randomly change out stored procedures on our code. Then yell at our tech support for thing not working or being broken. Wasting hundreds of hours of time until we realized what he was doing. Locking him out is very appealing. Instead we sold him and his management on 'we will do the work for you for a fee'. Which was more along the lines of 'you do this again we will fire the customer'.
That is but one small thing that can/will happen.
I'd be really surprised if the number was more than 1 in 100. And if 1 in 20 brick the device in the process, that's 1 in 2000.
According to [1] the average warranty claims rate for consumer electronics is 1 in 100. I doubt the difference in support load would even register on the scale.
Exactly. We charged the guy for what he did. We gave him 'sa' access to the database and he tried to burn us.
I think you may be assuming people act rationally? They do not. Most will but you will always get 'that guy' especially at scale. People will lie about what they have done. Or not even realize what they did goofed things up. In my example the guy was asking us to pay them back for defective software (millions of dollars). Right up until we proved he had broke it on purpose. I later found out he did it on purpose (confirmed by former coworkers of his 'he likes to mess with vendors'). He was not even alone. At least 3 other people tried that trick on us at different companies.
Most service requests are 'easy'. Small tweak/reship and off you go. But someone who has really broken something can be as easy as 'ship them a new one' to weeks of trying to figure out why a device has suddenly started acting out of spec. That means at least 1-2 people working on something for a period of time. That costs money.
> I'd be really surprised if the number was more than 1 in 100.
It is the time you have to put into looking into why did you end up with a defect that is not a defect. The margin on some of these IoT devices is in the couple of bucks range or smaller. You have to dedicate 2 guys for 3 months to figure out what went sideways can eat the entire profit margin of the whole run.
I was just saying I can see why a company would withhold the info. I did not say I agree with it. Especially for things that are out of warrantee. I think companies are using it to basically have no support and basically leave what would be a decent customer hanging and hoping they can covert to another sale. There is no 'one reason' there is a list.
It does seem like there ought to be a reasonable split between personal software and business stuff. I mean you guys had a big contract, it is some negotiated thing between two peers, it could be reasonable to negotiate root in some subsystems, not in others. In the end you can’t really trust anything a system tells you if somebody has full root of it. It seems like you guys keeping control of the logging would be a reasonable give for them, if they expect support. (But why would you guys have planned around a downright adversarial customer? That guy is weird).
Also, doesn’t this seem like… basically some kind of fraud? I wonder if your annoying user expected to be able to add the savings whatever he got back from the support contract to his “value to the company” somehow.
For personal customers who are just buying smartphones, we don’t really have giant support contracts to screw around with.
For IoT devices/cell/etc it could be 'bad' to give out the root password from a company PoV. As there are so many out there with the exact same password on several thousand devices (poor security but you can image a thousand devices in a few hours). So once given out it is written down into some wiki and everyone has it now (welcome to the botnet). So if you get one change whatever you were given and assume everyone else has it. Or maybe the 'secret sauce app' is under some random user account. But give out root and that special secret account is bypassed. Then it is off to china somewhere to be ripped apart and resold under a new brand name and half the cost.
Then on top of that lets say you are a nice company giving the thing out. That means you will need some sort of training for your support guys. Documentation on how to do it. And so on. Those things cost money for a EoL product you no longer make anything on.
Like I said there is a list of things as to why not to do it. There is also an interesting list of why to do it. But the upside is low for the company to allow it. I wish more companies would do it. But it is rare.
If people want companies to do this, the company has to be incentivized to want to spend any time/money on it. If people can make this an upside to companies doing this and not 'shame' and 'you broke the law' the companies will help.
You can even combine them if you want. Free support for the software that comes with it but if you replace the boot loader then support calls are billed hourly. There is no excuse for not allowing it -- it's leaving money on the table.
Unless the reason is that locking the user out of the device has the purpose of monopolizing ancillary markets, which should be an antitrust violation.
What you are describing is not a tradeoff but a magnificent bribe. They bribe us with measly benefits in order to accept the deal that is incredibly favourable for them.
Really? Please name them. Over the past 10 or 15 years, I've never seen anything other than the iPhone/Android or Mac/Windows duopoly for sale in any retail store. I've never seen any advertising for other than those duopolies. The HN crowd may be aware of obscure options, but for the vast majority of consumers, they don't exist. And since we as developers make money catering to the vast majority of consumers, we're kind of stuck with the duopoly too, at least as far as our work is concerned.
That's purely hypothetical. How could any prove or disprove the assertion?
The general point, though, is that consumer awareness is essential for sales. People won't buy things that they don't know about. As an indie developer myself, I'm painfully aware of this. It doesn't matter how great one's product is if nobody knows about it. Advertising is very expensive, so it requires vast capital outlays in order to get your products into the minds of consumers and onto the shelves of stores. The big established brands have a massive advantage, making it difficult for competitors to break into the market. Apple itself leveraged its existing brand, with Mac and iPod, in order to promote iPhone. And Apple's primary competitor is Google, who also was already an established brand via search and Chrome.
Remember that back in the day, Microsoft almost destroyed the entire desktop OS market. They almost killed Apple too. Only the Department of Justice put some kind of break on it, and Microsoft let Apple live in order to provide antitrust cover. If MS had for example simply withdrawn its apps from Mac—Office, Internet Explorer (remember that Internet Explorer was originally the default web browser on Mac OS X before Safari!)—Apple likely would have died.
The hardest part about switching from Facebook isn't installing some other app or anything like that, it's getting everyone else you know to switch from Facebook.
The hardest part about switching from Windows isn't installing Linux, it's getting e.g. game developers to target Linux before it has significant consumer market share.
That isn't to say that doing these thing is impossible, but it certainly isn't trivial, so anyone wondering why it hasn't happened already can't seriously think the only explanation is that nobody cares. It's like saying nobody cares about high healthcare costs -- of course they care, the question is what do we have to do to fix it?
Now, how many of you guys have this? Or anything like this? I bet 95% of the HN crowd happily uses iOS/Android daily.
Anyway, desktop computers aren't really the main problem here. For example, Apple Macs offer vastly more personal freedom than Apple iPhones. If iPhones behaved like Macs in that respect, then we might not be having this debate. To the extent that Macs have been increasingly locked down over the past 15 years, it's mostly just copying the iPhone, porting the "features" over from one platform to the other.
That's quite an assumption.
I think the reason people don't care, is because they don't know. The average person either doesn't know or barely knows That anything deeper than what they see in the user interface is happening on their system.
We humans are very much an out of sight out of mind type of creature. If we can't see it, it's hard for us to imagine that it exists.
The reason people don't care is because digital freedom/privacy is largely irrelevant to most people's lives. You can't convince someone to care about something that doesn't affect their life, they're too busy for that.
GP is correct about Apple products; even among the HN crowd they are likely the most popular devices. I think this is because most readers aren't trying to die on the hill of openness. They're more concerned with software and ubiquity, two areas where Apple is doing very well.
You do get many here enthusiastic about open access to your own hardware, but I think we're talking about a Venn diagram; we're not all the same. (I'm an Android user.)
The thing about all this is, Apple's products being well-integrated and well-designed doesn't require them to be locked down the way they are. The EU move to force them to use USB-C/Thunderbolt over Lightning is a perfect example of this. It unilaterally improved things for users, and iPhone 14 vs. 15 sales reflected that pretty clearly.
So I'd especially describe Apple as the "least bad" rather than "completely acceptable." They're specifically what I had in mind saying that.
That's definitely true, and it's what has made me favor Google over Apple for decades now. Google's deal has been free software for the price of your user data, but I've accepted that deal because Google has never practiced predatory lock-in. Apple makes claims to value your privacy (I wouldn't know) while making predatory lock-in fundamental to everything they do. Denying access to your own device is part of this.
The irony is that I loathe the data economy. I think it has gone far beyond what Google ever envisioned (for years it seemed they had yet to discover a way to make money at all). The privacy aspect matters, but I also hate the way it makes companies and their products behave; the way it feels like every click results in an attempt to directly advertise to you. And it's all clumsy and broken. How often are ads even correctly targeted? I feel about conglomerated user data the way I feel about meme coins: it's all built on speculation, hopes, and dreams, and has less to do with people actually buying your product. I can't wait for the bubble to burst and/or for a global ban on the sale and purchase of user data.
For your last sentence, though... user data and its utilities are arguably not a "bubble." And as we've seen with AI training, use of data being illegal doesn't really stop companies from doing it. I think we'll have better actual results from governments forcing Apple to let us run our own software on the hardware we buy, as opposed to governments trying to prevent Google, Meta, et. al. from abusing customer data.
A lot of this has to do with the fact that the former is about regulating our rights with hardware, while the latter is about software. Hardware is just easier for governments to regulate. When you try to regulate software, companies will do things like the deliberately-annoying cookie popups we got after GDPR/CCPA, because it's cheap to produce lots of bullshit to experiment with ways around those regulations.
You make a fair point though; the case does need to be made as to why this is a market failure and not just consumer choice working as expected. Why _do_ consumers tolerate manufacturers retaining ultimate control of consumer's property after the sale? It certainly doesn't seem to be that important to them. Maybe greater awareness of the issue would help somewhat?
Just my opinion from many conversations with normies about this: It's because most of them don't know (the marketing material from these companies certainly doesn't advertise it), and the ones who do know don't care because they wouldn't be able to (technical knowledge) or want to root/unlock and utilize the capabilities.
This is a good point. Some of that is perhaps self-perpetuating: Why root if there's nothing you can do with root? And why develop stuff you can do with root if there's nobody who can use it? If there weren't so much active suppression of software freedom by manufacturers maybe the situation would change and the benefits of consumers having full control of their devices would be more apparent.
Agree fully. Don’t know why you’re being downvoted. I accept the risk or tradeoff of Apple or MS spying on me. It’s not that, but the right to repair, to tinker, to hack. Those things have brought us so much interesting wonderful things. My entire generation (millennial) has superior tech literacy to both those that came before and after (no shade to the older gen - some of you are better than us, but with millennials it’s so much more widespread than eg gen X). Many younger gens never use ”real” computers (only tablet & phone). The gilded age was an anomaly, and is over.
> the case does need to be made as to why this is a market failure and not just consumer choice working as expected
I swear this consumer choice navel gazing will be the death of innovation. The US is obsessed with this narrative, that the magic market hand will self-correct, without any justification or scrutiny. Yes, consumer choice is necessary, but not sufficient. Just look at the developments in tech over the last decade+. I don’t have the solution but anyone who’s not entirely lost in dogma should be able to see the failures.
There are costs to any regulation, and lots of possible unintended consequences. So even though I'm personally a strong advocate for user control and software freedom, I'm wary of acting without strong justification and careful consideration of the underlying reasons behind the status quo.
> I accept the risk or tradeoff of Apple or MS spying on me.
For what it's worth, I do think this issue has indirect effects on privacy. If you have ultimate control of the software on your device, you can use that control in ways that help protect your privacy. Otherwise you're limited to whatever protections the manufacturer decides to grant you.
There are lots of similar positive possible downstream effects of software freedom, which is why I think this is an issue worth serious consideration despite my misgivings.
The underlying premise here is that the alternative is available for consumers to choose, i.e. that you can buy something which is otherwise equivalent to an iPhone but supports third party app stores or installing a third party OS. But that isn't the case.
What you get instead is e.g. Fairphone, which has the specs of a $200 phone but costs $800 and if you actually have root your bank app might break etc. And still many people buy it. So all you can conclude from this is that the price the mass market places on freedom is less than $600 plus some non-trivial usability issues, not that they value it at zero and don't care about it at all.
On top of this, it's a threshold issue. If the median phone was rooted, people would develop apps that need root. When the percentage is some low single digit if not a fraction of a percent, they don't, and then taking the trade offs of a phone that can be rooted isn't buying you what it should because you need a critical mass in order to achieve the expected benefits, but you need the benefits in order to achieve the critical mass. This is the sort of situation where a mandate can get you over the hump.
> There are costs to any regulation, and lots of possible unintended consequences.
A good way to handle this is through anti-trust, because then you can do things like exempt any company with less than e.g. 5% market share. That means not Apple or Google or Samsung, but if there is any major problem with the rule then the market can work around it by having 20+ independent companies each provide whatever it is that people actually want. Meanwhile that level of competition might very well solve the original problem on its own, because now a couple of them start selling unlocked devices without any countervailing trade offs and that's enough to make the others do it.
And then the whole world suddenly went apeshit, so Apple basically shrugged, said “fine, we’ll do it just like everyone else and put your photos in the relatively unprotected server domain to do the scan”. Sucks to be you.
Understand that at no point was there an option to not do the scan on upload, like all cloud providers, Apple scans for CSAM on any uploaded photos to stay out of any government grey areas.
Apple only ever scanned images being uploaded to the server. They were only ever going to scan images (even if it was done on the local device) if they were uploaded to the server.
On the one hand you have:
- do the scan in private, get a pass (I'm assuming we all get a pass), and no-one outside of your phone ever even looks at your images.
On the other hand, you:
- do the scan on upload. Some random bloke in support gets tasked with looking at 1 in every 10,000 images (or whatever) to make sure the algorithm is working, and your photo of little Bobby doing somersaults in the back garden is now being studied by Jim.
If you never uploaded it, it was never scanned, in either case.
So yes, you've lost privacy because faux outrage on the internet raised enough eyebrows. Way to go.
This is hacker news after all. What made the computer great was programs. What made the smart phone great (smart) is applications. It's insane to me that these companies are locking down their most valuable assets. The only way this works is if you're omniscient and can make all the programs users could want yourself. This is impossible considering both individuality and entropy (time). Both in the sense that time marches on and the fact that you don't have time nor infinite resources to accomplish all that. I mean we're talking about companies that didn't think to put a flashlight into a phone but it was one of the first apps developed. You could point to a number of high utility apps, but I'm also sure there's many you all use that you're unlikely to find on most people's phones.
We can also look at the maker community. Its flourished for centuries, millennia even. People are always innovating, adapting tools to their unique needs and situations. To some degree this is innately human and I'm not embellishing when I say that closed gardens and closed systems are dehumanizing. It limits us from being us. That person obsessed with cars and makes a sleeper Honda civic, that person that turns trash into art, that person that finds a new use for every day objects. Why would you want to take this away? It even hurts their bottom lines! People freely innovate and you get to reap the rewards. People explore, hack, and educate themselves, dreaming of working on your tech because of the environment you created. By locking down you forgo both short term and long term rewards.
I also want to add that we should not let any entity claim to be environmentally friendly or climate conscious that does not create open systems. No matter how much recycling they do. Because it is Reduce, Reuse, Recycle. In that order. You can't reuse if your things turn to garbage and reusing certainly plays a major role in reducing.
People living in "bad neighborhoods" have to spend more energy and money on locks, fences, security cameras, self-policing as to not go out alone after dark, etc.
Problem is, Internet (and international phone system, to a lesser degree) makes everything so much closer, that scammers from half-way around the globe are "local" for all intents as purposes. Thus, online, every neighborhood is a "bad neighborhood".
This is like the exaggerated crime coverage on the local news.
I've lived in the so-called "bad internet neighborhood" for 30 years, and I'm fine. It's not so bad.
Failure to obey them, might get jail time on those countries if caught disobeying, or an hefty fine, not counting what misuse might bring in, regardless of the country.
That's because there are people behind every product, and the people behind computers tend to be the paternalistic, nanny-state type. Just read through the histrionics in any HN thread about leaf blowers, they want every landscaper locked up and their tools of the trade taken away. Someone once suggested they should be forced to use rakes. Imagine if some landscaper insisted what laptop you should use.
As you wouldn't expect to find many in-the-Army buzz-cut guys roaming the Google campus as you would at a gun company, you wouldn't expect some blue-haired face-pierced sales engineer selling you table saws.
It's a cultural thing, nothing more.
We do not? You don't even need a license to buy /operate a computer unlike with some other examples on your list
I didn't mean "the law". To the contrary, the submitted article author was proposing that we pass laws giving greater individual consumer rights over their devices. But the big tech companies have been viciously fighting against consumer rights, such as the right to repair.
Vendor lockdown is that. Defenders of vendor lockdown argue that computer users need to be protected paternalistically from themselves.
For some reason we accept that for computers, but nobody would accept refrigerators and ovens that only allow you to eat healthy foods, nobody would accept homebuilders controlling the doors of your house and having to approve anyone who comes in, etc. Why do computers get this special treatment of vendor lockdown, but not any other product?
> Why do computers get this special treatment of vendor lockdown, but not any other product?
Of course they don't, plenty of other products are treated much more seriously by "us" (supporting lockdowns that limit your own use without supervision), some of them you've already listed
There are legal mandates regarding the sale and use of certain products. For example, you have to be a minimum age to buy cigarettes and alcohol, stores in some localities can only sell alcohol during certain hours, bars have to close at a certain time, you can't drive drunk, you must wear a seatbelt, you can't exceed the speed limit, etc.
But there are no vendor lockdowns in this regard. A cigarette will allow anyone to smoke it, a container of alcohol will allow anyone to drink it, you car still works if you're drunk and don't put on your seatbelt, etc. If your car made you take a breathalyzer test whenever you wanted to drive, or it didn't allow you to exceed the speed limit, that would be vendor lockdown.
I discussed the issue in another comment: "The equivalent would be if you could only use specific brands of replacement chains, blades, tires, or bullets that are approved by the manufacturer, for which the manufacturer gets a cut of the sales of those replacements." https://news.ycombinator.com/item?id=42684134
So here is your mistake when you only accept something almost literally identical to computer lockdown (same with your fridge example), but brushing off more serious usage "lockdowns" that don't exist with computers
> The equivalent would be if you could only use specific brands of replacement chains, blades, tires, or bullets that are approved by the manufacturer, for which the manufacturer gets a cut of the sales of those replacements."
Yes, this exists and is common in complex mechanical things, e.g., you lose warranty if you use unapproved parts, or for some parts there is actually not even an alternative, so manufacture is the only one getting a cut
So again, there is nothing unique or "most dangerous" about computers in either reality or people's prescriptions
Although since your argument isn't about real restrictions, but about what commenters support, you'd need to ask them which of these existing restrictions they support vs computers
You're equivocating on the word "vendor". You know full well that in this context, the vendor means the manufacturer of the computer, for example, Apple, and not the retail store selling the computer, which may not be Apple but rather Best Buy, for example. Likewise, in my analogy, vendor lockdown of a cigarrete would mean lockdown from the manufacturer of the cigarette, for example, Philip Morris, and not the retail store selling the cigarette.
> This is a much more serious restriction of consumer freedoms than if anyone can buy and use, but can't smoke when drunk and in bed (fire safety) due to some other lockdown mechanism built into the cigarette itself. More people are affected
This is actually false, because the only restriction on the sale of cigarettes is that you can't buy them if you're under age 18. Anyone age 18 or older is free to buy and smoke as many cigarettes as they want. Adults have full, unrestricted freedom. And that's what they should have for computers too. For better or worse, children have a huge number of legal restrictions on them.
Computer vendor lockdown affects all adults, no matter how old. Indeed, some people claim that the point is to protect your grandma, yadda yadda.
This is actually my point about being "dangerous". That is, we seem to consider computers as the most dangerous product for fully grown adults who have no age-related restrictions on purchasing things, because nobody is proposing or defending manufacturer lockdowns on other products for fully grown adults. We think that fully grown adults get to decided whether to smoke cigarettes, drink alcohol, eat junk food, etc., but for some reason fully grown adults can't decide to install software on their own computer.
> So here is your mistake when you only accept something almost literally identical to computer lockdown (same with your fridge example), but brushing off more serious usage "lockdowns" that don't exist with computers
I wasn't "brushing off" legal restrictions. I was merely distinguishing them from restrictions that come from the manufacturer.
The difference, of course, is that computer vendor lockdown is not legally mandated, and thus they don't have to lock down the devices. They're doing it totally voluntarily, and I believe the reason is increased profit rather than increased security.
> Yes, this exists and is common in complex mechanical things, e.g., you lose warranty if you use unapproved parts, or for some parts there is actually not even an alternative, so manufacture is the only one getting a cut
And this malicious practice is being challenged by "right to repair" laws.
> So again, there is nothing unique or "most dangerous" about computers in either reality or people's prescriptions
You're missing the entire point here. There are a lot of people who defend computer vendor software lockdown, in the name of "security", but there aren't nearly as many people who defend the warranty practices you just mentioned.
I find the limitations of your analogy artifical and thus irrelevant. Other people thinking about the trade-offs aren't bound by whether you decide that in the whole supply chain only the manufacturer's limits should be considered. So while you're free to arbitrarily limit your thinking, that won't help you answer questions like "Why do computers get this special treatment of vendor lockdown, but not any other product?"
> reason is increased profit rather than increased security.
That's fine, but we shouldn't rely on vendor motivation anyway, so the validity of your assessment doesn't help us decide when the increased security is worth it
> You're missing the entire point here
You've cut your quote off to make it seem so. I've explicitly mentioned the perception in the very next sentence
> but there aren't nearly as many people who defend the warranty practices you just mentioned.
That would depend entirely on the specific tech involved and other factors. Are you sure people defending software vendor lockdowns would not defend some limits for parts for nuclear plants? For guns? Also why did you skip the "for some parts there is actually not even an alternative" practice? Would fewer people defend the right of a manufacturer to also manufature parts for sale (forcing some kind of divestment so that the "vendor" doesn't get an extra "fee" from the parts business)?
The equivalent would be if you could only use specific brands of replacement chains, blades, tires, or bullets that are approved by the manufacturer, for which the manufacturer gets a cut of the sales of those replacements.
I think the marginal security value of denying root on the computer when you have already wangled root on the human is small.
And how exactly does device vendor lockdown stop this particular scam?
Who the fuck knows ? And how is that even remotely a useful question to ask - it's not answerable, those who commit the scam are the only people with the figures, and there's no "register of fuckers who scam other people" where they have to tell you how well they do.
> how exactly does device vendor lockdown stop this particular scam
Premise 1: All (for a suitable definition of "all") computer users are clueless when it comes to internet security
Premise 2: You are not trying to help any given individual's security, because some of them violate premise #1. You are trying to raise the bar for the clueless hurting themselves.
Premise 3: It is not about "personal freedom". It is about preventing the clueless (by no fault of their own, this shit is complicated) becoming drones and mules for attacks on others. It is an attempt to increase the greater good at the expense of placing restrictions on what any individual can do on their own phone. Those restrictions can be mitigated mainly by coughing up $100/year, which is a sufficient bar to prevent bad guys from doing it en-masse, but not so high as to prevent the people who want to do stuff from doing it.
Stopping people doing stupid stuff because they don't know any better is the goal, and that inevitably gets more and more restrictive as time progresses, because an arms race is instituted between the truly evil arseholes who prey on the clueless, and the manufacturers who don't want their produces seen as vehicles leading the clueless to the slaughter.
Personally I don't give a crap. The iPhone is fine for me as-is, I can install my own software on my own phone, and sure it costs $100/year. That's not a big deal IMHO, in terms of outgoings it barely registers above the noise floor. YMMV.
Um, why do crime statistics have to come from the perpetrators rather than from the victims? The victims report the crimes, duh.
Anyway, you spent a lot of words avoiding my question, which is how exactly does vendor lockdown stop the Nigerian prince scam? You're arguing that vendor lockdown is supposed to protect consumers, but you can't seem to explain how or how often.
You asked for (quoting) "Exactly how many people have fallen for the scam, out of all computer users". Not every crime is reported, duh.
> Anyway, you spent a lot of words avoiding my question
Nope. I can't answer the question because it's non-answerable. If you believe that nobody has ever fallen for phishing, Nigerian-prince, etc. etc. scams, well, I don't know what colour the sky is on your world, but it's not the same as on mine...
If you further believe that allowing everyone root access to devices that are also linked directly to their bank accounts, social security numbers, driving licenses, etc. etc. Then again, sky colour becomes an issue.
You seem technically savvy. I do not believe you are typical of the average phone user. I think the restrictions in place are a necessary tragedy of the commons, to prevent the destruction of trust in the system as a whole.
As I said, YMMV, and I'm not saying I particularly like the situation, just that I think it's necessary, and opening up everything to everyone is a foolish, idealistic, and hopelessly naive idea.
Not every crime is reported, but it's indisputable that a lot of crimes are reported. So give me a statistic, any reported statistic.
> If you believe that nobody has ever fallen for phishing, Nigerian-prince, etc. etc. scams, well, I don't know what colour the sky is on your world, but it's not the same as on mine...
How do you know this, except from reports by victims? That's what I'm asking for.
And once again, you haven't explained the mechanism by which vendor lockdown prevents this scam. However many or few victims there are of the scam, precisely zero of them are helped by vendor lockdown. I'm not going to stop asking how to explain how vendor lockdown is event relevant here.
> If you further believe that allowing everyone root access to devices that are also linked directly to their bank accounts, social security numbers, driving licenses, etc. etc.
This is hand waving, and it's not clear how root access by the owner of the device somehow exposes userland data to criminals. Moreover, all of this data is on desktop computers, and it's mostly fine.
As I said, I don't care about the current OS situation, I think it's actually pretty well reasoned out. I'm not spending my time tracking down statistics for you to "prove" some point to some other person on the internet.
I don't care enough to argue. Have a nice life.
A simple Google search would do: "Nigerian prince’ email scams still rake in over $700,000 a year" https://www.cnbc.com/2019/04/18/nigerian-prince-scams-still-...
$700k a year as an excuse to lock down over a billion smartphones? Not to mention that once again, this is an email scam, and thus vendor lockdown is irrelevant and doesn't prevent it.
It appears that you're the one believing whatever you want to believe, despite the empirical facts. The problem is that proponents of vendor lockdown always make gross exaggerations to defend it, pure fearmongering.
The thing with chainsaws and motorcycles is that they look and feel dangerous, and people have an intuitive understanding of how to approach those dangers.
If you ask a random person on the street about safe motorcycle riding, they'll probably tell you about respecting speed limits, wearing protective gear, only doing it when sober, not pulling stunts / showing off etc. I've never been on a motorcycle, have 0 interest in them, and I know those things.
Computers don't work that way. People can't distinguish between a real banking app and a fake banking app that looks real, an update pop-up and a fake "you need to update Adobe Flash Player" pop-up on a phishing website etc.
I've done plenty of "helping non-technical people out with computers" during my middle / secondary school days. That was when people still used Windows a lot, as opposed to doing everything on their phones. Most computers I've seen back then had some app that hijacked your start page, changed your search engine to something strange, would constantly open random websites with "dpwnload now free wallpapers and ring tones for your mobile now" etc. You didn't even have to fall for a scam to get something like that, plenty of reputable software came with such "add-ons", because that's how you made money back then.
I feel like that era of "total freedom" has somehow been erased from our minds, and we're looking at things through rose-tinted glasses. I, for one, vastly prefer the world of personalized ads and invasive surveillance over one where I constantly have to be on alert for my default browser being changed to Google Chrome for the hundredth time this year, just because I decided to update Skype.
How did this matter? People may know these things, but they nonetheless ignore speed limits, don't wear helmets, drive drunk, pull stunts, etc. And the motorcycle manufacturer can't stop them. They have the freedom to harm themselves.
> Computers don't work that way. People can't distinguish between a real banking app and a fake banking app that looks real
Guess what, people can't distinguish between the real and fake apps in the crApp Store either. Let's stop pretending that it's safe, when we've seen over and over that it's not.
> That was when people still used Windows a lot, as opposed to doing everything on their phones.
People still use Windows a lot. Smartphones have not replaced desktop computers but rather added to desktop computers. Almost every desktop computer owner also has a smartphone I believe that desktop computer sales are as high now as ever; I know that's true for Apple Macs, specifically.
> I feel like that era of "total freedom" has somehow been erased from our minds, and we're looking at things through rose-tinted glasses.
It hasn't been erased. The desktop never left. It's been surpassed in volume by smartphones, of course, but let's not pretend that desktops were somehow made obsolete and removed from the Earth. The people who have enough money buy smartphones and desktops. Many even have a smartphone, a desktop/laptop, and a tablet. The choice is not about security, it's about money and form factor. When I leave home, I put a phone in my pocket. When I'm on the couch, I use a laptop. When I'm reading an ebook, I use a tablet.
That's why you never blindly clicked "next" in installers. Everyone got one of those IE toolbars accidentally at some point, but it usually only took doing it once to learn the lesson.
An irrelevant "whaddabout" argument.
It doesn't change that we need security and privacy for our information handling devices, as well as personal control. The real conversation is about how to best balance these.
An irrelevant false dichotomy argument. There's no inherent conflict between security/privacy and personal control. I would argue that a device which has to phone home to the vendor to get approval for everything results in both less privacy and less personal control.
I can do online banking on my PC as root user if I so choose, but I cannot do online banking on my phone because my bank's app employs a rooting detector SDK that as of now even Magisk+a host of (questionable) modules can't bypass.
How do you even formulate these values so that they're in conflict in the first place?
If you're serious about this stuff binary thinking is a mistake. It's not a question of whether rooting is possible or impossible. It's a question of under what circumstances it can be done, and under whose control.
Also, "conflict" is the wrong word here. It's a question of competing concerns not conflicting ones.
We probably want root access to be under the end-user's control, but in such a way that minimizes the ability of malicious parties to exploit it.
e.g., one way would be to allow anyone to easily install any root they want, but to disallow software from, say, the Apple app store from running on such "rooted" devices. While that gives end-users control and would mostly prevent malicious actors from getting things they want, it's probably not what most user's would want. They probably want to run all their regular software along side the root software.
Another way would be to allowing people to easily install software as root, and allow software from popular app stores to run on it. That gives users max control, but is pretty easy for malicious actors to exploit too. People aren't going to be too happy with this when some coupon clipping app starts emptying people's bank accounts.
These are just examples to give the idea of the range of possibilities. The real answer needs to be a lot more nuanced than this. The point is, pretending there aren't issues doesn't get us anywhere. You might as well have no opinion on this.
To use the same logic, they shouldn't be given anything with a visible screw, or are you going to tell me they _wouldn't_ take a screw driver to an appliance because that would be silly for someone who doesn't know what they're doing in there?
To an extent, crime can't be eliminated. You can't even eliminate crime by instituting a strict authoritarian regime, because power corrupts, and those in power become criminals themselves. That's why turning big tech companies into paternalistic device authoritarians doesn't work. The big tech companies have become massively corrupt, demanding a 30% cut of everything that happens on your devices, in return for what? Some low paid, low skill reviewer spending a few minutes to approve or reject a third party app submission? That's not security, it's security theater.
There were phone scams before there were smartphones. Before there were mobile phones, when everyone had a landline. There's no technical solution for crime and scams, much as tech people want there to be. Education and viligence have always been the only effective resistance. Unfortunately, the big tech companies don't want to do education; to the contrary, they want consumers to be eternally technically ignorant—despite the increasing importance of computers in our lives—because that's more profitable. At least with cars, we have mandatory driver's education.
The problem is that a line is being drawn in an arbitrary place; if scammers are the worry, don't let them have a phone, or internet or email either, in fact just don't let them talk to any strangers in person or otherwise, but that would be awfully inconvenient for them.
Everyone is willing to make a compromise somewhere so long as the compromise isn't something they care about. Some readers probably think the suggestion of taking away their phone or email is absurd to protect them from scammers, and I'd place preventing root-access in the same category; not disabling it by default, I'm ok with that, but preventing it entirely.
My opinion is that everything should be secure by default, but when it's something you own, there should be reasonable, measured steps to "unsecure" it, whether that's removing a couple of screws, or accepting a disclaimer to gain root access to the device you own.
If I don't own it, let's cut the bullshit and tell me I'm merely licensing or renting it, and we'll adjust the price I'm willing to pay accordingly.
And non-owners shouldn’t be able to have access solely based on their physical possession - quite the contrary, owner should have means to fully use hardware security features for their personal benefit, locking their own device as tight as they want (within the device’s technical capabilities).
I have taken the opposite stance on that. Never again will they be left with some Samsung bloatware which hardly receives any Android updates when phones such as Nexus, Nokia and Nothing costs the same and has excellent LineageOS support.
Lineage is stable, bloat-free self-updating and requires no maintenance from my side.
As far as I know Mediatek (and vendors that use those chips) are usually not good with regards to GPL Compliance, which means no Lineageos if kernel sources are not available...
> I agree with the premise that consumer devices, such as mobile phones, should be as secure as they can by default. This can even go so far as shipping new devices with locked bootloaders and blocking access to root. ..
> But this shouldn’t come at the expense of being able to make an informed choice to unlock these privileges to install any software you want, even if that means adopting a higher level of risk.
One does not require "easy root access" to make that informed choice - complicated root access (within reason, as pulling out the soldering iron might be a step too far) should be enough for tasks like installing a new OS because the company no longer supports the hardware.
It makes sense to allow the _buyer_ to responsibly lock out others. This is common in other products that could be dangerous. But allowing the _seller_ to lock out others, e.g., competitors or the buyer, is a recipe for malfeasance, at the buyer's expense. Interestingly, with computers and pre-installed software, there is no option to lock out the sellers such as Apple or the companies that partner with sellers and pre-install software on the computers, such as Microsoft, Google, etc.
"It's all about protecting their profits, not protecting us."
It is interesting that the "protections" are not optional. It assumed _every_ buyer wants the protections from others _and also from themselves_ enabled by default, and also for protections from so-called "tech" companies to be _disabled_ by default. A remarkable coincidence.
Perhaps if buyers were given the option to login as single user and change the default protections some (not all) might disable phoning home to Silicon Valley or Redmond. They might block unwanted access to their computers by so-called "tech" companies who sell them out as ad targets. The so-called "tech" companies and their customers (advertisers) from other peoples' computers might be locked out.
Indeed letting buyers lock out whomever they choose might diminish the profits of so-called "tech" companies.
In the past HN commenters often sidestepped the question of these "protections" as self-serving and argued that so-called "tech" companies serve the "majority" of computer users and in fact these "protections" are what computer users want even though these users were never asked or given the choice to opt-out. If that were true then allowing a "minority" of users to control the protections themselves, i.e., operate as root, would only affect a minority of profits.
If not for the "dangerous" unlocking, I would have to run with dozens of severe vulnerabilities right now, all five years worth of them. A decent phone costs large amounts of money here, the hardware on mine is still very good, and so I would have used it regardless. (Yes, I understand that the firmware does not receive updates, but it's still much better than nothing.)
My guess is that you're assuming, wrongly, that vendor locked devices are "safe" and unlocked devices are "unsafe".
All computers that are connected to the internet are unsafe in some ways. The most dangerous apps on your computer are the vendor's own built-in web browser and messaging app.
Also, the vendor-controlled software stores are unsafe cesspools. You will never find a more wretched hive of scum and villainy. Moreover, the vendors deliberately make it impossible for you to protect yourself. For example, iOS makes it difficult or impossible to inspect the file system directly, and you can't install software such as Little Snitch on iOS that stops 3rd party apps—as well as 1st party apps!—from phoning home.
In any case, most computers, including Apple computers, have parental controls and the like, so you can lock down your own device to your heart's content if you don't trust yourself, or you don't trust the family member that you're gifting the device.
And the assumption you refer to, there are varying definitions for "safe". Is a device with a locked bootloader 100% safe in all use cases and all circumstances? Of course not. But me being able to reasonably trust that someone hasn't put a compromised version of the OS on the device, or, won't be able to put a different firmware on the device to brute force my encrypted contents is a bit of safety in a certain set of circumstances that I want in my device
If Apple, or anyone else, were precluded from locking the boot loader yes, I would be forced to buy a device that the FBI or anyone else could in theory poke around on enough to try to get at my data
You're scared of the wrong thing. The greater danger isn't arbitrary software but rather your son running up massive App Store charges on IAP of exploitative games and other scams. And if you think Apple will refund you, think again. Locking the device to the crApp Store isn't the solution. To the contrary, the solution is to enable parental controls to prevent access to the crApp Store.
> But me being able to reasonably trust that someone hasn't put a compromised version of the OS on the device, or, won't be able to put a different firmware on the device to brute force my encrypted contents is a bit of safety in a certain set of circumstances that I want in my device
These are possible without vendor lockdown. Devices can be and are designed so that the consumer can lock the device down and prevent modification, etc. Of course you can't constrain yourself, if you have the credentials to unlock the device, but you can constrain everyone else, whether they're children on the one hand or thieves/attackers on the other.
> but if it can be unlocked to run arbitrary software then he can in theory unlock it.
I'm effectively the admin several machines with many users on them. I have root access. I'm not at all concerned that they'll gain root access. Just make yourself admin on your child's phone, I don't see the issue. Apple and Google can even make gaining root access require some technical (but documented) methods. Look at the requirements to gain root on an android phone currently. You should be comfortable going into a terminal and using ADB. I'm not worried about the average user doing this nor even the average smart child. Hell, follow Apple's lead and require a 1hr lockout if you're really concerned about someone getting root on your device. How often will that happen if it requires being connected to a computer for an hour?"Freedom" is also a terrible argument for this. What does it even mean? Freedom from what? Freedom to do what? It's such a meaningless word you're going to lose half your audience just by bringing it up.
When the context is "digital devices", it becomes pretty clear what it means. You should be free to use it however you want, without externally imposed restraints.
Locking down the device so much so users cannot run applications they've written themselves without the approval of the company who made it, isn't "freedom" as the required approval from the company breaks the "without externally imposed restraints" part.
Yea, this is a nice thought if you don't live in society. However, it falls apart pretty rapidly once you realize "your freedom to stops at my freedom from". So it's a non-starter.
Well, democracies are societies, and you have much freedom in (most of) those. Not sure where you live, but if possible, you can always try to vacation in one to experience it yourself :)
> "your freedom to stops at my freedom from"
I don't understand what this means, nor how it relates to having root access on your digital devices. Could you possibly explain this again? I want to understand.
The average customer wants a device that works consistently, every day, that is easy to use, with a collection of 3rd party apps who won’t steal their life savings.
Windows failed to deliver this; the average customer never downloads an Exe from a newer publisher without terror. The average consumer is literally dozens of times more likely to trust a new smartphone app than a new desktop app.
We can also see this in the console market. Windows exists; old gaming PCs exist; the locked down console market will be with us forever because even Windows can’t deliver a simple experience that reliably works.
On that note, even if someone stole your car, at least your car does not have access to your bank account, your passwords, your messages, and even your sexual history. The personal and reputational cost of losing a car is not comparable.
Many people would actually probably prefer their car to be stolen than the contents of their phone be public.
I think a more accurate comparison would be to an electrician. In Australia, doing your own electrical work is a crime even for the homeowner, because it can cause physical death, and is too likely to be done wrong. Yes, you will possibly go to jail for replacing $2 light switches. I assure you that most people’s phones have things they would prefer physical death over being publicly distributed.
You're conflating vendor lockdown with device encryption. The latter does not require the former.
"The very worst offender is Nissan. The Japanese car manufacturer admits in their privacy policy to collecting a wide range of information, including sexual activity, health diagnosis data, and genetic data — but doesn’t specify how. They say they can share and sell consumers’ “preferences, characteristics, psychological trends, predispositions, behavior, attitudes, intelligence, abilities, and aptitudes” to data brokers, law enforcement, and other third parties."
And do you find this reasonable, and a good thing to expand to smartphone use?
In this comparison Google and Apple have the role of the government, if you believe that argument, that also implies that you believe they should be broken apart for antitrust
Just because you don't need or want it, doesn't mean it's not an important right to protect. Considering the influence of computers these days, the right of general purpose computing is probably at least as important as the right to free speech.
The problem is that 90% of people unlocking their phones will either be for piracy (against the company’s interests), or against the customer's own interests (stalkerware, data extraction, sale of stolen devices).
There is a reason malware is over 50 times as prevalent on Android.
Why would you think that?
Many Android phones can be unlocked, so it's not a hypothetical situation. I does not enable software piracy, since piracy doesn't depend on root. I know a few persons would install of sort of shit on their phone, including obvious malware, and they lack the knowledge to root their phones.
The data extraction problem happens today on unrooted phones in a "legal" way, it's done by your regular friendly companies like TikTok, Google or Meta. Rooting enables limiting this which is likely why they are against it.
If you look around on forums that discuss the topic of unlocking/rooting Android phones you will see that there is little discussion of piracy and people seem mostly driven by the will to control their own machine instead.
The first point is irrelevant once I've bought a thing. Once I own a thing it is mine to do with what I want, and the company's interests ought to be irrelevant. As for your second point, that is mitigated by making the process sufficiently annoying (eg. hiding it in the developer menu).
> There is a reason malware is over 50 times as prevalent on Android.
What's the reason for that bogus-sounding statistic?
You also don't need to buy a single device from Google to get started. You can take the PC you're at and get started right away, and publish that app (or malware) without spending a penny (though I don't recall whether they still charge that nominal fee to get a developer account).
Saying 90% of people root for piracy is hilarious, I rooted every Android device I owned until the last one or two, and I've never pirated anything, why did I root? Mainly for customisation and host-based ad-blocking.
I can't understand the thought process of these people who think the things you own should be locked down to protect you.
We should stop selling screwdrivers too in case someone's granny tries to open their toaster and electrocutes themselves, after all, a screwdriver is the pre-tech root access to all things electrical and electronic. I suspect those same people who argue in favour of locking these devices down would also say "don't by silly, my granny wouldn't open her toaster with a screwdriver, because she's not an engineer".
I don't know if it's because they don't understand it, and that's scary, so they think it's safer for the big boys to hold the keys, but imagine if people acted the same in other contexts?
"The bank should keep hold of the keys because otherwise I might accidentally lock myself out, or lose my keys, or leave the door unlocked for a bad guy to come in and steal my stuff".
That's fine if you can't trust yourself to look after them, let someone else handle your keys for you, perhaps someone "trust worthy" could offer it as a service, but I'll keep my keys in my own pocket thanks.
If they all went out of business, nothing of value would be loss.
Then you have apps that are free clients for services.
There is very little legitimate money being made by mobile from people actually buying apps
Malware has a wide definition however, and if you include all the spyware included with phones that aren't sold outside China and to a degree also India, you could probably hit that mark. But as they aren't allowed to access Google services or the official Play store, it's also a bit misleading.
And it can only be archived with a fully locked down hardware?
Of course not. The modern OS archives system security through permission and isolation, which don't require bootlock etc to work. In fact, it worked well too even after the device is unlocked & rooted.
> Windows failed to deliver this; the average customer never downloads an Exe from a newer publisher without terror
Windows (and Linux for that matter) is not modern OS. They're classic OS that offers the entire computer as playground for the program running on top of it. That's why Windows can be contaminated with a single malice EXE, but not Android or iOS.
OSs are not the same, don't try get the water muddy that way.
Here's the API reference if you'd like details [1]. They are very much not just standard Linux permissions. Android includes a huge set of APIs on top of Linux
[1] https://developer.android.com/reference/android/Manifest.per...
If I buy an iPhone, I should have the option to completely disconnect it from Apple and be able to replace the OS with whatever I want. If I do not have the option to do that do I REALLY own the device? The answer is no bacause what I have is a device that I can only use the way Apple allows. When the phone is obsolete and Apple stops updates then all I can do is send it off for recycling since Apple won't allow me to repurpose it with new software.
You are putting a lot of trust in the manufacturers as well. For example, they have the technical capabilities to kill the second hand market in their devices if they simply decided to refuse to allow a new user to login to a device. Sure, you could still sell the hardware, but it wouldn't be much use if the manufacturer stopped it from connecting and autorizing. I know this is an extreme example and no sane manufacturer would implement it, but I think it demonstrates why having to option to disconnect is a good thing.
The same applies to all other devices that are locked down, things like smart TVs, IP cameras and appliances. Just look at how many early smart TVs are now dumb because the manufacturer stopped updating the on-board apps. There should be no reason why the owner of such devices should be allowed to do whatever they want with them to try and bring them back to life.
This is blatant unempirical scare mongering. How many desktop computer users have had their life savings stolen by 3rd party apps? Citation needed.
> The average consumer is literally dozens of times more likely to trust a new smartphone app than a new desktop app.
This is a false dichotomy. Almost all desktop computer users have a smartphone too. The people who have enough disposable income buy both smartphones and desktop computers. There's no inherent conflict between the two.
> the locked down console market will be with us forever because even Windows can’t deliver a simple experience that reliably works.
That's a competely ahistorical interpretation. Originally, the gaming consoles had no third-party games: the games were all written by the vendors. The first third-party game development company was Activision, a group of former Atari programmers who learned that their games were responsible for most of Atari's revenue, but Atari refused to give them a cut, so they left and formed their own company. There was a lawsuit, and it was ultimately settled, allowing Atari to get a cut of Activision while allowing Activision to otherwise continue developing console games. It had nothing to do with "reliablity" or "security" or any kind of made-up excuse like that.
Again, citation needed. I made it through the 2000s just fine, thank you.
> What a stereotypical HN comment. Cite something that only applied to the 2nd generation of consoles to prove me wrong, even though my point spans almost all console generations.
No, I was explaining the historical origin of the game console business model. Of course the business model continued, as these things usually do, through a combination of monetary incentives and inertia.
Playing devil's advocate: banking trojans used to be really common here in Brazil back in the pre-smartphone era of the early 2000s (smartphones already existed, but weren't very commmon; most people who used online banking did it through their home computers). They're the reason why, for a long time, it was hard to use online banking on Linux: banks required (and still require) the use of an invasive "security plugin" on the browser (nowadays, there's also a Linux version of that plugin, which IIRC includes a daemon which runs as the root user), which attempts to somehow block and/or detect these banking trojans.
What does this even mean? Do you stand behind what you say? If so, then just say it without hiding behind the devil. And if you don't stand behind what you say, then why in the world are you saying it?
It may be only for historical reasons that desktop computers aren't completely locked down too. It's a lot easier to lock down a new device class, like smartphones, than it is to lock down an existing open device class, without causing consumer outrage and rebellion.
Yet that trust is, for the most part, unfounded. There's a ton of malware in app stores - you can assume any app that contains ads is sending data about you to some shady server, for example. You can't even trust the most popular apps not to be malware [0].
There are much more "hostile" smartphone apps that exfiltrate your data and sell it to the largest bidder than there are compromised executables these days. Also there are more profitable scams than compromising a PC system outside of industrial espionage.
PC in contrast to consoles always were a cost or usage factor. The difficulties of operating a PC isn't significant. It also heavily increases digital competency of the user for computer systems. If you really don't want that, you have other options.
then don't root your phone or download an .exe. having the ability to do something doesn't mean you are forced to do it.
not safe enough for you? fine! make the current status quo comfortable walled-garden-of-illusionary-fake-safety the default. for example, there's no reason windows needs to by default allow unsigned code to run. hell, even make it really annoying to turn off.
but the "safety" and "easy to use" arguments against right-to-repair, digital rights, ownership, etc. is simply nonsense. there is literally ZERO negative safety or usability impact to anyone else's device because i'd like to own mine.
it's also an insulting and disingenuous argument to hear anyone on this forum make: our careers and entire segment of the economy would not exist if it were not for open systems. and it's insulting to basically say "bubba/granny is too dumb to be trusted" with owning their own device.
This removes the risk of this being abused to compromise the data of stolen devices or evil maid attacks unless a user that knows what they are doing has explicitly opted themselves into that risk.
Re-imaging with a rooted image is not acceptable because this also reduces the device's security by prevent OTA updates!
Gated community is broken when the end user cannot improve the security of the device above and beyond the lack polices of Google and Apple. For instant there should be no reason my device ever communicates with organizations I do not support such as Facebook or X-Twitter. X-Twitter is often used as command and control service in plain site.
It is not just out-wards communicate to monitor but in-wards too. I've used Zone Alarm in the past at an international company to help find the infected servers and computers that where serving up viruses and other malware.
*I would argue that the "Gated Community" analogy is flawed. A real world gated community still allows for the home owner to improve the security. By installing cameras, security system, and guards. Apple & Google prevent such actions.
At the moment, I just run GrapheneOS and don't bother with any modification. It is not worth the hassle. I've already had my bank account locked out because a Google Store-bought Pixel phone was flagged as "stolen", probably due to some attestation measure (they could not tell me why). They recommended that I purchase a new phone.
Attestation requirements are only going to become more prevalent. I predict that in a few years basically all proprietary software for Android will require attestation.
So... you may still be able to unlock the device and make it yours, but you'll also be locked out of the ever expanding and ever-more-isolated walled garden.
If you can live off of GrapheneOS and F-Droid, that's great, but for a lot of users this won't be a real choice, because you increasingly need proprietary software for access to real things in the physical world (i.e. I needed to install a special app for event tickets recently).
Magisk exists, yes, but it's a flimsy temporary solution. It only works because it's able to lie to Google that your device doesn't support hardware attestation. As soon as Google starts requiring that all devices support hardware attestation, it will stop working.
And there are some real strong reasons why you benefit from this sort of ability, such as preventing folks from cheating in competitive games. I can't say that all uses seem to have good reasons to use it, but that seems like more of a vote with your wallet sort of situation. Perhaps the play store should also have stricter requirements on acceptable use of attestation and ensure they are upheld.
It's not the software, it's that the hardware itself, that I bought to own, still serves someone else in a way that's detrimental to my interests, and that can't be overridden because those stupid encryption keys used to sign attestation reports are burned into the silicon and only accessible to that TrustZone hypervisor that can't be unlocked.
> And there are some real strong reasons why you benefit from this sort of ability, such as preventing folks from cheating in competitive games.
Maybe playing such games on general-purpose devices is a bad idea to begin with. You know, consoles are already locked down pretty tight. But then there are PCs that have no hardware roots of trust at all yet you can play anything on them and sometimes even compete with console players. So go figure.
It's an invasion of my privacy.
If Americans had anything like BankID or MitID which would refuse to run on their devices and they would be prevented from paying a bill, transferring money, buying tickets, or reading their mail they would go apeshit in 5 seconds.
Some apps are no longer optional in the world we are living in.
They require hardware certification for the Pixel Screenshots app... and for anything that uses Gemini Nano (Call recorder summary, weather, pixel screenshots, etc).
The problem though is that rooting by itself is not that useful when a lot of apps use remote attestation to deny you service if you're rooted.
We don't just need root access, we need undetectable root access.
I'm typing this on a rooted phone where all (banking) apps work just fine. All it takes is downloading an app (magisk) and add apps to a list that need to have rooting hidden.
Worth noting that this could change with every update. It's an unstable situation right now, which is undesirable.
For that reason, e.g. the GrapheneOS team isn't employing measures to fake compliance at all. They'd really like to get SafetyNet compliance for their operating system (you need that to get Google Pay/Wallet to work), but funamentally can't get it. Right now, they could just fake it, but that's not guaranteed to work reliably, forever (and doing so would probably threaten their official BasicIntegrity compliance).
At some point the argument morphs from 'I should be able to do whatever I want with my device' to 'I should be able to access your service/device with whatever I want'.
The fact that Google allows this shows that
1. Apple could do it with zero security impact on anyone who doesn't opt in
2. They could keep any service-based profit source intact
But they still would never do it. Because it's not only service based profit they want to protect. They want to restrict customers from running competitor's software on their hardware, to ensure they get their cut.
I'm not demanding to be able to log in to your service/device and replace IIS with Apache on it. I'm just demanding to be able to access it as a normal user with Firefox instead of Chrome.
We right now have ENCRYPTED signal going from our computer to our displays, not just computers, but phones too SIMPLY to prevent people from dumping raw data.
All of that extra processing done just so you're allowed to for ex: watch netflix with a resolution higher than 720p. Then comically there's Chinese capture cards that you plug your GPU into, use mirroring mode and completely bypass it.
DRM is just one example, there's many more motivations such as preventing paid apps / pay for currency games from having these things given for free. This is the primary reason why iOS devices make significantly more money than android as it's near impossible to pirate / hack / crack for an average user.
For example, if anyone is interested, check out the computers Chinese governments are using right now. They are basically large mobile phones running some sort of Linux, but the whole thing is locked down. Fortunately things are OK on the commercial side but again it's more and more difficult to root or unlock a device.
And now the Western states are following suite, except it's the corporations that are leading the charge.
If they achieve this, and wipe out all commercial electronics distributors such as Mouse, then we need another underground railway movement to teach people to scavenge and build computers in that Dark Age.
I'm not joking. This could be real. It's already shaping.
But with cryptocurrencies normalizing it's only a matter of time before a paid piracy service emerges that is both cheaper, simpler and better than Netflix or any other streamer. Some arguably already have.
DRM was being broken for years without even a monetary incentive, with one it won't stand a chance.
The idealism of those who want to see the demise of DRM doesn't actually hold up in the face of reality. Even when we remove restrictions and give global access to content, for free, pirates don't give up. One of the reasons is that many pirate sites get ad revenue, piracy is a business for many folk and they get the benefit of not paying for the most expensive part. They also don't have legal/regulatory compliance, taxes and will often operate their infrastructure using stolen credit cards or accounts (we can see this).
Then you have people who are selling legitimately and trying to provide the best service for customers, but who have to pay for the content, competing with people who don't have any such responsibilities. So, customers take the cheap deal.
Some folk are also under the assumption that streaming services are money grabbing. Except when you actually look, most streaming services are running at a loss, or barely profitable.
I'm just working to protect our company and reduce losses, ultimately I am not preventing people getting access to fresh food or water. I am protecting premium goods from being illegitimately exploited and protecting the jobs of my colleagues when we're already under significant cost pressures.
One reason I post about these things on the internet is in the hope that one day we might have a constructive dialogue about how to balance freedoms AND enable commerce. But at the moment we have extremism, libertarian ideals against company lawyers.
More money than ever flows into piracy these days.
Even with complete monolithic control (which is an unlikely objective) over the entire chain from distribution to display there will be a way to obtain good quality output from a hijacked LCD controller if nothing else. There is no win condition for you.
Ultimately the goal of content protection (not just DRM) is to make it as inconvenient as possible to take content without paying. No security system is going to be perfect, but when you make it secure enough that people concentrate on weaker targets (or give up), then you're content.
Ultimately the goal of content protection (not just DRM) is to make it as inconvenient as possible to take content without paying. No security system is going to be perfect, but when you make it secure enough that people concentrate on weaker targets (or give up), then you're content.
Case in point: every popular desktop PC let's you run as root, and also watch DRM content. They aren't totally mutually exclusive.
I did do some experimentation with VMs and emulation and whatnot, but I never got anything that worked consistently enough to use full-time, so I bit the bullet and plugged my Nvidia Shield TV back in.
As someone that works in security, I fully understand the need for sane defaults that protect the average user. I even advocate in the article that we should keep these defaults in place for the most part.
What I tend to not understand is the argument that there should be no option for more enterprising users to access their hardware at the lowest levels because we need to protect the average consumer. It may be a footgun for some, but that's sort of the point. I expect to be able to modify something I own, whether it's to my detriment or not.
My argument isn't that root access should be the default, but at the very least it should be an option. I just don't think it's right that we've normalized corporations blocking the ability to load / inspect software, which often is marketed as a safety or privacy thing, but is arguably more a business decision meant to protect profit margins.
aren't we in fact pretending otherwise?
Right now I believe that stolen iPhones are effectively bricks (barring state-level actors with unpatched zero-days)?
While I agree, I think even legislation will not fix this, because what is a computing device, and who decides what is and what is not ? I'm sure apple will argue that nothing they sell should be considered computing devices. While the hacker will consider anything they can trick into arbitrary code to be one (is your fridge a computing device?)
If we go the legal route, I think the only way is to give the right to flash firmware of _ANYTHING_ that has programmable bits, and that's probably not going to fly either because lots of legislation already dictates users should be prohibited and prevented.
If there is legislation, it will contain a definition of what is a computing device and what isn't. It will be imperfect, and the edge cases will be contested in courts. Courts deal with blurry boundaries all the time.
That's how it always is with legal matters, and doesn't mean we have to demand that anything with a firmware must be flashable.
It's not that hard to imagine a version of the world where computers as we know them do not exist, but are mere appliances (like tablets and smartphones), and if companies feel threatened that they might be forced to open up their computing devices, they will be quick to make them not fall under the definition.
Instead of a smartphone, you will get a "Can telephone and access facebook and instasnap" device with whatever technical cripplement is needed to make it not a computing device and be exempt from the law. And as the general public and justice system is pretty ignorant with regard to technology, it's going to be pretty resource intensive to convince a judge why every gadget around that suddenly identifies as "not a computing device" is in fact on anyway.
Just scope the law to any device that can run code, and have the criteria for control be "the user must never have less control over code execution as the manufacturer does after the sale".
So, for example, if someone buys a phone from Apple they will get full control of the entire device (SEP/TEE included) because Apple has the ability to exercise post-sale code execution control to that level (they hold the private keys required).
But i wonder, why these rights do not seem to be enforced on computing devices. Either everyone is failing to assert their property rights or i am in the wrong here. Probably the latter.
This seems reasonable to me. What's wrong with it?
“What’s a computer?”
An other post I have posted regarding this: https://news.ycombinator.com/item?id=39349288
Undocumented hardware plus closed source drivers for almost everything make all this possible.
Speak for yourself. Sent from my Librem 5.
I simply point to the fact that you have no control. Just it's illusion.
Ask yourself, how can law enforcement push a software trojan into any mobile phone without physically touching it.
Can they?
> Are you really free to bake your own bread? (Think about
Yes. It's not easy, but it's not impossible. You have to fight for your freedom to get it.
You cannot fight for this, unless you design both the communication hardware and its firmware and you get away from Google Play.
This is exactly what my phone is. There are no blobs. It runs FSF-endorsed PureOS. Using it is fighting.
This is no longer true. You might have root access on your smartphone, but you still don't have access to the TEE (on ARM this is implemented using the "TrustZone" "feature").
Also, AVF is coming to Android, and protected VMs won't work with unlocked bootloader.. so expect the situation to deteriorate further once manufacturers make use of pVMs..
You could just be a sandbox root which is pointed at a guest user in a higher namespace.
Ideally I'd add a mandatory toolchain to that. At least a C compiler which should be able to target a device I own.
> the freedom to change a program, so that you can control it instead of it controlling you
I don't necessarily endorse all of Stallman's philosophy on software. But I think in this point at least he was very prescient.
I agree with this, as well as most (or all) of the other stuff mentioned in that article.
However, sometimes it might be reasonable to have a switch inside that you must unscrew it (using a commonly available screwdriver, rather than an obscure one) to switch it (and then later be able to switch it back), in order to enable some functions (e.g. to be able to upgrade EEPROM, or to bypass a secure boot loader). If the user puts glitter on it, then this allows the user to detect tampering, while remaining secure and allowing full control.
In the back of the device there's a sticker over a screwhole. You'll need to poke a screwdriver in there, but inside is not a screw. It's an electrical contact to make the ROM read-write which your screwdriver needs to bridge, and stay bridged while the firmware is flashing (It is harmless if the contact is loss in the process, but I don't know if it would be safe to abort at that point) Pretty hard to convince someone to do that process, yet doable by anyone with a flathead screwdriver.
I guess nowadays it become Samsung's e-fuse where if you flash it blow an e-fuse and the status of the fuse is now detectable with software. Then apps can refuse to service people just because the custom firmware fuse was blown.
Manufacturers will then claim that people don't own devices, merely a perpetual license to use it.
It would be refreshing for them to be honest about their monetary greed instead of telling false stories about "security" and what's "good for users".
Do you:
- Buy open devices?
- Sponsor development of open devices?
- Start open device companies?
- Develop open software that competes with walled gardens in quality and ease of use?
- Sponsor open software?
- Use open software?
- Engage in lobbying?
- Drop exploits (that would be worth a pile of gold) to let people jailbreak devices?
- ...
- Fake-care or real-care?
Why doesn't OP merely champion competition, instead of encouraging regulation of what software others can write, what hardware others can ship?
I too am afraid of general purpose computing going by the wayside, and I have the Precursor phone and Raptor Talos PowerPC machines on my wishlist, just as soon as I wrap my head around secure boot chains in general before having to implement one myself. But niche hardware is expensive to produce, so we're likely left with what AMD, Intel and Apple provides us.
I guess one quirk that IMO is fair to criticize is that it's not necessarily consumers who are demanding to be locked out of their administrator privileges (the average computer user is of course not aware of the distinction of signed vs unsigned binaries), so I don't know where the pressure for secure enclaves really comes from. Is it the data centers buying thousands of chips that don't want to be pwned? government customers who refuse to buy a single die if they can't verify the bootloader? Or just patriotic engineers sensitive to a cybersecurity regime that demands we keep our guard up against enemies, foreign and domestic?
The pressure comes from shareholders. User control means users can use their device in a way that benefits them, e.g. blocking invasive tracking. This benefit provides zero or negative shareholder value.
This is false dichotomy. Why not both?
We need individual consumer rights, and we also need healthy market competition.
In my experience, security engineers who see them as finally solving the “root of trust” problem. Generally (ime) it’s security engineers/teams that have been pushing for things like ssl/tls, global certificate stores, signed updates of those stores, signed kernels validating those updates. But if you break the kernel (or compromise the bootloader or EFI/BIOS) then it’s all for naught. A secure enclave solves that problem (unless you find a bug in it/its implementation) - your bootloader is validated, which validates your kernel, which validates all the userland components you care about. Security teams rejoice.
They sometimes actively search for root evidence.
Once the application is decompiled the attacker then can proceed to pentest the bank backend, or find any frontend-only security measures to bypass. One attack I heard in local news is not even a hack at all - they simply make script that use the mobile application API to automatically move money between sock puppet bank accounts. Once a victim get scammed, the money move around quickly. For privacy banks do not provide information about unrelated cross-bank transfers so even cops can't easily trace the multiple hops. That specific bank got in the news for that "weak security"
If you are unable to imagine how a 3rd party might root a device without the principal being aware of it, then maybe it is a shortcoming of your risk survey, not theirs.
I think in any scenario where the principal can do that without you noticing (which means things like reinstalling & logging you back into all your apps, logging the device into your google account successfully, restoring all your device settings, re-adding your fingerprint or device pin to unlock the device, etc) then it's game over regardless. If they can do that, they could get into your bank app anyway, or they could easily just replace your phone with another one entirely, and now you're just logging into your bank on a stranger's phone.
Barring a _very_ major Android zero-day (which probably would evade attestation anyway) unexpected rooting of your device is really not a plausible attack scenario.
Namely, that's what I do with proprietary software on my desktop. Nothing that's closed runs with access to my files. Further, a banking app shouldn't need to know I'm running a rooted device. For some reason, I can do banking with an open source browser on a rooted phone just fine. It's just the proprietary blob that comes with TPM shackles, and I think I should be the owner of those shackles because I own my phone.
https://github.com/LSPosed/DisableFlagSecure
it works in every app again on YOUR phone. I also have root and all my banking apps work currently ... but it's a cat-and-mouse game.
That's easy to say, but hard to legislate, and impossible to enforce.
There is so much firmware around, small binary blobs burned into micro controllers that can't even be updated. Or that isn't intended to be updated.
There are probably even dimmable LED bulbs without IoT features that still have a microcontroller.
Because a tiny microcontroller is the cheapest way to add logic to anything.
I'm not sure it isn't a good idea. Just not sure we grasp the ramifications.
You don't have to enforce it for every single manufacturer and every single product. If you enforce it for Apple, Samsung and a couple other big Android brands, you'll get most smartphones.
Now whether it is a good idea or not, I don't know. But if you wanted to enforce it, I think it wouldn't be too hard. Banks get a ton of regulations and have to apply them all the time. Surely Apple or Samsung could do that, too.
And for external programming you need programming devices who would also need physical interfaces where no standard connector exists.
But a definition for general purpose computing devices wouldn't be harder than to craft legislation that requires my banking app to run on a certified shitty app environment.
And I'd go even further, and limit the ability of devices to not be "owned outright", since that sound like a loophole. I do not want a EULA interfere with these rights.
I would start with, laws should be logical and informed and go from there... the number of prerequisite changes required to come mildly close to this is unreal. Including but not limited too: copyright law, insurance law, patents, contract law, federal vs state law, an agency competent enough to enforce this, lobby from the most powerful companies in the world, and more.
In dream land I support you though.
If it could be passed in California it would trickle down elsewhere.
e.g. California emissions mandates leading to less emissions in the entire country because it makes more sense at scale to build 1 SKU.
In my dream we abolish copyright all together but alas we live in a profit oriented society not a knowledge oriented one.
The phrase "have provisions to make changes" is on purpose. It has not yet been proven to me that a change to an iPad's bootloader function is impossible. It certainly isn't as easy as that of Mac, but the skill/effort required is a gradient.
This is similar to "soldered" storage. This was commonly thought of as impossible until it was demonstrated that a Mac will happily accept updated storage changed out with a hot air rework device. This method is certainly higher skill/effort/risk than remembering when to terminate 50pin SCSI, but shows that when a hacker has a will, there is a way.
Is it ironic that as computing devices become easier to use, they also require higher skill to fix and modify? No, more likely there is an iron rule that when a device's external complexity is contained, the inner workings of that thing become more complex. Complexity did not decrease but was hidden.
Does a grandfather clock and a tourbillon wristwatch encapsulate the same/similar general principles of timekeeping? Sure. If one has the skill to update parts of the grandfather clock, are those same skills sufficient for the wristwatch? Probably not. Should wristwatches be banned because people who update grandfather clocks do not have the skills to modify them? Surely not, that would be absurd.
Likewise, demanding root in a form you find acceptable is absurd. If you can't take root one a device you possess, it's a skill issue for you to address not the vendor.
I do think that it should never be illegal to tamper with a device that you own in any way that you find acceptable.
But I do not feel that manufacturers have any obligation to make this easy or possible.
It should be completely legal to hack your devices and to distribute tools and instructions for hacking devices without limitation.
The difference is between a locked down boot loader and one that is just a standard boot loader.
In general if there are no legal recourses to go after people who modify their devices, then it makes it more appealing for manufacturers to allow a supported opt-out method because that reduces the incentives to compromise their products with out-of-band methods.
So I'd prefer to take an indirect route of empowering customers rather than requiring manufacturers to do additional work to free up their products.
> Spotify’s Car Thing
Contrary to the author's claim, Car Thing is a great example of what can happen with abandoned hardware. The device did not become e-waste when the manufacturer stopped supporting it. There is a lively community of people modifying and updating software and doing really interesting things. I lack one only because I missed the $15 price nadir and they are relatively high priced in the secondary market.
> certain medical devices, such as implants and insulin pumps
> subsets of electronic control units for cars
These are precisely the opposite of what should be exceptions. If you have a pacemaker implanted in your body, you need the right to replace whatever software is running on it before the manufacturer goes out of business and takes the signing keys with them.There exists no thing where the owner of the device should not have the right to replace the software it runs, and the more safety-critical the device the more important the right.
Also, if someone wants to kill you in your sleep, they... don't need you to even have a pacemaker. And the security of medical devices is notoriously bad, so if you're worried about that sort of thing, be more worried that the status quo doesn't allow you to fix the existing remotely exploitable wireless security vulnerabilities.
That's just it though, in my opinion being able to flash the thing at all would count as a remotely exploitable wireless security vulnerability. The first thing I'd do if mine was flashable is lock it down to make sure it was no longer flashable. Does that make sense? I might not be articulating myself well here.
If it has a mechanism to flash it then they can give you the password for yours so that you can always do it yourself (or have someone do it) in the event that the manufacturer goes out of business before anyone finds the bug.
And if you really want to remove the ability to flash it, you could use your right to flash it to remove that feature, whereas the status quo is that it supports it -- insecurely -- and you aren't allowed to change it.
Something like this would inevitably be abused and result in wave of malware so massive that it would render the internet too hostile for all but the most careful, knowledgable and paranoid users.
It's a position I came to rather regretfully and sadly.
> limit[ing] access to general-purpose computing without doing so across the board for everyone
I imagine the vast majority of people on this site run (or at least have used) Linux, *BSD, etc. on a daily basis. No average person is going to set up Arch on their main PC, but lots of us do. Your average person enjoys their locked-down Samsungs and iPhones, while we enjoy unlocked problem-ridden (but personally solvable) Linux environments.
Would it be better if every person who could buy an iPhone was sufficiently technically competent to not install malware on their easily-rootable phone? Yes. But that’s not the world we live in. Maybe I’m succumbing to the “First They Came...” mindset, but until I can’t run Linux on a PC I’ve built with components of my choosing, I don’t really care if my phone is a semi-closed semi-black box. I used to enjoy jailbreaking iOS devices, but eventually decided I’m happy with my phone just being a phone. If I want to tinker, I’ve got a rack full of servers and 3 different workstations I have the freedom to break whenever I want. It’s nice to have a phone that can just do phone things 100% of the time - I haven’t had an iOS system crash (or any bug preventing use) in about 5 years, which was the last time I had a jailbroken phone.
There's a line from Blazing Saddles that comes to mind...
It would no longer be your computer.
- A computer that is compromised by malware
- A computer that doesn’t permit the user to install malware, and as a consequence, possibly alternative operating systems
Your phrasing implies that “it would no longer be our computer” is equivalent to “one that’s not ours from the start.” As far as I know, Microsoft and Apple aren’t going to ransomware your computer/phone to make a few bucks. You just can’t root an iPhone. Equating the two is arguing in bad faith, at best.
Let me know when that's ever invented. It's certainly not iPhone. The crApp Store is full of scams and ripoffs, costing consumers millions if not billions of dollars.
It's also worth noting how much goodware that users are not permitted to install, due to vendor lockdown and arbitrary restrictions, often motivated purely by the desire to squealch competition. Security, or in this case security theater, always has tradeoffs. Unfortunately, consumers often don't even know what they're missing, because the vendor restricts what they're allowed to see, but we developers know what kind of software that we can't make on locked down platforms.
Vendor lockdown is a tool for authoritarian regimes that enables censorship. For example, these regimes force the vendors to remove VPN apps from their "curated" stores, and since sideloading is forbidden, there are no alternatives for the poor users under these regimes.
Ah, the good old "think of the grandma". HN's version of "think of the children". And just like the conservative pearl clutchers, I don't think there's much sincerity in those thoughts. You are more afraid of the possibility that there would be less users to squeeze through adtech and crapware sold on appstores. Most HNers make a living from making the world an even worse place.
There's a lot of devs here very happy to have a captive audience of people too ignorant to know better, either exposing their eyeballs to constant onslaught of ads on "free" apps, or paying for really basic functionality.
I will never forget that show HN post that talked about selling a clone of Handbrake, a program that exists just to set a few flags on FFMPEG, and making a living out of it, because the Apple audience has been brainwashed to take out their wallet for the dumbest of things.
Ah, there it is:
https://news.ycombinator.com/item?id=39987579
The average HNer.
The average value add of people constantly bemoaning the idea that there could be regulations to stop them from enslaving new generations to tech is lower than that of scammers.
Pretty sad state of affairs, huh?
If the user is doing banking on fake-chrome then admin is pointless; https://xkcd.com/1200/
> then you'd never be able trust anything on that computer ever again. Even re-installing the OS wouldn't fix it.
Why not? If the hardware is under user control, they just reimage the firmware, reimage the OS, and then it's clean. (Or, in practice, perhaps they hire the local computer shop to do so, but I don't think that changes anything.)
There are already myriad unlockable devices.
What a bizarre fantasy you have constructed.
Given that, then anything very useful would be rendered non-functional, resulting in the device probably being useless.
Freedom cuts both ways here; if you want absolute freedom to do whatever you want with your device, why should software vendors not have absolute freedom to choose what platforms their software is permitted to run on?
Obviously none of this applies to FOSS.
For starters, because they're wrong, and they're wrong in a way that makes their users less secure. Allowing the use of Windows or Android with open CVEs while blocking completely up-to-date Linux or aftermarket Android ROMs clearly shows that this nonsense is contrary to security.
> If you don’t like it, either pick a different bank or deal with not having access to their software.
And that's the next biggest reason: Customers don't have the same amount of power that the companies have, so it's perfectly reasonable to tilt things in the customer's favor.
Today banks don't want to be liable for user error that lead to scams. They adhere to such policies out of fear, but they don't help as the most prominent victims of scams are still the same people, open system or not.
You cannot personally tell whether it is trustworthy or not.
Also a minor nit: > Yet, a group of (presumably) some of the most technically savvy people on the internet can’t figure out how to buy open products?
This isn’t necessarily fair. When I bought my iPad (my first apple device) I could try it at an Apple store where there’s only so much you can realise. How are the speakers? Is the screen bright? How quickly does YouTube/websites open?
What I did not realise until much later was that even something as “basic” as downloading a couple of MP3 and playing them is bizarrely hard on iOS. Or that alarms have no option to gently rise, and are designed to give you a heart attack. Or that you can’t even set your own song/music as an alarm (unless you’ve paid apple money for their music services, very conveniently).
Would I have still bought an iPad if I knew all these? Maybe. But maybe I’d have gotten a basic model. Or maybe I’d just use an android tablet - the details are irrelevant here. But there’s a lot of things about an OS that are obscure.
I'd be bothered to respect your opinion if there were competitors whose build quality were better than dogshit.
Also, most manufacturers consider the warranty void of the bootloader has been unlocked. There’s definitely no service (costs) for them on that front, and I’d be amazed if even 0.1% of all customers unlocked the bootloader (for it to be big enough to notice for any company).
This argument falls apart when 99% of the desirable devices do not have this option. It's not even about compromising on optional-but-important features, like having access to your bank - the "1%" devices usually do not even have a secure hardware element that handles full-disk encryption, leaving your most-personal data (that you carry with you everywhere you go, thus exposing it to additional risk) vulnerable.
> If you want a mostly-open handheld device, they're for sale, you should buy one of those.
Yeah, almost none of those are actually compelling. There's the Pixel series and GrapheneOS, but these devices are huuuge (they simply don't fit in my single hand!), and I don't want to give even more money to Google :S
In an ideal world, you should be able to simply root any of your devices on demand, in exchange for wiping the storage clean, losing your warranty, and any expectation of protection/privacy/extra features. Then it's up to you (and/or a third-party OS provider) to take care of that yourself.
The problem with that route is that only a tiny fraction of users are actually interested in that, there's value to lose in accidental rooting (users get angry about lost features), and there's no value to gain.
This is where regulation could come in.
I'm OK to void any guarantee if I can root it easily, TBH.
Limiting the ability to _easily_ modify what's running on a system is more about public cyber-health than the individual's freedom. Viruses + malware much more easily infect systems when they are running outside of a sandbox.
You can buy from Apple a computer that's locked down (an iPad), or a computer that's not locked down by the author's definition (a MacBook). It's a matter of consumer choice, not the company insisting on control of your devices.
The non-locked-down machines come in a different form factor than the locked-down ones - they usually have a physical keyboard and a larger form factor to accommodate that. This is partly for historical reasons, but very largely also for consumer choice - and also it makes sense that on the more flexible machine users are more likely to need a keyboard. This is all fine, you can't expect every company to sell their products with every combination of features.
I find more convincing the arguments about e-waste, but they need to be framed like that: sometimes we should mandate consumers get something they don't particularly want, for the greater good.
The Right-to-Repair union RTR-U could be a simple authority with access to the keys to unlock the device if the vendor breaches certain commitments. Various levels of commitment could be offered similar to copy-left. The basic / lowest level would be "can unlock if the company dies". Higher commitments could be
will unlock if ...
company starts telemetry
company changes licensing
company stops providing timely firmware updates.
This way consumers are guaranteed a certain quality of service and access on their devices. Then vendors get a stamp of approval (like OU or UL) with the level of certification like RTR-open, RTR-private , RTR-long-terms-support etc.
This governance operates within private enterprise while consumers are offered the option to buy into vendors who commit to right-to-repair and right-to-own.
If you are a very technical person, and you want to have root on every device you own, then iOS is not for you. That's ok. Android exists!
But iOS as an appliance-level, walled-garden environment is absolutely perfect FOR MOST USERS. And that's fine.
Nursing a grudge because Apple makes products that include choices that YOU PERSONALLY don't like is incredibly weird and entitled. Just buy something else! There are options!
If you've never unlocked a phone, please educate yourself on how the process works before opining. It's really not as terrifying as you imagine.
Happened twice to me:
Once the URL for getting the key did not exist anymore.
The other time it was disallowed after some time by the company.
I'd be concerned with a move away from root access across the board, but that doesn't appear to be happening.
Also I don't want others to be able to use my phone after stealing it.. here FRP lock helps me but in order for it to work it must also limit how I can use the phone.
I wish we stopped falling for this technical trap and finally focus on the substance - hardware/software companies get away with anti-user features that run on the user's device. This shouldn't be allowed.
I don't need an unlocked bootloader, I just don't want the preinstalled Google spyware. Google should not be allowed to hold my device hostage like this.
If you don't want to buy something you can't install whatever you want onto, don't buy it. 100% the ability or inability to modify the firmware of a device should be disclosed, but if it's disclosed the seller should be able to set the policy to whatever they want
It's the same reason I don't want "the good guys" to have decryption keys to my messaging service, because even if I did trust the FBI, the fact that there is a backdoor at all means it could be exploited by someone I don't trust
Again, if you don't want to use a device that has a locked bootloader, don't buy it. I fail to see how this business model should be legally foreclosed upon. You'll always have the option to buy a device that can be unlocked, someone will always sell such a device. But if you can't lock them, then I can't buy one even if I want to
No matter how convoluted you make the rube goldberg machine to bypass the cryptography, if there's a way to bypass it it will be bypassed
You claimed that an adversary with physical access to your device can compromise your unlockable phone, but presumably this won't happen with a phone that can't be unlocked. Is that not what you claim? If so, please detail how.
Wanting an uncompromisable bootloader is about more than just protection against malware that might modify the software on the device, it's about protecting a phone that can be unlocked from having the software modified by someone with the ability to provide the consent that the end-user would normally give. For example when I hand my phone over in customs, or if it's seized by the police. If my bootloader is not unlockable, I haven't provided them with the keys to unlock the software, and those keys are reasonably strong, then I can be reasonably confident they haven't compromised by device
But, if they can unlock the bootloader for whatever reason, I have no idea now what is running on the device or what was run on it even if they restore it back to a locked condition
That kind of logic cuts both ways: "If you don't want a device with a locked boot loader, just don't buy a device with a locked bootloader".
Unfortunately, as consumers, we're trapped between a rock and a hard place. On the one hand, I would want 100% freedom to use my device exactly as I see fit and run any software I want, without any form of curation from the manufacturer. On the other hand, there are plenty of software companies who do shitty things when given absolute freedom over what to do in a user's device (tracking / spying / etc) and I welcome buying a device where the manufacturer helps me fight some of that.
So I can absolutely see both arguments. And I think both types can coexist. I am happy my iPhone doesn't allow Meta to say "to use WhatsApp, you must install the MetaStore®, give it root and install it from there". I would not be happy with those restrictions on my desktop.
I think the inverse is a much more credible threat, though. "Sorry, you cant sign in to your bank because you are using Linux. Please try again on windows 11 with secure boot turned on" doesn't seem far fetched at all.
That's not an hypothetical for us here in Brazil: online banking was Windows-only for quite some time, because there was no Linux version of the invasive "security plugin" banks require for online banking (the current version of that "security plugin" has a Linux version).
You fix that by making root access inconvenient enough that companies can't rely on the average random user having it enabled.
For example force you to wipe the device to unlock it as another person said in another comment. Or make it so that if you don't unlock it within 7 days of the device purchase and first boot, you cannot unlock it anymore.
AI TikTok voice “Hey guys, if you just bought a new iPhone, make sure you remove Apple’s restriction locks so they can’t control what you install. Just follow these easy steps, but make sure you do it as soon as possible, since you’ll have to set up your phone again!”
With the comments filled with people talking about how terrible Apple is for locking down their phones, everyone’s an idiot for buying such a locked down phone so they better at least unlock the bootloader, etc.
This is not a far-fetched scenario based on some videos I’ve seen sent to me by friends.
That's the whole thing--there should be choice
Even if they never charged a fee for running the store, I bet this would raise a lot of eyebrows.
My point was more about how people would react if MS did such a thing (i.e. installs to come from the store by default).
The exception is FaceIt for counter strike, but that’s not distributed through steam and is entirely third-party.