Also, iirc iphones have this feature where if you appear to be under duress, it will refuse to unlock and disable face id. Is this true?
>GrapheneOS improves the security of the fingerprint unlock feature by only permitting 5 total attempts rather than implementing a 30 second delay between every 5 failed attempts with a total of 20 attempts. This doesn't just reduce the number of potential attempts but also makes it easy to disable fingerprint unlock by intentionally failing to unlock 5 times with a different finger.
Loved Graphene, and the Pixel worked flawlessly, but man, that unlock thing drove me nuts more than a few times.
Though with all the devices GrapheneOS supports, there are only two fingers you can plausibly use with the device: the thumb, usually on your dominant hand. It is quite awkward to be using anything else.
All this biometric talk in the world and it’s rarely made convenient for the user like this.
It was likely almost as fast as a physical keyboard smartphone for instant entry into an app.
Cut to my phone failing to recognize the fingerprint whenever it feels like or maybe because the humidity is 0.5% from the ideal value
sigh
heh it would suck to be beaten with a wrench to unlock your phone and, finally, to make it stop you relent but then the phone is like "nope, sorry. if you're gonna be dumb you gotta be tough".
Sort of: if you hold the buttons on both sides of the phone for about three seconds, it will bring up the Power Off/SOS screen. You do not need to interact with that screen, just display it. Easy-peasy, you can do it with the phone in your pocket. Once that screen is displayed, it requires a passcode to unlock the phone. The courts have determined that the passcode is protected by the 5th Amendment, but biometrics are not.
https://arstechnica.com/tech-policy/2023/12/suspects-can-ref...
Not legal advice. Having a trusted contact remotely wipe the device is also a potential option with appropriate iCloud creds and a message passed [2], assuming the device is not powered down or kept in a physical location blocking internet/cellular channels.
[1] New Apple security feature reboots iPhones after 3 days, researchers confirm - https://news.ycombinator.com/item?id=42143265 - November 2024 (215 comments)
[2] Erase a device in Find Devices on iCloud.com - https://support.apple.com/guide/icloud/erase-a-device-mmfc0e...
However on iPhones that have the Emergency SOS feature biometry is disabled until you enter your passphrase/code when that feature is invoked.
Biometry is also disabled until re-authentication if you invoke the shutdown menu by holding the power/power+volume up button.
Neither of those will get you to the Before First Unlock state, however. That is the ideal if you are attempting to protect access to your phone’s data in any adversarial scenario. You must restart/shut down the phone to get back to that.
Same applies to iPads.
There may be vulnerabilities, of course. In Before First Unlock there is not enough cryptographic material available in memory to decrypt application data. The full set of keying material is both user and device specific.
While there's always https://xkcd.com/538/ there are not currently quantum computers that can factor 4k RSA keys, so the court can order whatever it wants, unless they have a way past that (which may involve variations of xkcd 538), they ain't getting shit out of a properly configured digital safe. (construction of said safe is left as an exercise to the reader.)
For the relative handful who are custodians of that sort of data, history suggests a smaller minority than they'd like to admit have a readily achievable breaking point. The true believers who are left then are a minority that's hardly impossible to track and subvert through attacks that don't involve decryption on a device.
The point of that XKCD wasn't to be THE SINGULAR EXAMPLE, it's sort of a Zen Koan for people who only think in terms of technical risks and solutions.
(To be clear I’m not in support of anything close to the current state of affairs and wish we had way stronger privacy rights even in the case of police investigations)
The duress password feature is also useful. Entering it will completely wipe the phone and reset it to factory.
We just need a UX which makes it impossible to know how many profiles a phone has configured. Not some kind of sneaky hidden mode that you can be labeled a terrorist for having enabled, just that's how it works--you have to know a profile exists in order to log into it.
Of course it's not going to stand up to forensic scrutiny, but that's not what the feature is about anyhow.
This is famously used by Uber to protect their systems from the French police, for instance.
Mine gets a quick reboot for going into a checkpoint. This disables biometrics until I enter a passcode.
Maybe someone with more knowledge can chime in here.
> Automatic Restart is a security mechanism in iOS 18.1 iPadOS 18.1 and or later that leverages the Secure Enclave to monitor device unlock events. If a device remains locked for a prolonged period, it automatically restarts, transitioning from an After First Unlock state to a Before First Unlock state. During the restart, the device purges sensitive security keys and transient data from memory.
https://help.apple.com/pdf/security/en_US/apple-platform-sec...
> [...] inactivity reboot triggers exactly after 3 days (72 hours). [...]
https://naehrdine.blogspot.com/2024/11/reverse-engineering-i...
GrapheneOS also has this (https://grapheneos.org/features#auto-reboot) with a default of 18 hours.
Maybe one could try to force restart (https://support.apple.com/en-gb/guide/iphone/iph8903c3ee6/io...) to quickly get to BFU. But I could imagine that it'd be hard to remember and then execute the right steps in a stressful situation.
if i dont click those 5 presses fast enough it instead opens apple cash or whatever it’s called
i’m assuming that in a stressful situation it’d be much more consistent to hold down power and volume rather than clicking quickly
The companies are secretive so who knows what they are up to that we dont know about. What we do know is that these companies do not tell the whole truth when explaining their publicly visible conduct, including their data collection practices
For example, a so-called "tech" company might claim they need a user's phone number for "security" purposes while the data actually serves other purposes for the company that the user might find objectionable if they knew about them (This actually happened)
The mobile phone has become a computer that the user cannot truly control. Companies can remotely install and run code on these computers at any time for any reason.^1 If the user stores data on the phone, the company tries to get the user to sync it to the company's computers
If there are promises, e.g., about "privacy", made by the companies, then these promises are unlikely to be enforceable. It's rather difficult if not impossible to verify such promises are kept, or to discover they have been breached. Unfortunately, when the promises are broken then there is no adequate remedy. It's too late
1. This unfettered access can be blocked but there's been a culture that has emerged around actively doing the opposite. That the so-called "tech" companies are the primary beneficiaries is surely a fortuitous coincidence
Anything else is insecure in principle, and getting less and less secure in practice, as acquisition, collation, sharing, and leveraging unpermissioned information use becomes cheaper, easier, and more profitable by the day.
Cryptography provides a long menu of ways entities can exchange information and interact, without sharing information that is not functionally relevant.
Making those capabilities the basis for digital inter-entity trade is the only way we will get real privacy and avoid the massive predatory surveillance-manipulation-for-hire economy from continuing to metastasize. With AI driving the value and opportunity of its leverage against us ever upward.
Strict laws might have been a practical solution a couple decades ago when information based services began hyperscaling the surveillance-manipulation economy. It wouldn't be a bad thing now. But those laws seem unlikely, so the technical solution is the only path forward.
I don't think people really absorb how much of the value of the economy is parasitically skimmed off by the 2-sided centralized S-M business model. From consumers and ad buyers/producers. The colossal revenues of Google and Facebook to start. And how effectively that is incentivizing and funding continued growth in addictive, manipulative and dominant (through pervasiveness) "personalized" content, that will make things much worse.
No, sorry, that's just silly. Routine biometrics have made personal devices near-unhackable and almost un-stealable. They have turned automated password attacks into a historical memory. They are a huge boon to consumers. Yuge, even.
Can they be abused? Yeah, sure. I guess everything can. But to cynically claim they have no value, or negative value, is just detached from reality.
The 5th Amendment has been (so far) interpreted to only limit things that require conscious thought, such as remembering a password and speaking it or typing it.
Extreme example, imagine a stroke or head injury causing memory loss.
OTOH DNA/Face/Fingerprints, usually can't be 'forgotten'.
And unlike a witness, you can legally lie and mislead officers.
If what you are being charged with carries a larger penalty than simple perjury or destruction of evidence, it makes complete sense to do techniques such as this. Perjury is one of the harder and least prosecuted charges in the USA.
An example from people that I know who have gone through the corrupt courts more than once said the feedback from the last case was the prosecutor felt like a fish flopping outside of water.
The court stands no chance when someone uses techniques that require the government agencies to use their secret programs and tactics. They will rather drop and lose the case. Most of the time they are also extremely incompetent when it comes to technology and have to hire many outside consultants, which gives you more chances to fight. An easy win for the citizen.
A solution that can seem like plausible deniability could be interesting.
If the phone is in your pocket and somebody puts a gun to your head and tells you not to move, you are not pressing anything on your phone.
My impression is deliberately doing this would be illegal. It would have to be convincingly deniable somehow.
Is there a way to do that?
Which requires them to prove they know that device likely contains relevant information. Just being party to a court case doesn't mean you're forbidden from deleting anything ever again... like I said there are very specific rules for evidence, and one cannot begin to claim something relevant is destroyed if you can't even show that you had any idea what might have been destroyed in the first place.
You're right that in normal circumstances you can routinely delete records for data hygiene, to save money, as part of a phone repair, and so on, unless you've been court ordered otherwise.
And remember that without a court case alleging something in the first place, they wouldn't even have access to the device to know 1. it existed and 2. it might have had something useful on it. If I had two devices in my house and they're both clean, you can't just say "oh we think one of them had some evidence that was destroyed"... you need some kind of proof that it at least likely contained something relevant in the past before you can even begin to presume it might have been destroyed.
Same for your second paragraph: "oh we think one of them had some evidence..." - that's not how it works! It's your intent to destroy evidence that is the crime, not whether you destroyed evidence. They do not need to prove you destroyed evidence or even likely storage of evidence to get you convicted.
This is the main thing you're saying that is bad legal advice.
For your other point, yes, if they can't tell anything happened, or it seems like an accident, then you're probably going to get away with it. This happens a lot. I think that's a different topic. Original topic was: if you wink at your phone or use a weird finger (or some other visible gesture) and now your phone's wiped, could you get in legal trouble for that sequence of events.
Accidentally destroying evidence can still carry a serious penalty, but yes the intent is generally the most important. But absent intent, it can still help the prosecution to know the device had something useful to them on it.
E.x. if one had a "dead man's switch" phone that required a passkey every x minutes, and each time you did so it set the next threshold...
You'd also have to rely on this unnamed other to force that particular finger, rather than the others...
suspect: "no you cant force me to put my pinky there", attempts to make pinky inaccessible.
other: "we will charge you with obstruction if you resist placing the pinky"
Never speak with or offer any assistance to the police or government. It will come back to hurt you.
Why do you think it's appropriate to talk to people like this?