Oneplus phone update introduces hardware anti-rollback
consumerrights.wiki377 points by validatori 9 hours ago
377 points by validatori 9 hours ago
This has been a commonplace feature on SOCs for a decade or two now. The comments seem to be taking this headline as out‑of‑the‑ordinary news, phrased as if Oneplus invented it. Even cheapo devices often use an eFuse as anti-rollback. We do it at my work whenever root exploits are found that let you run unsigned code. If we don't blow an eFuse, then those security updates can just be undone, since any random enemy with hardware access could plug in a USB cable, flash the older exploitable signed firmware, steal your personal data, install a trojan, etc. I get the appeal of ROMs/jailbreaking/piracy but it relies on running obsolete exploitable firmware. It's not like they're forcing anyone to install the security patch who doesn't want it. This is normal.
It ain't normal to me. If I bought a phone, I should be able to decide that I want to run different software on it.
Let's say OP takes a very different turn with their software that I am comfortable with - say reporting my usage data to a different country. I should be able to say "fuck that upgrade, I'm going to run the software that was on my phone when I originally bought it"
This change blocks that action, and from my understanding if I try to do it, it bricks my phone.
The whole point of this is so that when someone steals your phone, they can't install an older vulnerable version of the firmware than can be used to set it back to factory settings which makes it far more valuable for resale.
Phone thieves aren't checking which phone brand I have before they knick my phone. Your scenerio is not improved by making Oneplus phones impossible to use once they're stolen.
It reduces the expected value of stealing a phone, which reduces the demand for stolen phones.
I find it hard to believe that Oneplus is spending engineering and business recourses, upsetting a portion of their own userbase, and creating more e-waste because they want to reduce the global demand for stolen phones. They only have like 3% of the total market, they can't realistically move that needle.
I don't understand what business incentives they would have to make "reduce global demand for stolen phones" a goal they want to invest in.
This is a security feature from Qualcomm. So there is little of their own time spent on this.
I'm fine with a total loss of hardware. I'd rather the hardware do what I want. I own it.
It'd be ideal if the phone manufacturer had a way to delegate trust and say "you take the risk, you deal with the consequences" - unlocking the bootloader used to be this. Now we're moving to platforms treating any unlocked device as uniformly untrusted, because of all of the security problems your untrusted device can cause if they allow it inside their trust boundary.
We cant have nice things because bad people abused it :(.
Realistically, we're moving to a model where you'll have to have a locked down iPhone or Android device to act as a trusted device to access anything that needs security (like banking), and then a second device if you want to play.
The really evil part is things that don't need security (like say, reading a website without a log in - just establishing a TLS session) might go away for untrusted devices as well.
> We cant have nice things because bad people abused it :(.
You've fallen for their propaganda. It's a bit off topic from the Oneplus headline but as far as bootloaders go we can't have nice things because the vendors and app developers want control over end users. The android security model is explicit that the user, vendor, and app developer are each party to the process and can veto anything. That's fundamentally incompatible with my worldview and I explicitly think it should be legislated out of existence.
The user is the only legitimate party to what happens on a privately owned device. App developers are to be viewed as potential adversaries that might attempt to take advantage of you. To the extent that you are forced to trust the vendor they have the equivalent of a fiduciary duty to you - they are ethically bound to see your best interests carried out to the best of their ability.
> That's fundamentally incompatible with my worldview and I explicitly think it should be legislated out of existence.
The model that makes sense to me personally is that private companies should be legislated to be absolutely clear about what they are selling you. If a company wants to make a locked down device, that should be their right. If you don't want to buy it, that's your absolute right too.
As a consumer, you should be given the information you need to make the choices that are aligned with your values.
If a company says "I'm selling you a device you can root", and people buy the device because it has that advertised, they should be on the hook to uphold that promise. The nasty thing on this thread is the potential rug pull by Oneplus, especially as they have kind of marketed themselves as the alternative to companies that lock their devices down.
I don't entirely agree but neither would I be dead set against such an arrangement. Consider that (for example) while private banks are free not to do business with you at least in civilized countries there is a government associated bank that will always do business with anyone. Mobile devices occupy a similar space; there would always need to be a vendor offering user controllable devices. And we would also need legal protections against app authors given that (for example) banking apps are currently picking and choosing which device configurations they will run on.
I think it would be far simpler and more effective to outlaw vendor controlled devices. Note that wouldn't prevent the existence of some sort of opt-in key escrow service where users voluntarily turn over control of the root of trust to a third party (possibly the vendor themselves).
You can already basically do this on Google Pixel devices today. Flash a custom ROM, relock the bootloader, and disable bootloader unlocking in settings. Control of the device is then held by whoever controls the keys at the root of the flashed ROM with the caveat that if you can log in to the phone you can re-enable bootloader unlocking.
>and then a second device if you want to play.
With virtualization this could be done with the same device. The play VM can be properly isolated from the secure one.
How is that supposed to fix anything if I don't trust the hypervisor?
It's funny, GP framed it as "work" vs "play" but for me it's "untrusted software that spies on me that I'm forced to use" vs "software stack that I mostly trust (except the firmware) but BigCorp doesn't approve of".
Then yes you will need a another device. Same if you don't trust the processor.
> since any random enemy with hardware access
Once they have hardware access who cares? They either access my data or throw it in a lake. Either way the phone is gone and I'd better have had good a data backup and a level of encryption I'm comfortable with.
This not only makes it impossible to install your own ROMs, but permanently bricks the phone if you try. That is not something my hardware provider will ever have the choice to make.
It's just another nail in the coffin of general computing, one more defeat of what phones could have been, and one more piece of personal control that consumers will be all too happy to give up because of convenience.
Sounds like that should be an option in "Developer Options" that defaults to true, and can only be disabled after re-authentication / enterprise IT authorization. I don't see anything lost for the user if it were done this way.
According to OP this does not disable bootloader unlocking in itself. It makes the up-versioned devices incompatible with all previous custom ROMs, but it should be possible to develop new ROM releases that are fully compatible with current eFuse states and don't blow the eFuse themselves.
I wonder, is there currently unpublished 0day on the SoC and they're forcing use of the latest firmware to ensure they're not vulnerable once the details become public? That would be a reason for suddenly introducing this without explanation.
I understand that there is a nuance somewhere, but that's about it.
Can you explain it in simpler terms such that an idiot like me can understand? Like what would an alternative OS have to do to be compatible with the "current eFuse states"?
People need to re-sign their releases and include the newer version of bootloader, more or less.
Yes, though noting that since the antirollback is apparently implemented by the bootloader itself on this Qualcomm SoC, this will blow the fuse on devices where the new version is installed, so the unofficial EDL-mode tools that the community seems to be most concerned about will still be unavailable, and users will still be unable to downgrade from the newer to older custom ROM builds.
Not being able to downgrade and using the debug tools was the exact point of doing this thing, as far as I understand.
So that’s how in an event of war US adversaries will be relieved of their devices
> The anti-rollback mechanism uses Qfprom (Qualcomm Fuse Programmable Read-Only Memory), a region on Qualcomm processors containing one-time programmable electronic fuses.
What a nice thoughtful people to build such a feature.
That’s why you sanction the hell out of Chinese Loongson or Russian Baikal pity of CPU — harder to disable than programmatically “blowing a fuse”.
This kind of thing is generally used to disallow downgrading the bootloader once there is a bug in chain of trust handling of the bootloader. Otherwise once broken is forever broken. It makes sense from the trusted computing perspective to have this. It's not even new, it was still there on p2k motorollas 25 years ago.
You may not want trusted computing and root/jailbreak everything as a consumer, but building one is not inherently evil.
Trusted computing means trusted by the vendor and content providers, not trusted by the user. In that sense I consider it very evil.
If the user doesn't trust an operating system, why would they use it. The operating system can steal sensitive information. Trusted computing is trusted by the user to the extent that they use the device. For example if they don't trust it, they may avoid logging in to their bank on it.
> If the user doesn't trust an operating system, why would they use it.
Because in the case of smartphones, there is realistically no other option.
> For example if they don't trust it, they may avoid logging in to their bank on it.
Except when the bank trusts the system that I don't (smartphone with Google Services or equivalent Apple junk installed), and doesn't trust the system that I do (desktop computer or degoogled smartphone), which is a very common scenario.
To trust an Android device, I need to have ultimate authority over it. That means freedom to remove functionality I don't like and make changes apps don't like. Otherwise, there are parts of practically every Android that I don't approve of, like the carrier app installer, any tracking/telemetry, most preinstalled apps, etc.
I recently moved to Apple devices because they use trusted computing differently; namely, to protect against platform abuse, but mostly not to protect corporate interests. They also publish detailed first-party documentation on how their platforms work and how certain features are implemented.
Apple jailbreaking has historically also had a better UX than Android rooting, because Apple platforms are more trusted than Android platforms, meaning that DRM protection, banking apps and such will often still work with a jailbroken iOS device, unlike most rooted Android devices. With that said though, I don't particularly expect to ever have a jailbroken iOS device again, unfortunately.
Apple implements many more protections than Android at the OS level to prevent abuse of trusted computing by third-party apps, and give the user control. (Though some Androids like, say, GrapheneOS, implement lots that Apple does not.)
But of course all this only matters if you trust Apple. I trust them less than I did, but to me they are still the most trustworthy.
>to protect against platform abuse, but mostly not to protect corporate interests
What do you mean by this? On both Android and iOS app developers can have a backend that checks the status of app attestation.
Do you actually, bottom-of-your-heart believe that ordinary consumers think like this? They use TikTok and WhatsApp and Facebook and the Wal-Mart coupon app as a product of deep consideration on the web of trust they're building?
Users don't have a choice, and they don't care. Bitlocker is cracked by the feds, iOS and Android devices can get unlocked or hacked with commercially-available grey-market exploits. Push Notifications are bugged, apparently. Your logic hinges on an idyllic philosophy that doesn't even exist in security focused communities.
Yes, I do believe from the bottom of my heart the users trust the operating systems they use. Apple and Google have done a great job at security and privacy which is why it seems like users don't care. It's like complaining why you have a system administrator if the servers are never down. When things are run well the average person seems ignorant of the problems.
Google certainly hasn't done a great job on privacy. Android devices leak so much information.
https://arstechnica.com/information-technology/2024/10/phone...
https://peabee.substack.com/p/everyone-knows-what-apps-you-u...
About Apple I just don't know enough because I haven't seriously used them for years
Yet, in the big picture Google is doing a good enough job that those information leaks have not caused them harm. When you really zoom in you can find some issues, but the real world impact of them is not big enough to influence most consumers.
What sort of hypothetical harm are you imagining here? Suppose the information leaks were a serious issue to me - what are my options? Switch to Apple? I doubt most consumers are going to consider something like postmarketos.
The carriers in the US were caught selling e911 location data to pretty much whoever was willing to pay. Did that hurt them? Not as far as I can tell, largely because there is no alternative and (bizarrely) such behavior isn't considered by our current legislation to be a criminal act. Consumers are forced to accept that they are simply along for the ride.