My understanding is that these certs are vendor-specific (and potentially more finely grained than that - I think it'd be fine to have a separate cert per model, for instance) and are rooted in the firmware, not the hardware - that is, it's not the secure boot signing key, and a new firmware update can simply use a different platform certificate.

Having said that, the fact that there appear to be multiple vendors affected by this (looking at those hashes in Virustotal, there are signatures that claim to be owned by Samsung, LG, and Mediatek, at least), and that's certainly concerning. My entirely unsupported guess is that they may all share manufacturing for some models, and certificates may have been on site there?

But the more significant question is how many phones trust these certificates. As I mentioned before, these could be model specific, and any leaked certificates could be tied to a very small number of phones. Or maybe Samsung used the same platform cert for all phones manufactured in the past decade, and that would be a much larger problem. With the information presently available we simply don't know how big a deal this is.

Edit to add: given Mediatek certs appear here, and given all the vendors linked to this have shipped low-end phones based on Mediatek SoCs, it wouldn't surprise me if it turns out Mediatek were the source.

Is the "First Time Seen in the Wild" accurate? (2016-02-13 11:50:54 UTC)


If so, this has been around for 6 years.

EDIT: A number of listed APKs target API 22 (Android 5.1). This isn't definite, (and apps can be run on later phones), but this also implies some compromises occurred in ~2015.

This thread leaves a lot of unanswered questions:

1. This was likely mitigated through a device update. What version did it roll out with? Which devices are still unpatched?

2. How was it compromised? Was it an OEM? An internal leak at Google?

3. What is the attack vector? It sounds like it was likely side-loading apps used by some attacker, but did any of these make it onto the Play Store?

I wonder if any of these leaked keys were used to sign the base android installers for the phones themselves?

If so, this might be a way to get new versions of AOSP onto Samsung phones that are bootloader-locked and have no current support by Samsung. Can anyone anyone experienced with Android+Samsung comment here?

Why this is Chromium issue tracker?

Or is the same tracker used across all Google projects?

What risks does this cause me as an android user? Does this mean my phone might accept auto updates that install malware onto my device?

What can I do to mitigate those risks?

What's the blast radius of this? Are only specific models of phones affected (if so, which?), or does this impact entire brands or the whole ecosystem?
Here's a great example of the pitfalls of forgoing real security in favor of centralized "security" that exists to secure platform owners' bottom lines and to allow them to decide what can and can't run on devices.
One interesting thing is 2abeb8565c30a3d2af0fc8ea48af78a702a1c3b432845e3f04873610fac09e1b was mentioned in the original description 1 but removed in 2/3.

That package is signed with an Android key instead of a vendor one...


I'd love to have this key for my device so I can actually have full ownership of it.
So this was disclosed November 11 (edit: or maybe May 13 as per green text?) and became public yesterday November 30. Leaves little time for Android devices to get the new key no?
Root for everyone? Maybe that's a good thing, finally restoring the balance of power from the megacorps to the actual owners.
There seems to be some confusion about this, so I'll just copy/paste what I wrote on Twitter:

>Folks, this is bad. Very, very bad. Hackers and/or malicious insiders have leaked the platform certificates of several vendors. These are used to sign system apps on Android builds, including the "android" app itself. These certs are being used to sign malicious Android apps!

>Why is that a problem? Well, it lets malicious apps opt into Android's shared user ID mechanism and run with the same highly privileged user ID as "android" - android.uid.system. Basically, they have the same authority/level of access as the Android OS process!

>Here's a short summary of [shared UID](https://blog.esper.io/android-13-deep-dive/#shared_uid_migra...), from my Android 13 deep dive.

>The post on the Android Partner Vulnerability Initiative issue tracker shared SHA256 hashes of the platform signing certificates and correctly signed malware using those certificates. Thanks to sites like VirusTotal and APKMirror, it's trivial to see who is affected...

>So, for example, this malware sample: https://virustotal.com/gui/file/b1f191b1ee463679c7c2fa7db5a2...

>scroll down to the certificate subject/issuer, and whose name do you see? The biggest Android OEM on the planet? Yeah, yikes.

>Go to APKMirror and just search for the SHA256 hash of the corresponding platform signing certificate... https://apkmirror.com/?post_type=app_release&searchtype=apk&...

>Yeah, this certificate is still being used to sign apps.

>That's just one example. [There are others at risk, too.](https://twitter.com/mszustak/status/1598406354464829440)

>In any case, Google recommends that affected parties should rotate the platform certificate, conduct an investigation into how this leak happened, and minimize the number of apps signed with the platform certificate, so that future leaks won't be as devastating.

>Okay, so what are the immediate implications/takeaways for users?

>- You can't trust that an app has been signed by the legitimate vendor/OEM if their platform certificate was leaked. Do not sideload those apps from third-party sites/outside of Google Play or trusted OEM store.

>- This may affect updates to apps that are delivered through app stores if the OEM rotates the signing key, depending on whether or not that app has a V3 signature or not. V3 signature scheme supports key rotation, older schemes do not.

>OEMs are not required to sign system apps with V3 signatures. The minimum signature scheme version for apps targeting API level 30+ on the system partition is V2. You can check the signature scheme using the apksigner tool: https://developer.android.com/studio/command-line/apksigner

>Affected OEMs can still rotate the cert used to sign their system apps that have V2 signatures and then push an OTA update to deliver the updated apps. Then they can push app updates with that new cert, but devices that haven't received OTAs won't receive those app updates.

What is the risk here? Presumably Google Play would not accept an apk that was signed by "android", right? Assuming that is not the case, the only risk would be installing apks from alternative sources, like F-Droid, who should probably check for strange signings, too, IMHO.

Or there is "something else", some sort of bullshit that goes over the baseband, where updates cannot be refused (other than to put your device in airplane mode, or off) and now because the "platforms" couldn't protect their private keys from malefactors, kids with SDRs are going to effortlessly pown people's phones by pushing their own software over a possible, nay probable, proprietary baseband channel.

(Tangential scifi rant: And then you add the risk of manufacturer shenanigans at the PCB or chip-image level, and you can't really do due diligence on chips without your own electron microscope (and possibly not even then). I've had this worried thought with software, too, that there is too much complexity to really understand it. In the same way our computer hardware is actually too hard for any one person to understand "completely" - that is, possess the skill-set of every individual contributor on the engineering team of a company that makes smartphones.)

This is confidence-inspiring considering that Google now requires letting them sign your apps for some features now.

> OEMs have mitigated the issues above in previous updates.

Yeah, sure they have. Because OEMs are so good at updates. Would be interesting to see how many users such updates are available to. My guess is between 20 to 40%.

In my opinion, mobile Authentication should be perform along with registered mobile’s IMEI number + sim unique number or telecom providers unique token number. This can be easily done by Microsoft , intuine . This will be one additional layer of protection to your mobile auth
So phones are heavily locked down by manufacturers for our own security, but apparently they cannot secure their own keys. If anyone outside a few million nerds really understood this and saw all the vulnerabilities we have seen over the past few years, locking users out of their devices would be completely indefensible.
Are there any VirusTotal employees here who can help figure out the answer to this so that we can find all the affected APKs? https://twitter.com/ArtemR/status/1598444589269909504
The title of the post is super scary but I have no idea what it means. There is no description about it in the linked page. Reading through the comments doesn’t help either. I would just wait for a proper write up.
Based on what I’ve read over the past year about security issues, it’s coming into focus how some personal data of mine may have gotten leaked: it was never safe to begin with.
Is this like Google's central "root key"? Or does each Android OS distributor, e.g. Samsung, LineageOS, etc., have their own certificate?
Well that's a bummer
Interesting for whoever is curious:

> Fri, Nov 11, 2022 at 8:01 AM PST (20 days ago)

So it took 20 days to disclose.

I had speculated for a while that Secure Boot, Widevine, Trusted Computing, all of it seems like they have some pretty serious central points of failure. So much so, that it would be a modern heist of the century if they were stolen.

If someone (for example) got Apple's iOS signing key and Apple's HTTPS certificate, Apple could suffer catastrophic damage. If someone got the PlayStation 5 signing key or the Xbox One signing key, catastrophic damage there. In a way, it's a beautiful, super-secure house... built on a single ludicrously powerful point of failure. Good thing we don't have any corrupt government agencies who might want to bribe someone for keys... yet... hopefully...

This is actually something I would fear for the future. There have been countless physical heists - most recently in Antwerp, Belgium, where over $100 million in diamonds were stolen in 2003. We haven't had a major signing key stolen yet, but there's always that first day... if you can't keep $100M in diamonds safe, can you really be sure that you can keep a hardware signing key safe forever? Heck, the logistics of stealing the diamonds is insane - but stealing a key only takes a pencil and a piece of paper.

The comments on some issues are downright unsettling. https://bugs.chromium.org/p/apvi/issues/detail?id=34#c9
If I understand this correctly this is an orbital nuke on android security.