Having said that, the fact that there appear to be multiple vendors affected by this (looking at those hashes in Virustotal, there are signatures that claim to be owned by Samsung, LG, and Mediatek, at least), and that's certainly concerning. My entirely unsupported guess is that they may all share manufacturing for some models, and certificates may have been on site there?
But the more significant question is how many phones trust these certificates. As I mentioned before, these could be model specific, and any leaked certificates could be tied to a very small number of phones. Or maybe Samsung used the same platform cert for all phones manufactured in the past decade, and that would be a much larger problem. With the information presently available we simply don't know how big a deal this is.
Edit to add: given Mediatek certs appear here, and given all the vendors linked to this have shipped low-end phones based on Mediatek SoCs, it wouldn't surprise me if it turns out Mediatek were the source.
If so, this has been around for 6 years.
EDIT: A number of listed APKs target API 22 (Android 5.1). This isn't definite, (and apps can be run on later phones), but this also implies some compromises occurred in ~2015.
1. This was likely mitigated through a device update. What version did it roll out with? Which devices are still unpatched?
2. How was it compromised? Was it an OEM? An internal leak at Google?
3. What is the attack vector? It sounds like it was likely side-loading apps used by some attacker, but did any of these make it onto the Play Store?
If so, this might be a way to get new versions of AOSP onto Samsung phones that are bootloader-locked and have no current support by Samsung. Can anyone anyone experienced with Android+Samsung comment here?
Or is the same tracker used across all Google projects?
What can I do to mitigate those risks?
That package is signed with an Android key instead of a vendor one...
>Folks, this is bad. Very, very bad. Hackers and/or malicious insiders have leaked the platform certificates of several vendors. These are used to sign system apps on Android builds, including the "android" app itself. These certs are being used to sign malicious Android apps!
>Why is that a problem? Well, it lets malicious apps opt into Android's shared user ID mechanism and run with the same highly privileged user ID as "android" - android.uid.system. Basically, they have the same authority/level of access as the Android OS process!
>Here's a short summary of [shared UID](https://blog.esper.io/android-13-deep-dive/#shared_uid_migra...), from my Android 13 deep dive.
>The post on the Android Partner Vulnerability Initiative issue tracker shared SHA256 hashes of the platform signing certificates and correctly signed malware using those certificates. Thanks to sites like VirusTotal and APKMirror, it's trivial to see who is affected...
>So, for example, this malware sample: https://virustotal.com/gui/file/b1f191b1ee463679c7c2fa7db5a2...
>scroll down to the certificate subject/issuer, and whose name do you see? The biggest Android OEM on the planet? Yeah, yikes.
>Go to APKMirror and just search for the SHA256 hash of the corresponding platform signing certificate... https://apkmirror.com/?post_type=app_release&searchtype=apk&...
>Yeah, this certificate is still being used to sign apps.
>That's just one example. [There are others at risk, too.](https://twitter.com/mszustak/status/1598406354464829440)
>In any case, Google recommends that affected parties should rotate the platform certificate, conduct an investigation into how this leak happened, and minimize the number of apps signed with the platform certificate, so that future leaks won't be as devastating.
>Okay, so what are the immediate implications/takeaways for users?
>- You can't trust that an app has been signed by the legitimate vendor/OEM if their platform certificate was leaked. Do not sideload those apps from third-party sites/outside of Google Play or trusted OEM store.
>- This may affect updates to apps that are delivered through app stores if the OEM rotates the signing key, depending on whether or not that app has a V3 signature or not. V3 signature scheme supports key rotation, older schemes do not.
>OEMs are not required to sign system apps with V3 signatures. The minimum signature scheme version for apps targeting API level 30+ on the system partition is V2. You can check the signature scheme using the apksigner tool: https://developer.android.com/studio/command-line/apksigner
>Affected OEMs can still rotate the cert used to sign their system apps that have V2 signatures and then push an OTA update to deliver the updated apps. Then they can push app updates with that new cert, but devices that haven't received OTAs won't receive those app updates.
Or there is "something else", some sort of bullshit that goes over the baseband, where updates cannot be refused (other than to put your device in airplane mode, or off) and now because the "platforms" couldn't protect their private keys from malefactors, kids with SDRs are going to effortlessly pown people's phones by pushing their own software over a possible, nay probable, proprietary baseband channel.
(Tangential scifi rant: And then you add the risk of manufacturer shenanigans at the PCB or chip-image level, and you can't really do due diligence on chips without your own electron microscope (and possibly not even then). I've had this worried thought with software, too, that there is too much complexity to really understand it. In the same way our computer hardware is actually too hard for any one person to understand "completely" - that is, possess the skill-set of every individual contributor on the engineering team of a company that makes smartphones.)
> OEMs have mitigated the issues above in previous updates.
Yeah, sure they have. Because OEMs are so good at updates. Would be interesting to see how many users such updates are available to. My guess is between 20 to 40%.
> Fri, Nov 11, 2022 at 8:01 AM PST (20 days ago)
So it took 20 days to disclose.
If someone (for example) got Apple's iOS signing key and Apple's HTTPS certificate, Apple could suffer catastrophic damage. If someone got the PlayStation 5 signing key or the Xbox One signing key, catastrophic damage there. In a way, it's a beautiful, super-secure house... built on a single ludicrously powerful point of failure. Good thing we don't have any corrupt government agencies who might want to bribe someone for keys... yet... hopefully...
This is actually something I would fear for the future. There have been countless physical heists - most recently in Antwerp, Belgium, where over $100 million in diamonds were stolen in 2003. We haven't had a major signing key stolen yet, but there's always that first day... if you can't keep $100M in diamonds safe, can you really be sure that you can keep a hardware signing key safe forever? Heck, the logistics of stealing the diamonds is insane - but stealing a key only takes a pencil and a piece of paper.