I believe he's saying P(bad if signed) < P(bad if not signed) - "I trust signed apps more than unsigned apps". I'm saying, maybe, but I'm not sure, and Cost * P(bad if signed) may be much worse than Cost * P(bad if not signed).
When the Transmission bittorrent client site was hacked to distribute ransomware, it was signed using an unrelated certificate that was likely stolen. This happened twice within a year, with different valid (stolen) certificates:
https://blog.malwarebytes.com/threat-analysis/2016/09/transm...
Stuxnet certificates were also stolen.
This disproves the GGP's premise that a signed app implies the developer paid for it, as well as your assumption that the paper trail for legally acquiring a certificate is an impediment to signing malware.
You're not only trusting the developer who purchased the certificate and the CA that granted the certificate, but also trusting the ongoing security of everybody else who has purchased a trusted certificate. That's a pretty open circle of trust.
Certificate revocation can limit the time of exposure once malware is distributed, but it isn't always implemented.
https://arstechnica.com/information-technology/2017/11/evasi...
"they found 189 malware samples bearing valid digital signatures that were created using compromised certificates issued by recognized certificate authorities and used to sign legitimate software. In total, 109 of those abused certificates remain valid."