Think about that for a second. If someone finds a vulnerability in JIRA, they don’t just find a vulnerability in that software: they’ve got access to support tickets, issue tracking, etc about lots of vulnerabilities in lots of software. That’s a big deal.
The fact that the US government had to step in and say PLEASE TAKE THIS SERIOUSLY, rather than Atlassian going into a Code Red situation, shows that they just don’t take the level of responsibility they’ve been given as seriously as is required for what they’re doing. This isn’t just some lousy app having a CVE. This is the keys to the kingdom for a lot of very critical software. This is systemic risk. The problem isn’t the code, it’s the culture.
If you “work in the world of business software” and you think that’s a “complete bullshit statement,” I really hope you don’t work on anything for which such systemic risk is possible. Because, to turn your statement back on you, that’s a complete bullshit way to treat the responsibility you have for the data with which you’ve been trusted. Go build a social media app or an online shopping site or something, and stay out of critical systems that can create cascading vulnerabilities.