A native app on Windows, macOS, iOS and Android (TV), and some other solution for other platforms, isn't an unreasonable ask in this context.
I mean granted, for a lot of companies, having a substandard or suboptimal app simply does not matter to their bottom line, because the product trumps the implementation details in the end; people are willing to put up with e.g. a bloated web app because it gives them access to a good chat service (think slack, discord). People were willing to put up with Twitter's fail whale outages whenever Justin Bieber tweeted because they had something good (network effect?).
The massive popularity of React Native on the iOS platform suggests there's more to the story. Discord, Slack, Spotify etc might also beg to differ; all electron apps, all the most dominant players in their respective markets. Consider all the massive gaming successes on the Unity platform?
The deference to native purity is an engineering conceit not something that users actually care about except in rare cases.
This assumes that, generally, native apps are more buggy. Why? This is contestable at best, or getting it the wrong way around at the worst. Perhaps at the hands of inexperienced developers it's right. But experienced developers coding in native APIs will probably produce an app with fewer bugs.
And we are talking about a billion dollar company. It can afford a handful of really good native developers for each platform. It can attract talented developers who can write cross-platform native code.
Cross-platform toolkits introduce their own class of bugs, which might require patches upstream to resolve, or annoying local forks.
As for the economics of it all, I'll leave that to the other sub-comment which deals with that with an excellent analogy to Apple. People will pay for quality.
They are, not because "native" code is inherently buggier but because the amount of code that needs to be written is multiplied by every platform you have to support to replicate the same experience per platform; more code = more bugs, and you have to handle all the nasty edge cases that are specific to each platform: a major increase in bugs is assured.
> And we are talking about a billion dollar company.
The size of the company doesn't matter, it makes no sense to massively increase the cost, complexity and staff size needed to support an existing product that's already massively successful.
> Cross-platform toolkits introduce their own class of bugs, which might require patches upstream to resolve, or annoying local forks.
This is true of literally any external source code you import into your project. However, if you re-invent the wheel you have to pay to fix it rather than having it fixed upstream for free.
This was a startup. Not a billion dollar company. When your app is used by millions, it's a worthwhile investment.
It's also hardly boring to achieve the same end result on multiple platforms by using appropriate native code for each. Particularly when it produces satisfyingly fluid and responsive end results. Perhaps an engineer that considers it boring is in the wrong field. A UI developer should get satisfaction from developing UIs, not as it being a stepping stone to get into systems development. I can't think of anything more boring in app development than developing an app which operates in a mediocre way.
A carefully planned native app for each platform can still share parts of the same codebase, you're not necessarily reinventing the wheel each time.
“it's a worthwhile investment.” Obviously not, or you’d see it more.