But even I would rationally question the survey results, as like all surveys it would only represent those surveyed.
If you scroll to the bottom of the article there is a section "About the respondents and methodology" and to quote, "This year, 306 people participated in the Grafana Labs Observability Survey. We did not contract an outside agency to solicit responses. Instead, we reached out to our community to participate, through our website, social media, and at various Grafana Labs and OSS events".
So let's be upfront here, it's a biased and small sample, but the consistent methodology with prior years means that the directional trends can be identified, even if you cannot extrapolate any specific number to the whole industry.
The indicative signals in there for trends and direction... but it's not quite a 100K audience from all industries and representative of wider engineering.
The number is high due to who was asked to respond.
91% of people claimed to do anything gives me pause.
The only place Datadog was mentioned was in invisible text. (Search the page)
Which seems absolutely crazy to me
I have even seen it used in demos where the host works for a company that has a first party dashboard product. (google, amazon etc)
This should be right at the top as it indicates a substantial bias in the data.
Calling this survey "Key findings and analysis on the state of observability" is disingenuous at best with that methodology. If this team realistically thinks this small sample size and narrow grouping gives an accurate analysis of the big picture, I would be hesitant about any analysis their product can provide.
I have only really found pyroscope.
(I've never used it, just found it after a quick search)