I was running one of them, and entering kaggle competitions throughout 2021 and 2022 using them. Many efforts and uses of Sentence-transformers (and new PhD projects) were thrown in the trash with Instruct GPT models and ChatGPT. I mean it's like developing a much better bicycle (lets say an ebike) but then cars come out. It was like that.
The future looked incredibly creative with cross-encoders, things like semantic paths, using the latent space to classify - everything was exciting. A all-in-one LLM that eclipsed embeddings on all but speed for these things was a bit of a kill joy.
Companies that changed existing indexing to use sentence transformers aren't exactly innovating; that process happened once or twice a decade for the last few decades. This was parents point I believe, in a way. And tbh, the improvement in results has never been noticeable to me; exact match is actually 90% of the solution to retrieval(maybe not search) already - we just take it for granted because we are so used to it.
I fully believe in a world without GPT-3, HN demos would be full of sentence transformer and other cool technology being used for demos and in creative ways, compared to how rarely you see them.