To add to your comment, Google has been using BERT to power Search since 2019: https://blog.google/products/search/search-language-understa...
I'm going to guess that the only reason they don't use larger models is because of the compute cost. ChatGPT at 4 billion users with today's hardware is an unsustainable business. However, that thought leads me to imagining: if Google offered Search "Premium" using the latest LLMs, how much would people pay for it?