In designing my search engine, I ended up building a custom index instead of using an off-the-shelf database solution. Franky it's so much data a SQL database can't even load it when run on the same server that runs my entire search engine.
With my own set-up I can have my data in pre-calculated immutable tables that are memory mapped off disk. That means I can write lock-free code. My updates are large hour-long batch operations off a written journal.
For the lexicon I'm also rolling a special one-way compression scheme/hash function that guarantees uniqueness but only works due to quirks with the data.
General purpose databases can't do these things, because then they wouldn't be general purpose databases.
SQL databases perform really well on average given your query is relatively optimized, but they do not represent the pinnacle of performance. Any time you want the fastest most efficient code to deal with a particular use case, it pays enormous dividends to roll your own implementation that has domain awareness. It means you can cut corners general solutions simply can't.