Works well for the vast majority of our customers (although we get the very occasional complaint about wanting a dev environment that works offline). The dataset sizes for local dev are usually so small that the cost rounds to free.
It's only occasional because the people who care about dev environments that work offline are most likely to just skip you and move on.
For actual developer experience, as well as a number of use cases like customers with security and privacy concerns, being able to host locally is essential.
Fair enough if you don't care about those segments of the market, but don't confuse a small number of people asking about it with a small number of people wanting it.
I'm watching them with excitement. We all learn from each other. There's so much to do.
- start small on a laptop. Going through procurement at companies is a pain
- test things in CI reliably. Outages don’t break builds
- transition from laptop scale to web scale easily with the same API with just a different backend
Otherwise it’s really hard to justify not using S3 vectors here
The current dev experience is to start with faiss for PoCs, move to pgvector and then something heavy duty like one of the Lucene wrappers.
Note that we already have a in-your-own-VPC offering for large orgs with strict security/privacy/regulatory controls.
in many CI environments unit tests don't have network access, it's not purely a price consideration.
(not a turbopuffer customer but I have been looking at it)
I've never seen a hard block on network access (how do you install packages/pull images?) but I am sympathetic to wanting to enforce that unit tests run quickly by minimizing/eliminating RTT to networked services.
We've considered the possibility of a local simulator before. Let me know if it winds up being a blocker for your use case.
You pre-build the images with packages installed beforehand, then use those image offline.
Without that it’s unfortunately a non starter
I appreciate your responsiveness and open mind