In the past, we relied heavily on using EVAL[SHA] with 200+ loc Lua scripts in order to implement high throughput, atomic transactions for realtime systems. We also used the JSON & Redis Query Language (previous named "full-text search") to build a more maintainable & strongly consistent system than using raw key-values and manually building secondary indexes.
We’ve since migrated to a native FoundationDB and SQLite hybrid setup, but this approach would have been really helpful for early-stage prototyping with a higher performance ceiling (thanks to FDB sharding) than a single-node Redis with AOF.
Related: Redis Cluster is a world of pain when handling clustering keys and cross-node queries and orchestration. DragonflyDB is chasing after the market of companies considering sharding Redis because of performance issues by providing better single-node performance. There's probably an alternative approach that could work by using an architecture like this.
Exactly. It's a distributed list, map, etc that is often used as a cache, and sometimes as a queue, but it's bigger than all that.
Oh? I'd be interested in hearing more about that. Is this common?
JVM is fast, but raw Java is not seen so often in corpo-rat world. And if for single endpoint getting data from database and encoding to JSON you have to schedule 2 cores and 4G of RAM for every few hundred QPS - something is wrong.
The ecosystem of Java is so huge. Most people who use Java barely have any idea what the rest of the software industry is doing.
I think herein lies the problem. The Java enterprise software (usually spring) world and the everything else under the sun world are very separate. Java devs usually have minimal visibility of other ecosystems and their patterns, and are very reliant on extremely mature (and heavy) tooling that other languages don't tend to use. Other devs hate how bloated the Java ecosystem feels, and that they can't use any of their usual tools. Neither tends to understand that their approach isn't the only way.
Not sure this is really true any more, but it definitely brings back memories from when I was learning OO programming (let's say, a couple decades ago, pre-github).
At the time, it seemed to be an industry-wide assumption that "software engineering" is exclusively done in Java. Every learning resource I could find at the time was deep, deep into inheritance and OO design patterns. Things seem better these days!
Java is the most performant runtime outside of C/C++/Rust. It is a first choice for any project.
So what's a good example so I can compare it to a non Java version?
The latency makes integration testing unnecessarily tedious. Don't even get me started on Maven -- dev tooling has to reimplement the build system rather than invoking it because the performance is so poor.
Add Go to that list. Or any popular compiled language for that matter.
JavaScript can be faster in some tasks. Even if JS is 2x slower in another task, it uses 3x less memory than Java.
Which costs more, double CPU time or triple memory requirement?
This was quite eye opening when thinking about this, I am aware of how underappreciated the performance of JVM is, but I never thought about how widely deployed it is
Maybe this is it.
[1] https://redis.io/resources/building-large-databases-redis-en...
[1] https://kvrocks.apache.org/ [2] https://github.com/apache/kvrocks
https://github.com/dragonflydb/dragonfly DragonflyDB (not open source, BuSL-1.1) with more performance
https://github.com/apache/kvrocks Apache Kvrocks (Apache-2.0) uses disk-based NoSQL database to lower memory usage