std::map, for example, is only appropriate if you need a red-black tree specifically. Which almost nobody does. std::unordered_map is less awful, but abseil has a literal straight upgrade. With the same API. So... why would you pick the slower thing when you're using C++? std::vector is only really appropriate if you know you never have small vectors, which is again a more obscure situation.
That assertion makes no sense at all. The stl contrainers work very well as basic generic containers that can safely be used for pretty much any conceivable use where performance isn't super critical. I'm talking about cases like, say, you need to map keys to values but you don't really care about performance or which specific data structure you're using. That's stl's domain: robust, bullet-proof implementations of basic data structures that are good enough for most (or practically all) cases with the exception of a few very niche applications.
If you happen to be one of the rare cases where you feel you need to know if a container is built around a red-black tree or any other fancy arcane data structure, and if this so critical to you that you feel the need to benchmark the performance to assess whether you either need to use non-defaults or completely replace parts or the whole container with a third-party alternative... Then and only then the stl is not for you.
You're arguing it's better to use something that's across the board worse for nearly every user, and by a lot, just because... why? It's slightly more convenient?
It really isn't. The whole STL is, by design, a template library packed with generic data structures that are designed to have very robust defaults and still be customizable and extensible.
When the defaults are good enough, which is the case in general, the STL will do. If you have a niche requirement (say, games) or feel adventurous, you adopt custom and/or specialized solutions.
This has been the case since the STL's inception. They are the standard, default library. I can't understand how someone is able to miss this fact.
There are lots of subtleties STL containers have to worry about in designing containers, regarding everything from iterator & pointer invalidation to allocation and allocator propagation. All this is because they're designed to be general-purpose and support most conceivable use cases. Their replacements have to trade off requirements in order to get better performance or otherwise improve on some axes.
It only breaks swap of the container itself during iteration. Which is a super niche condition.
And that swap also invalidates some of std::vector's iterators as well - specifically the end() iterator.
> They also really only make sense for tiny objects (ints, etc.) given you don't want a pickup truck's worth of data on your stack. They're most definitely not general-purpose.
Of course they are still general-purpose. They can (and do) specialize on the size of the object being contained. The only reason std::vector doesn't also have SSO is because it's an ABI break. Not because it's better in some way or less fragile. Legacy is the only reason.
And they don't invalidate the iterators that point to actual elements, which was kind of the entire point I was making. Don't let that stop you from trying to make it look like I'm just blurting out nonsense, though.
std::function is fine for prototyping, but its size hit is extreme, so in embedded code we use other implementations. But where size and speed doesn't matter? Why bother?
These are all largely header libraries. You're already hauling in a dependency, and in every c++ file that uses it at that.
> std::function is fine for prototyping
std::function isn't part of the containers library of the STL (containers being all the stuff here: https://en.cppreference.com/w/cpp/container ). I agree std::function is fine, it even has a pretty reasonable small-size optimization.
That's not even true for boost, no matter if they always advertise that. The lib is also notorious for bad decomposability (using only a subset without installing the whole monster). Not to speak about idiosyncratic naming and build system, making it sometimes hard to include it in meta builds of other libraries and frameworks.
In sum: Anyone sensible, regarding different kinds of footprint and dependencies will think twice, before pulling in these kind of libraries.
Having said that I think using abseil's containers is reasonable, even as a default, if you can afford the dependency.
> std::unordered_map
AFAIK unordered_map is the most awful of all standard containers.
I want my OWL, VCL back, not an hash table able to do lookups in micro-seconds
The point I was trying to make, was that from productivity point of view, there are more relevant stuff to fix in C++ than algorithm complexity of STL implementations.
Like catching up with what Java 5 standard library offers for networking and parallel/concurrent programming.
If you care about performance then it's not. (And if you don't care about performance then why are you using C++?) The standard requires that `insert` will not invalidate iterators, which basically forces everyone to implement `std::map` as a red-black tree, and those are pretty bad performance-wise on modern hardware mostly due to cache misses.
> If you need fast O(1) look up std::unordered_map is really not fit for purpose and requires you to come up with an hash function.
Modern hash table implementations (along with a modern hash function) are exactly what you should use if you need fast average-case O(1) lookup, so I'm confused why would you say that it's not fit for purpose? Unless you specifically meant only `std::unordered_map` which, yes, is pretty atrocious performance-wise (again, due to the iterator invalidation requirements).