The danger of indexing too much on a canon of knowledge in a fast evolving field is that you're narrowing your view to a set of techniques that don't work so well on modern problems.
Deep learning for instance is a nonconvex optimization problem where we have a lot of practical knowledge on how to make it work well, but the theoretical knowledge of why it works so well is still being developed. This is a case where practice precedes theory.
Instead of an encyclopedia, I recommend subscribing to a (free) mailing list of pre-prints, Optimization Online.
Can anyone recommend a good didactic book or online course for a computer scientist with a good grasp of math and machine learning? It has to reach beyond linear problems.
Wtaf.
Unfortunately the thing is from 2008 and I suppose this kind of book doesn't age well.
[1] http://titan.princeton.edu/2010-10-11/EoO2/Encyclopedia_Opti...
> CCXXXI, 4622
Taf.
Projects like this are built on the work of academics, the majority of whom are publicly funded. By and large they resent the for-profit publishers who benefit from their work and sometimes reduce them to needing to pirate their own work.