Soneone must be working on this.
There are many more including genetic algorithms. Some are discussed here: https://www.reddit.com/r/MachineLearning/comments/6efs8u/d_a...
>"Hypothesis set is the set of functions that have some common and unique properties that make them a viable candidate to be considered for the final hypothesis. We apply the learning algorithm to the hypothesis set and then choose a specific function from the hypothesis set. In terms of examples, the Hypothesis set could be linear regression and the learning algorithm could be gradient descent or the hypothesis set could be neural networks and the learning algorithm could be back propagation. Therefore, in order words, hypothesis set is a set of similar functions which have been shown to have good results under specific conditions and the learning algorithm is the algorithm that will do the actual searching." https://medium.com/technology-nineleaps/what-is-machine-lear...
I agree with you, I don't find talking about "hypothesis sets" to be at all enlightening. To me it sounds like needless jargon and it is strange to see even this amount of effort expended to use it. Maybe we are missing something here though.