From what I understand, the current bottlenecks for machine learning are:
- The lack of good data. Machine learning and DNN's specifically perform best with large datasets, that are also labeled. Google has open sourced some, but they (supposedly) keep the vast majority of their training data private.
- Compute resources. Training these datasets (which can be over terabytes in size) takes a lot of computational power, and only the largest tech companies (e.g. Google, Facebook, Amazon) have the capital to invest in it. Training a neural net can take a solo developer weeks or months of time while Google can afford to do it in a day.
There are actually a lot of advances being made in the algorithms, but iteration cycles are long because of these two bottlenecks and only large tech companies and research institutions have the resources to spend overcoming those bottlenecks. Web development didn't go through a renaissance until web technology became affordable and accessible to startups and hobbyists from reduced server costs (via EC2 and PaaS's like Heroku).
By that analogy, I think we're still in the early days of machine learning and better developer tools and resources could spur more innovation.