My professor (Sir Michael Brady) at university 14 years ago set up a company to do this very thing, and he already had reliable models back before 2010. I believe their company was called Oxford Imaging or something similar.
Yep, everyone seems to forget that ML was available before 2021. Had a conversation recently with my former colleague who learned about some plastic packaging company which used "AI" to predict client orders and inform them about scheduling implications. When I told him that you don't need Transformers and 30GB models for that, he was quasi-confused, cause he kinda knew it but the hype just overtook his knowledge.
In ML courses, you’re taught to try simpler methods and models before turning to more complex ones. I think that’s something that hasn’t made it into the mainstream yet.
A lot of people seem to be using GPT-4 for tasks like text classification and NER, and they’d be much better off fine-tuning a BERT model instead. In vision, too, transformers are great but a lot of times, a CNN is all you really need.