Everyone is talking about Large Language Models like ChatGPT, LLAma, BLOOM etc that can do any task with text transformation. I did many tests of them for changing style, locale, grammar correction etc and saw that quality is quite average. The performance is so slow and unstable to use them in scaling business.
I tried to make several small language models using 3-5 million of parallel sentence dataset to make specific linguistic transformation and see much potential there. The difference in performance between ChatGPT4 and my small models is almost 1000 times.
Anyone do such things ?