Open question, does the written by ML model thing feel like catching people wearing a fake rolex, I can argue it two ways here.
So we have tools like google, calculators and now GPT.. the big but here being if students don't learn they will be effected later and is a personal issue?
OR we are missing the point by bandaging old academic systems and not testing people on the concepts so that they are not just memorizing stuff?
If the text generated from these models are passed through an article spinner, there is no way these models can detect that it has been generated by an AI model.
At the end of the day, it's just a system to detect lazy people copying pasting stuff.
I'm not saying don't build such tools, and it's definitely great that to have them open source, but it's something to be aware of and caution users about.