https://discord.gg/hpBp9bnpr7
https://www.luos.io/
I was wondering last night how Google (not to mention the other engines) will adapt its SEO strategies to discern a human's written content from an AI's content.
Because if everyone starts to generate articles with AIs, the web will become a pile of content created by AIs based on content written by humans for humans. But we know that training models are based on text corpora from different sources (including the web).
Where is the border? The more content generated by AIs, the more the new AIs will train on content generated by other AIs to create new content.
We will end up reading content that humans no longer wrote, which could become almost rare.
Content is one of many variables taken into account by the algorithm to index pages, but it will be interesting to see how search engines will solve this issue.
Tell me what you think:
https://github.com/lorenzi-nicolas