It doesn't just do that, it pushes you more and more into specific niches. It doesn't have to do that either, but Youtube has designed it to do it, to put users into the famous Youtube Rabbit Hole.
This rabbit hole encourages extremism in some people, and that can be harmful to society. Why the defeatist attitude, that huge companies have to encourage extremism or people will stop using their products? That's non-sensical
Youtube recommends videos you are likely to click on. It is unclear to me what else they are supposed to recommend.
It is also unclear to me that the recommendations ought to be moderated by some embedded state entity to make sure the results align with what is in political vogue.
This is in the context of the commission saying that FB didn't do enough to block the spread of election fraud misinformation, not that they didn't do enough to block imminent lawless action.
I sense there are two processes at work here. One is simply an algorithm that recommends more of the same. If I watch a Jordan Peterson video, I start seeing recommendations for his other lectures and podcasts. I doubt YouTube's algorithm is specifically trying to change my views politically. Likewise, if I end up seeing a lefty BreadTuber's video, I start to see more socialist videos.
The other thing is that we really don't have shared definition or word-feel for "extremism". The people who are the most vocal about curbing extremist content also happen to be wildly partisan. I don't think that is a coincidence. I sense the false-positive rate is extremely high in this regard.