Feminism isn't just about not offending people or giving women the same "rights" as men. You can make the (true) claim that blacks in America have the same rights as whites. But does that mean that there's no more racism? And even without the judgmental word "racism", does that mean blacks have the same opportunities as whites? I don't think so.
Feminism is about making sure women have the same opportunities as men not only by virtue of the law, but "on the ground"; that society doesn't gently (or not so gently) steer them in directions where they end up with less power than men; that they're no longer objectified and that female politicians are not called by their first names.
I'm not saying language can fix all that, or that it even matters all that much. It certainly matters less in cultures where feminism has had greater success. But it is a good place to point out how, perhaps inadvertently, we keep falling into the same gender traps. If you start thinking about your choice of words, language becomes less natural, so you stop treating it, and the culture it articulates, as "nature", and start treating it as the malleable social construct that it actually is.