You need a model of yourself to game out future scenarios, and that model or model+game is probably consciousness or very closely related.
Sure, it's not completely in control but if it's just a rationalization then it begs the question: why bother? Is it accidental? If it's just an accident, then what replaces it in the planning process and why isn't that thing consciousness?
It's fine if you think that the planning process is what causes subjective experiences to arise. That may well be the case. I'm saying if you don't believe that non human objects can have subjective experiences, and then use that to define the limits of the behaviour of that object, that's a fallacy.
In humans, there seems to be a match between the subjective experience of consciousness and a high level planning job that needs doing. Our current LLMs are bad at high level planning, and it seems reasonable to suppose that making them good at high level planning might make them conscious or vice versa.
Agreed, woo is silly, but I didn't read it as woo but rather as a postulation that consciousness is what does high level planning.
I think we have different definitions of consciousness and this is what's causing the confusion. For me consciousness is simply having any subjective experience at all. You could be completely numbed out of your mind just staring at a wall and I would consider that consciousness. It seems that you are referring to introspection.