>then you must have some notion what this mechanism is and what it's purpose is so you can say,
My intuitive sense is that consciousness is present in so many mammals precisely because mammal young require so much care after birth. ie, it is beneficial for a mother to have differentiate between her own consciousness and the consciousness of her children. And, to care for her young and possess an intuitive sense for the needs of her young. It feels like this trait, once bootstrapped, could find other evolutionarily-beneficial behaviors, such as hunting in packs, building cities, or forming competitive social hierarchies. (you can't really have social status without consciousness)
To be clear, I'm not a biologist, and for sure I could be wrong here. I think the general principle holds though -- if a species of animal possesses a trait, then that trait either currently or previously conferred some evolutionary benefit. It's easy enough to think of what those benefits could sensibly be, and of course more difficult to actually go ahead and prove it out.
>I believe the typical (mis)understanding proceeds like so: "I am conscious. I not only know things, but I have a sensation of knowing them. This (computer/animal) reacts to things in such a way that it demonstrably knows them in some sense, but it has no sensation of knowing them. Now what is this wonderful thing that distinguishes me from it? How did it come to be? What purpose does it serve?"
I think that's a really fair characterization, but I also think that this "feeling of what happens" does characterize consciousness. 1) this means that an AI _could_ be conscious, but it would probably also need to have been programmed to sense input and also have some sort of identity. 2) if we don't believe that consciousness is a "feeling of what happens," but instead is just a response to external stimuli, then I would think consciousness is just synonymous with "living things," and there's no point in having a separate concept for it.
>One purpose the concept of consciousness serves is that it gives a special ethical status to its possessor.
I'm not sure what you mean by this. Are you saying that "from the perspective of the conscious being, that person themselves puts more value on themselves or other conscious beings?" -- or, are you saying that "conscious beings _should_ put more emphasis on other conscious beings, and therefore it's important to extend consciousness other things?" I think in either case, I would say that this bias towards other conscious beings is just sort of an in-built human bias, and isn't particularly important when it comes to scientific understanding of consciousness. (from a moral standpoint, I concede it could be important.)