> If epistemological state means anything at all ...
It is clear (to me, anyway) that by "epistemological state" Yudkowsky means "state of beliefs and knowledge" rather than what you say is the only thing it can possibly mean. Why do you think the only thing it could mean is what you state?
(I think he should have said "epistemic" rather than "epistemological".)
> If EY had meant ... then maybe he should have spelled all that out
Maybe. But what he wrote was pretty long already, and "since I am running on corrupted hardware" (which is what EY did write) amounts to much the same thing. There's nothing a writer can do to guarantee that every single reader will understand correctly.
> I reassert that very little new was said in this article
So you do. But you're reading only a portion of it; you make claims about its overall purpose which are clearly contradicted by the article itself (hint 1: "to me this seems like a dodge"; hint 2: "I now move on to my main point", followed by a statement of that point which is not anything like "how can I best respond to trolley problems?" or "our robotic overlords will be vastly superior to ourselves"); you ignore large parts of it altogether. Why should anyone care whether, treating it thus, you find anything new in it?
> and what was said was wrapped in a ton of verbiage
Well, yes, Yudkowsky is not the most concise writer in the world. I think that may be partly because he's found that being terser gets him misinterpreted more often. From your consistently inaccurate paraphrases and summaries here, it seems to me that his main problem probably wasn't excess verbosity.
> that does not seem to jibe with what he actually says: ...
Situations where you're in the sort of epistemic position described in trolley problems are very rare. Situations where you can, and maybe should, harm some people to benefit others are not so rare.
I dare say there are ways in which a superintelligence could "fold all the remaining meat-machines into itself". It's not so clear that any of them would result in there being a superintelligence which is a "version of" any of those meat-machines.
I neither know nor care exactly what your attitude to super AIs is. I do think, for what it's worth, that pretty much everything you've said here on the subject has an unpleasantly sneering tone which you might want to lose if you don't want to give the impression of being "against super AIs or something".