Also, it is my understanding that surface compositing is done in hardware using surface-backed textures, with any software rendering of those surfaces being done on the main UI thread while responding to messages such as drawRect, using CoreGraphics (not CoreAnimation). The only thing that gets shoved to background threads are, AFAIK, animations that are specified in code but then "set free" into the CoreAnimation backend.
I thereby don't feel like these comments adequately defend the statements made in the article: the things it claims are benefits of iOS's graphics architecture either A) work the same on Android (per the post it is responding to from the Android developer regarding the myths of Android hardware surface compositing) or B) actually happen in software on the main/UI thread on iOS. I think we need to look elsewhere for the real cause of Android's horrible touch response. ;(
Obviously you never went through tunnels. When I am playing a 100% offline game (without internet access) and my bus goes in the tunnel, everything stops. Who cares about this "game" when I have tons of networking to do. Networking takes precedence over animation so what happens is any touch events or animation or anything is heavily delayed.
What about the phone? When the call is ending, it can't render a UI update that the button was touched, thus just a delay. It even delays processing the next input event until after the screen switches states. End result? I make a call when I try to hang up or vice versa. Point being that these problems are not exclusive, but the iOS solution is significantly better for these edge cases which are the ones that cause the biggest headaches, they still happen in iOS but very infrequently vs consistently bad every other time.
So that an UI Update that is waiting for hardware can be delayed and the next UI update in the loop can be processed.
I dont seem to recall any such pattern for Java main thread UI Updates.