holographic displays would use eye trackers to show each eye a different image. Solid state zoom is maybe a bit of a stretch, but it would involve pixels becoming sensitive to angles more inward or outward from the sensor's center.
I'm not an expert, but I believe that's not how holographic displays would work with optical phased arrays. I believe a phased array can make it seem that light is being emitted from any point above the display (within the display angle of the opposite side of the display). There's no need to track observers, because it would be an honest reconstruction of the light emitted from a real 3d dimensional object.