Since you are only changing the underlying model every so often instead of doing a large training loop when you setup the optical computer that can do inference it scales 2n+1 with clock speeds of to 100THz with only 100w of power vs traditional GPUs at 2GHZ with 1Kw for 15k cores.
sigh. (why? because now I have to guess how much is vague handwaving, or an AI trying to fit a square peg into a round hole, and how much is reality)
A) Why that means calculations can be imprecise - the weights are data stored in RAM, is the idea we'd use > N-bit weights and say it's effectively N-bit due to imprecision, so we're good? Because that'd cancel out the advantage of using < N-bit weights. (which, of course, is fine if B) has a strong answer)
B) A aside, why is photonics preferable?
B) Power consumption and speed. Essentially chips are limited by the high resistance (hence heat loss) of the semiconductor. Photonics can encode multidimensionally, and data processing is as fast as the input light signal can be modulated and the output light signal can be interpreted. I guess this would favour heavy computations that require small inputs and outputs, because eventually you're bottlenecked by conventional chips.
The intrinsic size of the optical computing elements is much larger, being limited by wavelength. Then a lot of additional devices are needed, for conversion between electrical and optical signals and for thermal management.
Optical computing elements can be advantageous only in the applications where electronic devices need many metallic interconnections that occupy a lot of space, while in the optical devices all those signals can pass through a layer of free space, without interfering with each other when they cross.
This kind of structure may appear when doing tensor multiplication, so there are indeed chances that optical computing could be used for AI inference.
Nevertheless, optical computing is unlikely to ever be competitive in implementing general-purpose computers. Optical computers may appear but they will be restricted to some niche applications. AI inference might be the only one that has become widespread enough to motivate R&D efforts in this direction.