In lots of places a sobriety/drug test is mandatory after an accident with severe injuries or fatalities.
And if the driver were to be found not to have braked at all - or even to accelerate - the driver could very well lose their license. Even when you have the right of way you still have to behave like a responsible driver would.
If inspecting the full digital record by the police is not feasible then I would argue self driving cars have no business being on the road at all. After all, we require normal drivers to be witnesses to accidents as well, and we expect them to cooperate in tests to determine whether or not they were able to control their vehicle, especially in fatal accidents.
In this case one of the participants is dead and the other one is silicon so the only evidence taken is the same as if all participants had died and that's not true, at least one of them had a lot of evidence to give, and given the novel nature of the incident there was a very good reason to actually evaluate that evidence.
Being 'automated' should not be an automatic get-out-of-jail card with respect to your liability and your proven ability to control a vehicle, at least the same standards that apply to regular drivers should apply to automation.