Yep, at least in my eyes you'll never be able to "solve" self driving without solving the G in AGI. You require a world model for predictions in order to have enough time to avoid many bad outcomes. Avoiding an empty soda can and avoiding a brick are similar problems, but one can easily lead to critical failures if you miss it.