A Tesla has ~ 100.000.000 [1] lines of code. Considering this post, do you think we are sufficiently educated in software security to produce secure self-driving cars?
Elon Musk: "I think one of the biggest risks for autonomous vehicles is somebody achieving a fleet wide hack" [2].
By your logic, we should not fly any modern commercial or military aircraft or spacecraft, live within a certain radius of any power or hazardous chemical plant, place any dependency on any first world country's health care network, including life support, or invest in any company or stock.
Like most things in life it comes down to a security/convenience risk/benefit compromise.
Are you claiming that this could not have happened with Tesla? If so, please explain why.
> By your logic, we should not fly any modern commercial or military aircraft or spacecraft, live within a certain radius of any power or hazardous chemical plant, place any dependency on any first world country's health care network, including life support, or invest in any company or stock.
Up until now the benefits have clearly outweighed the risks, but that does not mean it will continue to do so.
Embedded software used to be low-level code we'd bang together using C or assembler. These days, even a relatively straightforward, albeit critical, task like throttle control is likely to use a sophisticated RTOS and tens of thousands of lines of code. " [1] [2]
[1] https://www.edn.com/design/automotive/4423428/Toyota-s-kille...
[2] https://www.embedded.com/electronics-blogs/barr-code/4214602...