If their response is merely, "Okay, but we're not responsible for damages," then that's great, who cares. But if they take any sort of measure to remotely disable your car or to hamstring your car's software, that would be a very slippery slope to start down.
Why not? It flies just fine now with physical goods. If someone hand-builds their own bicycle wheel, and installs it on their bicycle, and then the wheel breaks and they crash... no one is blaming the bike manufacturer. No one is saying "that bike company should have welded the safer wheels on and issued a license prohibiting wheel alterations."
Everyone gets that the customer took the liability in their own hands once they started using their own hands to modify the product. It's the classic trade-off of freedom vs. security.
This ability to allocate liability to the customer is part of what we DO need for software, IMO. Right now there is a popular perception that software cannot be modified or interacted with by anyone but the company who wrote it. I don't think that has to be true forever though.
Why not? It's how everything else works. If you buy a house and then install your own electrical wiring which causes your house to burn down, you have no claim against the construction company and nobody blames them for it because it was your fault, not theirs.
This recent attitude of "people won't understand, therefore corporations have to be paternalistic" is patronizing and factually incorrect. People do actually understand. The worst outcome is that technology-ignorant bureaucrats require companies to clamp down on modding. But having companies do that to begin with is no improvement.
The basis of this seems to be the conceit that manufacturers can actually control their products after they've been sold. It can be true for a time, in the sense that it takes people that long to figure out how to break back into the things they own, but it happens. If you can jailbreak an iPhone then you can jailbreak your car.
Which means that we have a choice. The first option is to accept the inevitable and support it. Expect people to make changes and take responsibility for their own changes, and facilitate that. The second is to try to lock everything down, so that the people making modifications have no access to documentation and have to be running out of date software because the old version with the jailbreak vulnerability is the same old version with the anti-lock brake glitch, which makes it much more likely that people will die. What does that do for your brand?
What happens when they, say, strike a deal with McDonald's and ban drivers from going to Burger King locations? It sounds far-fetched, but I don't see any difference. Where we take our cars and why should not be Tesla's business.
You described exactly the problem people have with this idea :)
> Do you think that Tesla should have to make sure that its self-driving software is compatible with every ridesharing app that any random person comes up with? How will that work with liability if somebody sends their self-driving Tesla out to work under a ridesharing app that feeds bad inputs to it?
Then they should prohibit interfacing the car with 3rd party software quoting relevant regulations (or at least a generic "for safety reasons"), not put a blanket ban on commercial use of the self-driving functionality.
For now it seems that they don't even want me to physically sit there in the driver's seat and do something else while the car is driving, which puts your whole speculation about Tesla's motives into question.
And BTW, what if somebody breaks this rule and hacks his car into an autonomous Uber slave? By your logic, if something bad happens Tesla will still get blamed. All they accomplished is CYA and for that a warning about dangers of interfacing with unauthorized software would be sufficient. Even better in fact, because not all unauthorized software must be commercial.
There are frequently limits put on use by sellers even when things are sold outright. You can purchase prescription drugs, but you can't just give them to anyone, or resell them without limits. You can often get software licenses for educational purposes that are not to be used for business. You may not like such restrictions, and certainly I don't think they should be made mandatory for everybody by law, but I don't think making it illegal for a company to make such rules is a very good idea. The easiest solution is simply to not purchase the offending product.
Edit: as regards the commercal/non-commercial mods, I think it's reasonable to imagine that they are trying to avoid the commercial insurance snafus that have entangled ride-sharing apps. When you accept money for a service you put yourself into a bunch of liability positions. And in this case, the liability is on the creator of the self-driving program.
Okay, Tesla's reasonable licensing agreement has a liability waiver when using the app for ride-sharing outside of certified applications.
That's a pretty reasonable piece of licensing. It's also not unheard of to state that a piece of equipment is not warrantied or you waive liability if used in commercial ventures, or outside of normal operation.
I'd be fine with that. It places responsibility on the owner of the car.
edit:
I also have no problem with Tesla not facilitating ride sharing. If the limitation to their network is just because the software makes it difficult to use a third party app then so be it.
If people feel its desirable to combat this trend, they need to be willing to take more legal responsibility for how they choose to use their products. Otherwise, compromising on full unrestricted ownership is the price paid to cover manufacturers' insurance costs.
Note: this isn't absolute - there are many cases of corporations simply abusing this outright, like snowwrestler's Amazon example - but it's certainly the case for Tesla and a lot of others.
That wasn't protecting themselves from litigation. That's saying "this car is ours, you just get to use it, and we control how".
Just like you cannot get a service manual for Tesla vehicles. Except in MA, where it's required by law. Where you have to make an appointment, go into their office, and view the manual on-site, and can't take it with you. I believe you also have to pay a fee to do so, or at least did.
However, they know that won't hold up in the court of public awareness/opinion, and that if people messing with their Teslas end up causing lots of problems, that legislatures and regulators are more likely to clamp down on self-driving cars generally. Which is the opposite of what they want.