I was highly skeptical of these systems too: for normal freeway driving, Tesla’s systems are much better than I expected them to be. There’s no way I would be comfortable enough to sleep in my car, as a programmer, but I could imagine someone getting overconfident. This has always been a major risk with semi-automated systems: I believe airplanes dialed back the functionality of autopilots to prevent pilot complacency.
OK that is a good point. So you are saying, maybe the marketing wasn't what misled the driver, but the abilities of the system. A system that fails 1% of the time will give a the user overconfidence up until the first time it fails.
This is a well-known issue: when automation gets good enough that the human in the loop starts to get bored, it can be more dangerous than either “all human” or “all computer”
Automated flying reminded me of this incident in India - some pilots used to cover the windows of their cockpit with newspapers after take off, presumably to have a short nap!