The way the LLMs are configured by default is to be chatty and pretend to be human.
They're not talking like Data from Star Trek, nor HAL.
They're trying to be Samantha from Her.
Go watch that movie and see the visceral human response that evokes.
To tell people to treat it as a tool, while it's deliberately trying to pass off as human is like telling us to piss in the wind.
It's bad advice and not how humans work.
I think better advice might be 'set it to a different conversational tone' might be better advice.