Humanoid robots were once universal in science fiction. They looked somewhat like metal people, and had narrow human personalities with programmed obsessive-compulsive disorder that kept them focused on their assigned tasks.
They were also charming. Now, as I've mentioned before, we tend to react emotionally to beings (fictional or real) that have constrained emotional and intellectual toolsets (autistic, mentally handicapped, animal, programmed). Robots certainly meet that requirement, and were usually written with a certain pertness or "speaking truth to power" attitude. The bombastic or prideful would meet their comeuppance from the clarity of a robot. Robots weren't deliberately contrary, had no emotional needs of their own, weren't petulant, snarky, or angst-ridden, and were in general easier to deal with than messy human minds.
In real life, people are still working on that charm. In his article Robots That Care, New Yorker medical writer Jerome Groopman describes some attempts at therapeutic robots. The article is vague and bland, partially because robots still can't do that much, and using them to interact with people who have had strokes, Alzheimer's patients, and children is mostly unsuccessful. The article does have some interesting things to say about how a robot can be programmed to interact differently with an extrovert than with an introvert, but has little hard information on it.
Because, of course, the article is about robots who might someday successfully pretend to care, not robots that care. The designers seek to hijack our hacked-up heuristics for interpreting other minds. Since we're capable of attributing personalities to computers, cars, and cats, we clearly are predisposed to see other minds even when they aren't there. Sherry Turkle, at the end of the article, thinks this is a bad idea. She wonders why people are so eager to cut humans out of the therapeutic relationship. The benefits would have to be extraordinary, she says for it to be worthwhile.
So beware heuristic hijacking (a concept that's been around, but Google indicates that I just now thought up the term--remember that you read it here first). Our makeshift analytics will inevitably be trickable by devices with the right programming and enough processing power. It hasn't happened yet, but it will.