cyrano@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 13 days agoIn psychotherapists vs. ChatGPT showdown, the latter wins, new study findsfortune.comexternal-linkmessage-square15fedilinkarrow-up11arrow-down10file-text
arrow-up11arrow-down1external-linkIn psychotherapists vs. ChatGPT showdown, the latter wins, new study findsfortune.comcyrano@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 13 days agomessage-square15fedilinkfile-text
minus-squarePapstJL4U@lemmy.worldlinkfedilinkEnglisharrow-up0·12 days agoPatients explaining they liked what they heared - not if it is correct or relevant to the cause. There is not even a pipeline for escalation, because AIs don’t think.
minus-squarejubilationtcornpone@sh.itjust.workslinkfedilinkEnglisharrow-up0·12 days agoExactly. AI chatbot’s also cannot empathize since they have no self awareness.
minus-squareasap@lemmy.worldlinkfedilinkEnglisharrow-up0·12 days agoYou can’t say “Exactly” when you tl;dr’d and removed one of the most important parts of the article. Your human summary was literally worse than AI 🤦
minus-squaredesktop_user@lemmy.blahaj.zonelinkfedilinkEnglisharrow-up0·12 days agobut it can give the illusion of empathy, which is far more important.
Patients explaining they liked what they heared - not if it is correct or relevant to the cause. There is not even a pipeline for escalation, because AIs don’t think.
Exactly. AI chatbot’s also cannot empathize since they have no self awareness.
You can’t say “Exactly” when you tl;dr’d and removed one of the most important parts of the article.
Your human summary was literally worse than AI 🤦
but it can give the illusion of empathy, which is far more important.