It’s cheap, quick and available 24/7, but is a chatbot therapist really the right tool to tackle complex emotional needs?

  • Mastengwe@lemm.ee
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    2
    ·
    edit-2
    7 months ago

    No. It can’t. It’s programmed to mimic. Nothing more. It’s doing what its word prediction programs it to do. It follows no logic, and doesn’t care about anything including you.

    This is just more evidence of how easily people can be manipulated.