• 6 Posts
  • 2.7K Comments
Joined 2 years ago
cake
Cake day: December 28th, 2023

help-circle
  • Let me ask you this: do you seriously believe that Microsoft send everything you type back to their servers? Like do you honestly believe that?

    Do you regularly throw out strawman hypotheticals that are only tangentially related? Because the main thing I believe right now is that you’re not really worth conversing with.

    I never said or implied they did send everything that you type back, and it tells me that you either don’t understand what we’re talking about (data mining) or you’re not participating in good faith. I let earlier ones slide, but I’m done with you.

    Good day.





  • So AI isn’t useful in the slightest?

    How is putting AI in a basic text editor useful? Or is this a situation where you are just defending the concept of the usefulness of shoving bots in every single piece of software possible?

    AI has some useful things it can do, but many more useless things, like just being shoved into something to try to mine even more of your data under the guide of ‘usefulness’.

    You know you don’t have to use AI in it, right?

    No, I just have to see the prompt every time I use the software just so lazy mouth breathers don’t have to expend the mental energy to go to another app to ask their mental pacifier to do all the hard work for them.











  • People may prefer cheap to expensive but that does not mean they are desperate.

    Again, your conditional statement is doing a hurculean amount of lifting here. We know that healthcare is unaffordable for a large swath of our population, but are you implying that mental healthcare (which doesn’t have nearly the coverage on most plans as physical healthcare) wouldn’t be in a similar state? Because mental healthcare is out of the reach of a lot of people.

    The option isn’t just cheap or expensive therapy. No therapy is as much an option if the therapy quality was 90s level machine chat bot.

    False dichotomy, the chat bot can be better than the 90s bots but still be bad. And ‘no therapy’ isn’t an option for a lot of people who will self harm as a coping mechanism.

    Why is it exactly a problem that people have an extra avenue to better mental well being?

    Why is it a good thing that people are using a tool that will yes-and just about anything they say and lead to psychosis in patients with no accountability from the provider?






  • having the ability to do so doesn’t necessarily mean they will do so.

    No, but they have a profit motive to do so. And I’d rather assume the worst and be wrong rather than deal with another 23andMe situation in a decade. Because it will happen eventually. VC money isn’t endless, and they’re pissing away money like a pro athlete in a club.

    You can trust them if you want, but I’m not naive enough to do that myself.

    There are plenty of terrible therapists, preists, family, and friends out there.

    Preaching to the choir, I’ve dumped people from all noted categories for being shitty. I gave up on therapy about 15 years ago but my partner convinced me to go back. I looked for someone who fit my specific needs, and found someone who is rebuilding my trust in therapists

    I trust my therapist not to randomly decide to give out my info because their job relies on that. AI chat bots flat out tell you they will use what you give them for their ‘training’ purposes, which means they have access to it and can use it or sell it as they please.