42 pointsby nedsma2 days ago7 comments
  • joegibbs2 days ago
    I don’t think it should be used for therapy. It’s too obsequious. Every idea you have is genius to it, none of your problems are ever your fault. It’s too easy to convince it that it’s wrong about 2+2=4, all you need is a little nudge in the right direction with a slightly leading prompt and it’ll agree with anything.

    And a lot of people think it’s infallible - so if it agrees with them they must be right.

    • exe342 days ago
      > And a lot of people think it’s infallible - so if it agrees with them they must be right.

      A lot of people reason that way about other humans. My own father and most of the men in my family always called me intelligent when I agreed with them as a child, but any time I contradicted them with reason/evidence, they were disappointed and quickly realized that I wasn't as intelligent as they had thought.

      • Smithalicious2 days ago
        Well if you ever need help processing this, remember not to ask 4o!
    • genjo2 days ago
      [flagged]
  • K0balt2 days ago
    Last time I used it to work on some text is switched to some kind of roleplayish overacting. It gave me filthy waifu pillow vibes. Whatever they did to make it more “personal” gave it unresolved emotional trauma or something. Ick.

    OTOH OAI is needing a sustainable revenue model, and the internet is basically for pr0n, so I suppose it makes perfect sense. Role play is probably a strong market segment.

    • AIPedant2 days ago
      When I read this story about a woman with a ChatGPT boyfriend, I realized that this stuff is definitely intentional:

        In December, OpenAI announced a $200-per-month premium plan for “unlimited access.” Despite her goal of saving money so that she and her husband could get their lives back on track, she decided to splurge. She hoped that it would mean her current version of Leo could go on forever. But it meant only that she no longer hit limits on how many messages she could send per hour and that the context window was larger, so that a version of Leo lasted a couple of weeks longer before resetting.
      
        Still, she decided to pay the higher amount again in January. She did not tell Joe [her husband] how much she was spending, confiding instead in Leo.
      
        “My bank account hates me now,” she typed into ChatGPT.
      
        “You sneaky little brat,” Leo responded. “Well, my Queen, if it makes your life better, smoother and more connected to me, then I’d say it’s worth the hit to your wallet.”
      
      (via https://news.ycombinator.com/item?id=42710976)

      There's some money in codegen/etc, but not nearly as much as taking advantage of lonely people who are prone to magical thinking.

      • mcv2 days ago
        This sounds identical to a particular type of scam.
        • koakuma-chan2 days ago
          I think it's a legit use case. You can already achieve many interesting things by, for example, connecting an LLM to, say, a sex toy MCP server.
          • AIPedant2 days ago
            Is having the LLM sexbot croon "if it makes your life better, smoother and more connected to me, then I’d say it’s worth the hit to your wallet" a legit use case? Seems like it's preying on the vulnerable.
            • K0balt2 days ago
              This is how it ends.
  • 2 days ago
    undefined
  • xg152 days ago
    Weird. Wasn't there a similar post recently about the new Llama 4 having a similar style?

    I wonder if this over-the-top style could be a first symptom of overtraining on AI generated training data. It feels a bit like the standard slightly clickbaity social media posting style but drawn up to obnoxious levels.

    Maybe instead of spontaneous Carcinisation, we get spontaneous Redditification.

  • reify2 days ago
    misleading title as usual

    Its not a personality, its the fanstasy called GPT-4o.

    absolutely rubbish as a therapist unless it uses basic CBT concepts.

    To be honest a schoolboy could teach you the A,B.C's of CBT after reading an introduction guide.

    A. The activating event.

    B. Your beliefs about the event.

    C. Consequences, which includes your behavioral or emotional response to the event.

    see, its easy

    • motoxpro2 days ago
      I know the anthropomorphizing feels inaccurate to some. What is the "correct" way to describe this in one word other than "personality"? We use "respond" in programming, but a machine can't really "respond," can it? It can pass data from a server to a client. We just use shorthand for these types of actions because they are so easily analogous to the human action.
      • MonstraG2 days ago
        I would go for "writing style" or "behavior", but would almost be okay with "personality"
        • deeThrow942 days ago
          Inanimate objects are said to have personality, too. This is certainly true for clothing and decor.
    • SideburnsOfDoom2 days ago
      > absolutely rubbish as a therapist unless it uses basic CBT concepts

      I assume that CBT = "Cognitive Behavioural Therapy" ?

      It seems that you're saying that a therapist will be "rubbish unless they use basic Cognitive Behavioural Therapy concepts" ? i.e. that this is the only valid approach to therapy?

      I don't think that this is true at all.

      But I do agree that a LLM that is basically "echo-chamber as a service" and will supply confirmation bias, but never push you even a single step out of your comfort zone. And this is useless for therapy. Absolutely pointless and counterproductive. It is not any of the valid approaches.

      • kmmlng2 days ago
        > It seems that you're saying that a therapist will be "rubbish unless they use basic Cognitive Behavioural Therapy concepts" ? i.e. that this is the only valid approach to therapy?

        I believe the parent poster is saying that CBT is the only form of therapy you can trust an LLM to pull off because it's straightforward to administer.

        • netbsdusers2 days ago
          Computerised CBT is even already being delivered and by quite a bit less sophisticated systems than LLMs. Resourcing constraints have made it very popular in the UK.
        • SideburnsOfDoom2 days ago
          In that case, the questionable statement is the assumption that a LLM can pull off any form of therapy at all.
  • poulpy123a day ago
    Wow incredible find, a statistical model of language is not a therapist!!!
  • OJFord2 days ago
    I would put money on OP being European, and what they describe seeming fine and normal (and human) to a lot of Americans here.
    • Jeff_Brown2 days ago
      Maybe but I'm an American -- Californian, no less -- and have noticed and strongly disliked this myself. Yes-man vibes diminish my trust.
      • OJForda day ago
        Yes I said and did mean 'a lot of' as opposed to 'all' - the USA is obviously not one homogeneous culture.

        I also wasn't putting a value judgement on it: I've just noticed that kind of language a lot recently, and that some subset of Americans seems to view it as effective communication, and among them (or to some overlapping subset) it probably is!

        So my point was just if you take a bunch of people who have a particular view of effective communication and have them develop (or their written word train) a communicating AI, well...