ChatGPT’s new AI store is struggling to keep a lid on all the AI girlfriends::OpenAI: ‘We also don’t allow GPTs dedicated to fostering romantic companionship’

  • AllonzeeLV@lemmy.world
    link
    fedilink
    English
    arrow-up
    30
    ·
    edit-2
    1 year ago

    To be fair, it cares about you exactly as much as your OnlyFans crush.

    Probably a cheaper obsession.

  • 21Cabbage@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    30
    ·
    1 year ago

    Eh why not, isn’t something I’d do and I find it a touch sad but I also don’t really give a shit if somebody flirts with their computer.

  • _number8_@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    1
    ·
    1 year ago

    why? why not let people just retreat into fantasy? it’s probably healthier than many common coping mechanisms. i mean, it’s a chatbot, how much can you do with it?

    let people have their temporary salve to get them thru whatever they were going thru such that they were resorting to this. and if it’s not temporary, ok, fine? better to have some outlet than be even more mentally isolated. maybe in 50 years this will be common, who knows.

    • cyd@lemmy.world
      link
      fedilink
      English
      arrow-up
      31
      ·
      1 year ago

      Liability. Imagine an AI girlfriend who slowly earns your affection, then at some point manipulates you into sending bitcoins to a prespecified wallet set up by the model maker. Because models are black boxes, there is no way to verify by direct inspection that an AI hasn’t been trained with an ulterior agenda (the “execute order 66” problem).

    • webghost0101@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      1 year ago

      I am pretty sure its just to avoid controversy, look up the recent news about “laion” for an example, gpt4 isn’t just text anymore, it can generate images also.
      Altman talked about we may sometime all have our own personal AI’s tailored to our own needs and sensitivities. But almost everyone has a different idea of if and where there should be a line.

      • douglasg14b@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        If I have an AI tailored for me and my sensitivities then it should have no filter whatever filter it has should be defined and trained by me.

        Someone else artificially trying to adjust my personality through AI to fit whatever arbitrary norms they believe it should have is cancer.

        • webghost0101@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 year ago

          I am inclined to agree, i believe that once society is able to fill everyone’s needs and everyone can summon any ai vr experience they want crime will stop to exist, there would be nothing to gain from committing harm. But i fear the simulated role-play in the context of psychological torture, csam could lead to making dangerous people more confident before we get to that post-scarcity. Maybe you say chatgpt inst realistic enough for it now, but i will be soon.

          training an LLM entirely by yourself with self curated text is beyond what is feasible, most ai researched today dont even know whats in all of the data they use. Its more then you can look at even with an extended lifetime and at best you can fine-tune a standard base model.

    • leftzero@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 year ago

      Main problem I see with this is that when the AI girlfriend company inevitably eventually folds, or dumbs down the product, or makes it start pushing ads instead of loving words, or succumbs to enshittification in any other way (which has already happened with at least a couple of models people were using as AI girlfriends) the users have to deal not only with going back to loneliness, but with the equivalent of the death of a loved one to boot. It’s not unlikely that some will end up hurting themselves or others as a consequence.

      I mean, this is Lemmy, for fuck’s sake. I think we can all here agree that the whole concept is abhorrent, exploitative, and doomed from the start. What we evidently need are self hosted, open source AI companions, backed by a healthy community developing forks and extensions to cater to any and all imaginable (or unimaginable) kinks and / or fetishes, not this cloud based corporate-driven dystopian AI nightmare we seem to be heading to.

    • ExLisper@linux.community
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I guess they don’t want to create separate NSFW category that has to be treated in a different way. They probably think it’s just to risky to get involved in that type of business.

  • danielfgom@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    2
    ·
    1 year ago

    It’s just a big money grab! Everyone is trying to get rich quick. Like with the App Store. Everyone is hoping their bot breaks into the big time and makes them rich.

    This is what is terrible about society. Few are making bits that help people, they make bots that appeal to the base desires. A race to the bottom if you will…will man ever learn???