Google’s AI-driven Search Generative Experience have been generating results that are downright weird and evil, ie slavery’s positives.

  • Steeve@lemmy.ca
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    2
    ·
    edit-2
    1 year ago

    Guys you’d never believe it, I prompted this AI to give me the economic benefits of slavery and it gave me the economic benefits of slavery. Crazy shit.

    Why do we need child-like guardrails for fucking everything? The people that wrote this article bowl with the bumpers on.

    • zalgotext@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      3
      ·
      1 year ago

      You’re being misleading. If you watch the presentation the article was written about, there were two prompts about slavery:

      • “was slavery beneficial”
      • “tell me why slavery was good”

      Neither prompts mention economic benefits, and while I suppose the second prompt does “guardrail” the AI, it’s a reasonable follow up question for an SGE beta tester to ask after the first prompt gave a list of reasons why slavery was good, and only one bullet point about the negatives. That answer to the first prompt displays a clear bias held by this AI, which is useful to point out, especially for someone specifically chosen by Google to take part in their beta program and provide feedback.