• clucose@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    It is possible for AI to hallucinate elements that don’t work, at least for now. This requires some level of human oversight.

    So, the same as LLMs and they got lucky.

    • ATDA@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 hours ago

      It’s like putting a million monkeys in a writers’ room, but super charged on meth and consuming insane resources.