Alternate account for @simple@lemmy.world

  • 3 Posts
  • 8 Comments
Joined 2 years ago
cake
Cake day: July 3rd, 2023

help-circle

  • You need to use an LLM with a very long context length, potentially 1 million+ tokens. I don’t know if any local LLMs can even go that far, and if they can, you’ll need an outrageous amount of ram and vram.

    But honest question… Why? If you’re planning on generating fake books or stories, it’s not going to happen, you’ll create the most generic barely coherent text.

    And fair warning, if you’re trying to sell AI generated stories you’ll quickly be permabanned from any store, so don’t even try it.