I’m interested in automatically generating lengthy, coherent stories of 100,000+ words from a single prompt using an open source local large language model (LLM). I came across the “Awesome-Story-Generation” repository which lists relevant papers describing promising methods like “Re3: Generating Longer Stories With Recursive Reprompting and Revision”, announced in this Twitter thread from October 2022 and “DOC: Improving Long Story Coherence With Detailed Outline Control”, announced in this Twitter thread from December 2022. However, these papers used GPT-3, and I was hoping to find similar techniques implemented with open source tools that I could run locally. If anyone has experience or knows of resources that could help me achieve long, coherent story generation with an open source LLM, I would greatly appreciate any advice or guidance.
I think it’s impossible then. My experience aligns with these recommendations. First tell it to come up with interesting story ideas. Then pick one. Have it write an outline. Have it come up with story arcs, subplots and a general structure. Chapter names… Then tell it to write the chapters individually, factoring in the results from before. Once it trails off or writes short chapters, edit the text and guide it back to where you want it to be.
It’ll just write bad and maybe short stories unless you do that. I mean you could theoretically automate this. Write a program with some AI agent framework that instructs it to do the individual tasks, have it reflect on itself, always feed back what it came up with and include it in the next task.
I’ve tried doing something like that and I don’t think there is a way around this. Or you do it like the other people and just tell it “Generate a novel” and be fine with whatever result it will come up with. But that just won’t be a good result.