Skip to content

Conversation

jaydenfyi
Copy link

@jaydenfyi jaydenfyi commented Oct 14, 2025

Relates to #1364

@sinclairzx81
Copy link
Owner

sinclairzx81 commented Oct 15, 2025

Hi, thanks for the PR!

Things seem ok here (afaik :D), but just a couple of general questions first.

  1. Does the file extension need to be .txt? (can it be .md?)
  2. The generated file is 4000+ lines long, just wondering about min/max token counts for LLM's to ingest all this content (I note that some projects provide a summarized view, would that make sense here?).
  3. Is there a way to test this? (Note: I'm not a LLM/AI user so will need some guidance here)
  4. Should each sub module /value /compile, /format, etc have it's own LLM file? (documentation splitting - see 5)
  5. if splitting documentation across multiple files, do LLM systems support loading referenced documentation in multiple files? (I see the output currently has # Source: docs/overview/overview.md, but does it support # Reference: ... )?

Thanks for taking the time to look into this. If you can provide some info on the above aspects (especially the testing / splitting aspect) that would be awesome.

As mentioned, I'm not really a user of AI systems, so I'm not sure exactly what the expectation would be or what best practice is.. but if I am to include anything, it would probably need to be supported by all the major LLM model vendors (Claude, OpenAI, Gemini, LLaMA, Orca, etc). If you can provide a bit more information to help me get my bearings on this stuff, I will do a quick review and can probably just merge this stuff through.

With regards to the PR itself, the only other thing I can think of is that it might be better to create a separate module specific to generating the LLM document rather than updating the code to generate the Website (as I may be overhauling that code at some point with updates specific to the website). It might be better to create a new task specific for LLM generation, what do you think?

Cheers!
S

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants