You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I know that AnythingLLM already has citations for RAG, but I think having in-text citations like Perplexity would be a big improvement. I think this will be essential for RAG applications going forward, since if you pull in 10+ different chunks for each RAG message, the user doesn't want to have to look through each of those chunks in order to verify if each of the statements in the response make sense. Being able to just hover over each citation at the end of each statement from the LLM would be a game-changer (Google's NotebookLM does this for example).
What would you like to see?
I know that AnythingLLM already has citations for RAG, but I think having in-text citations like Perplexity would be a big improvement. I think this will be essential for RAG applications going forward, since if you pull in 10+ different chunks for each RAG message, the user doesn't want to have to look through each of those chunks in order to verify if each of the statements in the response make sense. Being able to just hover over each citation at the end of each statement from the LLM would be a game-changer (Google's NotebookLM does this for example).
Here's a reddit thread where they talk about how to implement this:
https://www.reddit.com/r/LocalLLaMA/comments/1e5emhi/want_to_understand_how_citations_of_sources_work/?utm_source=share&utm_medium=web3x&utm_name=web3xcss&utm_term=1&utm_content=share_button
Here's a description of this feature that's been implemented in deepset:
https://docs.cloud.deepset.ai/docs/referencepredictor
The text was updated successfully, but these errors were encountered: