Skip to content

Commit 616bf04

Browse files
committed
add prompt and format_prompt to docstrings
1 parent c39a9bc commit 616bf04

File tree

1 file changed

+4
-0
lines changed

1 file changed

+4
-0
lines changed

src/cleanlab_codex/validator.py

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -164,6 +164,8 @@ def validate(
164164
query (str): The user query that was used to generate the response.
165165
context (str): The context that was retrieved from the RAG Knowledge Base and used to generate the response.
166166
response (str): A reponse from your LLM/RAG system.
167+
prompt (str, optional): Optional prompt representing the actual inputs (combining query, context, and system instructions into one string) to the LLM that generated the response.
168+
form_prompt (Callable[[str, str], str], optional): Optional function to format the prompt based on query and context. Cannot be provided together with prompt, provide one or the other. This function should take query and context as parameters and return a formatted prompt string. If not provided, a default prompt formatter will be used. To include a system prompt or any other special instructions for your LLM, incorporate them directly in your custom form_prompt() function definition.
167169
168170
Returns:
169171
dict[str, Any]: A dictionary containing:
@@ -216,6 +218,8 @@ def detect(
216218
query (str): The user query that was used to generate the response.
217219
context (str): The context that was retrieved from the RAG Knowledge Base and used to generate the response.
218220
response (str): A reponse from your LLM/RAG system.
221+
prompt (str, optional): Optional prompt representing the actual inputs (combining query, context, and system instructions into one string) to the LLM that generated the response.
222+
form_prompt (Callable[[str, str], str], optional): Optional function to format the prompt based on query and context. Cannot be provided together with prompt, provide one or the other. This function should take query and context as parameters and return a formatted prompt string. If not provided, a default prompt formatter will be used. To include a system prompt or any other special instructions for your LLM, incorporate them directly in your custom form_prompt() function definition.
219223
220224
Returns:
221225
tuple[ThresholdedTrustworthyRAGScore, bool]: A tuple containing:

0 commit comments

Comments
 (0)