-
Notifications
You must be signed in to change notification settings - Fork 54
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Where's the RAGColbertReranker usage in llama-cpp-agent? #37
Comments
I'm gonna add an example today, I have to add some comments first. |
@svjack Added example in readme and in the examples folder. |
I try the demo in readme Structured Output import llama_cpp
import llama_cpp.llama_tokenizer
main_model = llama_cpp.Llama.from_pretrained(
repo_id="Qwen/Qwen1.5-14B-Chat-GGUF",
filename="*q4_0.gguf",
tokenizer=llama_cpp.llama_tokenizer.LlamaHFTokenizer.from_pretrained("Qwen/Qwen1.5-14B"),
verbose=False,
n_gpu_layers = -1,
n_ctx = 3060
) This yield error of ---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
Cell In[22], line 4
1 structured_output_agent = StructuredOutputAgent(llama, debug_output=True)
3 text = """The Feynman Lectures on Physics is a physics textbook based on some lectures by Richard Feynman, a Nobel laureate who has sometimes been called "The Great Explainer". The lectures were presented before undergraduate students at the California Institute of Technology (Caltech), during 1961–1963. The book's co-authors are Feynman, Robert B. Leighton, and Matthew Sands."""
----> 4 print(structured_output_agent.create_object(Book, text))
File /environment/miniconda3/lib/python3.10/site-packages/llama_cpp_agent/structured_output_agent.py:215, in StructuredOutputAgent.create_object(self, model, data)
204 """
205 Creates an object of the given model from the given data.
206
(...)
212 object: The created object.
213 """
214 if model not in self.grammar_cache:
--> 215 grammar, documentation = generate_gbnf_grammar_and_documentation(
216 [model],
217 model_prefix="Response Model",
218 fields_prefix="Response Model Field",
219 )
221 self.grammar_cache[model] = grammar, documentation
222 else:
File /environment/miniconda3/lib/python3.10/site-packages/llama_cpp_agent/gbnf_grammar_generator/gbnf_grammar_from_pydantic_models.py:1451, in generate_gbnf_grammar_and_documentation(pydantic_model_list, outer_object_name, outer_object_content, model_prefix, fields_prefix, list_of_outputs, documentation_with_field_description, add_inner_thoughts, allow_only_inner_thoughts, inner_thoughts_field_name, add_request_heartbeat, request_heartbeat_field_name, request_heartbeat_models)
1415 def generate_gbnf_grammar_and_documentation(
1416 pydantic_model_list,
1417 outer_object_name: str | None = None,
(...)
1428 request_heartbeat_models: List[str] = None,
1429 ):
1430 """
1431 Generate GBNF grammar and documentation for a list of Pydantic models.
1432
(...)
1449 tuple: GBNF grammar string, documentation string.
1450 """
-> 1451 documentation = generate_text_documentation(
1452 copy(pydantic_model_list),
1453 model_prefix,
1454 fields_prefix,
1455 documentation_with_field_description=documentation_with_field_description,
1456 )
1457 grammar = generate_gbnf_grammar_from_pydantic_models(
1458 pydantic_model_list,
1459 outer_object_name,
(...)
1467 request_heartbeat_models,
1468 )
1469 grammar = remove_empty_lines(grammar + get_primitive_grammar(grammar))
File /environment/miniconda3/lib/python3.10/site-packages/llama_cpp_agent/gbnf_grammar_generator/gbnf_grammar_from_pydantic_models.py:1116, in generate_text_documentation(pydantic_models, model_prefix, fields_prefix, documentation_with_field_description)
1112 if isclass(element_type) and issubclass(
1113 element_type, BaseModel
1114 ):
1115 pyd_models.append((element_type, False))
-> 1116 if isclass(field_type) and issubclass(field_type, BaseModel):
1117 pyd_models.append((field_type, False))
1118 documentation += generate_field_text(
1119 name,
1120 field_type,
1121 model,
1122 documentation_with_field_description=documentation_with_field_description,
1123 )
File /environment/miniconda3/lib/python3.10/abc.py:123, in ABCMeta.__subclasscheck__(cls, subclass)
121 def __subclasscheck__(cls, subclass):
122 """Override for issubclass(subclass, cls)."""
--> 123 return _abc_subclasscheck(cls, subclass)
TypeError: issubclass() arg 1 must be a class But when I use case in I get favorable output. Does this mean, you should update your readme ? |
@svjack I can't reproduce the error. But could you send me your complete code that causes the error? |
In Python 3.10.12 Install by pip install llama-cpp-agent
pip install transformers
CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install llama-cpp-python --force-reinstall --upgrade --no-cache-dir source code import llama_cpp
import llama_cpp.llama_tokenizer
llama = llama_cpp.Llama.from_pretrained(
repo_id="Qwen/Qwen1.5-14B-Chat-GGUF",
filename="*q4_0.gguf",
tokenizer=llama_cpp.llama_tokenizer.LlamaHFTokenizer.from_pretrained("Qwen/Qwen1.5-14B"),
verbose=False,
n_gpu_layers = -1,
n_ctx = 3060
)
from enum import Enum
from llama_cpp import Llama
from pydantic import BaseModel, Field
from llama_cpp_agent.structured_output_agent import StructuredOutputAgent
# Example enum for our output model
class Category(Enum):
Fiction = "Fiction"
NonFiction = "Non-Fiction"
# Example output model
class Book(BaseModel):
"""
Represents an entry about a book.
"""
title: str = Field(..., description="Title of the book.")
author: str = Field(..., description="Author of the book.")
published_year: int = Field(..., description="Publishing year of the book.")
keywords: list[str] = Field(..., description="A list of keywords.")
category: Category = Field(..., description="Category of the book.")
summary: str = Field(..., description="Summary of the book.")
structured_output_agent = StructuredOutputAgent(llama, debug_output=True)
text = """The Feynman Lectures on Physics is a physics textbook based on some lectures by Richard Feynman, a Nobel laureate who has sometimes been called "The Great Explainer". The lectures were presented before undergraduate students at the California Institute of Technology (Caltech), during 1961–1963. The book's co-authors are Feynman, Robert B. Leighton, and Matthew Sands."""
print(structured_output_agent.create_object(Book, text)) structured_output_agent.create_object yield the above error. 🤔 |
@svjack I tried your code and it worked for me, but it lead to an error in llama-cpp-python after creating the object. I have to investigate that. |
When I run the structured output demo ,the same problem happened!! Package Version aiohttp 3.9.5 |
Thanks this bravo project, that can give easy format constrain of llama-cpp.
And where's the RAGColbertReranker usage ?
That I can try a usage example on agent ability with the help of llama-cpp-agent. 😊
The text was updated successfully, but these errors were encountered: