You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Feb 4, 2025. It is now read-only.
I successfully installed talk-codebase, its dependencies and a local model (Falcon), but for some reason a call is being made to llama-cpp-python for LlamaGrammar, which the current version of llama-cpp-python (0.1.68) doesn't seem to have. If an older version is needed, what version is needed?
(base) eric@eric-g17:~/test1/test1$ pip install llama-cpp-python==0.1.68
Collecting llama-cpp-python==0.1.68
Using cached llama_cpp_python-0.1.68-cp311-cp311-linux_x86_64.whl
Requirement already satisfied: typing-extensions>=4.5.0 in /home/eric/miniconda3/lib/python3.11/site-packages (from llama-cpp-python==0.1.68) (4.11.0)
Requirement already satisfied: numpy>=1.20.0 in /home/eric/miniconda3/lib/python3.11/site-packages (from llama-cpp-python==0.1.68) (1.23.5)
Requirement already satisfied: diskcache>=5.6.1 in /home/eric/miniconda3/lib/python3.11/site-packages (from llama-cpp-python==0.1.68) (5.6.3)
Installing collected packages: llama-cpp-python
Successfully installed llama-cpp-python-0.1.68
(base) eric@eric-g17:~/test1/test1$ talk-codebase chat ./
🤖 Config path: /home/eric/.talk_codebase_config.yaml:
Found model file at /home/eric/.cache/gpt4all/ggml-model-gpt4all-falcon-q4_0.bin
Traceback (most recent call last):
File "/home/eric/miniconda3/lib/python3.11/site-packages/langchain/llms/llamacpp.py", line 143, in validate_environment
from llama_cpp import Llama, LlamaGrammar
ImportError: cannot import name 'LlamaGrammar' from 'llama_cpp' (/home/eric/miniconda3/lib/python3.11/site-packages/llama_cpp/__init__.py)
The text was updated successfully, but these errors were encountered:
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
I successfully installed talk-codebase, its dependencies and a local model (Falcon), but for some reason a call is being made to llama-cpp-python for LlamaGrammar, which the current version of llama-cpp-python (0.1.68) doesn't seem to have. If an older version is needed, what version is needed?
The text was updated successfully, but these errors were encountered: