-
Notifications
You must be signed in to change notification settings - Fork 107
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SystemMessage / UserMessage / AssistantMessage with @prompt_chain #389
Comments
@alexchandel Unfortunately there is no All that magentic/src/magentic/prompt_chain.py Lines 82 to 93 in d7ae5c2
Based on this you could do something like: from magentic import UserMessage, FunctionCall
from magentic.chat import Chat
chat = Chat(
messages=[UserMessage(...), ...],
functions=[my_func],
output_types=[str, list[int], FunctionCall], # Note: FunctionCall is needed here
).submit() # .submit() adds an AssistantMessage by querying the LLM
while isinstance(chat.last_message.content, FunctionCall):
chat = chat.exec_function_call().submit()
return chat.last_message.content Update: |
I like the loop idea, but is there any way to make it a decorator, such that |
Actually it looks like there is maybe a bug in how
In particular, it looks like when the assistant sent no context but just a bare tool call,
Any ideas for how to work around this for now? |
Update. workaround was to comment out line 112 of Now, not sure if this is correct, first of all because |
@alexchandel Thanks for debugging this! What model are you using? OpenAI docs indicate "content" is optional, so it should probably be left out. Though I'm pretty sure it had to be set to null/None for an earlier model or maybe for the pydantic model (otherwise I wouldn't have explicitly set it to None). If the https://platform.openai.com/docs/api-reference/chat/create As a workaround in your own code to avoid editing magentic directly you could just register a new handler for the AssistantMessage. This will take precedence over the existing handler. from magentic.chat_model.openai_chat_model import message_to_openai_message
@message_to_openai_message.register(AssistantMessage)
def _(message: AssistantMessage[Any]) -> ChatCompletionMessageParam:
# same code as existing but without `"content": None` |
Hmm if it sent us a |
I opened an issue with LM Studio: lmstudio-ai/lmstudio-bug-tracker#261 So how would write a |
Released now in https://github.com/jackmpcollins/magentic/releases/tag/v0.37.0 from magentic import prompt_chain, UserMessage
def get_current_weather(location, unit="fahrenheit"):
"""Get the current weather in a given location"""
return {"temperature": "72", "forecast": ["sunny", "windy"]}
@prompt_chain(
template=[UserMessage("What's the weather like in {city}?")],
functions=[get_current_weather],
)
def describe_weather(city: str) -> str: ...
describe_weather("Boston")
'The weather in Boston is currently 72°F with sunny and windy conditions.' |
chatprompt
supports the very niceSystemMessage
,UserMessage
, andAssistantMessage
classes for input roles, which the API server will translate to<|im_start|>system
/<|im_end|>
/ whatever formatting. However it doesn't support chained function calls, meaning you must resolve them and send the output back.On the other hand,
prompt_chain
seems to support chained function calls, but only takes a template string (no message list), and sends the message as"role": "user"
. How can I useSystemMessage
,UserMessage
, andAssistantMessage
withprompt_chain
, or achieve the same effect?The text was updated successfully, but these errors were encountered: