How do I make an agent start with a static string and autocomplete the rest? #745
-
I have a Coder agent, and the only thing I want it to do is write code. I want to encourage this by adding a static string at the beginning and letting the model autocomplete it. The starting string would be something like this:
and the rest should be completed by the Agent / LLM. This is very similar to the 'start reply with' feature in text-generation-webui. How do I do this with AutoGen? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 3 replies
-
The key is to use the LLM for Completion rather than ChatCompletion. The API details for this have changed in the latest version of the openai package, so you can save yourself the time of porting your code from the old version to the new version by starting with the new version. As of today, that's The documentation is here, and this is the most relevant code snippet from that page: from autogen import OpenAIWrapper
# OpenAI endpoint
client = OpenAIWrapper()
# ChatCompletion
response = client.create(messages=[{"role": "user", "content": "2+2="}], model="gpt-3.5-turbo")
# extract the response text
print(client.extract_text_or_function_call(response))
# Azure OpenAI endpoint
client = OpenAIWrapper(api_key=..., base_url=..., api_version=..., api_type="azure")
# Completion
response = client.create(prompt="2+2=", model="gpt-3.5-turbo-instruct")
# extract the response text
print(client.extract_text_or_function_call(response)) |
Beta Was this translation helpful? Give feedback.
-
I'll resolve this now. Please reopen to continue the discussion. |
Beta Was this translation helpful? Give feedback.
The key is to use the LLM for Completion rather than ChatCompletion. The API details for this have changed in the latest version of the openai package, so you can save yourself the time of porting your code from the old version to the new version by starting with the new version. As of today, that's
openai==1.3.5
, andpyautogen==0.2.0b6
.The documentation is here, and this is the most relevant code snippet from that page: