Skip to content

How do I make an agent start with a static string and autocomplete the rest? #745

Closed Answered by rickyloynd-microsoft
PyroGenesis asked this question in Q&A
Discussion options

You must be logged in to vote

The key is to use the LLM for Completion rather than ChatCompletion. The API details for this have changed in the latest version of the openai package, so you can save yourself the time of porting your code from the old version to the new version by starting with the new version. As of today, that's openai==1.3.5, and pyautogen==0.2.0b6.

The documentation is here, and this is the most relevant code snippet from that page:

from autogen import OpenAIWrapper
# OpenAI endpoint
client = OpenAIWrapper()
# ChatCompletion
response = client.create(messages=[{"role": "user", "content": "2+2="}], model="gpt-3.5-turbo")
# extract the response text
print(client.extract_text_or_function_call(response))
#…

Replies: 2 comments 3 replies

Comment options

You must be logged in to vote
3 replies
@PyroGenesis
Comment options

@rickyloynd-microsoft
Comment options

@PyroGenesis
Comment options

Answer selected by PyroGenesis
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants