The full Jupyter Notebook is available at my GitHub repository, and shows the step-by-step Python API calls that are required to build a conversational thread with GPT.

The core is summarized here as:

name = "Go Dev"

instructions = """You are a dedicated GoLang developer,
and [...] full code for functions and types.
"""

content = """
Using the `go-openai` library in Go, [...].

"""

# 0. We need an assistant
new_assistant(name, instructions)

# 1. Get a Thread, and append a message to it
thread = new_thread()
add_msg_to_thread(thread, content)

# 2. Get a new Run, and associate it with our Thread
#    We will use the Assistant (Go Dev) we created before.
run = new_run(thread=thread, asst_name=name)

# 3. We then ask GPT for advice
if wait_on_run(run, thread):
    response = get_response(thread)
    print(f"{name} says:\n{response}")
else:
    print(f"We failed! Status: {run.status}, {run.incomplete_details}") 

Please see the full Notebook for the complete listing of all the functions.

The full API documentation is here, where there is also a Playground to test with different assistants’ configurations.


Please feel free to reach out if you have an interesting LLM project (OpenAI-related or otherwise) and would like some help in getting it off the ground: my LinkedIn profile is here.

2 responses to “Using OpenAI’s Assistants API”

  1. […] “Using OpenAI’s Assistants API” for Part 1 of the […]

  2. […] Part 1 and Part 2 of this series we explored how to use the new OpenaAI’s Threads API using Python; […]

Leave a comment

Trending