TIL: 9th August 2023 — LLM prompt engineering personas
When setting up a conversational multi-turn context with OpenAI GPT models (and probably others), it's useful to think about the roles. Specifically, ChatGPT expects three roles or personas:
user
asks theassistant
questions and gives instructions.assistant
responds to the user's input.system
guides theassistant
's behaviour and identity.
Here's an example in Go that I've entirely generated using ChatGPT :)
package main
import (
"fmt"
"log"
"github.com/openai/chatgpt-go"
)
func main() {
// Set up your OpenAI API key
apiKey := "your-api-key"
client := chatgpt.NewClient(apiKey)
messages := []*chatgpt.Message{
&chatgpt.Message{Role: chatgpt.System, Content: "You are an SQL expert. Assist the user with their SQL query optimization."},
&chatgpt.Message{Role: chatgpt.User, Content: "Hi there! I'm having some issues with a complex SQL query. Can you assist me in optimizing it?"},
&chatgpt.Message{Role: chatgpt.Assistant, Content: "Of course! I'd be happy to help you with your SQL query. Please provide me with the query you're working on and any specific issues you're facing."},
// Add more user and assistant messages here
}
response, err := client.SendMessage(messages)
if err != nil {
log.Fatal(err)
}
// Print assistant's reply
fmt.Println("Assistant:", response.Messages[len(response.Messages)-1].Content)
}
In this snippet, we're giving examples of all the three roles as the context. This way, the assistant
's responses to all subsequent messages will already understand what the user
is looking for, making the conversation more efficient given the subject of optimising the SQL queries.
The same strategy can be used to build domain-driven chat bots that e.g. translate the outcome of the conversation to a machine readable format (e.g. JSON) that can be processed by an API.
Additionally, there's a great short (just about 1 hour!) course with Isa Fulford and Andrew Ng on the topic of prompt engineering. It comes highly recommended!