ChatGPTHandler is an interface to OpenAI’s chat API that maintains message history.
This is a minimal configuration. Replace the key
with your OpenAI API key (help).
url:
chatgpthandler:
pattern: /$YAMLURL/chat
handler: ChatGPTHandler
kwargs:
key: sz-....
This opens a WebSocket connection to /chat
. You can send messages to it using:
const url = location.href.replace(/^http/, "ws").replace(/\/[^/]*$/, "/chat");
const ws = new WebSocket(url);
ws.onopen = () => ws.send("What rhymes with silk?");
ws.onmessage = (message) => console.log(message.data);
This sends a question “What rhymes with silk?” to OpenAI, and logs the response to the console.
milk, bilk, ilk, wilk, filk
<!doctype html>
<form>
<input id="input" value="What rhymes with milk?" /> <button>Send</button>
</form>
<pre id="output"></pre>
<script type="module">
const ws = new WebSocket(
location.href.replace(/^http/, "ws") + "chatgpt?stream",
);
const input = document.querySelector("#input");
const output = document.querySelector("#output");
ws.onmessage = (event) =>
output.insertAdjacentHTML("beforeend", event.data || "\n\n");
document.querySelector("form").addEventListener(
"submit",
(e) => {
e.preventDefault();
ws.send(input.value);
output.insertAdjacentHTML("beforeend", `\n${input.value}\n\n`);
},
false,
);
</script>
You can use the following parameters:
key
: your OpenAI API keymax_history
: maximum number of messages to store. Default: None (unlimited)model
: ID of the model. Default: gpt-3.5-turbo
temperature
: randomness to use (from 0 to 2). Default: 1top_p
: pick the top p% most likely tokens. Default: 1n
: number of chat completions. Currently, only 1 is supportedstream
: true
streams the response. Else the entire response is returned. Default: falsestop
: sequence that stops the API from generating further tokens. Default: Nonemax_tokens
: maximum tokens to generate. Default: None (unlimited)frequency_penalty
: higher values reduce repetition. Ranges from -2.0 to +2.0. Default: 0presence_penalty
: higher values increase novelty. Ranges from -2.0 to +2.0. Default: 0user
: unique identifier of the user. Default: NoneAny of the configurations can be a Python expression that uses handler
. For example:
url:
chatgpthandler:
pattern: /$YAMLURL/chat
handler: ChatGPTHandler
kwargs:
key: {function: handler.get_arg('key)}
model: {function: handler.get_arg('model', 'gpt-3.5-turbo')}
This lets the front-end pass the key and model to use. For example:
/chat?key=sz-...
uses the key sz-...
and the default model gpt-3.5-turbo
/chat?key=sz-...&model=gpt-4
uses the key sz-...
and model gpt-4
By default, ChatGPTHandler stores the entire conversation history from the time the WebSocket is opened. This history is sent to OpenAI with every message.
To limit the number of messages (e.g. to reduce the number of tokens processed), use max_history
. For example:
url:
chatgpthandler:
pattern: /$YAMLURL/chat
handler: ChatGPTHandler
kwargs:
key: sz-....
max_history: 5
This stores and sends only the last 5 messages to OpenAI.