I am creating an endpoint that will send back a readable stream from openAI back to my plugin (an AI editor which i will open source when working).
With an endpoint set up and a service processing the request, what’s the best way to stream a response back? When I ask chatgpt, it suggests:
To write a stream back to the client in a Strapi endpoint (assuming Strapi v3, which uses Koa), you can directly assign the stream to
ctx.body
. Koa (and thus Strapi) is smart enough to handle various types of responses, including streams.
Below is my service based on that advice, but the response always comes back as an empty object - not sure how to do it.:
async runAI(ctx){
const {request, response} = ctx
const {input} = request.body
const chatCompletion = await openai.chat.completions.create({
messages: [
{
role: "system",
content:
"You are an AI writing assistant that continues existing text based on context from prior text. " +
"Give more weight/priority to the later characters than the beginning ones. Make sure to construct complete sentences.",
},
{
role: "user",
content:input,
},
],
model: 'gpt-3.5-turbo',
stream: true
});
// Set response header
// response.header('Content-Type', 'text/plain')
ctx.set('Content-Type', 'text/plain');
// Streaming the response back to the client
const reader = chatCompletion.toReadableStream().getReader();
let responseStream = '';
while (true) {
const { done, value } = await reader.read();
if (done){ break;}
responseStream += value;
console.log(responseStream)
}
// Sending the streamed response back to the client
ctx.body = responseStream;
}
System Information
- Strapi Version: 4.11.4
- Operating System: mac
- Database: postgres
- Node Version: 18.16.0