Fullstack integration
If you are developing an interactive application that rely on results from a SCALE deployment, you will probably need to write a proxy API over your deployment to handle requests from your frontend without exposing your Ikomia API key. You may also want to implement some business logic and guardrails on your server like authentication and rate limiting.
While there is no universal solution for this as it depends on your application's requirements, we provide simple utilities to help you achieve this on your full-stack JavaScript application.
Example: Generating an image
Let's say that you want to create a simple web application that allows users to generate images from text prompts.
- You implement an API endpoint that will call your SCALE deployment using the client library.
- Your front-end can use whatever technology you want to query that endpoint, such as standard HTTP fetch, TRPC, WebSockets, etc.
- You stream the deployment run status to the front-end using the
StreamingRun
utility so that you can provide a nice progress indicator to your users. - When the deployment is complete,
StreamingRun
sends the results to the front-end. - Your front-end can use the client library types and utility to process the results.
Implementing the server endpoint
Create a new API endpoint that run your deployment wrapped in a StreamingRun
instance.
import {StreamingRun} from '@ikomia/ikclient/streaming';
const streamingRun = new StreamingRun(onProgress =>
client.run({parameters: {prompt}, onProgress})
);
const response = streamingRun.getResponse({
keepAliveDelay: 30,
headers: {'X-Custom-Header': 'value'},
});
The StreamingRun
can produce a stream in the following formats depending on your frameworks and requirements:
Method | Description | raise | includeInputs | keepAliveDelay | headers |
---|---|---|---|---|---|
getAsyncGenerator() | Returns an async generator that yields session states. Useful for tRPC or custom streaming implementations. | ✅ Default: true | ✅ Default: false | ❌ | ❌ |
getReadableStream() | Returns a ReadableStream that emits Server-Sent Events. | ✅ Default: true | ✅ Default: false | ✅ Default: 10 seconds | ❌ |
getResponse() | Returns a complete Fetch Response object with SSE stream and proper headers. | ✅ Default: true | ✅ Default: false | ✅ Default: 10 seconds | ✅ Default: {} |
You can customize the behavior of the generated stream by passing an object to the method with the following properties:
raise
: Whether to raise an error if the run failsincludeInputs
: Whether the streamed result object should include inputskeepAliveDelay
: The delay in seconds between keep-alive messages (SSE only)headers
: Additional headers to include in the response (Response only)
- Express
- Next.js
- Nuxt
- tRPC
You can use StreamingRun.getReadableStream()
to stream the status and results as server-sent events.
import express from 'express';
import {Readable} from 'stream';
import {Client} from '@ikomia/ikclient';
import {StreamingRun} from '@ikomia/ikclient/streaming';
const app = express();
const client = new Client({
url: 'https://your.scale.endpoint.url',
});
app.get('/api/generate-image', async (req, res) => {
// Add your authentication/rate limiting logic here.
const prompt = req.query.prompt;
const streamingRun = new StreamingRun(onProgress =>
client.run({parameters: {prompt}, onProgress})
);
// Set status and headers for server-sent events
res.status(200);
res.setHeader('Content-Type', 'text/event-stream');
Readable.fromWeb(streamingRun.getReadableStream()).pipe(res);
});
app.listen(3000);
On Next.js, you'll need to implement a Route Handler to stream the status and results as server-sent events.
If you are using Next.js Page Router for your project, you'll also need to use the App Router for this endpoint as Page Router does not support streaming responses. Since Next.js 13+, you can use both router in the same project.
import {Client} from '@ikomia/ikclient';
import {StreamingRun} from '@ikomia/ikclient/streaming';
const client = new Client({
url: 'https://your.scale.endpoint.url',
});
export async function GET(request: Request) {
// Add your authentication/rate limiting logic here.
const url = new URL(request.url);
const prompt = url.searchParams.get('prompt')!; // Get the prompt from the query string
const streamingRun = new StreamingRun(onProgress =>
client.run({parameters: {prompt}, onProgress})
);
return streamingRun.getResponse();
}
On Nuxt, you'll need to implement an event handler to stream the status and results as server-sent events.
import {Client} from '@ikomia/ikclient';
import {StreamingRun} from '@ikomia/ikclient/streaming';
const client = new Client({
url: 'https://your.scale.endpoint.url',
});
export default defineEventHandler(async event => {
// Add your authentication/rate limiting logic here.
const {prompt} = getQuery(event);
if (!prompt) {
throw createError({statusCode: 400});
}
const streamingRun = new StreamingRun(onProgress =>
client.run({parameters: {prompt}, onProgress})
);
return streamingRun.getResponse();
});
tRPC allows to stream async generators directly. Under the hood it can use server-sent events or WebSockets depending on your configuration.
import {z} from 'zod';
import {publicProcedure, router} from './trpc.js';
import {Client} from '@ikomia/ikclient';
import {StreamingRun} from '@ikomia/ikclient/streaming';
import {createHTTPServer} from '@trpc/server/adapters/standalone';
const client = new Client({
url: 'https://your.scale.endpoint.url',
});
const appRouter = router({
generateImage: publicProcedure.input(z.string()).query(async function* ({
input,
}) {
// Add your authentication/rate limiting logic here.
const streamingRun = new StreamingRun(onProgress =>
client.run({parameters: {prompt: input}, onProgress})
);
yield* streamingRun.getAsyncGenerator();
}),
});
export type AppRouter = typeof appRouter;
const server = createHTTPServer({
router: appRouter,
});
server.listen(3000);
Reading the stream on the front-end
Your front-end application can then call your server endpoint and use the StreamingRun
utility methods to deserialize the streamed response.
- Server-Sent Events
- tRPC
If you used server-sent events for streaming your results (as shown in the Express, Next.js, and Nuxt examples), you can read the stream on the front-end like this:
import {ImageIO} from '@ikomia/ikclient';
import {StreamingRun} from '@ikomia/ikclient/streaming';
const results = await fetch(
`/api/generate-image?prompt=${encodeURIComponent('A puppy that fly like a superhero')}`
).then(response =>
StreamingRun.fromResponse(response, {
// Track progress just like in Client.run
onProgress: state => console.log(state),
})
);
// You can then process the results like you would have done on the server side
const image = results.getOutput(0, ImageIO);
import type {AppRouter} from '../server/index.js';
import {ImageIO} from '@ikomia/ikclient';
import {StreamingRun} from '@ikomia/ikclient/streaming';
import {
createTRPCClient,
splitLink,
unstable_httpBatchStreamLink,
unstable_httpSubscriptionLink,
} from '@trpc/client';
// Initialize the tRPC client
const trpc = createTRPCClient<AppRouter>({
links: [
splitLink({
condition: op => op.type === 'subscription',
true: unstable_httpSubscriptionLink({
url: 'http://localhost:3000',
}),
false: unstable_httpBatchStreamLink({
url: 'http://localhost:3000',
}),
}),
],
});
const results = await trpc.generateImage
.query('A puppy that fly like a superhero')
.then(stream =>
StreamingRun.fromAsyncGenerator(stream, {
// Track progress just like in Client.run
onProgress: state => console.log(state),
})
);
// You can then process the results like you would have done on the server side
const image = results.getOutput(0, ImageIO);