Serverless fire-and-forget subrequests
It’s pretty common to need to run a side-effect as a result of a request initiated by a user. On a persistent server or in clientside JS, it’s pretty easy to create a subrequest and not
await
on the promise. But serverless environments don’t stay around very long, and may just shut down before the subrequest is sent. Here’s a hack for sending out fire-and-forget subrequests in serverless environments.
Say, you’re writing a Spotify-like app, and you have an endpoint that allows a user to create a playlist with an initial set of songs. You also want to kick off a request that populates suggestions for additional songs to add to the playlist.
async function populateSuggestions(req){ await fetch("https://api.example.com", { method: "POST", body: JSON.stringify(req) }) } async function handler(req){ const albumId = await createPlaylist(req) await populateSuggestions(req) return 200 }
The above code
await
s for populateSuggestions
to complete, which may take quite a while, and you don’t really want to block the response for that long. If you’re running this from the client, or on a persistent server, you could simply not
await
async function handler(req){ const albumId = await createPlaylist(req) populateSuggestions(req).catch(err => log(err)) return 200 }
But this won’t work on most serverless environments, ie: the environment may shut down before the
populateSuggestions
request gets sent. As of May 2024, Vercel added a new waitUntil that solves this exact problem, by extending the lifetime of the environment, without blocking the response. You do pay for this additional runtime, but it solves the problem if you’re using Vercel.import { waitUntil } from '@vercel/functions'; async function handler(req){ const albumId = await createPlaylist(req) waitUntil(populateSuggestions(req).catch(err => log(err))) return 200 }
The correct answer at sufficient scale is to have a job queue, or task system to manage this sort of work, eg: Pub/Sub, GCP Cloud Tasks. You may still wait for the enqueue operation, but you don’t wait for the execution of a longer-lived operation.
import { waitUntil } from '@vercel/functions'; async function enqueuePopulateSuggestions(req){ await queue.submitTask(populateSuggestions, req) } async function handler(req){ const albumId = await createPlaylist(req) waitUntil(populateSuggestions(req).catch(err => log(err))) return 200 }
Finally, here’s my workaround.
fetch
doesn’t support waiting only until you “send” a request, so we drop down to the lower level https
module. We create a promise that resolves as soon as the request stream is complete, without waiting for the response stream.async function populateSuggestions(req){ const requestOptions = { hostname: "api.example.com", port: 443, path: "/", method: 'POST', headers, } await new Promise((resolve, reject) => { const req = https.request(requestOptions, (_) => {}) req.on('error', (err) => { reject(err); }); req.write(JSON.stringify(request)); req.end(() => { req.destroy(); resolve(); }); }); } async function handler(req){ const albumId = await createPlaylist(req) await populateSuggestions(req).catch(err => log(err)) return 200 }
In practice, this adds only single digit to low double digit millisecond latency to your request, much cheaper relative to an expensive side-effect that may take several seconds.
Appendix
Here’s a more full implementation that you can drop in, you might need to extend it further for eg: headers. It emulates
fetch
but instead uses http[s]
to block only until the request is sent.async function nonBlockingRequest(url, request){ const parsedUrl = new URL(url) const isHttps = parsedUrl.protocol === 'https:' const httpModule = isHttps ? https : http const requestOptions = { hostname: parsedUrl.hostname, port: parsedUrl.port, path: parsedUrl.pathname + parsedUrl.search, method: 'POST', headers, } await new Promise((resolve, reject) => { const req = httpModule.request(requestOptions, (_) => {}) req.on('error', (err) => { reject(err) }) req.write(JSON.stringify(request)) req.end(() => { req.destroy() resolve() }) }) }