www.defer.run Open in urlscan Pro
2600:9000:214f:5400:13:7971:b740:93a1  Public Scan

URL: https://www.defer.run/blog/nextjs-timeout-error
Submission: On November 24 via api from US — Scanned from DE

Form analysis 0 forms found in the DOM

Text Content

Use casesPricingDocs
Blog
Get started
Sign in
Engineering
March 6th, 2023


HOW TO SOLVE NEXT.JS
“TIMEOUT” ERROR

Charly Poly・CEO
SHARE
TwitterLinkedInRedditredditURL

Next.js enabled all TypeScript and JavaScript developers, from front-end to
back-end, to quickly build swift React applications.

Next.js apps are most of the time deployed on Serverless solutions such as
Vercel or Cloudflare Workers.


While most use-cases fit in the limit of Serverless executions of 30-60 seconds,
use-cases such as third-party integrations or AI features don't.




OFFLOADING LONG-RUNNING TASKS

Such timeouts are solved by “offloading” your Functions/Workers and delaying the
computation in the background while the front end will poll for the results.


Timeouts happen when a long-running task is happening at the HTTP layer:

Front-end
Long-running Function/Worker

Timeouts can be avoided by moving the long-running task out of the HTTP layer:

1. triggers a job
2. poll for 
 job's status
triggers
pools for status
3. fetches the result
Front-end
Function/Worker triggers 
 a background job
Function/Worker fetch job's status
Job performing a long-running task

Having the front end trigger a long-running execution and polling for its
results allows for the task to not be limited by the Serverless (1min) or HTTP
limits (5min).


While it might look complicated to set up, we will see that such a pattern is
simple with Defer.


MOVING SLOW-RUNNING CODE IN THE BACKGROUND

Let's consider the following Next.js app:

.
|-- pages/
|   |-- api/
|       |-- longRunning.ts
|-- styles/
|-- next-env.d.ts
|-- next.config.js
|-- package.json
|-- tsconfig.json
|-- yarn.lock



having a /api/longRunning Next.js API Route:

1import type { NextApiRequest, NextApiResponse } from "next"2
3type Response = {4  ret: any5}6
7export default async function handler(8  req: NextApiRequest,9  res: NextApiResponse<Response>10) {11  // ... doing some long-running stuff...12
13  res.status(200).json({ ret })14}15


To avoid timeouts, we will move the long-running code in the background.


After setup a Defer application, let's create a defer/longRunning.ts background
function that will contain the long-running code:

1import { defer, configure } from "@defer/client"2
3async function longRunning() {4  // ... doing some long-running stuff...5}6
7export default longRunning8


Then update our api/longRunning.ts Next.js API Route to trigger this background
function and return the execution ID to the front end:

1import type { NextApiRequest, NextApiResponse } from "next"2import longRunning from "@/defer/longRunning"3
4type Data = {5  id: string6}7
8export default async function handler(9  _req: NextApiRequest,10  res: NextApiResponse<Data>11) {12  // calling `longRunning()` triggers13  //  its execution on Defer Platform14  const ret = await longRunning()15
16  // returns the Defer execution ID17  //  to the front-end18  res.status(200).json(ret)19}20


Let's finally add a new Next.js API Route enabling the front-end to poll an
execution's status and result (/api/longRunning/[id].ts):

1import { type FetchExecutionResponse, getExecution } from "@defer/client"2import type { NextApiRequest, NextApiResponse } from "next"3
4type Response = {5  res: FetchExecutionResponse6}7
8export default async function handler(9  req: NextApiRequest,10  res: NextApiResponse<Response>11) {12  const executionId = req.query.id13  const ret = await getExecution(executionId as string)14  res.status(200).json({ res: ret })15}16


You can find a complete Next.js working example here:
https://github.com/defer-run/defer.demo/tree/master/nextjs.


FAQ

Can I offload my Serverless Functions/Workers using Vercel/Cloudflare?

Vercel does not allow calling Functions in the background.

Cloudflare provides a Queue mechanism; however, not designed for such use cases.


What about QStash or Inngest?

QStash and Inngest are headless queueing solutions that might appear, at first,
better suited for Serverless applications.

However, such solutions come with some limitations:

 * your background functions need to be exposed as Next.js API Routes and
   secured with the provided private tokens to avoid any unwanted use.
 * Your background function gets limited to the HTTP timeout of 5 minutes.


What about AWS Lambda?

Yes, AWS Lambdas can run up to 15min.

However:

 * you will need to spend time to set up your AWS Stack using AWS CDR and your
   AWS
 * If you plan to use TypeScript or native packages, you will need to bundle and
   provide a Dockerfile.

Join the community and learn how to get started, and provide feedback.
Join the community
Stay tuned about our latest product and company updates.
Follow us
Start creating background jobs in minutes.
Read documentation


RELATED ARTICLES

View all
Engineering


HOW TO RELIABLY PROCESS IMAGES IN YOUR EXPRESS APPLICATION

Processing images directly from your Express application has many downsides that
should be avoided by moving these CPU-intensive and memory-leaking tasks in the
background.

July 6th, 2023
Engineering


HOW TO SET UP A NODE.JS CRON

CRONs are an essential part of web applications, from automatic invoicing,
triggering weekly notifications, or triggering search reindex or data backups.
This article covers the CRONs best practices (windowing, monitoring, alerting)
and an extensive list of deployment and hosting solutions.

June 28th, 2023
Engineering


BUILDER'S ITERATIVE IMPROVEMENTS BY LEVERAGING MULTI-ENVIRONMENTS

This blog post is the first of the Inside Defer Series, opening a door into the
inner workings of Defer on the technical and operating side. With hundreds of
builds per day and continuous executions, introducing changes to the Defer
Builder could be a slow and hazardous process.

June 20th, 2023
All Systems OperationalTerm of ServicePrivacy Policy
Copyright ©2023 Defer Inc.