Navigation
Search
|
Intro to Nitro: The server engine built for modern JavaScript
Wednesday October 1, 2025. 11:00 AM , from InfoWorld
![]() Nitro is deployment-aware, designed to make it easy to deploy your code to a variety of production platforms, from Node to serverless and edge computing hosts like Vercel or CloudFlare. Many of these hosts include support for automatically handling applications that use Nitro as their web server. Nitro incorporates features that work in tandem with its environment support; functions like key-value storage, SQL datastores, and task execution all are designed with deployment in mind. Nitro also includes sophisticated bundling, with code splitting, tree shaking, and TypeScript support built in. A server engine for frameworks Nitro is the server engine for several full-stack versions of major UI frameworks, including Nuxt.js (Vue), SolidStart (Solid), and Analog (Angular). These frameworks take advantage of Nitro’s features and add in the client side. Knowing your way around Nitro is becoming an important aspect of configuring full-stack frameworks. Using Nitro for APIs and microservices Because of Nitro’s collection of handy features and excellent startup speed, it is a good choice for APIs and microservices. It can handle all REST and RPC needs for both application back ends or straight services. The same characteristics make it a good choice for edge deployments. Nitro is overall a very streamlined and effective choice for server-side JavaScript projects where speed and simplicity are the highest priority. Nitro in the UnJS universe Nitro is the high-level server component in the UnJS (Unified JavaScript) universe. UnJS is worth knowing all by itself, with a growing ecosystem of tools that refresh common parts of the JavaScript application landscape. For the purpose of this article, let’s look at how Nitro integrates with several of these projects. Unimport Nitro uses Unimport, which lets you define imports in a config file and have them injected into your files. You can then use those included packages in your Nitro files without the need for an import statement. Unimport is also highly configurable. In a Nitro server, the config file is nitro.config.ts, and the ‘imports’ field controls how Unimport works. If it’s set to false, the tool is inactive. If there is no imports field at all, the system will automatically inject all imports. H3 server The HTTP engine inside Nitro is H3, a server geared for high-performance and portability. H3 provides the core functionality and Nitro builds on it. It’s a composable engine that can be extended with plugins, which is how Nitro builds in layers of conventional features. The Unstorage project Nitro uses the Unstorage project to provide a simple key/value storage interface with a variety of underlying providers. The default is in-memory storage, and there is also an on-disk store that uses the Node API. Beyond that are providers like MongoDB, Vercel storage, or Netlify Blobs. See the Unstorage documents for a complete list of drivers. The next sections present hands-on demonstrations of essential operations with Nitro. See the Nitro documentation for a quick guide to setting up a Nitro server. Hands-on file-based routing in Nitro Nitro gives you file-based routing, similar to what you might have used in Next.js. The file and folder names determine how URLs are resolved, and the file contents are used to define the request handlers. The routes are defined in a style similar to Express and other HTTP servers. We’ll explore this feature by using the Nitro key/value store to make two simple routes: one that writes a message (POST) and another that returns the message (GET). In both cases, we return a defineEventHandler() call. This is how the H3 engine defines endpoints, and Nitro will automatically mount them. If Unimport is active, no import is required; if it is not active, you’ll need to use import { eventHandler } from 'h3'. The GET route will be the simpler of the two, and we’ll map it to localhost:3000/iw. First, we make a file at this location: server/routes/iw.get.ts Here’s the file handler: export default defineEventHandler(async () => { const storage = useStorage(); const message = await storage.getItem('message'); return { message }; }); The useStorage call creates a reference to the key/value store, by default using an in-memory store. To get the message, we just retrieve it using its key, 'message'. To send it back, we return it from the endpoint as an object. Nitro then converts that object to a JSON object by default, with a field-named message holding the data. The process for handling the POST and the message save is similar. To start, we create a server/routes/iw.post.ts file: export default defineEventHandler(async (event) => { const storage = useStorage(); const body = await readBody(event); await storage.setItem('message', body.message); return { message: 'Message saved!' }; }); This endpoint shows a bit more of the router’s power in reading the body of the request and using it to set the message field on the key/value store. Note that useStorage also supports namespaces to help avoid name collisions. What about saving to disk? In the previous example, we saved our code to memory. We can save to disk by using the ‘data’ namespace, which will persist to.data/kv by default: const storage = useStorage('data'); And then you’ll see the following on your filesystem: $ cat.data/kv/message Hello from on-disk storage Storage and deployment considerations In these examples, it’s important to think about where the server will ultimately run when deployed to production. If you are deploying to a serverless platform, both memory and file are ephemeral because the virtual machine is created and destroyed as necessary. If you need long-term persistence, you can use another driver for useStorage, like a remote database or a service like Vercel’s Upstash Redis integration. Built-in route caching Nitro includes built-in caching support: export default defineCachedEventHandler((event) => { // My event handler }, { maxAge: 60 * 60 /* 1 hour */ }); This cached event handler defines the time parameter for a cache hit with the same request characteristics. Note that the typical caveats about deployment environments apply here. The underlying mechanism of Nitro’s event handling support is useStorage. So, in serverless environments, you’ll need to use an external provider like Redis. You can also cache functions that are not themselves event handlers. Utility helper functions Nitro lets you define helper functions in the utils/ directory. Anything you define in there can be auto-imported and used by your endpoints: $ cat getGreeting.ts export function getGreeting(s: string) { return `Hello from InfoWorld, ${s}` } Once it’s auto-imported, you can use this greeting without any imports in an endpoint like so: $ cat../routes/greeting.get.ts import { getGreeting } from '../utils/getGreeting'; export default defineEventHandler((event) => { const { name = 'World' } = getQuery(event); return { message: getGreeting(name as string), }; }); This code also demonstrates how to get a query parameter from the URL. So the query: produces this greeting: {'message':'Hello from InfoWorld, FooBar'} Default deployment with Node.js The default deployment target for Nitro is Node. If we run the build tool: $ npm run build it will output a runtime bundle that we can test out: node.output/server/index.mjs This will work for any environment where you have Node available, like a cloud VM. Serverless deployment with Vercel If we want to deploy to a serverless platform like Vercel, we need to configure an external datastore. Vercel recently moved its storage integrations to a “marketplace” with several providers, but they are still well integrated. Upstash offers an excellent key/value Redis store with a generous free tier. Although there are multiple providers, Vercel still exposes them to your app as Vercel KV. Your app talks to Vercel KV and then Vercel deals with routing requests to whatever provider you’ve configured. Before we can initiate a serverless deployment, we need to install the driver for Vercel’s storage: npm install @vercel/kv Then we tell Nitro to use the driver (vercel-kv) as the storage option for the ‘data’ namespace: $ cat nitro.config.ts import { defineNitroConfig } from 'nitropack/config' export default defineNitroConfig({ compatibilityDate: '2025-08-20', srcDir: 'server', storage: { data: { driver: 'vercel-kv', } } }); The main thing here is the storage field, which tells the ‘data’ namespace to rely on the vercel-kv driver we just installed. The compatibilityDate is a way to lock in the deployment preset behavior. We are deploying to Vercel’s native Nitro support, which will use a Nitro preset internally, and respect the compatibilityDate we’ve set. Now the app is ready for Vercel deployment. The next thing we need to do is create a GitHub (or similar) repo and push the app to it. Here’s my version of the app on GitHub. Once we have our project in a repository, we go to Vercel and add it as a project: Matthew Tyson Vercel makes this very easy: After following the tasks organized by the wizard, the application will be live. But before the storage endpoints will work, we need to configure a storage provider. Open the project’s ‘storage’ tab and create a new database, as shown here: Matthew Tyson Select Upstash Redis and create a new Upstash account if you don’t have one. The setup is free and quick: Matthew Tyson Next, create a new database and accept the defaults: Matthew Tyson Once the database is ready, select it and click Connect Project: Matthew Tyson From the dropdown, select the project we created earlier: Matthew Tyson Now the project will automatically rely on that key/value store for its storage, based on the provider we set for ‘data’. (If you look at the storage settings for the Upstash Redis provider, it will display environment variables like KV_REST_API_URL, which the vercel-kv driver automatically uses under the hood.) Incidentally, if you want to run this on your local development machine, you can use Vercel’s CLI tool to create a local Vercel environment that will host the app locally. To test it out, you can use curl commands like so: $ curl -X POST -H 'Content-Type: application/json' -d '{'message':'Hello from InfoWorld'}' https://iw-nitrojs.vercel.app/iw {'message':'Message saved!'} $ curl https://iw-nitrojs.vercel.app/iw {'message':'Hello from InfoWorld'} Conclusion There’s a good reason many full-stack frameworks rely on Nitro as their HTTP server. It is deployment-aware and built with modern development needs in mind. You can also take advantage of these characteristics when using Nitro to build your own APIs. Nitro is a great piece of the web development landscape and a powerful bridge between development and deployment.
https://www.infoworld.com/article/4061129/intro-to-nitro-the-server-engine-built-for-modern-javascri...
Related News |
25 sources
Current Date
Oct, Fri 3 - 03:41 CEST
|