Build TypeSafe Node API using tRPC, Fastify, Kysely and Atlas CLI
source link: https://dev.to/franciscomendes10866/build-typesafe-node-api-using-trpc-fastify-kysely-and-atlas-cli-580c
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
Introduction
In today's article we are going to create a fully typesafe CRUD API, we are not only going to address the development environment but also the production environment, using some tooling tools to help in the build process, lint, format, among others.
The idea is that at the end of the article you have a base that you can easily extend, adding more procedures and not have to worry about other configurations.
Prerequisites
Before going further, you need:
- TypeScript
- Atlas CLI
In addition, you are expected to have basic knowledge of these technologies.
Getting Started
API Setup
Our first step will be to create the project folder:
mkdir api
cd api
Then let's start a new project:
yarn init -y
Now we need to install the base development dependencies:
yarn add -D @types/node typescript
Now let's create the following tsconfig.json
:
{
"compilerOptions": {
"target": "esnext",
"module": "CommonJS",
"allowJs": true,
"removeComments": true,
"resolveJsonModule": true,
"typeRoots": ["./node_modules/@types"],
"sourceMap": true,
"outDir": "dist",
"strict": true,
"lib": ["esnext"],
"baseUrl": ".",
"forceConsistentCasingInFileNames": true,
"esModuleInterop": true,
"emitDecoratorMetadata": true,
"experimentalDecorators": true,
"moduleResolution": "Node",
"skipLibCheck": true
},
"include": ["src/**/*"],
"exclude": ["node_modules"]
}
With TypeScript configured, we can install the tooling dependencies:
yarn add -D tsx tsup rome
Let's initialize the rome configuration:
yarn rome init
After running the init
command, let's make the following changes to rome.json
:
{
"$schema": "./node_modules/rome/configuration_schema.json",
"linter": {
"enabled": true,
"rules": {
"recommended": true
}
},
"formatter": {
"enabled": true,
"formatWithErrors": false,
"indentStyle": "space",
"indentSize": 2,
"lineWidth": 80,
"ignore": []
}
}
Now in package.json
let's add the following scripts:
{
"scripts": {
"dev": "tsx watch src/main.ts",
"build": "tsup src",
"lint": "rome check src --apply",
"format": "rome format src --write",
"start": "node dist/main.js"
},
}
Database Setup
Inside our project directory, let's create a folder called schema/
:
mkdir schema
cd schema
Hoping you have a PostgreSQL database running, let's run the following command to inspect the database:
atlas schema inspect -u "postgres://docker:docker@localhost:5432/whale?sslmode=disable" > schema.hcl
After inspecting the database, the above command will create a file called schema.hcl
, to which we will then add the schema of our tables:
# @/schema/schema.hcl
schema "public" {
}
table "dogs" {
schema = schema.public
column "id" {
null = false
type = uuid
default = sql("gen_random_uuid()")
}
column "name" {
null = false
type = varchar(100)
}
column "isGoodBoy" {
null = false
type = boolean
}
column "breed" {
null = false
type = varchar(100)
}
primary_key {
columns = [column.id]
}
}
With the database schema defined, we need to apply the migrations to the database by running the following command:
atlas schema apply \
-u "postgres://docker:docker@localhost:5432/whale?sslmode=disable" \
--to file://schema.hcl
After confirming that we want to apply the migrations, we can move on to the next step.
Build Database Connector
First, let's install the following dependencies:
yarn add kysely pg
yarn add -D kysely-codegen @types/pg
Then let's create an .env
with a variable with the connection string to the database:
DATABASE_URL=postgres://docker:docker@localhost:5432/whale?sslmode=disable
Again in package.json
let's add a new script:
{
"scripts": {
// ...
"generate": "kysely-codegen"
},
}
And run the following command:
yarn generate
The above command will generate the data types inside the node_modules/
folder taking into account the database schema.
Now creating the src/
folder and inside it the db/
folder, let's create our database connector:
// @/src/db/index.ts
import { Kysely, PostgresDialect } from "kysely";
import { DB } from "kysely-codegen";
import { Pool } from "pg";
import { env } from "../env";
export const db = new Kysely<DB>({
dialect: new PostgresDialect({
pool: new Pool({
connectionString: env.DATABASE_URL,
}),
}),
});
In the code snippet above we imported the env
variable, but it has not yet been created and for that very reason we can move on to the next step.
Build API
First, let's install the remaining dependencies:
yarn add fastify @fastify/cors envalid zod @trpc/server
Now let's set some API defaults by creating the env.ts
file:
// @/src/env.ts
import { cleanEnv, str, num } from "envalid";
export const env = cleanEnv(process.env, {
PORT: num({
default: 3333,
}),
DATABASE_URL: str({
default: "postgres://docker:docker@localhost:5432/whale?sslmode=disable",
}),
});
Next, let's define the tRPC context, in which we'll return the request and response objects, as well as the database connector instance:
// @/src/context.ts
import { inferAsyncReturnType } from "@trpc/server";
import { CreateFastifyContextOptions } from "@trpc/server/adapters/fastify";
import { db } from "./db";
export const createContext = ({ req, res }: CreateFastifyContextOptions) => {
return {
req,
res,
db,
};
};
export type Context = inferAsyncReturnType<typeof createContext>;
Now we can go define the router and create the API CRUD:
// @/src/router.ts
import { initTRPC } from "@trpc/server";
import { z } from "zod";
import { Context } from "./context";
export const t = initTRPC.context<Context>().create();
export const appRouter = t.router({
getDogs: t.procedure.query(async ({ ctx }) => {
return await ctx.db.selectFrom("dogs").selectAll().execute();
}),
getDogById: t.procedure
.input(
z.object({
id: z.string().uuid(),
}),
)
.query(async ({ input, ctx }) => {
return await ctx.db
.selectFrom("dogs")
.selectAll()
.where("id", "=", input.id)
.executeTakeFirstOrThrow();
}),
createDog: t.procedure
.input(
z.object({
name: z.string(),
breed: z.string(),
isGoodBoy: z.boolean(),
}),
)
.mutation(async ({ input, ctx }) => {
return await ctx.db
.insertInto("dogs")
.values(input)
.returningAll()
.executeTakeFirstOrThrow();
}),
updateDog: t.procedure
.input(
z.object({
name: z.string(),
breed: z.string(),
isGoodBoy: z.boolean(),
}),
)
.mutation(async ({ input, ctx }) => {
return await ctx.db
.insertInto("dogs")
.values(input)
.onConflict((oc) => oc.column("id").doUpdateSet(input))
.returningAll()
.executeTakeFirstOrThrow();
}),
removeDog: t.procedure
.input(
z.object({
id: z.string().uuid(),
}),
)
.mutation(async ({ input, ctx }) => {
return await ctx.db
.deleteFrom("dogs")
.where("id", "=", input.id)
.returningAll()
.executeTakeFirstOrThrow();
}),
});
export type AppRouter = typeof appRouter;
Last but not least, we have to create the entry file, where we are going to setup the http server, among other things:
// @/src/main.ts
import fastify from "fastify";
import cors from "@fastify/cors";
import { fastifyTRPCPlugin } from "@trpc/server/adapters/fastify";
import { appRouter } from "./router";
import { createContext } from "./context";
import { env } from "./env";
(async () => {
try {
const server = await fastify({
maxParamLength: 5000,
});
await server.register(cors, {
origin: "http://localhost:5173",
});
await server.register(fastifyTRPCPlugin, {
prefix: "/trpc",
trpcOptions: {
router: appRouter,
createContext,
},
});
await server.listen({
port: env.PORT,
});
} catch (err) {
console.error(err);
process.exit(1);
}
})();
If you are using monorepo, yarn link
or other methods, you can go to package.json
and add the following key:
{
"main": "src/router"
}
This way, when importing the router data types to the trpc client, it goes directly to the router.
Conclusion
I hope you found this article helpful, whether you're using the information in an existing project or just giving it a try for fun.
Please let me know if you notice any mistakes in the article by leaving a comment. And, if you'd like to see the source code for this article, you can find it on the github repository linked below.
Recommend
About Joyk
Aggregate valuable and interesting links.
Joyk means Joy of geeK