Installation
Installing Joist in your project has four main steps:
- Setting up your database
- Setting up
joist-codegen - Setting up your tests
- Setting up your production code
A wrinkle is that each Node.js application can be pretty different, in terms of how you manage your local database (i.e. with Docker Compose), what your production application looks like (a REST API, a GraphQL API, etc.), etc.
So, to simplify this page, we’ll include some assumptions based on the Joist sample app, but you should be able to adjust these steps to your specific project.
Setting up your database
Section titled “Setting up your database”The sample app uses docker compose and a db.dockerfile file to manage the local Postgres database.
To start it, clone the sample app, and run:
docker compose build dbdocker compose up -d dbThe docker-compose.yml exposes the sample_app database on port 5342, so it is accessible with an environment variable of:
DATABASE_URL=postgres://sample_user:local@localhost:5432/sample_appThe following steps will assume your database is available at this location (it is already set in the sample app’s env/local.env file), but you can set DATABASE_URL to whatever is appropriate for your application.
Setting up migrations
Section titled “Setting up migrations”You should also set up a migrations library to manage your database schema; the Joist sample app uses node-pg-migrate.
If you do use node-pg-migrate as well, you can install Joist’s node-pg-migrate-based helper methods (like createEntityTable, createEnumTable, createManyToManyTable, etc.):
npm add --save-dev joist-migration-utilsAnd add joist-migrate and joist-new-migration commands to your package.json:
{ "scripts": { "joist-migrate": "env-cmd tsx ./node_modules/joist-migration-utils", "joist-new-migration": "npx node-pg-migrate create" }}The sample app uses env-cmd to load the environment variables from .env before running joist-migration-utils, and tsx to transpile the migration’s *.ts code to JavaScript, but if you don’t like that, you can manage your application’s environment variables however you like.
Now we can apply migrations by running:
npm run joist-migrateThe sample app also supports resetting the database schema (so you can re-run the migrations from scratch) by running:
docker compose exec db ./reset.shSetting up joist-codegen
Section titled “Setting up joist-codegen”Install the joist-codegen package as a dev dependency and add a joist-codegen script to your package.json:
npm add --save-dev joist-codegen{ "scripts": { "joist-codegen": "env-cmd tsx ./node_modules/joist-codegen" }}This again uses env-cmd, as joist-codegen will use the DATABASE_URL environment variable to connect to your local database.
Now, anytime you make schema changes (i.e. by running npm run joist-migrate), you can also run joist-codegen to create/update your domain model:
npm run joist-codegenSetting up your tests
Section titled “Setting up your tests”We want each test to get a clean/fresh database, so we should set up a beforeEach to invoke our local-only flush_database command:
The sample app does this via a setupTests.ts file that will be used for all tests:
import { EntityManager } from "src/entities";import { knex as createKnex, Knex } from "knex";import { PostgresDriver } from "joist-orm";import { newPgConnectionConfig } from "joist-utils";
let knex: Knex;
// Knex is used as a single/global connection pool + query builderfunction getKnex(): Knex { return (knex ??= createKnex({ client: "pg", connection: newPgConnectionConfig() as any, debug: false, asyncStackTraces: true, }));}
export function newEntityManager(): EntityManager { return new EntityManager({}, new PostgresDriver(getKnex()));}
beforeEach(async () => { const knex = await getKnex(); await knex.select(knex.raw("flush_database()"));});
afterAll(async () => { if (knex) { await knex.destroy(); }});The newPgConnectionConfig helper method from joist-utils also uses the DATABASE_URL environment variable, which we can have loaded into the Jest process by using env-cmd in a setupTestEnv.js file:
import { GetEnvVars } from "env-cmd";
export default async function globalSetup() { Object.entries(await GetEnvVars()).forEach(([key, value]) => (process.env[key] = value));};And then configure jest.config.js to use both files:
module.exports = { preset: "ts-jest", globalSetup: "<rootDir>/src/setupTestEnv.ts", setupFilesAfterEnv: ["<rootDir>/src/setupTests.ts"], testMatch: ["<rootDir>/src/**/*.test.{ts,tsx,js,jsx}"], moduleNameMapper: { "^src(.*)": "<rootDir>/src$1", },};As usual, you can/should adjust all of this to your specific project.
Now your unit tests should be able to create an EntityManager and work with the domain objects:
import { Author, EntityManager, newAuthor } from "src/entities";import { newEntityManager } from "src/setupTests";
describe("Author", () => { it("can be created", async () => { const em = newEntityManager(); const a = em.create(Author, { firstName: "a1" }); await em.flush(); });});Setting up your production code
Section titled “Setting up your production code”Finally, you can use the EntityManager and your domain objects in your production code.
First install the joist-orm dependency:
npm add --save-dev joist-ormThis is where the guide really becomes “it depends on your application”, but in theory it will look very similar to setting up the tests:
- Configure a single/global
knexinstance that will act as the connection pool, - For each request, create a new
EntityManagerto perform that request’s work
An extremely simple example might look like:
import { EntityManager, Author } from "./entities";import { newPgConnectionConfig, PostgresDriver } from "joist-orm";import { knex as createKnex, Knex } from "knex";
// Create our global knex connectionlet knex: Knex = createKnex({ client: "pg", connection: newPgConnectionConfig(), debug: false, asyncStackTraces: true,});
// Create a helper method for our requests to create a new EntityManagerfunction newEntityManager(): EntityManager { // If you have a per-request context object, you can create that here const ctx = {}; return new EntityManager(ctx, new PostgresDriver(getKnex()));}
// Handle GET `/authors`app.get("/authors", async (req, res) => { // Create a new em const em = newEntityManager(); // Find all authors const authors = await em.find(Author, {}); // Send them back as JSON res.send(authors);});Note that you’ll again need the DATABASE_URL environment variable set, but that will depend on whatever hosting provider you’re using to run the app (or, per the previous disclaimer, you’re free to configure the Knex connection pool with whatever configuration approach/library you like).