Installation
Installing Joist in your project has four main steps:
- Setting up your database
- Setting up
joist-codegen
- Setting up your tests
- Setting up your production code
A wrinkle is that each Node.js application can be pretty different, in terms of how you manage your local database (i.e. with Docker Compose), what your production application looks like (a REST API, a GraphQL API, etc.), etc.
So, to simplify this page, we'll include some assumptions based on the Joist sample app, but you should be able to adjust these steps to your specific project.
If you want a faster intro than this page, you should be able to check out the sample app, run the commands in its readme, and just start poking around.
Joist requires Node 18.
Setting up your database
The sample app uses docker compose
and a db.dockerfile
file to manage the local Postgres database.
To start it, clone the sample app, and run:
docker compose build db
docker compose up -d db
The docker-compose.yml
exposes the sample_app
database on port 5342
, so it is accessible with an environment variable of:
DATABASE_URL=postgres://sample_user:local@localhost:5432/sample_app
The following steps will assume your database is available at this location (it is already set in the sample app's env/local.env
file), but you can set DATABASE_URL
to whatever is appropriate for your application.
Setting up migrations
You should also set up a migrations library to manage your database schema; the Joist sample app uses node-pg-migrate.
If you do use node-pg-migrate
as well, you can install Joist's node-pg-migrate
-based helper methods (like createEntityTable
, createEnumTable
, createManyToManyTable
, etc.):
npm add --save-dev joist-migration-utils
And add joist-migrate
and joist-new-migration
commands to your package.json
:
{
"scripts": {
"joist-migrate": "env-cmd tsx ./node_modules/joist-migration-utils",
"joist-new-migration": "npx node-pg-migrate create"
}
}
The sample app uses env-cmd
to load the environment variables from .env
before running joist-migration-utils
, and tsx
to transpile the migration's *.ts
code to JavaScript, but if you don't like that, you can manage your application's environment variables however you like.
Invoking Joist's joist-migration-utils
is really just a tiny wrapper around node-pg-migrate
that:
- Reads the connection config from either a single
DATABASE_URL
or multipleDB_HOST
,DB_PORT
,DB_DATABASE
,DB_USER
, andDB_PASSWORD
environment variables - Runs the "up" command against the
migrations/
directory
If you want to invoke node-pg-migrate
's cli directly instead, that's just fine.
Now we can apply migrations by running:
npm run joist-migrate
The sample app also supports resetting the database schema (so you can re-run the migrations from scratch) by running:
docker compose exec db ./reset.sh
While we used node-pg-migrate
for this section, Joist is agnostic to your migration tool and will codegen based on your database schema, so you're welcome to use node-pg-migrate, Knex's migrations, or another tool for migrations/schema management.
As a quirk of node-pg-migrate
, the first migration that it creates via joist-new-migration
will always be a .js
file.
Once you rename that first migration to a .ts
file, all subsequent migrations will be created as .ts
files.
Setting up joist-codegen
Install the joist-codegen
package as a dev dependency and add a joist-codegen
script to your package.json
:
npm add --save-dev joist-codegen
{
"scripts": {
"joist-codegen": "env-cmd tsx ./node_modules/joist-codegen"
}
}
This again uses env-cmd
, as joist-codegen
will use the DATABASE_URL
environment variable to connect to your local database.
Now, anytime you make schema changes (i.e. by running npm run joist-migrate
), you can also run joist-codegen
to create/update your domain model:
npm run joist-codegen
Setting up your tests
We want each test to get a clean/fresh database, so we should set up a beforeEach
to invoke our local-only flush_database
command:
The sample app does this via a setupTests.ts
file that will be used for all tests:
import { EntityManager } from "src/entities";
import { knex as createKnex, Knex } from "knex";
import { PostgresDriver } from "joist-orm";
import { newPgConnectionConfig } from "joist-utils";
let knex: Knex;
// Knex is used as a single/global connection pool + query builder
function getKnex(): Knex {
return (knex ??= createKnex({
client: "pg",
connection: newPgConnectionConfig() as any,
debug: false,
asyncStackTraces: true,
}));
}
export function newEntityManager(): EntityManager {
return new EntityManager({}, new PostgresDriver(getKnex()));
}
beforeEach(async () => {
const knex = await getKnex();
await knex.select(knex.raw("flush_database()"));
});
afterAll(async () => {
if (knex) {
await knex.destroy();
}
});
The newPgConnectionConfig
helper method from joist-utils
also uses the DATABASE_URL
environment variable, which we can have loaded into the Jest process by using env-cmd
in a setupTestEnv.js
file:
import { GetEnvVars } from "env-cmd";
export default async function globalSetup() {
Object.entries(await GetEnvVars()).forEach(([key, value]) => (process.env[key] = value));
};
And then configure jest.config.js
to use both files:
module.exports = {
preset: "ts-jest",
globalSetup: "<rootDir>/src/setupTestEnv.ts",
setupFilesAfterEnv: ["<rootDir>/src/setupTests.ts"],
testMatch: ["<rootDir>/src/**/*.test.{ts,tsx,js,jsx}"],
moduleNameMapper: {
"^src(.*)": "<rootDir>/src$1",
},
};
While Joist's newPgConnectionConfig
uses the same environment variable convention as joist-codegen
, with the idea that your app's production environment variables will be set automatically by your deployment infra (i.e. in the style of Twelve Factor Applications), you're free to configure Knex
with whatever idiomatic configuration looks like for your app.
See the Knex config documentation.
As usual, you can/should adjust all of this to your specific project.
Now your unit tests should be able to create an EntityManager
and work with the domain objects:
import { Author, EntityManager, newAuthor } from "src/entities";
import { newEntityManager } from "src/setupTests";
describe("Author", () => {
it("can be created", async () => {
const em = newEntityManager();
const a = new Author(em, { firstName: "a1" });
await em.flush();
});
});
Setting up your production code
Finally, you can use the EntityManager
and your domain objects in your production code.
First install the joist-orm
dependency:
npm add --save-dev joist-orm
This is where the guide really becomes "it depends on your application", but in theory it will look very similar to setting up the tests:
- Configure a single/global
knex
instance that will act as the connection pool, - For each request, create a new
EntityManager
to perform that request's work
An extremely simple example might look like:
import { EntityManager, Author } from "./entities";
import { newPgConnectionConfig, PostgresDriver } from "joist-orm";
import { knex as createKnex, Knex } from "knex";
// Create our global knex connection
let knex: Knex = createKnex({
client: "pg",
connection: newPgConnectionConfig(),
debug: false,
asyncStackTraces: true,
});
// Create a helper method for our requests to create a new EntityManager
function newEntityManager(): EntityManager {
// If you have a per-request context object, you can create that here
const ctx = {};
return new EntityManager(ctx, new PostgresDriver(getKnex()));
}
// Handle GET `/authors`
app.get("/authors", async (req, res) => {
// Create a new em
const em = newEntityManager();
// Find all authors
const authors = await em.find(Author, {});
// Send them back as JSON
res.send(authors);
});
Note that you'll again need the DATABASE_URL
environment variable set, but that will depend on whatever hosting provider you're using to run the app (or, per the previous disclaimer, you're free to configure the Knex
connection pool with whatever configuration approach/library you like).