41 Commits

Author SHA1 Message Date
09d85c8f22 lalalalal 2026-02-02 19:06:47 -05:00
a0ce5183b2 dkkdkdkdkd 2026-02-02 19:02:40 -05:00
c83202b681 fafafafa 2026-02-02 18:59:54 -05:00
2c1d297be1 fda 2026-02-02 18:55:31 -05:00
2d697c1e61 Move package.json files around 2026-02-02 18:47:54 -05:00
410bb671f1 fads 2026-02-02 18:41:26 -05:00
0ae197f939 fdas 2026-02-02 18:39:49 -05:00
370bea5d98 asdf 2026-02-02 18:37:09 -05:00
9d34768051 Add file list 2026-02-02 18:35:37 -05:00
b752eb5080 Make shfmt happier 2026-02-02 18:32:53 -05:00
1ed5aa4b33 Reorder some imports 2026-02-02 18:32:39 -05:00
4d1c30b874 Fix some stragglers 2026-02-02 18:31:03 -05:00
02edf259f0 Rename framework/ to diachron/ and update all references
Update paths in .gitignore, cmd, develop, mgmt, sync.sh, check.sh,
fixup.sh, CLAUDE.md, docs/new-project.md, and backend/*.sh scripts.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-02 18:10:32 -05:00
db1f2151de Rename express/ to backend/ and update references
Update paths in sync.sh, master/main.go, and CLAUDE.md to reflect
the directory rename.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-02 17:54:44 -05:00
6e669d025a Move many files to diachron subdir 2026-02-02 17:22:08 -05:00
a1dbf71de4 Rename directory 2026-02-02 16:53:22 -05:00
0afc3efa5d Fix test script to work on macOS default bash
Replace globstar (bash 4.0+) with find for portability.
macOS ships with bash 3.2 which doesn't support globstar.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-02 12:25:26 -05:00
6f2ca2c15d Tweak marketing blurb 2026-02-02 11:37:05 -05:00
6a41273835 Add macOS x86_64 platform support
Platform detection now happens in framework/platform, sourced by both
sync.sh and the node shim. Uses shasum on macOS, sha256sum on Linux.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-02 11:21:41 -05:00
33251d9b77 Add comprehensive test suite for express modules
Tests for:
- user.ts: User class, roles, permissions, status checks
- util.ts: loadFile utility
- handlers.ts: multiHandler
- types.ts: methodParser, requireAuth, requirePermission
- logging.ts: module structure
- database.ts: connectionConfig, raw queries, PostgresAuthStore
- auth/token.ts: generateToken, hashToken, parseAuthorizationHeader
- auth/password.ts: hashPassword, verifyPassword (scrypt)
- auth/types.ts: Zod parsers, Session class, tokenLifetimes
- auth/store.ts: InMemoryAuthStore
- auth/service.ts: AuthService (login, register, verify, reset)
- basic/*.ts: route structure tests

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-25 20:40:49 -06:00
408032c30d fmt 2026-01-25 18:21:01 -06:00
19959a0325 . 2026-01-25 18:20:57 -06:00
87c9d1be16 Update todo list 2026-01-25 18:19:32 -06:00
c2748bfcc6 Add test infrastructure for hydrators using node:test
- Add docker-compose.test.yml with isolated PostgreSQL on port 5433
- Add environment variable support for database connection config
- Add test setup utilities and initial user hydrator tests
- Add test and test:watch scripts to package.json

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-25 18:18:15 -06:00
2f5ef7c267 Add automatic restart for crashed worker processes
Workers are now monitored and automatically restarted when they crash.
The worker pool validates addresses before returning them to skip stale
entries from crashed workers.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-25 17:55:43 -06:00
bcd71f2801 Add kysely-codegen command 2026-01-25 17:54:50 -06:00
82a8c03316 Pull in typeid-js 2026-01-25 17:26:09 -06:00
b8065ead79 Add develop db-url 2026-01-25 17:25:57 -06:00
811c446895 Pull in kysely-codegen 2026-01-25 12:28:44 -06:00
5a8c0028d7 Add user_credentials migration 2026-01-25 12:14:34 -06:00
f7e6e56aca Merge branch 'experiments' 2026-01-25 12:12:35 -06:00
cd19a32be5 Add more todo items 2026-01-25 12:12:15 -06:00
478305bc4f Update /home template 2026-01-25 12:12:02 -06:00
421628d49e Add various doc updates
They are still very far from complete.
2026-01-25 12:11:34 -06:00
4f37a72d7b Clean commands up 2026-01-24 16:54:54 -06:00
e30bf5d96d Fix regexp in fixup.sh 2026-01-24 16:39:13 -06:00
8704c4a8d5 Separate framework and app migrations
Also add a new develop command: clear-db.
2026-01-24 16:38:33 -06:00
579a19669e Match user and session schema changes 2026-01-24 15:48:22 -06:00
474420ac1e Add development command to reset the database and rerun migrations 2026-01-24 15:13:34 -06:00
960f78a1ad Update initial tables 2026-01-24 15:13:30 -06:00
a0043fd475 Fix go version 2025-02-08 13:37:29 -06:00
137 changed files with 4474 additions and 273 deletions

8
.gitignore vendored
View File

@@ -1,5 +1,5 @@
**/node_modules **/node_modules
framework/downloads diachron/downloads
framework/binaries diachron/binaries
framework/.nodejs diachron/.nodejs
framework/.nodejs-config diachron/.nodejs-config

1
.go-version Normal file
View File

@@ -0,0 +1 @@
1.23.6

View File

@@ -38,7 +38,7 @@ master process. Key design principles:
**Format TypeScript code:** **Format TypeScript code:**
```bash ```bash
cd express && ../cmd pnpm biome check --write . cd backend && ../cmd pnpm biome check --write .
``` ```
**Build Go master process:** **Build Go master process:**
@@ -54,9 +54,9 @@ cd master && go build
### Components ### Components
- **express/** - TypeScript/Express.js backend application - **backend/** - TypeScript/Express.js backend application
- **master/** - Go-based master process for file watching and process management - **master/** - Go-based master process for file watching and process management
- **framework/** - Managed binaries (Node.js, pnpm), command wrappers, and - **diachron/** - Managed binaries (Node.js, pnpm), command wrappers, and
framework-specific library code framework-specific library code
- **monitor/** - Go file watcher that triggers rebuilds (experimental) - **monitor/** - Go file watcher that triggers rebuilds (experimental)
@@ -68,7 +68,7 @@ Responsibilities:
- Proxy web requests to backend workers - Proxy web requests to backend workers
- Behaves identically in all environments (no dev/prod distinction) - Behaves identically in all environments (no dev/prod distinction)
### Express App Structure ### Backend App Structure
- `app.ts` - Main Express application setup with route matching - `app.ts` - Main Express application setup with route matching
- `routes.ts` - Route definitions - `routes.ts` - Route definitions
@@ -78,7 +78,7 @@ Responsibilities:
### Framework Command System ### Framework Command System
Commands flow through: `./cmd``framework/cmd.d/*``framework/shims/*` → managed binaries in `framework/binaries/` Commands flow through: `./cmd``diachron/cmd.d/*``diachron/shims/*` → managed binaries in `diachron/binaries/`
This ensures consistent tooling versions across the team without system-wide installations. This ensures consistent tooling versions across the team without system-wide installations.
@@ -94,8 +94,8 @@ This ensures consistent tooling versions across the team without system-wide ins
## Platform Requirements ## Platform Requirements
Linux x86_64 only (currently). Requires: Linux or macOS on x86_64. Requires:
- Modern libc for Go binaries - Modern libc for Go binaries (Linux)
- docker compose (for full stack) - docker compose (for full stack)
- fd, shellcheck, shfmt (for development) - fd, shellcheck, shfmt (for development)

View File

@@ -2,16 +2,13 @@ diachron
## Introduction ## Introduction
Is your answer to some of these questions "yes"? If so, you might like
diachron. (When it comes to that dev/test/prod one, hear us out first, ok?)
- Do you want to share a lot of backend and frontend code? - Do you want to share a lot of backend and frontend code?
- Are you tired of your web stack breaking when you blink too hard? - Are you tired of your web stack breaking when you blink too hard?
- Have you read [Taking PHP - Have you read [Taking PHP
Seriously](https://slack.engineering/taking-php-seriously/) and wish you had Seriously](https://slack.engineering/taking-php-seriously/) and do you wish
something similar for Typescript? you had something similar for Typescript?
- Do you think that ORMs are not all that? Do you wish you had first class - Do you think that ORMs are not all that? Do you wish you had first class
unmediated access to your database? And do you think that database unmediated access to your database? And do you think that database
@@ -35,6 +32,9 @@ diachron. (When it comes to that dev/test/prod one, hear us out first, ok?)
you're trying to fix? We're talking authentication, authorization, XSS, you're trying to fix? We're talking authentication, authorization, XSS,
https, nested paths, all that stuff. https, nested paths, all that stuff.
Is your answer to some of these questions "yes"? If so, you might like
diachron. (When it comes to that dev/test/prod one, hear us out first, ok?)
## Getting started ## Getting started
Different situations require different getting started docs. Different situations require different getting started docs.
@@ -44,9 +44,8 @@ Different situations require different getting started docs.
## Requirements ## Requirements
To run diachron, you currently need to have a Linux box running x86_64 with a To run diachron, you need Linux or macOS on x86_64. Linux requires a new
new enough libc to run golang binaries. Support for other platforms will come enough libc to run golang binaries.
eventually.
To run a more complete system, you also need to have docker compose installed. To run a more complete system, you also need to have docker compose installed.

30
TODO.md
View File

@@ -1,11 +1,39 @@
## high importance ## high importance
- Many of the UUIDs generated are not UUIDv7. This needs to be fixed.
- [ ] Add unit tests all over the place. - [ ] Add unit tests all over the place.
- ⚠️ Huge task - needs breakdown before starting - ⚠️ Huge task - needs breakdown before starting
- [ ] map exceptions back to source lines
- [ ] migrations, seeding, fixtures
```sql
CREATE SCHEMA fw;
CREATE TABLE fw.users (...);
CREATE TABLE fw.groups (...);
```
```sql
CREATE TABLE app.user_profiles (...);
CREATE TABLE app.customer_metadata (...);
```
- [ ] flesh out `mgmt` and `develop` (does not exist yet)
4.1 What belongs in develop
- Create migrations
- Squash migrations
- Reset DB
- Roll back migrations
- Seed large test datasets
- Run tests
- Snapshot / restore local DB state (!!!)
`develop` fails if APP_ENV (or whatever) is `production`. Or maybe even
`testing`.
- [ ] Add default user table(s) to database. - [ ] Add default user table(s) to database.
@@ -44,6 +72,8 @@
necessary at all, with some sane defaults and an easy to use override necessary at all, with some sane defaults and an easy to use override
mechanism mechanism
- [ ] time library
- [ ] fill in the rest of express/http-codes.ts - [ ] fill in the rest of express/http-codes.ts
- [ ] fill out express/content-types.ts - [ ] fill out express/content-types.ts

View File

@@ -3,15 +3,13 @@ import express, {
type Response as ExpressResponse, type Response as ExpressResponse,
} from "express"; } from "express";
import { match } from "path-to-regexp"; import { match } from "path-to-regexp";
import { Session } from "./auth"; import { Session } from "./diachron/auth";
import { cli } from "./cli"; import { cli } from "./diachron/cli";
import { contentTypes } from "./content-types"; import { contentTypes } from "./diachron/content-types";
import { runWithContext } from "./context"; import { runWithContext } from "./diachron/context";
import { core } from "./core"; import { core } from "./diachron/core";
import { httpCodes } from "./http-codes"; import { httpCodes } from "./diachron/http-codes";
import { request } from "./request"; import { request } from "./diachron/request";
import { routes } from "./routes";
// import { URLPattern } from 'node:url'; // import { URLPattern } from 'node:url';
import { import {
AuthenticationRequired, AuthenticationRequired,
@@ -25,7 +23,8 @@ import {
type ProcessedRoute, type ProcessedRoute,
type Result, type Result,
type Route, type Route,
} from "./types"; } from "./diachron/types";
import { routes } from "./routes";
const app = express(); const app = express();

View File

@@ -8,7 +8,7 @@ check_dir="$DIR"
out_dir="$check_dir/out" out_dir="$check_dir/out"
source "$check_dir"/../framework/shims/common source "$check_dir"/../diachron/shims/common
source "$check_dir"/../framework/shims/node.common source "$check_dir"/../diachron/shims/node.common
$ROOT/cmd pnpm tsc --outDir "$out_dir" $ROOT/cmd pnpm tsc --outDir "$out_dir"

View File

@@ -0,0 +1,80 @@
// Tests for auth/password.ts
// Pure unit tests - no database needed
import assert from "node:assert/strict";
import { describe, it } from "node:test";
import { hashPassword, verifyPassword } from "./password";
describe("password", () => {
describe("hashPassword", () => {
it("returns a scrypt formatted hash", async () => {
const hash = await hashPassword("testpassword");
assert.ok(hash.startsWith("$scrypt$"));
});
it("includes all scrypt parameters", async () => {
const hash = await hashPassword("testpassword");
const parts = hash.split("$");
// Format: $scrypt$N$r$p$salt$hash
assert.equal(parts.length, 7);
assert.equal(parts[0], "");
assert.equal(parts[1], "scrypt");
// N, r, p should be numbers
assert.ok(!Number.isNaN(parseInt(parts[2], 10)));
assert.ok(!Number.isNaN(parseInt(parts[3], 10)));
assert.ok(!Number.isNaN(parseInt(parts[4], 10)));
});
it("generates different hashes for same password (different salt)", async () => {
const hash1 = await hashPassword("testpassword");
const hash2 = await hashPassword("testpassword");
assert.notEqual(hash1, hash2);
});
});
describe("verifyPassword", () => {
it("returns true for correct password", async () => {
const hash = await hashPassword("correctpassword");
const result = await verifyPassword("correctpassword", hash);
assert.equal(result, true);
});
it("returns false for incorrect password", async () => {
const hash = await hashPassword("correctpassword");
const result = await verifyPassword("wrongpassword", hash);
assert.equal(result, false);
});
it("throws for invalid hash format", async () => {
await assert.rejects(
verifyPassword("password", "invalid-hash"),
/Invalid password hash format/,
);
});
it("throws for non-scrypt hash", async () => {
await assert.rejects(
verifyPassword("password", "$bcrypt$10$salt$hash"),
/Invalid password hash format/,
);
});
it("works with empty password", async () => {
const hash = await hashPassword("");
const result = await verifyPassword("", hash);
assert.equal(result, true);
});
it("works with unicode password", async () => {
const hash = await hashPassword("p@$$w0rd\u{1F511}");
const result = await verifyPassword("p@$$w0rd\u{1F511}", hash);
assert.equal(result, true);
});
it("is case sensitive", async () => {
const hash = await hashPassword("Password");
const result = await verifyPassword("password", hash);
assert.equal(result, false);
});
});
});

View File

@@ -0,0 +1,419 @@
// Tests for auth/service.ts
// Uses InMemoryAuthStore - no database needed
import assert from "node:assert/strict";
import { beforeEach, describe, it } from "node:test";
import { AuthService } from "./service";
import { InMemoryAuthStore } from "./store";
describe("AuthService", () => {
let store: InMemoryAuthStore;
let service: AuthService;
beforeEach(() => {
store = new InMemoryAuthStore();
service = new AuthService(store);
});
describe("register", () => {
it("creates a new user", async () => {
const result = await service.register(
"test@example.com",
"password123",
"Test User",
);
assert.equal(result.success, true);
if (result.success) {
assert.equal(result.user.email, "test@example.com");
assert.equal(result.user.displayName, "Test User");
assert.ok(result.verificationToken.length > 0);
}
});
it("fails when email already registered", async () => {
await service.register("test@example.com", "password123");
const result = await service.register(
"test@example.com",
"password456",
);
assert.equal(result.success, false);
if (!result.success) {
assert.equal(result.error, "Email already registered");
}
});
it("creates user without displayName", async () => {
const result = await service.register(
"test@example.com",
"password123",
);
assert.equal(result.success, true);
if (result.success) {
assert.equal(result.user.displayName, undefined);
}
});
});
describe("login", () => {
beforeEach(async () => {
// Create and verify a user
const result = await service.register(
"test@example.com",
"password123",
"Test User",
);
if (result.success) {
// Verify email to activate user
await service.verifyEmail(result.verificationToken);
}
});
it("succeeds with correct credentials", async () => {
const result = await service.login(
"test@example.com",
"password123",
"cookie",
);
assert.equal(result.success, true);
if (result.success) {
assert.ok(result.token.length > 0);
assert.equal(result.user.email, "test@example.com");
}
});
it("fails with wrong password", async () => {
const result = await service.login(
"test@example.com",
"wrongpassword",
"cookie",
);
assert.equal(result.success, false);
if (!result.success) {
assert.equal(result.error, "Invalid credentials");
}
});
it("fails with unknown email", async () => {
const result = await service.login(
"unknown@example.com",
"password123",
"cookie",
);
assert.equal(result.success, false);
if (!result.success) {
assert.equal(result.error, "Invalid credentials");
}
});
it("fails for inactive user", async () => {
// Create a user but don't verify email (stays pending)
await service.register("pending@example.com", "password123");
const result = await service.login(
"pending@example.com",
"password123",
"cookie",
);
assert.equal(result.success, false);
if (!result.success) {
assert.equal(result.error, "Account is not active");
}
});
it("stores metadata", async () => {
const result = await service.login(
"test@example.com",
"password123",
"cookie",
{ userAgent: "TestAgent", ipAddress: "192.168.1.1" },
);
assert.equal(result.success, true);
});
});
describe("validateToken", () => {
let token: string;
beforeEach(async () => {
const regResult = await service.register(
"test@example.com",
"password123",
);
if (regResult.success) {
await service.verifyEmail(regResult.verificationToken);
}
const loginResult = await service.login(
"test@example.com",
"password123",
"cookie",
);
if (loginResult.success) {
token = loginResult.token;
}
});
it("returns authenticated for valid token", async () => {
const result = await service.validateToken(token);
assert.equal(result.authenticated, true);
if (result.authenticated) {
assert.equal(result.user.email, "test@example.com");
assert.notEqual(result.session, null);
}
});
it("returns unauthenticated for invalid token", async () => {
const result = await service.validateToken("invalid-token");
assert.equal(result.authenticated, false);
assert.equal(result.user.isAnonymous(), true);
assert.equal(result.session, null);
});
});
describe("logout", () => {
it("invalidates the session", async () => {
const regResult = await service.register(
"test@example.com",
"password123",
);
if (regResult.success) {
await service.verifyEmail(regResult.verificationToken);
}
const loginResult = await service.login(
"test@example.com",
"password123",
"cookie",
);
assert.equal(loginResult.success, true);
if (!loginResult.success) return;
const token = loginResult.token;
// Token should be valid before logout
const beforeLogout = await service.validateToken(token);
assert.equal(beforeLogout.authenticated, true);
// Logout
await service.logout(token);
// Token should be invalid after logout
const afterLogout = await service.validateToken(token);
assert.equal(afterLogout.authenticated, false);
});
});
describe("logoutAllSessions", () => {
it("invalidates all user sessions", async () => {
const regResult = await service.register(
"test@example.com",
"password123",
);
if (regResult.success) {
await service.verifyEmail(regResult.verificationToken);
}
// Create multiple sessions
const login1 = await service.login(
"test@example.com",
"password123",
"cookie",
);
const login2 = await service.login(
"test@example.com",
"password123",
"bearer",
);
assert.equal(login1.success, true);
assert.equal(login2.success, true);
if (!login1.success || !login2.success) return;
// Both should be valid
const before1 = await service.validateToken(login1.token);
const before2 = await service.validateToken(login2.token);
assert.equal(before1.authenticated, true);
assert.equal(before2.authenticated, true);
// Logout all
const user = await store.getUserByEmail("test@example.com");
const count = await service.logoutAllSessions(user!.id);
assert.equal(count, 2);
// Both should be invalid
const after1 = await service.validateToken(login1.token);
const after2 = await service.validateToken(login2.token);
assert.equal(after1.authenticated, false);
assert.equal(after2.authenticated, false);
});
});
describe("verifyEmail", () => {
it("activates user with valid token", async () => {
const regResult = await service.register(
"test@example.com",
"password123",
);
assert.equal(regResult.success, true);
if (!regResult.success) return;
const result = await service.verifyEmail(
regResult.verificationToken,
);
assert.equal(result.success, true);
// User should now be active and can login
const loginResult = await service.login(
"test@example.com",
"password123",
"cookie",
);
assert.equal(loginResult.success, true);
});
it("fails with invalid token", async () => {
const result = await service.verifyEmail("invalid-token");
assert.equal(result.success, false);
if (!result.success) {
assert.equal(
result.error,
"Invalid or expired verification token",
);
}
});
it("fails when token already used", async () => {
const regResult = await service.register(
"test@example.com",
"password123",
);
assert.equal(regResult.success, true);
if (!regResult.success) return;
// First verification succeeds
const result1 = await service.verifyEmail(
regResult.verificationToken,
);
assert.equal(result1.success, true);
// Second verification fails (token deleted)
const result2 = await service.verifyEmail(
regResult.verificationToken,
);
assert.equal(result2.success, false);
});
});
describe("createPasswordResetToken", () => {
it("returns token for existing user", async () => {
const regResult = await service.register(
"test@example.com",
"password123",
);
assert.equal(regResult.success, true);
const result =
await service.createPasswordResetToken("test@example.com");
assert.notEqual(result, null);
assert.ok(result!.token.length > 0);
});
it("returns null for unknown email", async () => {
const result = await service.createPasswordResetToken(
"unknown@example.com",
);
assert.equal(result, null);
});
});
describe("resetPassword", () => {
it("changes password with valid token", async () => {
const regResult = await service.register(
"test@example.com",
"oldpassword",
);
if (regResult.success) {
await service.verifyEmail(regResult.verificationToken);
}
const resetToken =
await service.createPasswordResetToken("test@example.com");
assert.notEqual(resetToken, null);
const result = await service.resetPassword(
resetToken!.token,
"newpassword",
);
assert.equal(result.success, true);
// Old password should no longer work
const loginOld = await service.login(
"test@example.com",
"oldpassword",
"cookie",
);
assert.equal(loginOld.success, false);
// New password should work
const loginNew = await service.login(
"test@example.com",
"newpassword",
"cookie",
);
assert.equal(loginNew.success, true);
});
it("fails with invalid token", async () => {
const result = await service.resetPassword(
"invalid-token",
"newpassword",
);
assert.equal(result.success, false);
if (!result.success) {
assert.equal(result.error, "Invalid or expired reset token");
}
});
it("invalidates all existing sessions", async () => {
const regResult = await service.register(
"test@example.com",
"password123",
);
if (regResult.success) {
await service.verifyEmail(regResult.verificationToken);
}
// Create a session
const loginResult = await service.login(
"test@example.com",
"password123",
"cookie",
);
assert.equal(loginResult.success, true);
if (!loginResult.success) return;
const sessionToken = loginResult.token;
// Reset password
const resetToken =
await service.createPasswordResetToken("test@example.com");
await service.resetPassword(resetToken!.token, "newpassword");
// Old session should be invalid
const validateResult = await service.validateToken(sessionToken);
assert.equal(validateResult.authenticated, false);
});
});
});

View File

@@ -0,0 +1,321 @@
// Tests for auth/store.ts (InMemoryAuthStore)
// Pure unit tests - no database needed
import assert from "node:assert/strict";
import { after, before, beforeEach, describe, it } from "node:test";
import type { UserId } from "../user";
import { InMemoryAuthStore } from "./store";
import { hashToken } from "./token";
import type { TokenId } from "./types";
describe("InMemoryAuthStore", () => {
let store: InMemoryAuthStore;
beforeEach(() => {
store = new InMemoryAuthStore();
});
describe("createUser", () => {
it("creates a user with pending status", async () => {
const user = await store.createUser({
email: "test@example.com",
passwordHash: "hash123",
displayName: "Test User",
});
assert.equal(user.email, "test@example.com");
assert.equal(user.displayName, "Test User");
assert.equal(user.status, "pending");
});
it("creates a user without displayName", async () => {
const user = await store.createUser({
email: "test@example.com",
passwordHash: "hash123",
});
assert.equal(user.email, "test@example.com");
assert.equal(user.displayName, undefined);
});
it("generates a unique id", async () => {
const user1 = await store.createUser({
email: "test1@example.com",
passwordHash: "hash123",
});
const user2 = await store.createUser({
email: "test2@example.com",
passwordHash: "hash456",
});
assert.notEqual(user1.id, user2.id);
});
});
describe("getUserByEmail", () => {
it("returns user when found", async () => {
await store.createUser({
email: "test@example.com",
passwordHash: "hash123",
});
const user = await store.getUserByEmail("test@example.com");
assert.notEqual(user, null);
assert.equal(user!.email, "test@example.com");
});
it("is case-insensitive", async () => {
await store.createUser({
email: "Test@Example.COM",
passwordHash: "hash123",
});
const user = await store.getUserByEmail("test@example.com");
assert.notEqual(user, null);
});
it("returns null when not found", async () => {
const user = await store.getUserByEmail("notfound@example.com");
assert.equal(user, null);
});
});
describe("getUserById", () => {
it("returns user when found", async () => {
const created = await store.createUser({
email: "test@example.com",
passwordHash: "hash123",
});
const user = await store.getUserById(created.id);
assert.notEqual(user, null);
assert.equal(user!.id, created.id);
});
it("returns null when not found", async () => {
const user = await store.getUserById("nonexistent" as UserId);
assert.equal(user, null);
});
});
describe("getUserPasswordHash", () => {
it("returns hash when found", async () => {
const user = await store.createUser({
email: "test@example.com",
passwordHash: "hash123",
});
const hash = await store.getUserPasswordHash(user.id);
assert.equal(hash, "hash123");
});
it("returns null when not found", async () => {
const hash = await store.getUserPasswordHash(
"nonexistent" as UserId,
);
assert.equal(hash, null);
});
});
describe("setUserPassword", () => {
it("updates password hash", async () => {
const user = await store.createUser({
email: "test@example.com",
passwordHash: "oldhash",
});
await store.setUserPassword(user.id, "newhash");
const hash = await store.getUserPasswordHash(user.id);
assert.equal(hash, "newhash");
});
});
describe("updateUserEmailVerified", () => {
it("sets user status to active", async () => {
const created = await store.createUser({
email: "test@example.com",
passwordHash: "hash123",
});
assert.equal(created.status, "pending");
await store.updateUserEmailVerified(created.id);
const user = await store.getUserById(created.id);
assert.equal(user!.status, "active");
});
});
describe("createSession", () => {
it("creates a session with token", async () => {
const user = await store.createUser({
email: "test@example.com",
passwordHash: "hash123",
});
const { token, session } = await store.createSession({
userId: user.id,
tokenType: "session",
authMethod: "cookie",
expiresAt: new Date(Date.now() + 3600000),
});
assert.ok(token.length > 0);
assert.equal(session.userId, user.id);
assert.equal(session.tokenType, "session");
assert.equal(session.authMethod, "cookie");
});
it("stores metadata", async () => {
const user = await store.createUser({
email: "test@example.com",
passwordHash: "hash123",
});
const { session } = await store.createSession({
userId: user.id,
tokenType: "session",
authMethod: "cookie",
expiresAt: new Date(Date.now() + 3600000),
userAgent: "Mozilla/5.0",
ipAddress: "127.0.0.1",
});
assert.equal(session.userAgent, "Mozilla/5.0");
assert.equal(session.ipAddress, "127.0.0.1");
});
});
describe("getSession", () => {
it("returns session when found and not expired", async () => {
const user = await store.createUser({
email: "test@example.com",
passwordHash: "hash123",
});
const { token } = await store.createSession({
userId: user.id,
tokenType: "session",
authMethod: "cookie",
expiresAt: new Date(Date.now() + 3600000), // 1 hour from now
});
const tokenId = hashToken(token) as TokenId;
const session = await store.getSession(tokenId);
assert.notEqual(session, null);
assert.equal(session!.userId, user.id);
});
it("returns null for expired session", async () => {
const user = await store.createUser({
email: "test@example.com",
passwordHash: "hash123",
});
const { token } = await store.createSession({
userId: user.id,
tokenType: "session",
authMethod: "cookie",
expiresAt: new Date(Date.now() - 1000), // Expired 1 second ago
});
const tokenId = hashToken(token) as TokenId;
const session = await store.getSession(tokenId);
assert.equal(session, null);
});
it("returns null for nonexistent session", async () => {
const session = await store.getSession("nonexistent" as TokenId);
assert.equal(session, null);
});
});
describe("deleteSession", () => {
it("removes the session", async () => {
const user = await store.createUser({
email: "test@example.com",
passwordHash: "hash123",
});
const { token } = await store.createSession({
userId: user.id,
tokenType: "session",
authMethod: "cookie",
expiresAt: new Date(Date.now() + 3600000),
});
const tokenId = hashToken(token) as TokenId;
await store.deleteSession(tokenId);
const session = await store.getSession(tokenId);
assert.equal(session, null);
});
});
describe("deleteUserSessions", () => {
it("removes all sessions for user", async () => {
const user = await store.createUser({
email: "test@example.com",
passwordHash: "hash123",
});
const { token: token1 } = await store.createSession({
userId: user.id,
tokenType: "session",
authMethod: "cookie",
expiresAt: new Date(Date.now() + 3600000),
});
const { token: token2 } = await store.createSession({
userId: user.id,
tokenType: "session",
authMethod: "bearer",
expiresAt: new Date(Date.now() + 3600000),
});
const count = await store.deleteUserSessions(user.id);
assert.equal(count, 2);
const session1 = await store.getSession(
hashToken(token1) as TokenId,
);
const session2 = await store.getSession(
hashToken(token2) as TokenId,
);
assert.equal(session1, null);
assert.equal(session2, null);
});
it("returns 0 when user has no sessions", async () => {
const count = await store.deleteUserSessions(
"nonexistent" as UserId,
);
assert.equal(count, 0);
});
});
describe("updateLastUsed", () => {
it("updates lastUsedAt timestamp", async () => {
const user = await store.createUser({
email: "test@example.com",
passwordHash: "hash123",
});
const { token } = await store.createSession({
userId: user.id,
tokenType: "session",
authMethod: "cookie",
expiresAt: new Date(Date.now() + 3600000),
});
const tokenId = hashToken(token) as TokenId;
const beforeUpdate = await store.getSession(tokenId);
assert.equal(beforeUpdate!.lastUsedAt, undefined);
await store.updateLastUsed(tokenId);
const afterUpdate = await store.getSession(tokenId);
assert.ok(afterUpdate!.lastUsedAt instanceof Date);
});
});
});

View File

@@ -0,0 +1,94 @@
// Tests for auth/token.ts
// Pure unit tests - no database needed
import assert from "node:assert/strict";
import { describe, it } from "node:test";
import {
generateToken,
hashToken,
parseAuthorizationHeader,
SESSION_COOKIE_NAME,
} from "./token";
describe("token", () => {
describe("generateToken", () => {
it("generates a non-empty string", () => {
const token = generateToken();
assert.equal(typeof token, "string");
assert.ok(token.length > 0);
});
it("generates unique tokens", () => {
const tokens = new Set<string>();
for (let i = 0; i < 100; i++) {
tokens.add(generateToken());
}
assert.equal(tokens.size, 100);
});
it("generates base64url encoded tokens", () => {
const token = generateToken();
// base64url uses A-Z, a-z, 0-9, -, _
assert.match(token, /^[A-Za-z0-9_-]+$/);
});
});
describe("hashToken", () => {
it("returns a hex string", () => {
const hash = hashToken("test-token");
assert.match(hash, /^[a-f0-9]+$/);
});
it("returns consistent hash for same input", () => {
const hash1 = hashToken("test-token");
const hash2 = hashToken("test-token");
assert.equal(hash1, hash2);
});
it("returns different hash for different input", () => {
const hash1 = hashToken("token-1");
const hash2 = hashToken("token-2");
assert.notEqual(hash1, hash2);
});
it("returns 64 character hash (SHA-256)", () => {
const hash = hashToken("test-token");
assert.equal(hash.length, 64);
});
});
describe("parseAuthorizationHeader", () => {
it("returns null for undefined header", () => {
assert.equal(parseAuthorizationHeader(undefined), null);
});
it("returns null for empty string", () => {
assert.equal(parseAuthorizationHeader(""), null);
});
it("returns null for non-bearer auth", () => {
assert.equal(parseAuthorizationHeader("Basic abc123"), null);
});
it("returns null for malformed header", () => {
assert.equal(parseAuthorizationHeader("Bearer"), null);
assert.equal(parseAuthorizationHeader("Bearer token extra"), null);
});
it("extracts token from valid bearer header", () => {
assert.equal(parseAuthorizationHeader("Bearer abc123"), "abc123");
});
it("is case-insensitive for Bearer keyword", () => {
assert.equal(parseAuthorizationHeader("bearer abc123"), "abc123");
assert.equal(parseAuthorizationHeader("BEARER abc123"), "abc123");
});
});
describe("SESSION_COOKIE_NAME", () => {
it("is defined", () => {
assert.equal(typeof SESSION_COOKIE_NAME, "string");
assert.ok(SESSION_COOKIE_NAME.length > 0);
});
});
});

View File

@@ -0,0 +1,253 @@
// Tests for auth/types.ts
// Pure unit tests - no database needed
import assert from "node:assert/strict";
import { describe, it } from "node:test";
import { z } from "zod";
import { AuthenticatedUser, anonymousUser } from "../user";
import {
authMethodParser,
forgotPasswordInputParser,
loginInputParser,
registerInputParser,
resetPasswordInputParser,
Session,
sessionDataParser,
tokenLifetimes,
tokenTypeParser,
} from "./types";
describe("auth/types", () => {
describe("tokenTypeParser", () => {
it("accepts valid token types", () => {
assert.equal(tokenTypeParser.parse("session"), "session");
assert.equal(
tokenTypeParser.parse("password_reset"),
"password_reset",
);
assert.equal(tokenTypeParser.parse("email_verify"), "email_verify");
});
it("rejects invalid token types", () => {
assert.throws(() => tokenTypeParser.parse("invalid"));
});
});
describe("authMethodParser", () => {
it("accepts valid auth methods", () => {
assert.equal(authMethodParser.parse("cookie"), "cookie");
assert.equal(authMethodParser.parse("bearer"), "bearer");
});
it("rejects invalid auth methods", () => {
assert.throws(() => authMethodParser.parse("basic"));
});
});
describe("sessionDataParser", () => {
it("accepts valid session data", () => {
const data = {
tokenId: "abc123",
userId: "user-1",
tokenType: "session",
authMethod: "cookie",
createdAt: new Date(),
expiresAt: new Date(),
};
const result = sessionDataParser.parse(data);
assert.equal(result.tokenId, "abc123");
assert.equal(result.userId, "user-1");
});
it("coerces date strings to dates", () => {
const data = {
tokenId: "abc123",
userId: "user-1",
tokenType: "session",
authMethod: "cookie",
createdAt: "2025-01-01T00:00:00Z",
expiresAt: "2025-01-02T00:00:00Z",
};
const result = sessionDataParser.parse(data);
assert.ok(result.createdAt instanceof Date);
assert.ok(result.expiresAt instanceof Date);
});
it("accepts optional fields", () => {
const data = {
tokenId: "abc123",
userId: "user-1",
tokenType: "session",
authMethod: "cookie",
createdAt: new Date(),
expiresAt: new Date(),
lastUsedAt: new Date(),
userAgent: "Mozilla/5.0",
ipAddress: "127.0.0.1",
isUsed: true,
};
const result = sessionDataParser.parse(data);
assert.equal(result.userAgent, "Mozilla/5.0");
assert.equal(result.ipAddress, "127.0.0.1");
assert.equal(result.isUsed, true);
});
});
describe("loginInputParser", () => {
it("accepts valid login input", () => {
const result = loginInputParser.parse({
email: "test@example.com",
password: "secret",
});
assert.equal(result.email, "test@example.com");
assert.equal(result.password, "secret");
});
it("rejects invalid email", () => {
assert.throws(() =>
loginInputParser.parse({
email: "not-an-email",
password: "secret",
}),
);
});
it("rejects empty password", () => {
assert.throws(() =>
loginInputParser.parse({
email: "test@example.com",
password: "",
}),
);
});
});
describe("registerInputParser", () => {
it("accepts valid registration input", () => {
const result = registerInputParser.parse({
email: "test@example.com",
password: "password123",
displayName: "Test User",
});
assert.equal(result.email, "test@example.com");
assert.equal(result.password, "password123");
assert.equal(result.displayName, "Test User");
});
it("accepts registration without displayName", () => {
const result = registerInputParser.parse({
email: "test@example.com",
password: "password123",
});
assert.equal(result.displayName, undefined);
});
it("rejects password shorter than 8 characters", () => {
assert.throws(() =>
registerInputParser.parse({
email: "test@example.com",
password: "short",
}),
);
});
});
describe("forgotPasswordInputParser", () => {
it("accepts valid email", () => {
const result = forgotPasswordInputParser.parse({
email: "test@example.com",
});
assert.equal(result.email, "test@example.com");
});
it("rejects invalid email", () => {
assert.throws(() =>
forgotPasswordInputParser.parse({
email: "invalid",
}),
);
});
});
describe("resetPasswordInputParser", () => {
it("accepts valid reset input", () => {
const result = resetPasswordInputParser.parse({
token: "abc123",
password: "newpassword",
});
assert.equal(result.token, "abc123");
assert.equal(result.password, "newpassword");
});
it("rejects empty token", () => {
assert.throws(() =>
resetPasswordInputParser.parse({
token: "",
password: "newpassword",
}),
);
});
it("rejects password shorter than 8 characters", () => {
assert.throws(() =>
resetPasswordInputParser.parse({
token: "abc123",
password: "short",
}),
);
});
});
describe("tokenLifetimes", () => {
it("defines session lifetime", () => {
assert.ok(tokenLifetimes.session > 0);
// 30 days in ms
assert.equal(tokenLifetimes.session, 30 * 24 * 60 * 60 * 1000);
});
it("defines password_reset lifetime", () => {
assert.ok(tokenLifetimes.password_reset > 0);
// 1 hour in ms
assert.equal(tokenLifetimes.password_reset, 1 * 60 * 60 * 1000);
});
it("defines email_verify lifetime", () => {
assert.ok(tokenLifetimes.email_verify > 0);
// 24 hours in ms
assert.equal(tokenLifetimes.email_verify, 24 * 60 * 60 * 1000);
});
});
describe("Session", () => {
it("wraps authenticated session", () => {
const user = AuthenticatedUser.create("test@example.com", {
id: "user-1",
});
const sessionData = {
tokenId: "token-1",
userId: "user-1",
tokenType: "session" as const,
authMethod: "cookie" as const,
createdAt: new Date(),
expiresAt: new Date(),
};
const session = new Session(sessionData, user);
assert.equal(session.isAuthenticated(), true);
assert.equal(session.getUser(), user);
assert.equal(session.getData(), sessionData);
assert.equal(session.tokenId, "token-1");
assert.equal(session.userId, "user-1");
});
it("wraps anonymous session", () => {
const session = new Session(null, anonymousUser);
assert.equal(session.isAuthenticated(), false);
assert.equal(session.getUser(), anonymousUser);
assert.equal(session.getData(), null);
assert.equal(session.tokenId, undefined);
assert.equal(session.userId, undefined);
});
});
});

View File

@@ -0,0 +1,24 @@
// Tests for basic/login.ts
// These tests verify the route structure and export
import assert from "node:assert/strict";
import { describe, it } from "node:test";
import { loginRoute } from "./login";
describe("basic/login", () => {
describe("loginRoute", () => {
it("has correct path", () => {
assert.equal(loginRoute.path, "/login");
});
it("handles GET and POST methods", () => {
assert.ok(loginRoute.methods.includes("GET"));
assert.ok(loginRoute.methods.includes("POST"));
assert.equal(loginRoute.methods.length, 2);
});
it("has a handler function", () => {
assert.equal(typeof loginRoute.handler, "function");
});
});
});

View File

@@ -0,0 +1,24 @@
// Tests for basic/logout.ts
// These tests verify the route structure and export
import assert from "node:assert/strict";
import { describe, it } from "node:test";
import { logoutRoute } from "./logout";
describe("basic/logout", () => {
describe("logoutRoute", () => {
it("has correct path", () => {
assert.equal(logoutRoute.path, "/logout");
});
it("handles GET and POST methods", () => {
assert.ok(logoutRoute.methods.includes("GET"));
assert.ok(logoutRoute.methods.includes("POST"));
assert.equal(logoutRoute.methods.length, 2);
});
it("has a handler function", () => {
assert.equal(typeof logoutRoute.handler, "function");
});
});
});

View File

@@ -0,0 +1,73 @@
// Tests for basic/routes.ts
// These tests verify the route structure and exports
import assert from "node:assert/strict";
import { describe, it } from "node:test";
import { routes } from "./routes";
describe("basic/routes", () => {
describe("routes object", () => {
it("exports routes as an object", () => {
assert.equal(typeof routes, "object");
});
it("contains hello route", () => {
assert.ok("hello" in routes);
assert.equal(routes.hello.path, "/hello");
assert.ok(routes.hello.methods.includes("GET"));
});
it("contains home route", () => {
assert.ok("home" in routes);
assert.equal(routes.home.path, "/");
assert.ok(routes.home.methods.includes("GET"));
});
it("contains login route", () => {
assert.ok("login" in routes);
assert.equal(routes.login.path, "/login");
});
it("contains logout route", () => {
assert.ok("logout" in routes);
assert.equal(routes.logout.path, "/logout");
});
it("all routes have handlers", () => {
for (const [name, route] of Object.entries(routes)) {
assert.equal(
typeof route.handler,
"function",
`Route ${name} should have a handler function`,
);
}
});
it("all routes have methods array", () => {
for (const [name, route] of Object.entries(routes)) {
assert.ok(
Array.isArray(route.methods),
`Route ${name} should have methods array`,
);
assert.ok(
route.methods.length > 0,
`Route ${name} should have at least one method`,
);
}
});
it("all routes have path string", () => {
for (const [name, route] of Object.entries(routes)) {
assert.equal(
typeof route.path,
"string",
`Route ${name} should have a path string`,
);
assert.ok(
route.path.startsWith("/"),
`Route ${name} path should start with /`,
);
}
});
});
});

View File

@@ -1,4 +1,5 @@
import { DateTime } from "ts-luxon"; import { DateTime } from "ts-luxon";
import { get, User } from "../hydrators/user";
import { request } from "../request"; import { request } from "../request";
import { html, render } from "../request/util"; import { html, render } from "../request/util";
import type { Call, Result, Route } from "../types"; import type { Call, Result, Route } from "../types";
@@ -23,11 +24,18 @@ const routes: Record<string, Route> = {
const _auth = request.auth; const _auth = request.auth;
const me = request.session.getUser(); const me = request.session.getUser();
const email = me.toString(); const id = me.id;
console.log(`*** id: ${id}`);
const u = await get(id);
const email = u?.email || "anonymous@example.com";
const name = u?.display_name || "anonymous";
const showLogin = me.isAnonymous(); const showLogin = me.isAnonymous();
const showLogout = !me.isAnonymous(); const showLogout = !me.isAnonymous();
const c = await render("basic/home", { const c = await render("basic/home", {
name,
email, email,
showLogin, showLogin,
showLogout, showLogout,

View File

@@ -0,0 +1,285 @@
// Tests for database.ts
// Requires test PostgreSQL: docker compose -f docker-compose.test.yml up -d
import assert from "node:assert/strict";
import { after, before, beforeEach, describe, it } from "node:test";
import {
connectionConfig,
db,
migrate,
migrationStatus,
PostgresAuthStore,
pool,
raw,
rawPool,
} from "./database";
import type { UserId } from "./user";
describe("database", () => {
before(async () => {
// Run migrations to set up schema
await migrate();
});
after(async () => {
await pool.end();
});
describe("connectionConfig", () => {
it("has required fields", () => {
assert.ok("host" in connectionConfig);
assert.ok("port" in connectionConfig);
assert.ok("user" in connectionConfig);
assert.ok("password" in connectionConfig);
assert.ok("database" in connectionConfig);
});
it("port is a number", () => {
assert.equal(typeof connectionConfig.port, "number");
});
});
describe("raw", () => {
it("executes raw SQL queries", async () => {
const result = await raw<{ one: number }>("SELECT 1 as one");
assert.equal(result.length, 1);
assert.equal(result[0].one, 1);
});
it("supports parameterized queries", async () => {
const result = await raw<{ sum: number }>(
"SELECT $1::int + $2::int as sum",
[2, 3],
);
assert.equal(result[0].sum, 5);
});
});
describe("db (Kysely instance)", () => {
it("can execute SELECT queries", async () => {
const result = await db
.selectFrom("users")
.select("id")
.limit(1)
.execute();
// May be empty, just verify it runs
assert.ok(Array.isArray(result));
});
});
describe("rawPool", () => {
it("is a pg Pool instance", () => {
assert.ok(rawPool.query !== undefined);
});
it("can execute queries", async () => {
const result = await rawPool.query("SELECT 1 as one");
assert.equal(result.rows[0].one, 1);
});
});
describe("migrate", () => {
it("runs without error when migrations are up to date", async () => {
// Should not throw
await migrate();
});
});
describe("migrationStatus", () => {
it("returns applied and pending arrays", async () => {
const status = await migrationStatus();
assert.ok(Array.isArray(status.applied));
assert.ok(Array.isArray(status.pending));
});
it("shows framework migrations as applied", async () => {
const status = await migrationStatus();
// At least the users migration should be applied
const hasUsersMigration = status.applied.some((m) =>
m.includes("users"),
);
assert.ok(hasUsersMigration);
});
});
describe("PostgresAuthStore", () => {
let store: PostgresAuthStore;
before(() => {
store = new PostgresAuthStore();
});
beforeEach(async () => {
// Clean up test data before each test
await rawPool.query("DELETE FROM sessions");
await rawPool.query("DELETE FROM user_credentials");
await rawPool.query("DELETE FROM user_emails");
await rawPool.query("DELETE FROM users");
});
describe("createUser", () => {
it("creates a user with pending status", async () => {
const user = await store.createUser({
email: "test@example.com",
passwordHash: "hash123",
displayName: "Test User",
});
assert.equal(user.email, "test@example.com");
assert.equal(user.displayName, "Test User");
assert.equal(user.status, "pending");
});
it("stores the password hash", async () => {
const user = await store.createUser({
email: "test@example.com",
passwordHash: "secrethash",
});
const hash = await store.getUserPasswordHash(user.id);
assert.equal(hash, "secrethash");
});
});
describe("getUserByEmail", () => {
it("returns user when found", async () => {
await store.createUser({
email: "find@example.com",
passwordHash: "hash",
});
const user = await store.getUserByEmail("find@example.com");
assert.notEqual(user, null);
assert.equal(user!.email, "find@example.com");
});
it("is case-insensitive", async () => {
await store.createUser({
email: "UPPER@EXAMPLE.COM",
passwordHash: "hash",
});
const user = await store.getUserByEmail("upper@example.com");
assert.notEqual(user, null);
});
it("returns null when not found", async () => {
const user = await store.getUserByEmail("notfound@example.com");
assert.equal(user, null);
});
});
describe("getUserById", () => {
it("returns user when found", async () => {
const created = await store.createUser({
email: "test@example.com",
passwordHash: "hash",
});
const user = await store.getUserById(created.id);
assert.notEqual(user, null);
assert.equal(user!.id, created.id);
});
it("returns null when not found", async () => {
const user = await store.getUserById(
"00000000-0000-0000-0000-000000000000" as UserId,
);
assert.equal(user, null);
});
});
describe("setUserPassword", () => {
it("updates the password hash", async () => {
const user = await store.createUser({
email: "test@example.com",
passwordHash: "oldhash",
});
await store.setUserPassword(user.id, "newhash");
const hash = await store.getUserPasswordHash(user.id);
assert.equal(hash, "newhash");
});
});
describe("updateUserEmailVerified", () => {
it("sets user status to active", async () => {
const created = await store.createUser({
email: "test@example.com",
passwordHash: "hash",
});
assert.equal(created.status, "pending");
await store.updateUserEmailVerified(created.id);
const user = await store.getUserById(created.id);
assert.equal(user!.status, "active");
});
});
describe("session operations", () => {
let userId: UserId;
beforeEach(async () => {
const user = await store.createUser({
email: "session@example.com",
passwordHash: "hash",
});
userId = user.id;
});
it("creates and retrieves sessions", async () => {
const { token, session } = await store.createSession({
userId,
tokenType: "session",
authMethod: "cookie",
expiresAt: new Date(Date.now() + 3600000),
});
assert.ok(token.length > 0);
assert.equal(session.userId, userId);
assert.equal(session.tokenType, "session");
});
it("deletes sessions", async () => {
const { session } = await store.createSession({
userId,
tokenType: "session",
authMethod: "cookie",
expiresAt: new Date(Date.now() + 3600000),
});
await store.deleteSession(session.tokenId as any);
// Session should be soft-deleted (revoked)
const retrieved = await store.getSession(
session.tokenId as any,
);
assert.equal(retrieved, null);
});
it("deletes all user sessions", async () => {
await store.createSession({
userId,
tokenType: "session",
authMethod: "cookie",
expiresAt: new Date(Date.now() + 3600000),
});
await store.createSession({
userId,
tokenType: "session",
authMethod: "bearer",
expiresAt: new Date(Date.now() + 3600000),
});
const count = await store.deleteUserSessions(userId);
assert.equal(count, 2);
});
});
});
});

View File

@@ -18,47 +18,68 @@ import type {
} from "./auth/store"; } from "./auth/store";
import { generateToken, hashToken } from "./auth/token"; import { generateToken, hashToken } from "./auth/token";
import type { SessionData, TokenId } from "./auth/types"; import type { SessionData, TokenId } from "./auth/types";
import type { Domain } from "./types";
import { AuthenticatedUser, type User, type UserId } from "./user"; import { AuthenticatedUser, type User, type UserId } from "./user";
// Connection configuration // Connection configuration (supports environment variable overrides)
const connectionConfig = { const connectionConfig = {
host: "localhost", host: process.env.DB_HOST ?? "localhost",
port: 5432, port: Number(process.env.DB_PORT ?? 5432),
user: "diachron", user: process.env.DB_USER ?? "diachron",
password: "diachron", password: process.env.DB_PASSWORD ?? "diachron",
database: "diachron", database: process.env.DB_NAME ?? "diachron",
}; };
// Database schema types for Kysely // Database schema types for Kysely
// Generated<T> marks columns with database defaults (optional on insert) // Generated<T> marks columns with database defaults (optional on insert)
interface UsersTable { interface UsersTable {
id: string; id: string;
email: string;
password_hash: string;
display_name: string | null;
status: Generated<string>; status: Generated<string>;
roles: Generated<string[]>; display_name: string | null;
permissions: Generated<string[]>; created_at: Generated<Date>;
email_verified: Generated<boolean>; updated_at: Generated<Date>;
}
interface UserEmailsTable {
id: string;
user_id: string;
email: string;
normalized_email: string;
is_primary: Generated<boolean>;
is_verified: Generated<boolean>;
created_at: Generated<Date>;
verified_at: Date | null;
revoked_at: Date | null;
}
interface UserCredentialsTable {
id: string;
user_id: string;
credential_type: Generated<string>;
password_hash: string | null;
created_at: Generated<Date>; created_at: Generated<Date>;
updated_at: Generated<Date>; updated_at: Generated<Date>;
} }
interface SessionsTable { interface SessionsTable {
token_id: string; id: Generated<string>;
token_hash: string;
user_id: string; user_id: string;
user_email_id: string | null;
token_type: string; token_type: string;
auth_method: string; auth_method: string;
created_at: Generated<Date>; created_at: Generated<Date>;
expires_at: Date; expires_at: Date;
last_used_at: Date | null; revoked_at: Date | null;
user_agent: string | null;
ip_address: string | null; ip_address: string | null;
user_agent: string | null;
is_used: Generated<boolean | null>; is_used: Generated<boolean | null>;
} }
interface Database { interface Database {
users: UsersTable; users: UsersTable;
user_emails: UserEmailsTable;
user_credentials: UserCredentialsTable;
sessions: SessionsTable; sessions: SessionsTable;
} }
@@ -87,12 +108,13 @@ async function raw<T = unknown>(
// ============================================================================ // ============================================================================
// Migration file naming convention: // Migration file naming convention:
// NNNN_description.sql // yyyy-mm-dd_ss_description.sql
// e.g., 0001_initial.sql, 0002_add_users.sql // e.g., 2025-01-15_01_initial.sql, 2025-01-15_02_add_users.sql
// //
// Migrations directory: express/migrations/ // Migrations directory: express/migrations/
const MIGRATIONS_DIR = path.join(__dirname, "migrations"); const FRAMEWORK_MIGRATIONS_DIR = path.join(__dirname, "diachron/migrations");
const APP_MIGRATIONS_DIR = path.join(__dirname, "migrations");
const MIGRATIONS_TABLE = "_migrations"; const MIGRATIONS_TABLE = "_migrations";
interface MigrationRecord { interface MigrationRecord {
@@ -121,22 +143,34 @@ async function getAppliedMigrations(): Promise<string[]> {
} }
// Get pending migration files // Get pending migration files
function getMigrationFiles(): string[] { function getMigrationFiles(kind: Domain): string[] {
if (!fs.existsSync(MIGRATIONS_DIR)) { const dir = kind === "fw" ? FRAMEWORK_MIGRATIONS_DIR : APP_MIGRATIONS_DIR;
if (!fs.existsSync(dir)) {
return []; return [];
} }
return fs
.readdirSync(MIGRATIONS_DIR) const root = __dirname;
const mm = fs
.readdirSync(dir)
.filter((f) => f.endsWith(".sql")) .filter((f) => f.endsWith(".sql"))
.filter((f) => /^\d{4}_/.test(f)) .filter((f) => /^\d{4}-\d{2}-\d{2}_\d{2}-/.test(f))
.map((f) => `${dir}/${f}`)
.map((f) => f.replace(`${root}/`, ""))
.sort(); .sort();
return mm;
} }
// Run a single migration // Run a single migration
async function runMigration(filename: string): Promise<void> { async function runMigration(filename: string): Promise<void> {
const filepath = path.join(MIGRATIONS_DIR, filename); // const filepath = path.join(MIGRATIONS_DIR, filename);
const filepath = filename;
const content = fs.readFileSync(filepath, "utf-8"); const content = fs.readFileSync(filepath, "utf-8");
process.stdout.write(` Migration: ${filename}...`);
// Run migration in a transaction // Run migration in a transaction
const client = await pool.connect(); const client = await pool.connect();
try { try {
@@ -147,8 +181,11 @@ async function runMigration(filename: string): Promise<void> {
[filename], [filename],
); );
await client.query("COMMIT"); await client.query("COMMIT");
console.log(`Applied migration: ${filename}`); console.log(" ✓");
} catch (err) { } catch (err) {
console.log(" ✗");
const message = err instanceof Error ? err.message : String(err);
console.error(` Error: ${message}`);
await client.query("ROLLBACK"); await client.query("ROLLBACK");
throw err; throw err;
} finally { } finally {
@@ -156,24 +193,31 @@ async function runMigration(filename: string): Promise<void> {
} }
} }
function getAllMigrationFiles() {
const fw_files = getMigrationFiles("fw");
const app_files = getMigrationFiles("app");
const all = [...fw_files, ...app_files];
return all;
}
// Run all pending migrations // Run all pending migrations
async function migrate(): Promise<void> { async function migrate(): Promise<void> {
await ensureMigrationsTable(); await ensureMigrationsTable();
const applied = new Set(await getAppliedMigrations()); const applied = new Set(await getAppliedMigrations());
const files = getMigrationFiles(); const all = getAllMigrationFiles();
const pending = files.filter((f) => !applied.has(f)); const pending = all.filter((all) => !applied.has(all));
if (pending.length === 0) { if (pending.length === 0) {
console.log("No pending migrations"); console.log("No pending migrations");
return; return;
} }
console.log(`Running ${pending.length} migration(s)...`); console.log(`Applying ${pending.length} migration(s):`);
for (const file of pending) { for (const file of pending) {
await runMigration(file); await runMigration(file);
} }
console.log("Migrations complete");
} }
// List migration status // List migration status
@@ -183,10 +227,10 @@ async function migrationStatus(): Promise<{
}> { }> {
await ensureMigrationsTable(); await ensureMigrationsTable();
const applied = new Set(await getAppliedMigrations()); const applied = new Set(await getAppliedMigrations());
const files = getMigrationFiles(); const ff = getAllMigrationFiles();
return { return {
applied: files.filter((f) => applied.has(f)), applied: ff.filter((ff) => applied.has(ff)),
pending: files.filter((f) => !applied.has(f)), pending: ff.filter((ff) => !applied.has(ff)),
}; };
} }
@@ -201,12 +245,12 @@ class PostgresAuthStore implements AuthStore {
data: CreateSessionData, data: CreateSessionData,
): Promise<{ token: string; session: SessionData }> { ): Promise<{ token: string; session: SessionData }> {
const token = generateToken(); const token = generateToken();
const tokenId = hashToken(token); const tokenHash = hashToken(token);
const row = await db const row = await db
.insertInto("sessions") .insertInto("sessions")
.values({ .values({
token_id: tokenId, token_hash: tokenHash,
user_id: data.userId, user_id: data.userId,
token_type: data.tokenType, token_type: data.tokenType,
auth_method: data.authMethod, auth_method: data.authMethod,
@@ -218,13 +262,12 @@ class PostgresAuthStore implements AuthStore {
.executeTakeFirstOrThrow(); .executeTakeFirstOrThrow();
const session: SessionData = { const session: SessionData = {
tokenId: row.token_id, tokenId: row.token_hash,
userId: row.user_id, userId: row.user_id,
tokenType: row.token_type as SessionData["tokenType"], tokenType: row.token_type as SessionData["tokenType"],
authMethod: row.auth_method as SessionData["authMethod"], authMethod: row.auth_method as SessionData["authMethod"],
createdAt: row.created_at, createdAt: row.created_at,
expiresAt: row.expires_at, expiresAt: row.expires_at,
lastUsedAt: row.last_used_at ?? undefined,
userAgent: row.user_agent ?? undefined, userAgent: row.user_agent ?? undefined,
ipAddress: row.ip_address ?? undefined, ipAddress: row.ip_address ?? undefined,
isUsed: row.is_used ?? undefined, isUsed: row.is_used ?? undefined,
@@ -237,8 +280,9 @@ class PostgresAuthStore implements AuthStore {
const row = await db const row = await db
.selectFrom("sessions") .selectFrom("sessions")
.selectAll() .selectAll()
.where("token_id", "=", tokenId) .where("token_hash", "=", tokenId)
.where("expires_at", ">", new Date()) .where("expires_at", ">", new Date())
.where("revoked_at", "is", null)
.executeTakeFirst(); .executeTakeFirst();
if (!row) { if (!row) {
@@ -246,50 +290,62 @@ class PostgresAuthStore implements AuthStore {
} }
return { return {
tokenId: row.token_id, tokenId: row.token_hash,
userId: row.user_id, userId: row.user_id,
tokenType: row.token_type as SessionData["tokenType"], tokenType: row.token_type as SessionData["tokenType"],
authMethod: row.auth_method as SessionData["authMethod"], authMethod: row.auth_method as SessionData["authMethod"],
createdAt: row.created_at, createdAt: row.created_at,
expiresAt: row.expires_at, expiresAt: row.expires_at,
lastUsedAt: row.last_used_at ?? undefined,
userAgent: row.user_agent ?? undefined, userAgent: row.user_agent ?? undefined,
ipAddress: row.ip_address ?? undefined, ipAddress: row.ip_address ?? undefined,
isUsed: row.is_used ?? undefined, isUsed: row.is_used ?? undefined,
}; };
} }
async updateLastUsed(tokenId: TokenId): Promise<void> { async updateLastUsed(_tokenId: TokenId): Promise<void> {
await db // The new schema doesn't have last_used_at column
.updateTable("sessions") // This is now a no-op; session activity tracking could be added later
.set({ last_used_at: new Date() })
.where("token_id", "=", tokenId)
.execute();
} }
async deleteSession(tokenId: TokenId): Promise<void> { async deleteSession(tokenId: TokenId): Promise<void> {
// Soft delete by setting revoked_at
await db await db
.deleteFrom("sessions") .updateTable("sessions")
.where("token_id", "=", tokenId) .set({ revoked_at: new Date() })
.where("token_hash", "=", tokenId)
.execute(); .execute();
} }
async deleteUserSessions(userId: UserId): Promise<number> { async deleteUserSessions(userId: UserId): Promise<number> {
const result = await db const result = await db
.deleteFrom("sessions") .updateTable("sessions")
.set({ revoked_at: new Date() })
.where("user_id", "=", userId) .where("user_id", "=", userId)
.where("revoked_at", "is", null)
.executeTakeFirst(); .executeTakeFirst();
return Number(result.numDeletedRows); return Number(result.numUpdatedRows);
} }
// User operations // User operations
async getUserByEmail(email: string): Promise<User | null> { async getUserByEmail(email: string): Promise<User | null> {
// Find user through user_emails table
const normalizedEmail = email.toLowerCase().trim();
const row = await db const row = await db
.selectFrom("users") .selectFrom("user_emails")
.selectAll() .innerJoin("users", "users.id", "user_emails.user_id")
.where(sql`LOWER(email)`, "=", email.toLowerCase()) .select([
"users.id",
"users.status",
"users.display_name",
"users.created_at",
"users.updated_at",
"user_emails.email",
])
.where("user_emails.normalized_email", "=", normalizedEmail)
.where("user_emails.revoked_at", "is", null)
.executeTakeFirst(); .executeTakeFirst();
if (!row) { if (!row) {
@@ -299,10 +355,24 @@ class PostgresAuthStore implements AuthStore {
} }
async getUserById(userId: UserId): Promise<User | null> { async getUserById(userId: UserId): Promise<User | null> {
// Get user with their primary email
const row = await db const row = await db
.selectFrom("users") .selectFrom("users")
.selectAll() .leftJoin("user_emails", (join) =>
.where("id", "=", userId) join
.onRef("user_emails.user_id", "=", "users.id")
.on("user_emails.is_primary", "=", true)
.on("user_emails.revoked_at", "is", null),
)
.select([
"users.id",
"users.status",
"users.display_name",
"users.created_at",
"users.updated_at",
"user_emails.email",
])
.where("users.id", "=", userId)
.executeTakeFirst(); .executeTakeFirst();
if (!row) { if (!row) {
@@ -312,68 +382,149 @@ class PostgresAuthStore implements AuthStore {
} }
async createUser(data: CreateUserData): Promise<User> { async createUser(data: CreateUserData): Promise<User> {
const id = crypto.randomUUID(); const userId = crypto.randomUUID();
const emailId = crypto.randomUUID();
const credentialId = crypto.randomUUID();
const now = new Date(); const now = new Date();
const normalizedEmail = data.email.toLowerCase().trim();
const row = await db // Create user record
await db
.insertInto("users") .insertInto("users")
.values({ .values({
id, id: userId,
email: data.email,
password_hash: data.passwordHash,
display_name: data.displayName ?? null, display_name: data.displayName ?? null,
status: "pending", status: "pending",
roles: [],
permissions: [],
email_verified: false,
created_at: now, created_at: now,
updated_at: now, updated_at: now,
}) })
.returningAll() .execute();
.executeTakeFirstOrThrow();
return this.rowToUser(row); // Create user_email record
await db
.insertInto("user_emails")
.values({
id: emailId,
user_id: userId,
email: data.email,
normalized_email: normalizedEmail,
is_primary: true,
is_verified: false,
created_at: now,
})
.execute();
// Create user_credential record
await db
.insertInto("user_credentials")
.values({
id: credentialId,
user_id: userId,
credential_type: "password",
password_hash: data.passwordHash,
created_at: now,
updated_at: now,
})
.execute();
return new AuthenticatedUser({
id: userId,
email: data.email,
displayName: data.displayName,
status: "pending",
roles: [],
permissions: [],
createdAt: now,
updatedAt: now,
});
} }
async getUserPasswordHash(userId: UserId): Promise<string | null> { async getUserPasswordHash(userId: UserId): Promise<string | null> {
const row = await db const row = await db
.selectFrom("users") .selectFrom("user_credentials")
.select("password_hash") .select("password_hash")
.where("id", "=", userId) .where("user_id", "=", userId)
.where("credential_type", "=", "password")
.executeTakeFirst(); .executeTakeFirst();
return row?.password_hash ?? null; return row?.password_hash ?? null;
} }
async setUserPassword(userId: UserId, passwordHash: string): Promise<void> { async setUserPassword(userId: UserId, passwordHash: string): Promise<void> {
const now = new Date();
// Try to update existing credential
const result = await db
.updateTable("user_credentials")
.set({ password_hash: passwordHash, updated_at: now })
.where("user_id", "=", userId)
.where("credential_type", "=", "password")
.executeTakeFirst();
// If no existing credential, create one
if (Number(result.numUpdatedRows) === 0) {
await db
.insertInto("user_credentials")
.values({
id: crypto.randomUUID(),
user_id: userId,
credential_type: "password",
password_hash: passwordHash,
created_at: now,
updated_at: now,
})
.execute();
}
// Update user's updated_at
await db await db
.updateTable("users") .updateTable("users")
.set({ password_hash: passwordHash, updated_at: new Date() }) .set({ updated_at: now })
.where("id", "=", userId) .where("id", "=", userId)
.execute(); .execute();
} }
async updateUserEmailVerified(userId: UserId): Promise<void> { async updateUserEmailVerified(userId: UserId): Promise<void> {
const now = new Date();
// Update user_emails to mark as verified
await db
.updateTable("user_emails")
.set({
is_verified: true,
verified_at: now,
})
.where("user_id", "=", userId)
.where("is_primary", "=", true)
.execute();
// Update user status to active
await db await db
.updateTable("users") .updateTable("users")
.set({ .set({
email_verified: true,
status: "active", status: "active",
updated_at: new Date(), updated_at: now,
}) })
.where("id", "=", userId) .where("id", "=", userId)
.execute(); .execute();
} }
// Helper to convert database row to User object // Helper to convert database row to User object
private rowToUser(row: Selectable<UsersTable>): User { private rowToUser(row: {
id: string;
status: string;
display_name: string | null;
created_at: Date;
updated_at: Date;
email: string | null;
}): User {
return new AuthenticatedUser({ return new AuthenticatedUser({
id: row.id, id: row.id,
email: row.email, email: row.email ?? "unknown@example.com",
displayName: row.display_name ?? undefined, displayName: row.display_name ?? undefined,
status: row.status as "active" | "suspended" | "pending", status: row.status as "active" | "suspended" | "pending",
roles: row.roles, roles: [], // TODO: query from RBAC tables
permissions: row.permissions, permissions: [], // TODO: query from RBAC tables
createdAt: row.created_at, createdAt: row.created_at,
updatedAt: row.updated_at, updatedAt: row.updated_at,
}); });

View File

@@ -0,0 +1,17 @@
import { connectionConfig, migrate, pool } from "../database";
import { dropTables, exitIfUnforced } from "./util";
async function main(): Promise<void> {
exitIfUnforced();
try {
await dropTables();
} finally {
await pool.end();
}
}
main().catch((err) => {
console.error("Failed to clear database:", err.message);
process.exit(1);
});

View File

@@ -0,0 +1,26 @@
// reset-db.ts
// Development command to wipe the database and apply all migrations from scratch
import { connectionConfig, migrate, pool } from "../database";
import { dropTables, exitIfUnforced } from "./util";
async function main(): Promise<void> {
exitIfUnforced();
try {
await dropTables();
console.log("");
await migrate();
console.log("");
console.log("Database reset complete.");
} finally {
await pool.end();
}
}
main().catch((err) => {
console.error("Failed to reset database:", err.message);
process.exit(1);
});

View File

@@ -0,0 +1,42 @@
// FIXME: this is at the wrong level of specificity
import { connectionConfig, migrate, pool } from "../database";
const exitIfUnforced = () => {
const args = process.argv.slice(2);
// Require explicit confirmation unless --force is passed
if (!args.includes("--force")) {
console.error("This will DROP ALL TABLES in the database!");
console.error(` Database: ${connectionConfig.database}`);
console.error(
` Host: ${connectionConfig.host}:${connectionConfig.port}`,
);
console.error("");
console.error("Run with --force to proceed.");
process.exit(1);
}
};
const dropTables = async () => {
console.log("Dropping all tables...");
// Get all table names in the public schema
const result = await pool.query<{ tablename: string }>(`
SELECT tablename FROM pg_tables
WHERE schemaname = 'public'
`);
if (result.rows.length > 0) {
// Drop all tables with CASCADE to handle foreign key constraints
const tableNames = result.rows
.map((r) => `"${r.tablename}"`)
.join(", ");
await pool.query(`DROP TABLE IF EXISTS ${tableNames} CASCADE`);
console.log(`Dropped ${result.rows.length} table(s)`);
} else {
console.log("No tables to drop");
}
};
export { dropTables, exitIfUnforced };

View File

@@ -0,0 +1,71 @@
// Tests for handlers.ts
// These tests use mock Call objects
import assert from "node:assert/strict";
import { describe, it } from "node:test";
import type { Request as ExpressRequest } from "express";
import { Session } from "./auth/types";
import { contentTypes } from "./content-types";
import { multiHandler } from "./handlers";
import { httpCodes } from "./http-codes";
import type { Call } from "./types";
import { anonymousUser } from "./user";
// Helper to create a minimal mock Call
function createMockCall(overrides: Partial<Call> = {}): Call {
const defaultSession = new Session(null, anonymousUser);
return {
pattern: "/test",
path: "/test",
method: "GET",
parameters: {},
request: {} as ExpressRequest,
user: anonymousUser,
session: defaultSession,
...overrides,
};
}
describe("handlers", () => {
describe("multiHandler", () => {
it("returns OK status", async () => {
const call = createMockCall({ method: "GET" });
const result = await multiHandler(call);
assert.equal(result.code, httpCodes.success.OK);
});
it("returns text/plain content type", async () => {
const call = createMockCall();
const result = await multiHandler(call);
assert.equal(result.contentType, contentTypes.text.plain);
});
it("includes method in result", async () => {
const call = createMockCall({ method: "POST" });
const result = await multiHandler(call);
assert.ok(result.result.includes("POST"));
});
it("includes a random number in result", async () => {
const call = createMockCall();
const result = await multiHandler(call);
// Result format: "that was GET (0.123456789)"
assert.match(result.result, /that was \w+ \(\d+\.?\d*\)/);
});
it("works with different HTTP methods", async () => {
const methods = ["GET", "POST", "PUT", "PATCH", "DELETE"] as const;
for (const method of methods) {
const call = createMockCall({ method });
const result = await multiHandler(call);
assert.ok(result.result.includes(method));
}
});
});
});

View File

@@ -0,0 +1,29 @@
import { Kysely, PostgresDialect } from "kysely";
import { Pool } from "pg";
import type { DB } from "../../generated/db";
import { connectionConfig } from "../database";
const db = new Kysely<DB>({
dialect: new PostgresDialect({
pool: new Pool(connectionConfig),
}),
log(event) {
if (event.level === "query") {
// FIXME: Wire this up to the logging system
console.log("SQL:", event.query.sql);
console.log("Params:", event.query.parameters);
}
},
});
abstract class Hydrator<T> {
public db: Kysely<DB>;
protected abstract table: string;
constructor() {
this.db = db;
}
}
export { Hydrator, db };

View File

@@ -0,0 +1 @@
export type Hydrators = {};

View File

@@ -0,0 +1,44 @@
// Test setup for hydrator tests
// Run: DB_PORT=5433 DB_USER=diachron_test DB_PASSWORD=diachron_test DB_NAME=diachron_test npx tsx --test tests/*.test.ts
import { Pool } from "pg";
import { connectionConfig, migrate } from "../../database";
const pool = new Pool(connectionConfig);
export async function setupTestDatabase(): Promise<void> {
await migrate();
}
export async function cleanupTables(): Promise<void> {
// Clean in reverse dependency order
await pool.query("DELETE FROM user_emails");
await pool.query("DELETE FROM users");
}
export async function teardownTestDatabase(): Promise<void> {
await pool.end();
}
export async function insertTestUser(data: {
id: string;
displayName: string;
status: string;
email: string;
}): Promise<void> {
const emailId = crypto.randomUUID();
const normalizedEmail = data.email.toLowerCase().trim();
await pool.query(
`INSERT INTO users (id, display_name, status) VALUES ($1, $2, $3)`,
[data.id, data.displayName, data.status],
);
await pool.query(
`INSERT INTO user_emails (id, user_id, email, normalized_email, is_primary)
VALUES ($1, $2, $3, $4, true)`,
[emailId, data.id, data.email, normalizedEmail],
);
}
export { pool };

View File

@@ -0,0 +1,98 @@
// Tests for user hydrator
// Run with: cd express && DB_PORT=5433 DB_USER=diachron_test DB_PASSWORD=diachron_test DB_NAME=diachron_test ../cmd npx tsx --test diachron/hydrators/tests/user.test.ts
import assert from "node:assert/strict";
import { after, before, beforeEach, describe, it } from "node:test";
import { get } from "../user";
import {
cleanupTables,
insertTestUser,
setupTestDatabase,
teardownTestDatabase,
} from "./setup";
describe("user hydrator", () => {
before(async () => {
await setupTestDatabase();
});
after(async () => {
await teardownTestDatabase();
});
beforeEach(async () => {
await cleanupTables();
});
describe("get", () => {
it("returns null for non-existent user", async () => {
const result = await get("00000000-0000-0000-0000-000000000000");
assert.equal(result, null);
});
it("returns user when found", async () => {
const userId = "cfae0a19-6515-4813-bc2d-1e032b72b203";
await insertTestUser({
id: userId,
displayName: "Test User",
status: "active",
email: "test@example.com",
});
const result = await get(userId);
assert.notEqual(result, null);
assert.equal(result!.id, userId);
assert.equal(result!.display_name, "Test User");
assert.equal(result!.status, "active");
assert.equal(result!.email, "test@example.com");
});
it("validates user data with zod parser", async () => {
const userId = crypto.randomUUID();
await insertTestUser({
id: userId,
displayName: "Valid User",
status: "active",
email: "valid@example.com",
});
const result = await get(userId);
// If we get here without throwing, parsing succeeded
assert.notEqual(result, null);
assert.equal(typeof result!.id, "string");
assert.equal(typeof result!.email, "string");
});
it("returns user with pending status", async () => {
const userId = crypto.randomUUID();
await insertTestUser({
id: userId,
displayName: "Pending User",
status: "pending",
email: "pending@example.com",
});
const result = await get(userId);
assert.notEqual(result, null);
assert.equal(result!.status, "pending");
});
it("returns user with suspended status", async () => {
const userId = crypto.randomUUID();
await insertTestUser({
id: userId,
displayName: "Suspended User",
status: "suspended",
email: "suspended@example.com",
});
const result = await get(userId);
assert.notEqual(result, null);
assert.equal(result!.status, "suspended");
});
});
});

View File

@@ -0,0 +1,59 @@
import {
ColumnType,
Generated,
Insertable,
JSONColumnType,
Selectable,
Updateable,
} from "kysely";
import type { TypeID } from "typeid-js";
import { z } from "zod";
import { db, Hydrator } from "./hydrator";
const parser = z.object({
// id: z.uuidv7(),
id: z.uuid(),
display_name: z.string(),
// FIXME: status is duplicated elsewhere
status: z.union([
z.literal("active"),
z.literal("suspended"),
z.literal("pending"),
]),
email: z.email(),
});
const tp = parser.parse({
id: "cfae0a19-6515-4813-bc2d-1e032b72b203",
display_name: "foo",
status: "active",
email: "mw@philologue.net",
});
export type User = z.infer<typeof parser>;
const get = async (id: string): Promise<null | User> => {
const ret = await db
.selectFrom("users")
.where("users.id", "=", id)
.innerJoin("user_emails", "user_emails.user_id", "users.id")
.select([
"users.id",
"users.status",
"users.display_name",
"user_emails.email",
])
.executeTakeFirst();
if (ret === undefined) {
return null;
}
console.dir(ret);
const parsed = parser.parse(ret);
return parsed;
};
export { get };

View File

@@ -0,0 +1,53 @@
// Tests for logging.ts
// Note: These tests verify the module structure and types.
// Full integration tests would require a running logging service.
import assert from "node:assert/strict";
import { describe, it } from "node:test";
// We can't easily test log() and getLogs() without mocking fetch,
// but we can verify the module exports correctly and types work.
describe("logging", () => {
describe("module structure", () => {
it("exports log function", async () => {
const { log } = await import("./logging");
assert.equal(typeof log, "function");
});
it("exports getLogs function", async () => {
const { getLogs } = await import("./logging");
assert.equal(typeof getLogs, "function");
});
});
describe("Message type", () => {
// Type-level tests - if these compile, the types are correct
it("accepts valid message sources", () => {
type MessageSource = "logging" | "diagnostic" | "user";
const sources: MessageSource[] = ["logging", "diagnostic", "user"];
assert.equal(sources.length, 3);
});
});
describe("FilterArgument type", () => {
// Type-level tests
it("accepts valid filter options", () => {
type FilterArgument = {
limit?: number;
before?: number;
after?: number;
match?: (string | RegExp)[];
};
const filter: FilterArgument = {
limit: 10,
before: Date.now(),
after: Date.now() - 3600000,
match: ["error", /warning/i],
};
assert.ok(filter.limit === 10);
});
});
});

View File

@@ -0,0 +1,29 @@
-- 0001_users.sql
-- Create users table for authentication
CREATE TABLE users (
id UUID PRIMARY KEY,
status TEXT NOT NULL DEFAULT 'active',
display_name TEXT,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
CREATE TABLE user_emails (
id UUID PRIMARY KEY,
user_id UUID NOT NULL REFERENCES users(id),
email TEXT NOT NULL,
normalized_email TEXT NOT NULL,
is_primary BOOLEAN NOT NULL DEFAULT FALSE,
is_verified BOOLEAN NOT NULL DEFAULT FALSE,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
verified_at TIMESTAMPTZ,
revoked_at TIMESTAMPTZ
);
-- Enforce uniqueness only among *active* emails
CREATE UNIQUE INDEX user_emails_unique_active
ON user_emails (normalized_email)
WHERE revoked_at IS NULL;

View File

@@ -2,15 +2,17 @@
-- Create sessions table for auth tokens -- Create sessions table for auth tokens
CREATE TABLE sessions ( CREATE TABLE sessions (
token_id TEXT PRIMARY KEY, id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID NOT NULL REFERENCES users(id) ON DELETE CASCADE, token_hash TEXT UNIQUE NOT NULL,
user_id UUID NOT NULL REFERENCES users(id),
user_email_id UUID REFERENCES user_emails(id),
token_type TEXT NOT NULL, token_type TEXT NOT NULL,
auth_method TEXT NOT NULL, auth_method TEXT NOT NULL,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(), created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
expires_at TIMESTAMPTZ NOT NULL, expires_at TIMESTAMPTZ NOT NULL,
last_used_at TIMESTAMPTZ, revoked_at TIMESTAMPTZ,
ip_address INET,
user_agent TEXT, user_agent TEXT,
ip_address TEXT,
is_used BOOLEAN DEFAULT FALSE is_used BOOLEAN DEFAULT FALSE
); );

View File

@@ -0,0 +1,17 @@
-- 0003_user_credentials.sql
-- Create user_credentials table for password storage (extensible for other auth methods)
CREATE TABLE user_credentials (
id UUID PRIMARY KEY,
user_id UUID NOT NULL REFERENCES users(id),
credential_type TEXT NOT NULL DEFAULT 'password',
password_hash TEXT,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
-- Each user can have at most one credential per type
CREATE UNIQUE INDEX user_credentials_user_type_idx ON user_credentials (user_id, credential_type);
-- Index for user lookups
CREATE INDEX user_credentials_user_id_idx ON user_credentials (user_id);

View File

@@ -0,0 +1,20 @@
CREATE TABLE roles (
id UUID PRIMARY KEY,
name TEXT UNIQUE NOT NULL,
description TEXT
);
CREATE TABLE groups (
id UUID PRIMARY KEY,
name TEXT NOT NULL,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
CREATE TABLE user_group_roles (
user_id UUID NOT NULL REFERENCES users(id),
group_id UUID NOT NULL REFERENCES groups(id),
role_id UUID NOT NULL REFERENCES roles(id),
granted_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
revoked_at TIMESTAMPTZ,
PRIMARY KEY (user_id, group_id, role_id)
);

View File

@@ -0,0 +1,14 @@
CREATE TABLE capabilities (
id UUID PRIMARY KEY,
name TEXT UNIQUE NOT NULL,
description TEXT
);
CREATE TABLE role_capabilities (
role_id UUID NOT NULL REFERENCES roles(id),
capability_id UUID NOT NULL REFERENCES capabilities(id),
granted_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
revoked_at TIMESTAMPTZ,
PRIMARY KEY (role_id, capability_id)
);

View File

@@ -4,8 +4,10 @@
"description": "", "description": "",
"main": "index.js", "main": "index.js",
"scripts": { "scripts": {
"test": "echo \"Error: no test specified\" && exit 1", "test": "DB_PORT=5433 DB_USER=diachron_test DB_PASSWORD=diachron_test DB_NAME=diachron_test tsx --test '**/*.{test,spec}.ts'",
"nodemon": "nodemon dist/index.js" "test:watch": "DB_PORT=5433 DB_USER=diachron_test DB_PASSWORD=diachron_test DB_NAME=diachron_test tsx --test --watch '**/*.{test,spec}.ts'",
"nodemon": "nodemon dist/index.js",
"kysely-codegen": "kysely-codegen"
}, },
"keywords": [], "keywords": [],
"author": "", "author": "",
@@ -24,12 +26,14 @@
"ts-luxon": "^6.2.0", "ts-luxon": "^6.2.0",
"ts-node": "^10.9.2", "ts-node": "^10.9.2",
"tsx": "^4.20.6", "tsx": "^4.20.6",
"typeid-js": "^1.2.0",
"typescript": "^5.9.3", "typescript": "^5.9.3",
"zod": "^4.1.12" "zod": "^4.1.12"
}, },
"devDependencies": { "devDependencies": {
"@biomejs/biome": "2.3.10", "@biomejs/biome": "2.3.10",
"@types/express": "^5.0.5", "@types/express": "^5.0.5",
"@types/pg": "^8.16.0" "@types/pg": "^8.16.0",
"kysely-codegen": "^0.19.0"
} }
} }

View File

@@ -44,6 +44,9 @@ importers:
tsx: tsx:
specifier: ^4.20.6 specifier: ^4.20.6
version: 4.20.6 version: 4.20.6
typeid-js:
specifier: ^1.2.0
version: 1.2.0
typescript: typescript:
specifier: ^5.9.3 specifier: ^5.9.3
version: 5.9.3 version: 5.9.3
@@ -60,9 +63,20 @@ importers:
'@types/pg': '@types/pg':
specifier: ^8.16.0 specifier: ^8.16.0
version: 8.16.0 version: 8.16.0
kysely-codegen:
specifier: ^0.19.0
version: 0.19.0(kysely@0.28.9)(pg@8.16.3)(typescript@5.9.3)
packages: packages:
'@babel/code-frame@7.28.6':
resolution: {integrity: sha512-JYgintcMjRiCvS8mMECzaEn+m3PfoQiyqukOMCCVQtoJGYJw8j/8LBJEiqkHLkfwCcs74E3pbAUFNg7d9VNJ+Q==}
engines: {node: '>=6.9.0'}
'@babel/helper-validator-identifier@7.28.5':
resolution: {integrity: sha512-qSs4ifwzKJSV39ucNjsvc6WVHs6b7S03sOh2OcHF9UHfVPqWWALUsNUVzhSBiItjRZoLHx7nIarVjqKVusUZ1Q==}
engines: {node: '>=6.9.0'}
'@biomejs/biome@2.3.10': '@biomejs/biome@2.3.10':
resolution: {integrity: sha512-/uWSUd1MHX2fjqNLHNL6zLYWBbrJeG412/8H7ESuK8ewoRoMPUgHDebqKrPTx/5n6f17Xzqc9hdg3MEqA5hXnQ==} resolution: {integrity: sha512-/uWSUd1MHX2fjqNLHNL6zLYWBbrJeG412/8H7ESuK8ewoRoMPUgHDebqKrPTx/5n6f17Xzqc9hdg3MEqA5hXnQ==}
engines: {node: '>=14.21.3'} engines: {node: '>=14.21.3'}
@@ -360,6 +374,14 @@ packages:
engines: {node: '>=0.4.0'} engines: {node: '>=0.4.0'}
hasBin: true hasBin: true
ansi-styles@3.2.1:
resolution: {integrity: sha512-VT0ZI6kZRdTh8YyJw3SMbYm/u+NqfsAxEpWO0Pf9sq8/e94WxxOpPKx9FR1FlyCtOVDNOQ+8ntlqFxiRc+r5qA==}
engines: {node: '>=4'}
ansi-styles@4.3.0:
resolution: {integrity: sha512-zbB9rCJAT1rbjiVDb2hqKFHNYLxgtk8NURxZ3IZwD3F6NtxbXZQCnnSi1Lkx+IDohdPlFp222wVALIheZJQSEg==}
engines: {node: '>=8'}
anymatch@3.1.3: anymatch@3.1.3:
resolution: {integrity: sha512-KMReFUr0B4t+D+OBkjR3KYqvocp2XaSzO55UcB6mgQMd3KbcE+mWTyvVV7D/zsdEbNnV6acZUutkiHQXvTr1Rw==} resolution: {integrity: sha512-KMReFUr0B4t+D+OBkjR3KYqvocp2XaSzO55UcB6mgQMd3KbcE+mWTyvVV7D/zsdEbNnV6acZUutkiHQXvTr1Rw==}
engines: {node: '>= 8'} engines: {node: '>= 8'}
@@ -367,6 +389,9 @@ packages:
arg@4.1.3: arg@4.1.3:
resolution: {integrity: sha512-58S9QDqG0Xx27YwPSt9fJxivjYl432YCwfDMfZ+71RAqUrZef7LrKQZ3LHLOwCS4FLNBplP533Zx895SeOCHvA==} resolution: {integrity: sha512-58S9QDqG0Xx27YwPSt9fJxivjYl432YCwfDMfZ+71RAqUrZef7LrKQZ3LHLOwCS4FLNBplP533Zx895SeOCHvA==}
argparse@2.0.1:
resolution: {integrity: sha512-8+9WqebbFzpX9OR+Wa6O29asIogeRMzcGtAINdpMHHyAg10f05aSFVBbcEqGf/PXw1EjAZ+q2/bEBg3DvurK3Q==}
asap@2.0.6: asap@2.0.6:
resolution: {integrity: sha512-BSHWgDSAiKs50o2Re8ppvp3seVHXSRM44cdSsT9FfNEUUZLOGWVCsiWaRPWM1Znn+mqZ1OfVZ3z3DWEzSp7hRA==} resolution: {integrity: sha512-BSHWgDSAiKs50o2Re8ppvp3seVHXSRM44cdSsT9FfNEUUZLOGWVCsiWaRPWM1Znn+mqZ1OfVZ3z3DWEzSp7hRA==}
@@ -400,10 +425,35 @@ packages:
resolution: {integrity: sha512-+ys997U96po4Kx/ABpBCqhA9EuxJaQWDQg7295H4hBphv3IZg0boBKuwYpt4YXp6MZ5AmZQnU/tyMTlRpaSejg==} resolution: {integrity: sha512-+ys997U96po4Kx/ABpBCqhA9EuxJaQWDQg7295H4hBphv3IZg0boBKuwYpt4YXp6MZ5AmZQnU/tyMTlRpaSejg==}
engines: {node: '>= 0.4'} engines: {node: '>= 0.4'}
callsites@3.1.0:
resolution: {integrity: sha512-P8BjAsXvZS+VIDUI11hHCQEv74YT67YUi5JJFNWIqL235sBmjX4+qx9Muvls5ivyNENctx46xQLQ3aTuE7ssaQ==}
engines: {node: '>=6'}
chalk@2.4.2:
resolution: {integrity: sha512-Mti+f9lpJNcwF4tWV8/OrTTtF1gZi+f8FqlyAdouralcFWFQWF2+NgCHShjkCb+IFBLq9buZwE1xckQU4peSuQ==}
engines: {node: '>=4'}
chalk@4.1.2:
resolution: {integrity: sha512-oKnbhFyRIXpUuez8iBMmyEa4nbj4IOQyuhc/wy9kY7/WVPcwIO9VA668Pu8RkO7+0G76SLROeyw9CpQ061i4mA==}
engines: {node: '>=10'}
chokidar@3.6.0: chokidar@3.6.0:
resolution: {integrity: sha512-7VT13fmjotKpGipCW9JEQAusEPE+Ei8nl6/g4FBAmIm0GOOLMua9NDDo/DWp0ZAxCr3cPq5ZpBqmPAQgDda2Pw==} resolution: {integrity: sha512-7VT13fmjotKpGipCW9JEQAusEPE+Ei8nl6/g4FBAmIm0GOOLMua9NDDo/DWp0ZAxCr3cPq5ZpBqmPAQgDda2Pw==}
engines: {node: '>= 8.10.0'} engines: {node: '>= 8.10.0'}
color-convert@1.9.3:
resolution: {integrity: sha512-QfAUtd+vFdAtFQcC8CCyYt1fYWxSqAiK2cSD6zDB8N3cpsEBAvRxp9zOGg6G/SHHJYAT88/az/IuDGALsNVbGg==}
color-convert@2.0.1:
resolution: {integrity: sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ==}
engines: {node: '>=7.0.0'}
color-name@1.1.3:
resolution: {integrity: sha512-72fSenhMw2HZMTVHeCA9KCmpEIbzWiQsjN+BHcBbS9vr1mtt+vJjPdksIBNUmKAW8TFUDPJK5SUU3QhE9NEXDw==}
color-name@1.1.4:
resolution: {integrity: sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA==}
commander@5.1.0: commander@5.1.0:
resolution: {integrity: sha512-P0CysNDQ7rtVw4QIQtm+MRxV66vKFSvlsQvGYXZWR3qFU0jlMKHZZZgw8e+8DSah4UDKMqnknRDQz+xuQXQ/Zg==} resolution: {integrity: sha512-P0CysNDQ7rtVw4QIQtm+MRxV66vKFSvlsQvGYXZWR3qFU0jlMKHZZZgw8e+8DSah4UDKMqnknRDQz+xuQXQ/Zg==}
engines: {node: '>= 6'} engines: {node: '>= 6'}
@@ -427,6 +477,15 @@ packages:
resolution: {integrity: sha512-yki5XnKuf750l50uGTllt6kKILY4nQ1eNIQatoXEByZ5dWgnKqbnqmTrBE5B4N7lrMJKQ2ytWMiTO2o0v6Ew/w==} resolution: {integrity: sha512-yki5XnKuf750l50uGTllt6kKILY4nQ1eNIQatoXEByZ5dWgnKqbnqmTrBE5B4N7lrMJKQ2ytWMiTO2o0v6Ew/w==}
engines: {node: '>= 0.6'} engines: {node: '>= 0.6'}
cosmiconfig@9.0.0:
resolution: {integrity: sha512-itvL5h8RETACmOTFc4UfIyB2RfEHi71Ax6E/PivVxq9NseKbOWpeyHEOIbmAw1rs8Ak0VursQNww7lf7YtUwzg==}
engines: {node: '>=14'}
peerDependencies:
typescript: '>=4.9.5'
peerDependenciesMeta:
typescript:
optional: true
create-require@1.1.1: create-require@1.1.1:
resolution: {integrity: sha512-dcKFX3jn0MpIaXjisoRvexIJVEKzaq7z2rZKxf+MSr9TkdmHmsU4m2lcLojrj/FHl8mk5VxMmYA+ftRkP/3oKQ==} resolution: {integrity: sha512-dcKFX3jn0MpIaXjisoRvexIJVEKzaq7z2rZKxf+MSr9TkdmHmsU4m2lcLojrj/FHl8mk5VxMmYA+ftRkP/3oKQ==}
@@ -443,10 +502,26 @@ packages:
resolution: {integrity: sha512-g7nH6P6dyDioJogAAGprGpCtVImJhpPk/roCzdb3fIh61/s/nPsfR6onyMwkCAR/OlC3yBC0lESvUoQEAssIrw==} resolution: {integrity: sha512-g7nH6P6dyDioJogAAGprGpCtVImJhpPk/roCzdb3fIh61/s/nPsfR6onyMwkCAR/OlC3yBC0lESvUoQEAssIrw==}
engines: {node: '>= 0.8'} engines: {node: '>= 0.8'}
diff@3.5.0:
resolution: {integrity: sha512-A46qtFgd+g7pDZinpnwiRJtxbC1hpgf0uzP3iG89scHk0AUC7A1TGxf5OiiOUv/JMZR8GOt8hL900hV0bOy5xA==}
engines: {node: '>=0.3.1'}
diff@4.0.2: diff@4.0.2:
resolution: {integrity: sha512-58lmxKSA4BNyLz+HHMUzlOEpg09FV+ev6ZMe3vJihgdxzgcwZ8VoEEPmALCZG9LmqfVoNMMKpttIYTVG6uDY7A==} resolution: {integrity: sha512-58lmxKSA4BNyLz+HHMUzlOEpg09FV+ev6ZMe3vJihgdxzgcwZ8VoEEPmALCZG9LmqfVoNMMKpttIYTVG6uDY7A==}
engines: {node: '>=0.3.1'} engines: {node: '>=0.3.1'}
dotenv-expand@12.0.3:
resolution: {integrity: sha512-uc47g4b+4k/M/SeaW1y4OApx+mtLWl92l5LMPP0GNXctZqELk+YGgOPIIC5elYmUH4OuoK3JLhuRUYegeySiFA==}
engines: {node: '>=12'}
dotenv@16.6.1:
resolution: {integrity: sha512-uBq4egWHTcTt33a72vpSG0z3HnPuIl6NqYcTrKEg2azoEyl2hpW0zqlxysq2pK9HlDIHyHyakeYaYnSAwd8bow==}
engines: {node: '>=12'}
dotenv@17.2.3:
resolution: {integrity: sha512-JVUnt+DUIzu87TABbhPmNfVdBDt18BLOWjMUFJMSi/Qqg7NTYtabbvSNJGOJ7afbRuv9D/lngizHtP7QyLQ+9w==}
engines: {node: '>=12'}
dunder-proto@1.0.1: dunder-proto@1.0.1:
resolution: {integrity: sha512-KIN/nDJBQRcXw0MLVhZE9iQHmG68qAVIBg9CqmUYjmQIhgij9U5MFvrqkUL5FbtyyzZuOeOt0zdeRe4UY7ct+A==} resolution: {integrity: sha512-KIN/nDJBQRcXw0MLVhZE9iQHmG68qAVIBg9CqmUYjmQIhgij9U5MFvrqkUL5FbtyyzZuOeOt0zdeRe4UY7ct+A==}
engines: {node: '>= 0.4'} engines: {node: '>= 0.4'}
@@ -458,6 +533,13 @@ packages:
resolution: {integrity: sha512-Q0n9HRi4m6JuGIV1eFlmvJB7ZEVxu93IrMyiMsGC0lrMJMWzRgx6WGquyfQgZVb31vhGgXnfmPNNXmxnOkRBrg==} resolution: {integrity: sha512-Q0n9HRi4m6JuGIV1eFlmvJB7ZEVxu93IrMyiMsGC0lrMJMWzRgx6WGquyfQgZVb31vhGgXnfmPNNXmxnOkRBrg==}
engines: {node: '>= 0.8'} engines: {node: '>= 0.8'}
env-paths@2.2.1:
resolution: {integrity: sha512-+h1lkLKhZMTYjog1VEpJNG7NZJWcuc2DDk/qsqSTRRCOXiLjeQ1d1/udrUGhqMxUgAlwKNZ0cf2uqan5GLuS2A==}
engines: {node: '>=6'}
error-ex@1.3.4:
resolution: {integrity: sha512-sqQamAnR14VgCr1A618A3sGrygcpK+HEbenA/HiEAkkUwcZIIB/tgWqHFxWgOyDh4nB4JCRimh79dR5Ywc9MDQ==}
es-define-property@1.0.1: es-define-property@1.0.1:
resolution: {integrity: sha512-e3nRfgfUZ4rNGL232gUgX06QNyyez04KdjFrF+LTRoOXmrOgFKDg4BCdsjW8EnT69eqdYGmRpJwiPVYNrCaW3g==} resolution: {integrity: sha512-e3nRfgfUZ4rNGL232gUgX06QNyyez04KdjFrF+LTRoOXmrOgFKDg4BCdsjW8EnT69eqdYGmRpJwiPVYNrCaW3g==}
engines: {node: '>= 0.4'} engines: {node: '>= 0.4'}
@@ -478,6 +560,10 @@ packages:
escape-html@1.0.3: escape-html@1.0.3:
resolution: {integrity: sha512-NiSupZ4OeuGwr68lGIeym/ksIZMJodUGOSCZ/FSnTxcrekbvqrgdUxlJOMpijaKZVjAJrWrGs/6Jy8OMuyj9ow==} resolution: {integrity: sha512-NiSupZ4OeuGwr68lGIeym/ksIZMJodUGOSCZ/FSnTxcrekbvqrgdUxlJOMpijaKZVjAJrWrGs/6Jy8OMuyj9ow==}
escape-string-regexp@1.0.5:
resolution: {integrity: sha512-vbRorB5FUQWvla16U8R/qgaFIya2qGzwDrNmCZuYKrbdSUMG6I1ZCGQRefkRVhuOkIGVne7BQ35DSfo1qvJqFg==}
engines: {node: '>=0.8.0'}
etag@1.8.1: etag@1.8.1:
resolution: {integrity: sha512-aIL5Fx7mawVa300al2BnEE4iNvo1qETxLrPI/o05L7z6go7fCw1J6EQmbK4FmJ2AS7kgVF/KEZWufBfdClMcPg==} resolution: {integrity: sha512-aIL5Fx7mawVa300al2BnEE4iNvo1qETxLrPI/o05L7z6go7fCw1J6EQmbK4FmJ2AS7kgVF/KEZWufBfdClMcPg==}
engines: {node: '>= 0.6'} engines: {node: '>= 0.6'}
@@ -502,6 +588,9 @@ packages:
resolution: {integrity: sha512-Rx/WycZ60HOaqLKAi6cHRKKI7zxWbJ31MhntmtwMoaTeF7XFH9hhBp8vITaMidfljRQ6eYWCKkaTK+ykVJHP2A==} resolution: {integrity: sha512-Rx/WycZ60HOaqLKAi6cHRKKI7zxWbJ31MhntmtwMoaTeF7XFH9hhBp8vITaMidfljRQ6eYWCKkaTK+ykVJHP2A==}
engines: {node: '>= 0.8'} engines: {node: '>= 0.8'}
fs.realpath@1.0.0:
resolution: {integrity: sha512-OO0pH2lK6a0hZnAdau5ItzHPI6pUlvI7jMVnxUQRtw4owF2wk8lOSabtGDCTP4Ggrg2MbGnWO9X8K1t4+fGMDw==}
fsevents@2.3.3: fsevents@2.3.3:
resolution: {integrity: sha512-5xoDfX+fL7faATnagmWPpbFtwh/R77WmMMqqHGS65C3vvB0YHrgF+B1YmZ3441tMj5n63k0212XNoJwzlhffQw==} resolution: {integrity: sha512-5xoDfX+fL7faATnagmWPpbFtwh/R77WmMMqqHGS65C3vvB0YHrgF+B1YmZ3441tMj5n63k0212XNoJwzlhffQw==}
engines: {node: ^8.16.0 || ^10.6.0 || >=11.0.0} engines: {node: ^8.16.0 || ^10.6.0 || >=11.0.0}
@@ -521,10 +610,18 @@ packages:
get-tsconfig@4.13.0: get-tsconfig@4.13.0:
resolution: {integrity: sha512-1VKTZJCwBrvbd+Wn3AOgQP/2Av+TfTCOlE4AcRJE72W1ksZXbAx8PPBR9RzgTeSPzlPMHrbANMH3LbltH73wxQ==} resolution: {integrity: sha512-1VKTZJCwBrvbd+Wn3AOgQP/2Av+TfTCOlE4AcRJE72W1ksZXbAx8PPBR9RzgTeSPzlPMHrbANMH3LbltH73wxQ==}
git-diff@2.0.6:
resolution: {integrity: sha512-/Iu4prUrydE3Pb3lCBMbcSNIf81tgGt0W1ZwknnyF62t3tHmtiJTRj0f+1ZIhp3+Rh0ktz1pJVoa7ZXUCskivA==}
engines: {node: '>= 4.8.0'}
glob-parent@5.1.2: glob-parent@5.1.2:
resolution: {integrity: sha512-AOIgSQCepiJYwP3ARnGx+5VnTu2HBYdzbGP45eLw1vr3zB3vZLeyed1sC9hnbcOc9/SrMyM5RPQrkGz4aS9Zow==} resolution: {integrity: sha512-AOIgSQCepiJYwP3ARnGx+5VnTu2HBYdzbGP45eLw1vr3zB3vZLeyed1sC9hnbcOc9/SrMyM5RPQrkGz4aS9Zow==}
engines: {node: '>= 6'} engines: {node: '>= 6'}
glob@7.2.3:
resolution: {integrity: sha512-nFR0zLpU2YCaRxwoCJvL6UvCH2JFyFVIvwTLsIf21AuHlMskA1hhTdk+LlYJtOlYt9v6dvszD2BGRqBL+iQK9Q==}
deprecated: Glob versions prior to v9 are no longer supported
gopd@1.2.0: gopd@1.2.0:
resolution: {integrity: sha512-ZUKRh6/kUFoAiTAtTYPZJ3hw9wNxx+BIBOijnlG9PnrJsCcSjs1wyyD6vJpaYtgnzDrKYRSqf3OO6Rfa93xsRg==} resolution: {integrity: sha512-ZUKRh6/kUFoAiTAtTYPZJ3hw9wNxx+BIBOijnlG9PnrJsCcSjs1wyyD6vJpaYtgnzDrKYRSqf3OO6Rfa93xsRg==}
engines: {node: '>= 0.4'} engines: {node: '>= 0.4'}
@@ -533,6 +630,10 @@ packages:
resolution: {integrity: sha512-sKJf1+ceQBr4SMkvQnBDNDtf4TXpVhVGateu0t918bl30FnbE2m4vNLX+VWe/dpjlb+HugGYzW7uQXH98HPEYw==} resolution: {integrity: sha512-sKJf1+ceQBr4SMkvQnBDNDtf4TXpVhVGateu0t918bl30FnbE2m4vNLX+VWe/dpjlb+HugGYzW7uQXH98HPEYw==}
engines: {node: '>=4'} engines: {node: '>=4'}
has-flag@4.0.0:
resolution: {integrity: sha512-EykJT/Q1KjTWctppgIAgfSO0tKVuZUjhgMr17kqTumMl6Afv3EISleU7qZUzoXDFTAHTDC4NOoG/ZxU3EvlMPQ==}
engines: {node: '>=8'}
has-symbols@1.1.0: has-symbols@1.1.0:
resolution: {integrity: sha512-1cDNdwJ2Jaohmb3sg4OmKaMBwuC48sYni5HUw2DvsC8LjGTLK9h+eb1X6RyuOHe4hT0ULCW68iomhjUoKUqlPQ==} resolution: {integrity: sha512-1cDNdwJ2Jaohmb3sg4OmKaMBwuC48sYni5HUw2DvsC8LjGTLK9h+eb1X6RyuOHe4hT0ULCW68iomhjUoKUqlPQ==}
engines: {node: '>= 0.4'} engines: {node: '>= 0.4'}
@@ -556,17 +657,36 @@ packages:
ignore-by-default@1.0.1: ignore-by-default@1.0.1:
resolution: {integrity: sha512-Ius2VYcGNk7T90CppJqcIkS5ooHUZyIQK+ClZfMfMNFEF9VSE73Fq+906u/CWu92x4gzZMWOwfFYckPObzdEbA==} resolution: {integrity: sha512-Ius2VYcGNk7T90CppJqcIkS5ooHUZyIQK+ClZfMfMNFEF9VSE73Fq+906u/CWu92x4gzZMWOwfFYckPObzdEbA==}
import-fresh@3.3.1:
resolution: {integrity: sha512-TR3KfrTZTYLPB6jUjfx6MF9WcWrHL9su5TObK4ZkYgBdWKPOFoSoQIdEuTuR82pmtxH2spWG9h6etwfr1pLBqQ==}
engines: {node: '>=6'}
inflight@1.0.6:
resolution: {integrity: sha512-k92I/b08q4wvFscXCLvqfsHCrjrF7yiXsQuIVvVE7N82W3+aqpzuUdBbfhWcy/FZR3/4IgflMgKLOsvPDrGCJA==}
deprecated: This module is not supported, and leaks memory. Do not use it. Check out lru-cache if you want a good and tested way to coalesce async requests by a key value, which is much more comprehensive and powerful.
inherits@2.0.4: inherits@2.0.4:
resolution: {integrity: sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ==} resolution: {integrity: sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ==}
interpret@1.4.0:
resolution: {integrity: sha512-agE4QfB2Lkp9uICn7BAqoscw4SZP9kTE2hxiFI3jBPmXJfdqiahTbUuKGsMoN2GtqL9AxhYioAcVvgsb1HvRbA==}
engines: {node: '>= 0.10'}
ipaddr.js@1.9.1: ipaddr.js@1.9.1:
resolution: {integrity: sha512-0KI/607xoxSToH7GjN1FfSbLoU0+btTicjsQSWQlh/hZykN8KpmMf7uYwPW3R+akZ6R/w18ZlXSHBYXiYUPO3g==} resolution: {integrity: sha512-0KI/607xoxSToH7GjN1FfSbLoU0+btTicjsQSWQlh/hZykN8KpmMf7uYwPW3R+akZ6R/w18ZlXSHBYXiYUPO3g==}
engines: {node: '>= 0.10'} engines: {node: '>= 0.10'}
is-arrayish@0.2.1:
resolution: {integrity: sha512-zz06S8t0ozoDXMG+ube26zeCTNXcKIPJZJi8hBrF4idCLms4CG9QtK7qBl1boi5ODzFpjswb5JPmHCbMpjaYzg==}
is-binary-path@2.1.0: is-binary-path@2.1.0:
resolution: {integrity: sha512-ZMERYes6pDydyuGidse7OsHxtbI7WVeUEozgR/g7rd0xUimYNlvZRE/K2MgZTjWy725IfelLeVcEM97mmtRGXw==} resolution: {integrity: sha512-ZMERYes6pDydyuGidse7OsHxtbI7WVeUEozgR/g7rd0xUimYNlvZRE/K2MgZTjWy725IfelLeVcEM97mmtRGXw==}
engines: {node: '>=8'} engines: {node: '>=8'}
is-core-module@2.16.1:
resolution: {integrity: sha512-UfoeMA6fIJ8wTYFEUjelnaGI67v6+N7qXJEvQuIGa99l4xsCruSYOVSQ0uPANn4dAzm8lkYPaKLrrijLq7x23w==}
engines: {node: '>= 0.4'}
is-extglob@2.1.1: is-extglob@2.1.1:
resolution: {integrity: sha512-SbKbANkN603Vi4jEZv49LeVJMn4yGwsbzZworEoyEiutsN3nJYdbO36zfhGJ6QEDpOZIFkDtnq5JRxmvl3jsoQ==} resolution: {integrity: sha512-SbKbANkN603Vi4jEZv49LeVJMn4yGwsbzZworEoyEiutsN3nJYdbO36zfhGJ6QEDpOZIFkDtnq5JRxmvl3jsoQ==}
engines: {node: '>=0.10.0'} engines: {node: '>=0.10.0'}
@@ -582,10 +702,62 @@ packages:
is-promise@4.0.0: is-promise@4.0.0:
resolution: {integrity: sha512-hvpoI6korhJMnej285dSg6nu1+e6uxs7zG3BYAm5byqDsgJNWwxzM6z6iZiAgQR4TJ30JmBTOwqZUw3WlyH3AQ==} resolution: {integrity: sha512-hvpoI6korhJMnej285dSg6nu1+e6uxs7zG3BYAm5byqDsgJNWwxzM6z6iZiAgQR4TJ30JmBTOwqZUw3WlyH3AQ==}
js-tokens@4.0.0:
resolution: {integrity: sha512-RdJUflcE3cUzKiMqQgsCu06FPu9UdIJO0beYbPhHN4k6apgJtifcoCtT9bcxOpYBtpD2kCM6Sbzg4CausW/PKQ==}
js-yaml@4.1.1:
resolution: {integrity: sha512-qQKT4zQxXl8lLwBtHMWwaTcGfFOZviOJet3Oy/xmGk2gZH677CJM9EvtfdSkgWcATZhj/55JZ0rmy3myCT5lsA==}
hasBin: true
json-parse-even-better-errors@2.3.1:
resolution: {integrity: sha512-xyFwyhro/JEof6Ghe2iz2NcXoj2sloNsWr/XsERDK/oiPCfaNhl5ONfp+jQdAZRQQ0IJWNzH9zIZF7li91kh2w==}
kysely-codegen@0.19.0:
resolution: {integrity: sha512-ZpdQQnpfY0kh45CA6yPA9vdFsBE+b06Fx7QVcbL5rX//yjbA0yYGZGhnH7GTd4P4BY/HIv5uAfuOD83JVZf95w==}
engines: {node: '>=20.0.0'}
hasBin: true
peerDependencies:
'@libsql/kysely-libsql': '>=0.3.0 <0.5.0'
'@tediousjs/connection-string': '>=0.5.0 <0.6.0'
better-sqlite3: '>=7.6.2 <13.0.0'
kysely: '>=0.27.0 <1.0.0'
kysely-bun-sqlite: '>=0.3.2 <1.0.0'
kysely-bun-worker: '>=1.2.0 <2.0.0'
mysql2: '>=2.3.3 <4.0.0'
pg: '>=8.8.0 <9.0.0'
tarn: '>=3.0.0 <4.0.0'
tedious: '>=18.0.0 <20.0.0'
peerDependenciesMeta:
'@libsql/kysely-libsql':
optional: true
'@tediousjs/connection-string':
optional: true
better-sqlite3:
optional: true
kysely-bun-sqlite:
optional: true
kysely-bun-worker:
optional: true
mysql2:
optional: true
pg:
optional: true
tarn:
optional: true
tedious:
optional: true
kysely@0.28.9: kysely@0.28.9:
resolution: {integrity: sha512-3BeXMoiOhpOwu62CiVpO6lxfq4eS6KMYfQdMsN/2kUCRNuF2YiEr7u0HLHaQU+O4Xu8YXE3bHVkwaQ85i72EuA==} resolution: {integrity: sha512-3BeXMoiOhpOwu62CiVpO6lxfq4eS6KMYfQdMsN/2kUCRNuF2YiEr7u0HLHaQU+O4Xu8YXE3bHVkwaQ85i72EuA==}
engines: {node: '>=20.0.0'} engines: {node: '>=20.0.0'}
lines-and-columns@1.2.4:
resolution: {integrity: sha512-7ylylesZQ/PV29jhEDl3Ufjo6ZX7gCqJr5F7PKrqc93v7fzSymt1BpwEU8nAUXs8qzzvqhbjhK5QZg6Mt/HkBg==}
loglevel@1.9.2:
resolution: {integrity: sha512-HgMmCqIJSAKqo68l0rS2AanEWfkxaZ5wNiEFb5ggm08lDs9Xl2KxBlX3PTcaD2chBM1gXAYf491/M2Rv8Jwayg==}
engines: {node: '>= 0.6.0'}
make-error@1.3.6: make-error@1.3.6:
resolution: {integrity: sha512-s8UhlNe7vPKomQhC1qFelMokr/Sc3AgNbso3n74mVPA5LTZwkB9NlXf4XPamLxJE8h0gh73rM94xvwRT2CVInw==} resolution: {integrity: sha512-s8UhlNe7vPKomQhC1qFelMokr/Sc3AgNbso3n74mVPA5LTZwkB9NlXf4XPamLxJE8h0gh73rM94xvwRT2CVInw==}
@@ -601,6 +773,10 @@ packages:
resolution: {integrity: sha512-Snk314V5ayFLhp3fkUREub6WtjBfPdCPY1Ln8/8munuLuiYhsABgBVWsozAG+MWMbVEvcdcpbi9R7ww22l9Q3g==} resolution: {integrity: sha512-Snk314V5ayFLhp3fkUREub6WtjBfPdCPY1Ln8/8munuLuiYhsABgBVWsozAG+MWMbVEvcdcpbi9R7ww22l9Q3g==}
engines: {node: '>=18'} engines: {node: '>=18'}
micromatch@4.0.8:
resolution: {integrity: sha512-PXwfBhYu0hBCPw8Dn0E+WDYb7af3dSLVWKi3HGv84IdF4TyFoC0ysxFd0Goxw7nSv4T/PzEJQxsYsEiFCKo2BA==}
engines: {node: '>=8.6'}
mime-db@1.54.0: mime-db@1.54.0:
resolution: {integrity: sha512-aU5EJuIN2WDemCcAp2vFBfp/m4EAhWJnUNSSw0ixs7/kXbd6Pg64EmwJkNdFhB8aWt1sH2CTXrLxo/iAGV3oPQ==} resolution: {integrity: sha512-aU5EJuIN2WDemCcAp2vFBfp/m4EAhWJnUNSSw0ixs7/kXbd6Pg64EmwJkNdFhB8aWt1sH2CTXrLxo/iAGV3oPQ==}
engines: {node: '>= 0.6'} engines: {node: '>= 0.6'}
@@ -612,6 +788,9 @@ packages:
minimatch@3.1.2: minimatch@3.1.2:
resolution: {integrity: sha512-J7p63hRiAjw1NDEww1W7i37+ByIrOWO5XQQAzZ3VOcL0PNybwpfmV/N05zFAzwQ9USyEcX6t3UO+K5aqBQOIHw==} resolution: {integrity: sha512-J7p63hRiAjw1NDEww1W7i37+ByIrOWO5XQQAzZ3VOcL0PNybwpfmV/N05zFAzwQ9USyEcX6t3UO+K5aqBQOIHw==}
minimist@1.2.8:
resolution: {integrity: sha512-2yyAR8qBkN3YuheJanUpWC5U3bb5osDywNB8RzDVlDwDHbocAJveqqj1u8+SVD7jkWT4yvsHCpWqqWqAxb0zCA==}
ms@2.1.3: ms@2.1.3:
resolution: {integrity: sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA==} resolution: {integrity: sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA==}
@@ -649,10 +828,25 @@ packages:
once@1.4.0: once@1.4.0:
resolution: {integrity: sha512-lNaJgI+2Q5URQBkccEKHTQOPaXdUxnZZElQTZY0MFUAuaEqe1E+Nyvgdz/aIyNi6Z9MzO5dv1H8n58/GELp3+w==} resolution: {integrity: sha512-lNaJgI+2Q5URQBkccEKHTQOPaXdUxnZZElQTZY0MFUAuaEqe1E+Nyvgdz/aIyNi6Z9MzO5dv1H8n58/GELp3+w==}
parent-module@1.0.1:
resolution: {integrity: sha512-GQ2EWRpQV8/o+Aw8YqtfZZPfNRWZYkbidE9k5rpl/hC3vtHHBfGm2Ifi6qWV+coDGkrUKZAxE3Lot5kcsRlh+g==}
engines: {node: '>=6'}
parse-json@5.2.0:
resolution: {integrity: sha512-ayCKvm/phCGxOkYRSCM82iDwct8/EonSEgCSxWxD7ve6jHggsFl4fZVQBPRNgQoKiuV/odhFrGzQXZwbifC8Rg==}
engines: {node: '>=8'}
parseurl@1.3.3: parseurl@1.3.3:
resolution: {integrity: sha512-CiyeOxFT/JZyN5m0z9PfXw4SCBJ6Sygz1Dpl0wqjlhDEGGBP1GnsUVEL0p63hoG1fcj3fHynXi9NYO4nWOL+qQ==} resolution: {integrity: sha512-CiyeOxFT/JZyN5m0z9PfXw4SCBJ6Sygz1Dpl0wqjlhDEGGBP1GnsUVEL0p63hoG1fcj3fHynXi9NYO4nWOL+qQ==}
engines: {node: '>= 0.8'} engines: {node: '>= 0.8'}
path-is-absolute@1.0.1:
resolution: {integrity: sha512-AVbw3UJ2e9bq64vSaS9Am0fje1Pa8pbGqTTsmXfaIiMpnr5DlDhfJOuLj9Sf95ZPVDAUerDfEk88MPmPe7UCQg==}
engines: {node: '>=0.10.0'}
path-parse@1.0.7:
resolution: {integrity: sha512-LDJzPVEEEPR+y48z93A0Ed0yXb8pAByGWo/k5YYdYgpY2/2EsOsksJrq7lOHxryrVOn1ejG6oAp8ahvOIQD8sw==}
path-to-regexp@8.3.0: path-to-regexp@8.3.0:
resolution: {integrity: sha512-7jdwVIRtsP8MYpdXSwOS0YdD0Du+qOoF/AEPIt88PcCFrZCzx41oxku1jD88hZBwbNUIEfpqvuhjFaMAqMTWnA==} resolution: {integrity: sha512-7jdwVIRtsP8MYpdXSwOS0YdD0Du+qOoF/AEPIt88PcCFrZCzx41oxku1jD88hZBwbNUIEfpqvuhjFaMAqMTWnA==}
@@ -690,10 +884,17 @@ packages:
pgpass@1.0.5: pgpass@1.0.5:
resolution: {integrity: sha512-FdW9r/jQZhSeohs1Z3sI1yxFQNFvMcnmfuj4WBMUTxOrAyLMaTcE1aAMBiTlbMNaXvBCQuVi0R7hd8udDSP7ug==} resolution: {integrity: sha512-FdW9r/jQZhSeohs1Z3sI1yxFQNFvMcnmfuj4WBMUTxOrAyLMaTcE1aAMBiTlbMNaXvBCQuVi0R7hd8udDSP7ug==}
picocolors@1.1.1:
resolution: {integrity: sha512-xceH2snhtb5M9liqDsmEw56le376mTZkEX/jEb/RxNFyegNul7eNslCXP9FDj/Lcu0X8KEyMceP2ntpaHrDEVA==}
picomatch@2.3.1: picomatch@2.3.1:
resolution: {integrity: sha512-JU3teHTNjmE2VCGFzuY8EXzCDVwEqB2a8fsIvwaStHhAWJEeVd1o1QD80CU6+ZdEXXSLbSsuLwJjkCBWqRQUVA==} resolution: {integrity: sha512-JU3teHTNjmE2VCGFzuY8EXzCDVwEqB2a8fsIvwaStHhAWJEeVd1o1QD80CU6+ZdEXXSLbSsuLwJjkCBWqRQUVA==}
engines: {node: '>=8.6'} engines: {node: '>=8.6'}
pluralize@8.0.0:
resolution: {integrity: sha512-Nc3IT5yHzflTfbjgqWcCPpo7DaKy4FnpB0l/zCAW0Tc7jxAiuqSxHasntB3D7887LSrA93kDJ9IXovxJYxyLCA==}
engines: {node: '>=4'}
postgres-array@2.0.0: postgres-array@2.0.0:
resolution: {integrity: sha512-VpZrUqU5A69eQyW2c5CA1jtLecCsN2U/bD6VilrFDWq5+5UIEVO7nazS3TEcHf1zuPYO/sqGvUvW62g86RXZuA==} resolution: {integrity: sha512-VpZrUqU5A69eQyW2c5CA1jtLecCsN2U/bD6VilrFDWq5+5UIEVO7nazS3TEcHf1zuPYO/sqGvUvW62g86RXZuA==}
engines: {node: '>=4'} engines: {node: '>=4'}
@@ -733,9 +934,22 @@ packages:
resolution: {integrity: sha512-hOS089on8RduqdbhvQ5Z37A0ESjsqz6qnRcffsMU3495FuTdqSm+7bhJ29JvIOsBDEEnan5DPu9t3To9VRlMzA==} resolution: {integrity: sha512-hOS089on8RduqdbhvQ5Z37A0ESjsqz6qnRcffsMU3495FuTdqSm+7bhJ29JvIOsBDEEnan5DPu9t3To9VRlMzA==}
engines: {node: '>=8.10.0'} engines: {node: '>=8.10.0'}
rechoir@0.6.2:
resolution: {integrity: sha512-HFM8rkZ+i3zrV+4LQjwQ0W+ez98pApMGM3HUrN04j3CqzPOzl9nmP15Y8YXNm8QHGv/eacOVEjqhmWpkRV0NAw==}
engines: {node: '>= 0.10'}
resolve-from@4.0.0:
resolution: {integrity: sha512-pb/MYmXstAkysRFx8piNI1tGFNQIFA3vkE3Gq4EuA1dF6gHp/+vgZqsCGJapvy8N3Q+4o7FwvquPJcnZ7RYy4g==}
engines: {node: '>=4'}
resolve-pkg-maps@1.0.0: resolve-pkg-maps@1.0.0:
resolution: {integrity: sha512-seS2Tj26TBVOC2NIc2rOe2y2ZO7efxITtLZcGSOnHHNOQ7CkiUBfw0Iw2ck6xkIhPwLhKNLS8BO+hEpngQlqzw==} resolution: {integrity: sha512-seS2Tj26TBVOC2NIc2rOe2y2ZO7efxITtLZcGSOnHHNOQ7CkiUBfw0Iw2ck6xkIhPwLhKNLS8BO+hEpngQlqzw==}
resolve@1.22.11:
resolution: {integrity: sha512-RfqAvLnMl313r7c9oclB1HhUEAezcpLjz95wFH4LVuhk9JF/r22qmVP9AMmOU4vMX7Q8pN8jwNg/CSpdFnMjTQ==}
engines: {node: '>= 0.4'}
hasBin: true
router@2.2.0: router@2.2.0:
resolution: {integrity: sha512-nLTrUKm2UyiL7rlhapu/Zl45FwNgkZGaCpZbIHajDYgwlJCOzLSk+cIPAnsEqV955GjILJnKbdQC1nVPz+gAYQ==} resolution: {integrity: sha512-nLTrUKm2UyiL7rlhapu/Zl45FwNgkZGaCpZbIHajDYgwlJCOzLSk+cIPAnsEqV955GjILJnKbdQC1nVPz+gAYQ==}
engines: {node: '>= 18'} engines: {node: '>= 18'}
@@ -762,6 +976,15 @@ packages:
setprototypeof@1.2.0: setprototypeof@1.2.0:
resolution: {integrity: sha512-E5LDX7Wrp85Kil5bhZv46j8jOeboKq5JMmYM3gVGdGH8xFpPWXUMsNrlODCrkoxMEeNi/XZIwuRvY4XNwYMJpw==} resolution: {integrity: sha512-E5LDX7Wrp85Kil5bhZv46j8jOeboKq5JMmYM3gVGdGH8xFpPWXUMsNrlODCrkoxMEeNi/XZIwuRvY4XNwYMJpw==}
shelljs.exec@1.1.8:
resolution: {integrity: sha512-vFILCw+lzUtiwBAHV8/Ex8JsFjelFMdhONIsgKNLgTzeRckp2AOYRQtHJE/9LhNvdMmE27AGtzWx0+DHpwIwSw==}
engines: {node: '>= 4.0.0'}
shelljs@0.8.5:
resolution: {integrity: sha512-TiwcRcrkhHvbrZbnRcFYMLl30Dfov3HKqzp5tO5b4pt6G/SezKcYhmDg15zXVBswHmctSAQKznqNW2LO5tTDow==}
engines: {node: '>=4'}
hasBin: true
side-channel-list@1.0.0: side-channel-list@1.0.0:
resolution: {integrity: sha512-FCLHtRD/gnpCiCHEiJLOwdmFP+wzCmDEkc9y7NsYxeF4u7Btsn1ZuwgwJGxImImHicJArLP4R0yX4c2KCrMrTA==} resolution: {integrity: sha512-FCLHtRD/gnpCiCHEiJLOwdmFP+wzCmDEkc9y7NsYxeF4u7Btsn1ZuwgwJGxImImHicJArLP4R0yX4c2KCrMrTA==}
engines: {node: '>= 0.4'} engines: {node: '>= 0.4'}
@@ -798,6 +1021,14 @@ packages:
resolution: {integrity: sha512-QjVjwdXIt408MIiAqCX4oUKsgU2EqAGzs2Ppkm4aQYbjm+ZEWEcW4SfFNTr4uMNZma0ey4f5lgLrkB0aX0QMow==} resolution: {integrity: sha512-QjVjwdXIt408MIiAqCX4oUKsgU2EqAGzs2Ppkm4aQYbjm+ZEWEcW4SfFNTr4uMNZma0ey4f5lgLrkB0aX0QMow==}
engines: {node: '>=4'} engines: {node: '>=4'}
supports-color@7.2.0:
resolution: {integrity: sha512-qpCAvRl9stuOHveKsn7HncJRvv501qIacKzQlO/+Lwxc9+0q2wLyv4Dfvt80/DPn2pqOBsJdDiogXGR9+OvwRw==}
engines: {node: '>=8'}
supports-preserve-symlinks-flag@1.0.0:
resolution: {integrity: sha512-ot0WnXS9fgdkgIcePe6RHNk1WA8+muPa6cSjeR3V8K27q9BB1rTE3R1p7Hv0z1ZyAc8s6Vvv8DIyWf681MAt0w==}
engines: {node: '>= 0.4'}
to-regex-range@5.0.1: to-regex-range@5.0.1:
resolution: {integrity: sha512-65P7iz6X5yEr1cwcgvQxbbIw7Uk3gOy5dIdtZ4rDveLqhrdJP+Li/Hx6tyK0NEb+2GCyneCMJiGqrADCSNk8sQ==} resolution: {integrity: sha512-65P7iz6X5yEr1cwcgvQxbbIw7Uk3gOy5dIdtZ4rDveLqhrdJP+Li/Hx6tyK0NEb+2GCyneCMJiGqrADCSNk8sQ==}
engines: {node: '>=8.0'} engines: {node: '>=8.0'}
@@ -837,6 +1068,9 @@ packages:
resolution: {integrity: sha512-OZs6gsjF4vMp32qrCbiVSkrFmXtG/AZhY3t0iAMrMBiAZyV9oALtXO8hsrHbMXF9x6L3grlFuwW2oAz7cav+Gw==} resolution: {integrity: sha512-OZs6gsjF4vMp32qrCbiVSkrFmXtG/AZhY3t0iAMrMBiAZyV9oALtXO8hsrHbMXF9x6L3grlFuwW2oAz7cav+Gw==}
engines: {node: '>= 0.6'} engines: {node: '>= 0.6'}
typeid-js@1.2.0:
resolution: {integrity: sha512-t76ZucAnvGC60ea/HjVsB0TSoB0cw9yjnfurUgtInXQWUI/VcrlZGpO23KN3iSe8yOGUgb1zr7W7uEzJ3hSljA==}
typescript@5.9.3: typescript@5.9.3:
resolution: {integrity: sha512-jl1vZzPDinLr9eUt3J/t7V6FgNEw9QjvBPdysz9KfQDD41fQrC2Y4vKQdiaUpFT4bXlb1RHhLpp8wtm6M5TgSw==} resolution: {integrity: sha512-jl1vZzPDinLr9eUt3J/t7V6FgNEw9QjvBPdysz9KfQDD41fQrC2Y4vKQdiaUpFT4bXlb1RHhLpp8wtm6M5TgSw==}
engines: {node: '>=14.17'} engines: {node: '>=14.17'}
@@ -852,6 +1086,10 @@ packages:
resolution: {integrity: sha512-pjy2bYhSsufwWlKwPc+l3cN7+wuJlK6uz0YdJEOlQDbl6jo/YlPi4mb8agUkVC8BF7V8NuzeyPNqRksA3hztKQ==} resolution: {integrity: sha512-pjy2bYhSsufwWlKwPc+l3cN7+wuJlK6uz0YdJEOlQDbl6jo/YlPi4mb8agUkVC8BF7V8NuzeyPNqRksA3hztKQ==}
engines: {node: '>= 0.8'} engines: {node: '>= 0.8'}
uuid@10.0.0:
resolution: {integrity: sha512-8XkAphELsDnEGrDxUOHB3RGvXz6TeuYSGEZBOjtTtPm2lwhGBjLgOzLHB63IUWfBpNucQjND6d3AOudO+H3RWQ==}
hasBin: true
v8-compile-cache-lib@3.0.1: v8-compile-cache-lib@3.0.1:
resolution: {integrity: sha512-wa7YjyUGfNZngI/vtK0UHAN+lgDCxBPCylVXGp0zu59Fz5aiGtNXaq3DhIov063MorB+VfufLh3JlF2KdTK3xg==} resolution: {integrity: sha512-wa7YjyUGfNZngI/vtK0UHAN+lgDCxBPCylVXGp0zu59Fz5aiGtNXaq3DhIov063MorB+VfufLh3JlF2KdTK3xg==}
@@ -875,6 +1113,14 @@ packages:
snapshots: snapshots:
'@babel/code-frame@7.28.6':
dependencies:
'@babel/helper-validator-identifier': 7.28.5
js-tokens: 4.0.0
picocolors: 1.1.1
'@babel/helper-validator-identifier@7.28.5': {}
'@biomejs/biome@2.3.10': '@biomejs/biome@2.3.10':
optionalDependencies: optionalDependencies:
'@biomejs/cli-darwin-arm64': 2.3.10 '@biomejs/cli-darwin-arm64': 2.3.10
@@ -1081,6 +1327,14 @@ snapshots:
acorn@8.15.0: {} acorn@8.15.0: {}
ansi-styles@3.2.1:
dependencies:
color-convert: 1.9.3
ansi-styles@4.3.0:
dependencies:
color-convert: 2.0.1
anymatch@3.1.3: anymatch@3.1.3:
dependencies: dependencies:
normalize-path: 3.0.0 normalize-path: 3.0.0
@@ -1088,6 +1342,8 @@ snapshots:
arg@4.1.3: {} arg@4.1.3: {}
argparse@2.0.1: {}
asap@2.0.6: {} asap@2.0.6: {}
balanced-match@1.0.2: {} balanced-match@1.0.2: {}
@@ -1129,6 +1385,19 @@ snapshots:
call-bind-apply-helpers: 1.0.2 call-bind-apply-helpers: 1.0.2
get-intrinsic: 1.3.0 get-intrinsic: 1.3.0
callsites@3.1.0: {}
chalk@2.4.2:
dependencies:
ansi-styles: 3.2.1
escape-string-regexp: 1.0.5
supports-color: 5.5.0
chalk@4.1.2:
dependencies:
ansi-styles: 4.3.0
supports-color: 7.2.0
chokidar@3.6.0: chokidar@3.6.0:
dependencies: dependencies:
anymatch: 3.1.3 anymatch: 3.1.3
@@ -1141,6 +1410,18 @@ snapshots:
optionalDependencies: optionalDependencies:
fsevents: 2.3.3 fsevents: 2.3.3
color-convert@1.9.3:
dependencies:
color-name: 1.1.3
color-convert@2.0.1:
dependencies:
color-name: 1.1.4
color-name@1.1.3: {}
color-name@1.1.4: {}
commander@5.1.0: {} commander@5.1.0: {}
concat-map@0.0.1: {} concat-map@0.0.1: {}
@@ -1155,6 +1436,15 @@ snapshots:
cookie@0.7.2: {} cookie@0.7.2: {}
cosmiconfig@9.0.0(typescript@5.9.3):
dependencies:
env-paths: 2.2.1
import-fresh: 3.3.1
js-yaml: 4.1.1
parse-json: 5.2.0
optionalDependencies:
typescript: 5.9.3
create-require@1.1.1: {} create-require@1.1.1: {}
debug@4.4.3(supports-color@5.5.0): debug@4.4.3(supports-color@5.5.0):
@@ -1165,8 +1455,18 @@ snapshots:
depd@2.0.0: {} depd@2.0.0: {}
diff@3.5.0: {}
diff@4.0.2: {} diff@4.0.2: {}
dotenv-expand@12.0.3:
dependencies:
dotenv: 16.6.1
dotenv@16.6.1: {}
dotenv@17.2.3: {}
dunder-proto@1.0.1: dunder-proto@1.0.1:
dependencies: dependencies:
call-bind-apply-helpers: 1.0.2 call-bind-apply-helpers: 1.0.2
@@ -1177,6 +1477,12 @@ snapshots:
encodeurl@2.0.0: {} encodeurl@2.0.0: {}
env-paths@2.2.1: {}
error-ex@1.3.4:
dependencies:
is-arrayish: 0.2.1
es-define-property@1.0.1: {} es-define-property@1.0.1: {}
es-errors@1.3.0: {} es-errors@1.3.0: {}
@@ -1216,6 +1522,8 @@ snapshots:
escape-html@1.0.3: {} escape-html@1.0.3: {}
escape-string-regexp@1.0.5: {}
etag@1.8.1: {} etag@1.8.1: {}
express@5.1.0: express@5.1.0:
@@ -1269,6 +1577,8 @@ snapshots:
fresh@2.0.0: {} fresh@2.0.0: {}
fs.realpath@1.0.0: {}
fsevents@2.3.3: fsevents@2.3.3:
optional: true optional: true
@@ -1296,14 +1606,33 @@ snapshots:
dependencies: dependencies:
resolve-pkg-maps: 1.0.0 resolve-pkg-maps: 1.0.0
git-diff@2.0.6:
dependencies:
chalk: 2.4.2
diff: 3.5.0
loglevel: 1.9.2
shelljs: 0.8.5
shelljs.exec: 1.1.8
glob-parent@5.1.2: glob-parent@5.1.2:
dependencies: dependencies:
is-glob: 4.0.3 is-glob: 4.0.3
glob@7.2.3:
dependencies:
fs.realpath: 1.0.0
inflight: 1.0.6
inherits: 2.0.4
minimatch: 3.1.2
once: 1.4.0
path-is-absolute: 1.0.1
gopd@1.2.0: {} gopd@1.2.0: {}
has-flag@3.0.0: {} has-flag@3.0.0: {}
has-flag@4.0.0: {}
has-symbols@1.1.0: {} has-symbols@1.1.0: {}
hasown@2.0.2: hasown@2.0.2:
@@ -1328,14 +1657,32 @@ snapshots:
ignore-by-default@1.0.1: {} ignore-by-default@1.0.1: {}
import-fresh@3.3.1:
dependencies:
parent-module: 1.0.1
resolve-from: 4.0.0
inflight@1.0.6:
dependencies:
once: 1.4.0
wrappy: 1.0.2
inherits@2.0.4: {} inherits@2.0.4: {}
interpret@1.4.0: {}
ipaddr.js@1.9.1: {} ipaddr.js@1.9.1: {}
is-arrayish@0.2.1: {}
is-binary-path@2.1.0: is-binary-path@2.1.0:
dependencies: dependencies:
binary-extensions: 2.3.0 binary-extensions: 2.3.0
is-core-module@2.16.1:
dependencies:
hasown: 2.0.2
is-extglob@2.1.1: {} is-extglob@2.1.1: {}
is-glob@4.0.3: is-glob@4.0.3:
@@ -1346,8 +1693,37 @@ snapshots:
is-promise@4.0.0: {} is-promise@4.0.0: {}
js-tokens@4.0.0: {}
js-yaml@4.1.1:
dependencies:
argparse: 2.0.1
json-parse-even-better-errors@2.3.1: {}
kysely-codegen@0.19.0(kysely@0.28.9)(pg@8.16.3)(typescript@5.9.3):
dependencies:
chalk: 4.1.2
cosmiconfig: 9.0.0(typescript@5.9.3)
dotenv: 17.2.3
dotenv-expand: 12.0.3
git-diff: 2.0.6
kysely: 0.28.9
micromatch: 4.0.8
minimist: 1.2.8
pluralize: 8.0.0
zod: 4.1.12
optionalDependencies:
pg: 8.16.3
transitivePeerDependencies:
- typescript
kysely@0.28.9: {} kysely@0.28.9: {}
lines-and-columns@1.2.4: {}
loglevel@1.9.2: {}
make-error@1.3.6: {} make-error@1.3.6: {}
math-intrinsics@1.1.0: {} math-intrinsics@1.1.0: {}
@@ -1356,6 +1732,11 @@ snapshots:
merge-descriptors@2.0.0: {} merge-descriptors@2.0.0: {}
micromatch@4.0.8:
dependencies:
braces: 3.0.3
picomatch: 2.3.1
mime-db@1.54.0: {} mime-db@1.54.0: {}
mime-types@3.0.1: mime-types@3.0.1:
@@ -1366,6 +1747,8 @@ snapshots:
dependencies: dependencies:
brace-expansion: 1.1.12 brace-expansion: 1.1.12
minimist@1.2.8: {}
ms@2.1.3: {} ms@2.1.3: {}
negotiator@1.0.0: {} negotiator@1.0.0: {}
@@ -1403,8 +1786,23 @@ snapshots:
dependencies: dependencies:
wrappy: 1.0.2 wrappy: 1.0.2
parent-module@1.0.1:
dependencies:
callsites: 3.1.0
parse-json@5.2.0:
dependencies:
'@babel/code-frame': 7.28.6
error-ex: 1.3.4
json-parse-even-better-errors: 2.3.1
lines-and-columns: 1.2.4
parseurl@1.3.3: {} parseurl@1.3.3: {}
path-is-absolute@1.0.1: {}
path-parse@1.0.7: {}
path-to-regexp@8.3.0: {} path-to-regexp@8.3.0: {}
pg-cloudflare@1.2.7: pg-cloudflare@1.2.7:
@@ -1442,8 +1840,12 @@ snapshots:
dependencies: dependencies:
split2: 4.2.0 split2: 4.2.0
picocolors@1.1.1: {}
picomatch@2.3.1: {} picomatch@2.3.1: {}
pluralize@8.0.0: {}
postgres-array@2.0.0: {} postgres-array@2.0.0: {}
postgres-bytea@1.0.1: {} postgres-bytea@1.0.1: {}
@@ -1478,8 +1880,20 @@ snapshots:
dependencies: dependencies:
picomatch: 2.3.1 picomatch: 2.3.1
rechoir@0.6.2:
dependencies:
resolve: 1.22.11
resolve-from@4.0.0: {}
resolve-pkg-maps@1.0.0: {} resolve-pkg-maps@1.0.0: {}
resolve@1.22.11:
dependencies:
is-core-module: 2.16.1
path-parse: 1.0.7
supports-preserve-symlinks-flag: 1.0.0
router@2.2.0: router@2.2.0:
dependencies: dependencies:
debug: 4.4.3(supports-color@5.5.0) debug: 4.4.3(supports-color@5.5.0)
@@ -1523,6 +1937,14 @@ snapshots:
setprototypeof@1.2.0: {} setprototypeof@1.2.0: {}
shelljs.exec@1.1.8: {}
shelljs@0.8.5:
dependencies:
glob: 7.2.3
interpret: 1.4.0
rechoir: 0.6.2
side-channel-list@1.0.0: side-channel-list@1.0.0:
dependencies: dependencies:
es-errors: 1.3.0 es-errors: 1.3.0
@@ -1565,6 +1987,12 @@ snapshots:
dependencies: dependencies:
has-flag: 3.0.0 has-flag: 3.0.0
supports-color@7.2.0:
dependencies:
has-flag: 4.0.0
supports-preserve-symlinks-flag@1.0.0: {}
to-regex-range@5.0.1: to-regex-range@5.0.1:
dependencies: dependencies:
is-number: 7.0.0 is-number: 7.0.0
@@ -1606,6 +2034,10 @@ snapshots:
media-typer: 1.1.0 media-typer: 1.1.0
mime-types: 3.0.1 mime-types: 3.0.1
typeid-js@1.2.0:
dependencies:
uuid: 10.0.0
typescript@5.9.3: {} typescript@5.9.3: {}
undefsafe@2.0.5: {} undefsafe@2.0.5: {}
@@ -1614,6 +2046,8 @@ snapshots:
unpipe@1.0.0: {} unpipe@1.0.0: {}
uuid@10.0.0: {}
v8-compile-cache-lib@3.0.1: {} v8-compile-cache-lib@3.0.1: {}
vary@1.1.2: {} vary@1.1.2: {}

View File

@@ -0,0 +1,179 @@
// Tests for types.ts
// Pure unit tests
import assert from "node:assert/strict";
import { describe, it } from "node:test";
import type { Request as ExpressRequest } from "express";
import { Session } from "./auth/types";
import { contentTypes } from "./content-types";
import { httpCodes } from "./http-codes";
import {
AuthenticationRequired,
AuthorizationDenied,
type Call,
isRedirect,
massageMethod,
methodParser,
type Permission,
type RedirectResult,
type Result,
requireAuth,
requirePermission,
} from "./types";
import { AuthenticatedUser, anonymousUser } from "./user";
// Helper to create a minimal mock Call
function createMockCall(overrides: Partial<Call> = {}): Call {
const defaultSession = new Session(null, anonymousUser);
return {
pattern: "/test",
path: "/test",
method: "GET",
parameters: {},
request: {} as ExpressRequest,
user: anonymousUser,
session: defaultSession,
...overrides,
};
}
describe("types", () => {
describe("methodParser", () => {
it("accepts valid HTTP methods", () => {
assert.equal(methodParser.parse("GET"), "GET");
assert.equal(methodParser.parse("POST"), "POST");
assert.equal(methodParser.parse("PUT"), "PUT");
assert.equal(methodParser.parse("PATCH"), "PATCH");
assert.equal(methodParser.parse("DELETE"), "DELETE");
});
it("rejects invalid methods", () => {
assert.throws(() => methodParser.parse("get"));
assert.throws(() => methodParser.parse("OPTIONS"));
assert.throws(() => methodParser.parse("HEAD"));
assert.throws(() => methodParser.parse(""));
});
});
describe("massageMethod", () => {
it("converts lowercase to uppercase", () => {
assert.equal(massageMethod("get"), "GET");
assert.equal(massageMethod("post"), "POST");
assert.equal(massageMethod("put"), "PUT");
assert.equal(massageMethod("patch"), "PATCH");
assert.equal(massageMethod("delete"), "DELETE");
});
it("handles mixed case", () => {
assert.equal(massageMethod("Get"), "GET");
assert.equal(massageMethod("pOsT"), "POST");
});
it("throws for invalid methods", () => {
assert.throws(() => massageMethod("options"));
assert.throws(() => massageMethod("head"));
});
});
describe("isRedirect", () => {
it("returns true for redirect results", () => {
const result: RedirectResult = {
code: httpCodes.redirection.Found,
contentType: contentTypes.text.html,
result: "",
redirect: "/other",
};
assert.equal(isRedirect(result), true);
});
it("returns false for non-redirect results", () => {
const result: Result = {
code: httpCodes.success.OK,
contentType: contentTypes.text.html,
result: "hello",
};
assert.equal(isRedirect(result), false);
});
});
describe("AuthenticationRequired", () => {
it("has correct name and message", () => {
const err = new AuthenticationRequired();
assert.equal(err.name, "AuthenticationRequired");
assert.equal(err.message, "Authentication required");
});
it("is an instance of Error", () => {
const err = new AuthenticationRequired();
assert.ok(err instanceof Error);
});
});
describe("AuthorizationDenied", () => {
it("has correct name and message", () => {
const err = new AuthorizationDenied();
assert.equal(err.name, "AuthorizationDenied");
assert.equal(err.message, "Authorization denied");
});
it("is an instance of Error", () => {
const err = new AuthorizationDenied();
assert.ok(err instanceof Error);
});
});
describe("requireAuth", () => {
it("returns user for authenticated call", () => {
const user = AuthenticatedUser.create("test@example.com");
const session = new Session(null, user);
const call = createMockCall({ user, session });
const result = requireAuth(call);
assert.equal(result, user);
});
it("throws AuthenticationRequired for anonymous user", () => {
const call = createMockCall({ user: anonymousUser });
assert.throws(() => requireAuth(call), AuthenticationRequired);
});
});
describe("requirePermission", () => {
it("returns user when they have the permission", () => {
const user = AuthenticatedUser.create("test@example.com", {
permissions: ["posts:create" as Permission],
});
const session = new Session(null, user);
const call = createMockCall({ user, session });
const result = requirePermission(
call,
"posts:create" as Permission,
);
assert.equal(result, user);
});
it("throws AuthenticationRequired for anonymous user", () => {
const call = createMockCall({ user: anonymousUser });
assert.throws(
() => requirePermission(call, "posts:create" as Permission),
AuthenticationRequired,
);
});
it("throws AuthorizationDenied when missing permission", () => {
const user = AuthenticatedUser.create("test@example.com", {
permissions: ["posts:read" as Permission],
});
const session = new Session(null, user);
const call = createMockCall({ user, session });
assert.throws(
() => requirePermission(call, "posts:create" as Permission),
AuthorizationDenied,
);
});
});
});

View File

@@ -112,4 +112,6 @@ export function requirePermission(call: Call, permission: Permission): User {
return user; return user;
} }
export type Domain = "app" | "fw";
export { methodParser, massageMethod }; export { methodParser, massageMethod };

View File

@@ -0,0 +1,213 @@
// Tests for user.ts
// These are pure unit tests - no database needed
import assert from "node:assert/strict";
import { describe, it } from "node:test";
import {
AnonymousUser,
AuthenticatedUser,
anonymousUser,
type Permission,
type Role,
} from "./user";
describe("User", () => {
describe("AuthenticatedUser.create", () => {
it("creates a user with default values", () => {
const user = AuthenticatedUser.create("test@example.com");
assert.equal(user.email, "test@example.com");
assert.equal(user.status, "active");
assert.equal(user.isAnonymous(), false);
assert.deepEqual([...user.roles], []);
assert.deepEqual([...user.permissions], []);
});
it("creates a user with custom values", () => {
const user = AuthenticatedUser.create("test@example.com", {
id: "custom-id",
displayName: "Test User",
status: "pending",
roles: ["admin"],
permissions: ["posts:create"],
});
assert.equal(user.id, "custom-id");
assert.equal(user.displayName, "Test User");
assert.equal(user.status, "pending");
assert.deepEqual([...user.roles], ["admin"]);
assert.deepEqual([...user.permissions], ["posts:create"]);
});
});
describe("status checks", () => {
it("isActive returns true for active users", () => {
const user = AuthenticatedUser.create("test@example.com", {
status: "active",
});
assert.equal(user.isActive(), true);
});
it("isActive returns false for suspended users", () => {
const user = AuthenticatedUser.create("test@example.com", {
status: "suspended",
});
assert.equal(user.isActive(), false);
});
it("isActive returns false for pending users", () => {
const user = AuthenticatedUser.create("test@example.com", {
status: "pending",
});
assert.equal(user.isActive(), false);
});
});
describe("role checks", () => {
it("hasRole returns true when user has the role", () => {
const user = AuthenticatedUser.create("test@example.com", {
roles: ["admin", "editor"],
});
assert.equal(user.hasRole("admin"), true);
assert.equal(user.hasRole("editor"), true);
});
it("hasRole returns false when user does not have the role", () => {
const user = AuthenticatedUser.create("test@example.com", {
roles: ["user"],
});
assert.equal(user.hasRole("admin"), false);
});
it("hasAnyRole returns true when user has at least one role", () => {
const user = AuthenticatedUser.create("test@example.com", {
roles: ["editor"],
});
assert.equal(user.hasAnyRole(["admin", "editor"]), true);
});
it("hasAnyRole returns false when user has none of the roles", () => {
const user = AuthenticatedUser.create("test@example.com", {
roles: ["user"],
});
assert.equal(user.hasAnyRole(["admin", "editor"]), false);
});
it("hasAllRoles returns true when user has all roles", () => {
const user = AuthenticatedUser.create("test@example.com", {
roles: ["admin", "editor", "user"],
});
assert.equal(user.hasAllRoles(["admin", "editor"]), true);
});
it("hasAllRoles returns false when user is missing a role", () => {
const user = AuthenticatedUser.create("test@example.com", {
roles: ["admin"],
});
assert.equal(user.hasAllRoles(["admin", "editor"]), false);
});
});
describe("permission checks", () => {
it("hasPermission returns true for direct permissions", () => {
const user = AuthenticatedUser.create("test@example.com", {
permissions: ["posts:create" as Permission],
});
assert.equal(
user.hasPermission("posts:create" as Permission),
true,
);
});
it("hasPermission returns true for role-derived permissions", () => {
const user = AuthenticatedUser.create("test@example.com", {
roles: ["admin" as Role],
});
// admin role has users:read, users:create, users:update, users:delete
assert.equal(user.hasPermission("users:read" as Permission), true);
assert.equal(
user.hasPermission("users:delete" as Permission),
true,
);
});
it("hasPermission returns false when permission not granted", () => {
const user = AuthenticatedUser.create("test@example.com", {
roles: ["user" as Role],
});
// user role only has users:read
assert.equal(
user.hasPermission("users:delete" as Permission),
false,
);
});
it("can() is a convenience method for hasPermission", () => {
const user = AuthenticatedUser.create("test@example.com", {
roles: ["admin" as Role],
});
assert.equal(user.can("read", "users"), true);
assert.equal(user.can("delete", "users"), true);
assert.equal(user.can("create", "posts"), false);
});
});
describe("effectivePermissions", () => {
it("returns combined direct and role-derived permissions", () => {
const user = AuthenticatedUser.create("test@example.com", {
roles: ["user" as Role],
permissions: ["posts:create" as Permission],
});
const perms = user.effectivePermissions();
assert.equal(perms.has("posts:create" as Permission), true);
assert.equal(perms.has("users:read" as Permission), true); // from user role
});
it("returns empty set for user with no roles or permissions", () => {
const user = AuthenticatedUser.create("test@example.com");
const perms = user.effectivePermissions();
assert.equal(perms.size, 0);
});
});
describe("serialization", () => {
it("toJSON returns plain object", () => {
const user = AuthenticatedUser.create("test@example.com", {
id: "test-id",
displayName: "Test",
status: "active",
roles: ["admin"],
permissions: ["posts:create"],
});
const json = user.toJSON();
assert.equal(json.id, "test-id");
assert.equal(json.email, "test@example.com");
assert.equal(json.displayName, "Test");
assert.equal(json.status, "active");
assert.deepEqual(json.roles, ["admin"]);
assert.deepEqual(json.permissions, ["posts:create"]);
});
it("toString returns readable string", () => {
const user = AuthenticatedUser.create("test@example.com", {
id: "test-id",
});
assert.equal(user.toString(), "User(id test-id)");
});
});
describe("AnonymousUser", () => {
it("isAnonymous returns true", () => {
const user = AnonymousUser.create("anon@example.com");
assert.equal(user.isAnonymous(), true);
});
it("anonymousUser singleton is anonymous", () => {
assert.equal(anonymousUser.isAnonymous(), true);
assert.equal(anonymousUser.id, "-1");
assert.equal(anonymousUser.email, "anonymous@example.com");
});
});
});

View File

@@ -0,0 +1,61 @@
// Tests for util.ts
// Pure unit tests with filesystem
import assert from "node:assert/strict";
import { mkdir, rm, writeFile } from "node:fs/promises";
import { join } from "node:path";
import { after, before, describe, it } from "node:test";
import { loadFile } from "./util";
describe("util", () => {
const testDir = join(import.meta.dirname, ".test-util-tmp");
before(async () => {
await mkdir(testDir, { recursive: true });
});
after(async () => {
await rm(testDir, { recursive: true, force: true });
});
describe("loadFile", () => {
it("loads file contents as string", async () => {
const testFile = join(testDir, "test.txt");
await writeFile(testFile, "hello world");
const content = await loadFile(testFile);
assert.equal(content, "hello world");
});
it("handles utf-8 content", async () => {
const testFile = join(testDir, "utf8.txt");
await writeFile(testFile, "hello \u{1F511} world");
const content = await loadFile(testFile);
assert.equal(content, "hello \u{1F511} world");
});
it("handles empty file", async () => {
const testFile = join(testDir, "empty.txt");
await writeFile(testFile, "");
const content = await loadFile(testFile);
assert.equal(content, "");
});
it("handles multiline content", async () => {
const testFile = join(testDir, "multiline.txt");
await writeFile(testFile, "line1\nline2\nline3");
const content = await loadFile(testFile);
assert.equal(content, "line1\nline2\nline3");
});
it("throws for nonexistent file", async () => {
await assert.rejects(
loadFile(join(testDir, "nonexistent.txt")),
/ENOENT/,
);
});
});
});

109
backend/generated/db.d.ts vendored Normal file
View File

@@ -0,0 +1,109 @@
/**
* This file was generated by kysely-codegen.
* Please do not edit it manually.
*/
import type { ColumnType } from "kysely";
export type Generated<T> =
T extends ColumnType<infer S, infer I, infer U>
? ColumnType<S, I | undefined, U>
: ColumnType<T, T | undefined, T>;
export type Timestamp = ColumnType<Date, Date | string, Date | string>;
export interface _Migrations {
applied_at: Generated<Timestamp>;
id: Generated<number>;
name: string;
}
export interface Capabilities {
description: string | null;
id: string;
name: string;
}
export interface Groups {
created_at: Generated<Timestamp>;
id: string;
name: string;
}
export interface RoleCapabilities {
capability_id: string;
granted_at: Generated<Timestamp>;
revoked_at: Timestamp | null;
role_id: string;
}
export interface Roles {
description: string | null;
id: string;
name: string;
}
export interface Sessions {
auth_method: string;
created_at: Generated<Timestamp>;
expires_at: Timestamp;
id: Generated<string>;
ip_address: string | null;
is_used: Generated<boolean | null>;
revoked_at: Timestamp | null;
token_hash: string;
token_type: string;
user_agent: string | null;
user_email_id: string | null;
user_id: string;
}
export interface UserCredentials {
created_at: Generated<Timestamp>;
credential_type: Generated<string>;
id: string;
password_hash: string | null;
updated_at: Generated<Timestamp>;
user_id: string;
}
export interface UserEmails {
created_at: Generated<Timestamp>;
email: string;
id: string;
is_primary: Generated<boolean>;
is_verified: Generated<boolean>;
normalized_email: string;
revoked_at: Timestamp | null;
user_id: string;
verified_at: Timestamp | null;
}
export interface UserGroupRoles {
granted_at: Generated<Timestamp>;
group_id: string;
revoked_at: Timestamp | null;
role_id: string;
user_id: string;
}
export interface Users {
created_at: Generated<Timestamp>;
display_name: string | null;
id: string;
status: Generated<string>;
updated_at: Generated<Timestamp>;
}
export interface DB {
_migrations: _Migrations;
capabilities: Capabilities;
groups: Groups;
role_capabilities: RoleCapabilities;
roles: Roles;
sessions: Sessions;
user_credentials: UserCredentials;
user_emails: UserEmails;
user_group_roles: UserGroupRoles;
users: Users;
}

View File

@@ -0,0 +1 @@
CREATE TABLE test_application_table ();

View File

@@ -0,0 +1 @@
CREATE TABLE test_application_table ();

20
backend/package.json Normal file
View File

@@ -0,0 +1,20 @@
{
"name": "my app",
"version": "0.0.1",
"description": "",
"main": "index.js",
"scripts": {
"test": "DB_PORT=5433 DB_USER=diachron_test DB_PASSWORD=diachron_test DB_NAME=diachron_test tsx --test '**/*.{test,spec}.ts'",
"test:watch": "DB_PORT=5433 DB_USER=diachron_test DB_PASSWORD=diachron_test DB_NAME=diachron_test tsx --test --watch '**/*.{test,spec}.ts'",
"nodemon": "nodemon dist/index.js",
"kysely-codegen": "kysely-codegen"
},
"keywords": [],
"author": "",
"license": "ISC",
"packageManager": "pnpm@10.12.4",
"dependencies": {
},
"devDependencies": {
}
}

View File

@@ -0,0 +1,2 @@
packages:
- 'diachron'

View File

@@ -2,13 +2,13 @@
import nunjucks from "nunjucks"; import nunjucks from "nunjucks";
import { DateTime } from "ts-luxon"; import { DateTime } from "ts-luxon";
import { authRoutes } from "./auth/routes"; import { authRoutes } from "./diachron/auth/routes";
import { routes as basicRoutes } from "./basic/routes"; import { routes as basicRoutes } from "./diachron/basic/routes";
import { contentTypes } from "./content-types"; import { contentTypes } from "./diachron/content-types";
import { core } from "./core"; import { core } from "./diachron/core";
import { multiHandler } from "./handlers"; import { multiHandler } from "./diachron/handlers";
import { httpCodes } from "./http-codes"; import { httpCodes } from "./diachron/http-codes";
import type { Call, Result, Route } from "./types"; import type { Call, Result, Route } from "./diachron/types";
// FIXME: Obviously put this somewhere else // FIXME: Obviously put this somewhere else
const okText = (result: string): Result => { const okText = (result: string): Result => {

View File

@@ -6,7 +6,7 @@ DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
check_dir="$DIR" check_dir="$DIR"
source "$check_dir"/../framework/shims/common source "$check_dir"/../diachron/shims/common
source "$check_dir"/../framework/shims/node.common source "$check_dir"/../diachron/shims/node.common
$ROOT/cmd pnpm tsc --showConfig $ROOT/cmd pnpm tsc --showConfig

View File

@@ -9,5 +9,6 @@
"strict": true, "strict": true,
"types": ["node"], "types": ["node"],
"outDir": "out" "outDir": "out"
} },
"exclude": ["**/*.spec.ts", "**/*.test.ts"]
} }

View File

@@ -6,8 +6,8 @@ DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
check_dir="$DIR" check_dir="$DIR"
source "$check_dir"/../framework/shims/common source "$check_dir"/../diachron/shims/common
source "$check_dir"/../framework/shims/node.common source "$check_dir"/../diachron/shims/node.common
# $ROOT/cmd pnpm tsc --lib ES2023 --esModuleInterop -w $check_dir/app.ts # $ROOT/cmd pnpm tsc --lib ES2023 --esModuleInterop -w $check_dir/app.ts
# $ROOT/cmd pnpm tsc -w $check_dir/app.ts # $ROOT/cmd pnpm tsc -w $check_dir/app.ts

View File

@@ -10,7 +10,7 @@ cd "$DIR"
# #
exclusions="SC2002" exclusions="SC2002"
source "$DIR/framework/versions" source "$DIR/diachron/versions"
if [[ $# -ne 0 ]]; then if [[ $# -ne 0 ]]; then
shellcheck --exclude="$exclusions" "$@" shellcheck --exclude="$exclusions" "$@"
@@ -20,10 +20,10 @@ fi
shell_scripts="$(fd .sh | xargs)" shell_scripts="$(fd .sh | xargs)"
# The files we need to check all either end in .sh or else they're the files # The files we need to check all either end in .sh or else they're the files
# in framework/cmd.d and framework/shims. -x instructs shellcheck to also # in diachron/cmd.d and diachron/shims. -x instructs shellcheck to also
# check `source`d files. # check `source`d files.
shellcheck -x --exclude="$exclusions" "$DIR/cmd" "$DIR"/framework/cmd.d/* "$DIR"/framework/shims/* "$shell_scripts" shellcheck -x --exclude="$exclusions" "$DIR/cmd" "$DIR"/diachron/cmd.d/* "$DIR"/diachron/shims/* "$shell_scripts"
pushd "$DIR/master" pushd "$DIR/master"
docker run --rm -v $(pwd):/app -w /app golangci/golangci-lint:$golangci_lint golangci-lint run docker run --rm -v $(pwd):/app -w /app golangci/golangci-lint:$golangci_lint golangci-lint run

24
cmd
View File

@@ -2,20 +2,26 @@
# This file belongs to the framework. You are not expected to modify it. # This file belongs to the framework. You are not expected to modify it.
# FIXME: Obviously this file isn't nearly robust enough. Make it so. # Managed binary runner - runs framework-managed binaries like node, pnpm, tsx
# Usage: ./cmd <command> [args...]
set -eu set -eu
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
if [ $# -lt 1 ]; then
echo "Usage: ./cmd <command> [args...]"
echo ""
echo "Available commands:"
for cmd in "$DIR"/diachron/cmd.d/*; do
if [ -x "$cmd" ]; then
basename "$cmd"
fi
done
exit 1
fi
subcmd="$1" subcmd="$1"
# echo "$subcmd"
#exit 3
shift shift
echo will run "$DIR"/framework/cmd.d/"$subcmd" "$@" exec "$DIR"/diachron/cmd.d/"$subcmd" "$@"
exec "$DIR"/framework/cmd.d/"$subcmd" "$@"

27
develop Executable file
View File

@@ -0,0 +1,27 @@
#!/bin/bash
# This file belongs to the framework. You are not expected to modify it.
# Development command runner - parallel to ./mgmt for development tasks
# Usage: ./develop <command> [args...]
set -eu
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
if [ $# -lt 1 ]; then
echo "Usage: ./develop <command> [args...]"
echo ""
echo "Available commands:"
for cmd in "$DIR"/diachron/develop.d/*; do
if [ -x "$cmd" ]; then
basename "$cmd"
fi
done
exit 1
fi
subcmd="$1"
shift
exec "$DIR"/diachron/develop.d/"$subcmd" "$@"

0
diachron/.nodejs/.gitignore vendored Normal file
View File

0
diachron/binaries/.gitignore vendored Normal file
View File

15
diachron/cmd.d/test Executable file
View File

@@ -0,0 +1,15 @@
#!/bin/bash
set -eu
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
cd "$DIR/../../backend"
if [ $# -eq 0 ]; then
# Find all test files - use -print0/xargs to handle filenames safely
find . -type f \( -name '*.spec.ts' -o -name '*.test.ts' \) -print0 |
xargs -0 "$DIR"/../shims/pnpm tsx --test
else
"$DIR"/../shims/pnpm tsx --test "$@"
fi

View File

@@ -5,5 +5,5 @@ set -eu
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
ROOT="$DIR/../.." ROOT="$DIR/../.."
cd "$ROOT/express" cd "$ROOT/backend"
"$DIR"/tsx migrate.ts "$@" "$DIR"/tsx migrate.ts "$@"

11
diachron/develop.d/clear-db Executable file
View File

@@ -0,0 +1,11 @@
#!/bin/bash
# This file belongs to the framework. You are not expected to modify it.
set -eu
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
ROOT="$DIR/../.."
cd "$ROOT/express"
"$DIR"/../cmd.d/tsx develop/clear-db.ts "$@"

Some files were not shown because too many files have changed in this diff Show More