95 Commits

Author SHA1 Message Date
5947dcdc86 Add ? prefix for sample files in file-list; fix bootstrap/upgrade
bootstrap.sh wrote .diachron-version to the temp clone directory
instead of the target project, causing upgrade.sh to fail. Fix that
and teach all three scripts (bootstrap, upgrade, diff-upstream) about
the new ? prefix convention in file-list.

Sample files (?-prefixed) are copied on bootstrap but left alone on
upgrade so user modifications are preserved. New samples introduced
in a newer framework version are still copied if absent.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-09 06:58:09 -05:00
f0aca17a0a Pass URL parameters from path-to-regexp match into Call.parameters
Call.parameters was hardcoded to a placeholder object. Now the matched
route parameters are threaded through from the dispatcher to the handler
so handlers can access e.g. call.parameters.word.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-08 17:24:26 -05:00
5be7b84972 Point CLAUDE.md and AGENTS.md to diachron/AGENTS.md
Ensures coding agents auto-discover the framework guide regardless
of which root-level instruction file their tool reads.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-08 16:50:07 -05:00
ae077886ba Add diff-upstream.sh and document framework change workflow
New script extracts a diff of framework files against the upstream
ref in .diachron-version.  Both DIACHRON.md and diachron/AGENTS.md
now explain how to use it and recommend keeping framework changes
in discrete, well-described commits to ease upstreaming.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-08 16:43:43 -05:00
8318e60c33 Add newcomer guide, agent guide, and sample app files
DIACHRON.md explains the framework to newcomers joining a
diachron-based project.  diachron/AGENTS.md helps AI coding
agents work with the framework conventions and commands.
backend/types.ts and backend/services.ts are sample starting
points for application-specific types and services.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-08 13:38:10 -05:00
cf04ecc78a Add todo item 2026-02-08 09:42:33 -05:00
82e87577cc Get base files closer to being bootstrappable 2026-02-08 09:42:23 -05:00
98f1f554c1 Fix add-user command 2026-02-08 08:04:59 -05:00
554f0e0044 . 2026-02-08 08:04:49 -05:00
530e3dccef Add todo item 2026-02-08 08:04:38 -05:00
91780b6dca Add formatted error pages for console and browser
Errors now show app frames highlighted with relative paths and library
frames collapsed, both in ANSI on the console and as a styled HTML page
in the browser. Process-level uncaughtException/unhandledRejection
handlers also use the formatter.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-07 18:07:41 -05:00
a39ed37a03 Enable source maps for readable stack traces
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-07 17:05:18 -05:00
35165dcefe Wire up node_modules 2026-02-07 16:56:13 -05:00
dbd4e0a687 Add a note re necessary software 2026-02-07 16:55:54 -05:00
7b271da2b8 Fix some commands 2026-02-07 16:55:33 -05:00
940cef138e Suppress duplicate tar output in bootstrap and upgrade scripts
Verbose on the sending tar, quiet on the receiving tar, so the
file list prints once.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-07 10:13:22 -05:00
296e460326 Implement upgrade.sh for framework version upgrades
Removes old framework files (per current file-list), copies in new
ones from the target ref, and stages everything for the user to
review before committing. Also adds file-list to itself so it
gets upgraded too.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-07 10:10:09 -05:00
738e622fdc Add bootstrap and cached repository scripts
bootstrap.sh clones from a local mirror and extracts framework
files into the working directory. update-cached-repository.sh
maintains the mirror under ~/.cache/diachron/v1/repositories/.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-07 09:36:20 -05:00
034b035a91 Cache downloaded binaries in ~/.cache/diachron/v1/binaries
Downloads Node.js and pnpm to a shared cache directory, then
copies into the project tree. Repeated project bootstraps skip
the network entirely if the cache is warm.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-07 09:12:21 -05:00
f352ae44e1 Outline what the first version of the upgrade script should do 2026-02-02 20:03:19 -05:00
341db4f821 Add dependency duplication check between app and framework
Adds check-deps.ts which ensures backend/package.json doesn't duplicate
any dependencies already provided by backend/diachron/package.json.
Integrated into backend/check.sh.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-02 19:47:05 -05:00
eabec3816b Add bootstrap.sh script
It's meant to be used to bootstrap new projects.  It could probably be curled
and piped through bash although this has not been tested yet.
2026-02-02 19:42:59 -05:00
b752eb5080 Make shfmt happier 2026-02-02 18:32:53 -05:00
1ed5aa4b33 Reorder some imports 2026-02-02 18:32:39 -05:00
4d1c30b874 Fix some stragglers 2026-02-02 18:31:03 -05:00
02edf259f0 Rename framework/ to diachron/ and update all references
Update paths in .gitignore, cmd, develop, mgmt, sync.sh, check.sh,
fixup.sh, CLAUDE.md, docs/new-project.md, and backend/*.sh scripts.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-02 18:10:32 -05:00
db1f2151de Rename express/ to backend/ and update references
Update paths in sync.sh, master/main.go, and CLAUDE.md to reflect
the directory rename.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-02 17:54:44 -05:00
6e669d025a Move many files to diachron subdir 2026-02-02 17:22:08 -05:00
a1dbf71de4 Rename directory 2026-02-02 16:53:22 -05:00
0afc3efa5d Fix test script to work on macOS default bash
Replace globstar (bash 4.0+) with find for portability.
macOS ships with bash 3.2 which doesn't support globstar.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-02 12:25:26 -05:00
6f2ca2c15d Tweak marketing blurb 2026-02-02 11:37:05 -05:00
6a41273835 Add macOS x86_64 platform support
Platform detection now happens in framework/platform, sourced by both
sync.sh and the node shim. Uses shasum on macOS, sha256sum on Linux.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-02 11:21:41 -05:00
33251d9b77 Add comprehensive test suite for express modules
Tests for:
- user.ts: User class, roles, permissions, status checks
- util.ts: loadFile utility
- handlers.ts: multiHandler
- types.ts: methodParser, requireAuth, requirePermission
- logging.ts: module structure
- database.ts: connectionConfig, raw queries, PostgresAuthStore
- auth/token.ts: generateToken, hashToken, parseAuthorizationHeader
- auth/password.ts: hashPassword, verifyPassword (scrypt)
- auth/types.ts: Zod parsers, Session class, tokenLifetimes
- auth/store.ts: InMemoryAuthStore
- auth/service.ts: AuthService (login, register, verify, reset)
- basic/*.ts: route structure tests

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-25 20:40:49 -06:00
408032c30d fmt 2026-01-25 18:21:01 -06:00
19959a0325 . 2026-01-25 18:20:57 -06:00
87c9d1be16 Update todo list 2026-01-25 18:19:32 -06:00
c2748bfcc6 Add test infrastructure for hydrators using node:test
- Add docker-compose.test.yml with isolated PostgreSQL on port 5433
- Add environment variable support for database connection config
- Add test setup utilities and initial user hydrator tests
- Add test and test:watch scripts to package.json

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-25 18:18:15 -06:00
2f5ef7c267 Add automatic restart for crashed worker processes
Workers are now monitored and automatically restarted when they crash.
The worker pool validates addresses before returning them to skip stale
entries from crashed workers.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-25 17:55:43 -06:00
bcd71f2801 Add kysely-codegen command 2026-01-25 17:54:50 -06:00
82a8c03316 Pull in typeid-js 2026-01-25 17:26:09 -06:00
b8065ead79 Add develop db-url 2026-01-25 17:25:57 -06:00
811c446895 Pull in kysely-codegen 2026-01-25 12:28:44 -06:00
5a8c0028d7 Add user_credentials migration 2026-01-25 12:14:34 -06:00
f7e6e56aca Merge branch 'experiments' 2026-01-25 12:12:35 -06:00
cd19a32be5 Add more todo items 2026-01-25 12:12:15 -06:00
478305bc4f Update /home template 2026-01-25 12:12:02 -06:00
421628d49e Add various doc updates
They are still very far from complete.
2026-01-25 12:11:34 -06:00
4f37a72d7b Clean commands up 2026-01-24 16:54:54 -06:00
e30bf5d96d Fix regexp in fixup.sh 2026-01-24 16:39:13 -06:00
8704c4a8d5 Separate framework and app migrations
Also add a new develop command: clear-db.
2026-01-24 16:38:33 -06:00
579a19669e Match user and session schema changes 2026-01-24 15:48:22 -06:00
474420ac1e Add development command to reset the database and rerun migrations 2026-01-24 15:13:34 -06:00
960f78a1ad Update initial tables 2026-01-24 15:13:30 -06:00
d921679058 Rework user types: create AuthenticatedUser and AnonymousUser class
Both are subclasses of an abstract User class which contains almost everything
interesting.
2026-01-17 17:45:36 -06:00
350bf7c865 Run shell scripts through shfmt 2026-01-17 16:30:55 -06:00
8a7682e953 Split services into core and request 2026-01-17 16:20:55 -06:00
e59bb35ac9 Update todo list 2026-01-17 16:10:38 -06:00
a345a2adfb Add directive 2026-01-17 16:10:24 -06:00
00d84d6686 Note that files belong to framework 2026-01-17 15:45:02 -06:00
7ed05695b9 Separate happy path utility functions for requests 2026-01-17 15:43:52 -06:00
03cc4cf4eb Remove prettier; we've been using biome for a while 2026-01-17 13:19:40 -06:00
2121a6b5de Merge remote-tracking branch 'crondiad/experiments' into experiments 2026-01-11 16:08:03 -06:00
Michael Wolf
6ace2163ed Update pnpm version 2026-01-11 16:07:32 -06:00
Michael Wolf
93ab4b5d53 Update node version 2026-01-11 16:07:24 -06:00
Michael Wolf
70ddcb2a94 Note that we need bash 2026-01-11 16:06:48 -06:00
Michael Wolf
1da81089cd Add sync.sh script
This downloads and installs dependencies necessary to run or develop.

Add docker-compose.yml for initial use
2026-01-11 16:06:43 -06:00
f383c6a465 Add logger wrapper script 2026-01-11 15:48:32 -06:00
e34d47b352 Add various todo items 2026-01-11 15:36:15 -06:00
de70be996e Add docker-compose.yml for initial use 2026-01-11 15:33:01 -06:00
096a1235b5 Add basic logout 2026-01-11 15:31:59 -06:00
4a4dc11aa4 Fix formatting 2026-01-11 15:17:58 -06:00
7399cbe785 Add / template 2026-01-11 14:57:51 -06:00
14d20be9a2 Note that file belongs to the framework 2026-01-11 14:57:26 -06:00
55f5cc699d Add request-scoped context for session.getUser()
Use AsyncLocalStorage to provide request context so services can access
the current user without needing Call passed through every function.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-11 14:56:10 -06:00
afcb447b2b Add a command to add a new user 2026-01-11 14:38:19 -06:00
1c1eeddcbe Add basic login screen with form-based authentication
Adds /login route with HTML template that handles GET (show form) and
POST (authenticate). On successful login, sets session cookie and
redirects to /. Also adds framework support for redirects and cookies
in route handlers.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-11 10:07:02 -06:00
7cecf5326d Make biome happier 2026-01-10 14:02:38 -06:00
47f6bee75f Improve test command to find spec/test files recursively
Use globstar for recursive matching and support both *.spec.ts
and *.test.ts patterns in any subdirectory.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-10 13:55:42 -06:00
6e96c33457 Add very basic support for finding and rendering templates 2026-01-10 13:50:44 -06:00
9e3329fa58 . 2026-01-10 13:38:42 -06:00
05eaf938fa Add test command
For now this just runs typescript tests.  Eventually it'll do more than that.
2026-01-10 13:38:10 -06:00
df2d4eea3f Add initial way to get info about execution context 2026-01-10 13:37:39 -06:00
b235a6be9a Add block for declared var 2026-01-10 13:05:39 -06:00
8cd4b42cc6 Add scripts to run migrations and to connect to the db 2026-01-10 09:05:05 -06:00
241d3e799e Use less ambiguous funcion 2026-01-10 08:55:00 -06:00
49dc0e3fe0 Mark several unused vars as such 2026-01-10 08:54:51 -06:00
c7b8cd33da Clean up imports 2026-01-10 08:54:34 -06:00
6c0895de07 Fix formatting 2026-01-10 08:51:20 -06:00
17ea6ba02d Consider block stmts without braces to be errors 2026-01-09 11:44:09 -06:00
661def8a5c Refmt 2026-01-04 15:24:29 -06:00
74d75d08dd Add Session class to provide getUser() on call.session
Wraps SessionData and user into a Session class that handlers can use
via call.session.getUser() instead of accessing services directly.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-04 15:22:27 -06:00
ad6d405206 Add session data to Call type
- AuthService.validateRequest now returns AuthResult with both user and session
- Call type includes session: SessionData | null
- Handlers can access session metadata (createdAt, authMethod, etc.)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-04 09:50:05 -06:00
e9ccf6d757 Add PostgreSQL database layer with Kysely and migrations
- Add database.ts with connection pool, Kysely query builder, and migration runner
- Create migrations for users and sessions tables (0001, 0002)
- Implement PostgresAuthStore to replace InMemoryAuthStore
- Wire up database service in services/index.ts

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-04 09:43:20 -06:00
34ec5be7ec Pull in kysely and pg deps 2026-01-03 17:20:49 -06:00
a0043fd475 Fix go version 2025-02-08 13:37:29 -06:00
163 changed files with 7423 additions and 635 deletions

8
.gitignore vendored
View File

@@ -1,5 +1,5 @@
**/node_modules **/node_modules
framework/downloads diachron/downloads
framework/binaries diachron/binaries
framework/.nodejs diachron/.nodejs
framework/.nodejs-config diachron/.nodejs-config

1
.go-version Normal file
View File

@@ -0,0 +1 @@
1.23.6

44
AGENTS.md Normal file
View File

@@ -0,0 +1,44 @@
# Agent Instructions
Read and follow the instructions in `diachron/AGENTS.md`. That file
contains framework conventions, commands, and structure that apply to
all coding agents working on diachron-based projects.
This project uses **bd** (beads) for issue tracking. Run `bd onboard` to get started.
## Quick Reference
```bash
bd ready # Find available work
bd show <id> # View issue details
bd update <id> --status in_progress # Claim work
bd close <id> # Complete work
bd sync # Sync with git
```
## Landing the Plane (Session Completion)
**When ending a work session**, you MUST complete ALL steps below. Work is NOT complete until `git push` succeeds.
**MANDATORY WORKFLOW:**
1. **File issues for remaining work** - Create issues for anything that needs follow-up
2. **Run quality gates** (if code changed) - Tests, linters, builds
3. **Update issue status** - Close finished work, update in-progress items
4. **PUSH TO REMOTE** - This is MANDATORY:
```bash
git pull --rebase
bd sync
git push
git status # MUST show "up to date with origin"
```
5. **Clean up** - Clear stashes, prune remote branches
6. **Verify** - All changes committed AND pushed
7. **Hand off** - Provide context for next session
**CRITICAL RULES:**
- Work is NOT complete until `git push` succeeds
- NEVER stop before pushing - that leaves work stranded locally
- NEVER say "ready to push when you are" - YOU must push
- If push fails, resolve and retry until it succeeds

View File

@@ -3,6 +3,10 @@
This file provides guidance to Claude Code (claude.ai/code) when working with This file provides guidance to Claude Code (claude.ai/code) when working with
code in this repository. code in this repository.
Read and follow the instructions in `diachron/AGENTS.md`. That file
contains framework conventions, commands, and structure that apply to
all coding agents working on diachron-based projects.
## Project Overview ## Project Overview
Diachron is an opinionated TypeScript/Node.js web framework with a Go-based Diachron is an opinionated TypeScript/Node.js web framework with a Go-based
@@ -31,14 +35,14 @@ master process. Key design principles:
### Development ### Development
**Check shell scripts (shellcheck + shfmt) (eventually go fmt and prettier or similar):** **Check shell scripts (shellcheck + shfmt) (eventually go fmt and biome or similar):**
```bash ```bash
./check.sh ./check.sh
``` ```
**Format TypeScript code:** **Format TypeScript code:**
```bash ```bash
cd express && ../cmd pnpm prettier --write . cd backend && ../cmd pnpm biome check --write .
``` ```
**Build Go master process:** **Build Go master process:**
@@ -54,9 +58,9 @@ cd master && go build
### Components ### Components
- **express/** - TypeScript/Express.js backend application - **backend/** - TypeScript/Express.js backend application
- **master/** - Go-based master process for file watching and process management - **master/** - Go-based master process for file watching and process management
- **framework/** - Managed binaries (Node.js, pnpm), command wrappers, and - **diachron/** - Managed binaries (Node.js, pnpm), command wrappers, and
framework-specific library code framework-specific library code
- **monitor/** - Go file watcher that triggers rebuilds (experimental) - **monitor/** - Go file watcher that triggers rebuilds (experimental)
@@ -68,7 +72,7 @@ Responsibilities:
- Proxy web requests to backend workers - Proxy web requests to backend workers
- Behaves identically in all environments (no dev/prod distinction) - Behaves identically in all environments (no dev/prod distinction)
### Express App Structure ### Backend App Structure
- `app.ts` - Main Express application setup with route matching - `app.ts` - Main Express application setup with route matching
- `routes.ts` - Route definitions - `routes.ts` - Route definitions
@@ -78,7 +82,7 @@ Responsibilities:
### Framework Command System ### Framework Command System
Commands flow through: `./cmd``framework/cmd.d/*``framework/shims/*` → managed binaries in `framework/binaries/` Commands flow through: `./cmd``diachron/cmd.d/*``diachron/shims/*` → managed binaries in `diachron/binaries/`
This ensures consistent tooling versions across the team without system-wide installations. This ensures consistent tooling versions across the team without system-wide installations.
@@ -94,8 +98,8 @@ This ensures consistent tooling versions across the team without system-wide ins
## Platform Requirements ## Platform Requirements
Linux x86_64 only (currently). Requires: Linux or macOS on x86_64. Requires:
- Modern libc for Go binaries - Modern libc for Go binaries (Linux)
- docker compose (for full stack) - docker compose (for full stack)
- fd, shellcheck, shfmt (for development) - fd, shellcheck, shfmt (for development)
@@ -108,6 +112,10 @@ Early stage - most implementations are stubs:
# meta # meta
## formatting and sorting
- When a typescript file exports symbols, they should be listed in order
## guidelines for this document ## guidelines for this document
- Try to keep lines below 80 characters in length, especially prose. But if - Try to keep lines below 80 characters in length, especially prose. But if

184
DIACHRON.md Normal file
View File

@@ -0,0 +1,184 @@
# What is diachron?
diachron is a web framework for TypeScript and Node.js. It uses a Go-based
master process that handles file watching, building, process management, and
request proxying. The application code is TypeScript running on Express.js.
If you're joining a project that uses diachron, this document will orient you.
## Why diachron exists
diachron was built around a few frustrations with mainstream web frameworks:
- **No dev/prod split.** Most frameworks behave differently in development and
production. diachron doesn't. The master process watches files, rebuilds,
and manages workers the same way everywhere. There is no `NODE_ENV`.
- **Managed tooling.** Node.js, pnpm, and other tools are downloaded and
pinned to exact versions inside the project. You don't install them
system-wide. Everyone on the team runs the same binaries.
- **PostgreSQL, directly.** No ORM, no database abstraction layer. You write
SQL (via Kysely for type safety) and talk to PostgreSQL. If you need
MySQL or SQLite support, this is not the framework for you.
- **Debuggability over magic.** Everything is explicit and inspectable.
Logging and observability are first-class concerns, not afterthoughts.
diachron is inspired by the
[Taking PHP Seriously](https://slack.engineering/taking-php-seriously/) essay
from Slack Engineering. It's designed for small to medium systems (what we
call "Ring 0 and Ring 1") -- not heavy-compliance or banking-scale
applications.
## How it works
When you run `./master`, the following happens:
1. The Go master process starts and watches your TypeScript source files.
2. It builds the backend using `@vercel/ncc`, producing a single bundled JS
file.
3. It starts one or more Node.js worker processes running your Express app.
4. It proxies HTTP requests from port 8080 to the workers.
5. When you edit a source file, it rebuilds and restarts the workers
automatically.
6. If a worker crashes, it restarts automatically.
There is no separate "dev server" or "hot module replacement." The master
process is the only way to run the application.
## Project structure
A diachron project looks like this:
```
.
├── DIACHRON.md # This file (framework overview for newcomers)
├── master/ # Go master process (framework-owned)
├── logger/ # Go logging service (framework-owned)
├── diachron/ # Managed binaries, shims, framework library
│ ├── AGENTS.md # Guide for AI coding agents
│ ├── binaries/ # Downloaded Node.js, pnpm (gitignored)
│ ├── cmd.d/ # Commands available via ./cmd
│ ├── shims/ # Wrappers that use managed binaries
│ └── ...
├── backend/ # Your application code
│ ├── app.ts # Entry point
│ ├── routes.ts # Route definitions
│ ├── handlers.ts # Route handlers
│ ├── services.ts # Service layer
│ ├── types.ts # Application types
│ ├── config.ts # Application configuration
│ └── diachron/ # Framework library code (framework-owned)
├── cmd # Run managed commands (./cmd pnpm install, etc.)
├── develop # Development-only commands (./develop reset-db, etc.)
├── mgmt # Management commands safe for production
├── sync.sh # Install/update all dependencies
├── master # The compiled master binary (after sync)
└── docker-compose.yml
```
### File ownership
There are two owners of files in a diachron project:
- **You own** everything in `backend/` (except `backend/diachron/`), plus
`docker-compose.yml`, `package.json`, and anything else you create.
- **The framework owns** `master/`, `logger/`, `diachron/`,
`backend/diachron/`, and the top-level scripts (`cmd`, `develop`, `mgmt`,
`sync.sh`, `check.sh`).
Don't modify framework-owned files unless you need to. This separation
keeps framework upgrades clean. If you do need to change framework files
(especially early on, there are rough edges), you can extract your changes
as a patch:
```bash
./diff-upstream.sh # full diff against upstream
./diff-upstream.sh --stat # just list changed files
```
This diffs every file in `file-list` against the upstream ref recorded in
`.diachron-version`.
When you do change framework files, make each change in its own commit with
a clear message explaining what the change is and why it's needed. Mixing
framework fixes with application work in a single commit makes it much
harder to upstream later. A clean history of discrete, well-explained
framework commits is the easiest thing to turn into contributions.
## Getting started
```bash
# Install dependencies and build the master process
./sync.sh
# Start the application
./master
```
The app will be available at `http://localhost:8080`.
You need Linux or macOS on x86_64. For the full stack (database, Redis,
etc.), you also need `docker compose`.
## The command system
diachron has three types of commands, separated by intent and safety:
- **`./cmd <command>`** -- Run managed tools (node, pnpm, tsx, etc.). These
use the project's pinned versions, not whatever is installed on your system.
```bash
./cmd pnpm install
./cmd pnpm test
```
- **`./mgmt <command>`** -- Management commands that are safe to run in
production. Migrations, user management, that sort of thing.
```bash
./mgmt migrate
./mgmt add-user
```
- **`./develop <command>`** -- Development commands that may be destructive.
Database resets, fixture loading, etc. These are gated in production.
```bash
./develop reset-db
./develop db # Open a database shell
```
The rule of thumb: if you'd run it at 3am while tired and worried, it's a
`mgmt` command. If it destroys data on purpose, it's a `develop` command.
## Key concepts
### Call and Result
diachron wraps Express's `Request` and `Response` in its own types called
`Call` and `Result`. This avoids shadowing and keeps the framework's
interface distinct from Express internals. Your handlers receive a `Call`
and return a `Result`.
### Routes
Routes are defined as data (arrays of `Route` objects in `routes.ts`), not
through decorators or method chaining. The framework processes them into
Express handlers.
### No environment variables for behavior
There is no `NODE_ENV`, no `DEBUG`, no mode switching. Configuration that
must vary between deployments (database URLs, secrets) lives in
configuration files, but the application's behavior doesn't branch on
environment.
## Further reading
- `README.md` -- Project introduction and requirements
- `diachron/AGENTS.md` -- Guide for AI coding agents
- `docs/` -- Design documents and philosophy
- `docs/commands.md` -- Detailed explanation of the command system
- `docs/concentric-circles.md` -- What diachron is (and isn't) designed for

View File

@@ -2,16 +2,13 @@ diachron
## Introduction ## Introduction
Is your answer to some of these questions "yes"? If so, you might like
diachron. (When it comes to that dev/test/prod one, hear us out first, ok?)
- Do you want to share a lot of backend and frontend code? - Do you want to share a lot of backend and frontend code?
- Are you tired of your web stack breaking when you blink too hard? - Are you tired of your web stack breaking when you blink too hard?
- Have you read [Taking PHP - Have you read [Taking PHP
Seriously](https://slack.engineering/taking-php-seriously/) and wish you had Seriously](https://slack.engineering/taking-php-seriously/) and do you wish
something similar for Typescript? you had something similar for Typescript?
- Do you think that ORMs are not all that? Do you wish you had first class - Do you think that ORMs are not all that? Do you wish you had first class
unmediated access to your database? And do you think that database unmediated access to your database? And do you think that database
@@ -35,6 +32,9 @@ diachron. (When it comes to that dev/test/prod one, hear us out first, ok?)
you're trying to fix? We're talking authentication, authorization, XSS, you're trying to fix? We're talking authentication, authorization, XSS,
https, nested paths, all that stuff. https, nested paths, all that stuff.
Is your answer to some of these questions "yes"? If so, you might like
diachron. (When it comes to that dev/test/prod one, hear us out first, ok?)
## Getting started ## Getting started
Different situations require different getting started docs. Different situations require different getting started docs.
@@ -44,20 +44,28 @@ Different situations require different getting started docs.
## Requirements ## Requirements
To run diachron, you currently need to have a Linux box running x86_64 with a To run diachron, you need Linux or macOS on x86_64. Linux requires a new
new enough libc to run golang binaries. Support for other platforms will come enough libc to run golang binaries.
eventually.
To run a more complete system, you also need to have docker compose installed. To run a more complete system, you also need to have docker compose installed.
### Database
To connect to the database, you need psql (PostgreSQL client, for
`./diachron/common.d/db`)
- macOS: `brew install libpq` (and follow the caveat to add it to your PATH),
or `brew install postgresql`
- Debian/Ubuntu: `apt install postgresql-client`
- Fedora/RHEL: `dnf install postgresql`
### Development requirements ### Development requirements
To hack on diachron itself, you need the following: To hack on diachron itself, you need the following:
- bash
- docker and docker compose - docker and docker compose
- [fd](https://github.com/sharkdp/fd) - [fd](https://github.com/sharkdp/fd)
- golang, version 1.23.6 or greater - golang, version 1.23.6 or greater
- shellcheck - shellcheck
- shfmt - shfmt

110
TODO.md
View File

@@ -1,22 +1,49 @@
## high importance ## high importance
- Many of the UUIDs generated are not UUIDv7. This needs to be fixed.
- [ ] Add unit tests all over the place. - [ ] Add unit tests all over the place.
- ⚠️ Huge task - needs breakdown before starting - ⚠️ Huge task - needs breakdown before starting
- [ ] Create initial docker-compose.yml file for local development - [ ] map exceptions back to source lines
- include most recent stable postgres
- include beanstalkd
- include memcached
- include redis
- include mailpit
- [ ] Add first cut at database access. Remember that ORMs are not all that! - [ ] migrations, seeding, fixtures
```sql
CREATE SCHEMA fw;
CREATE TABLE fw.users (...);
CREATE TABLE fw.groups (...);
```
```sql
CREATE TABLE app.user_profiles (...);
CREATE TABLE app.customer_metadata (...);
```
- [ ] flesh out `mgmt` and `develop` (does not exist yet)
4.1 What belongs in develop
- Create migrations
- Squash migrations
- Reset DB
- Roll back migrations
- Seed large test datasets
- Run tests
- Snapshot / restore local DB state (!!!)
`develop` fails if APP_ENV (or whatever) is `production`. Or maybe even
`testing`.
- [ ] Add default user table(s) to database.
- [ ] Add middleware concept
- [ ] Add authentication - [ ] Add authentication
- password - [ ] password
- third party? - [ ] third party?
- [ ] Add middleware concept
- [ ] Add authorization - [ ] Add authorization
- for specific routes / resources / etc - for specific routes / resources / etc
@@ -25,6 +52,16 @@
Partially done; see the /time route. But we need to figure out where to Partially done; see the /time route. But we need to figure out where to
store templates, static files, etc. store templates, static files, etc.
- [ ] fix process management: if you control-c `master` process sometimes it
leaves around `master-bin`, `logger-bin`, and `diachron:nnnn` processes.
Huge problem.
- [ ] Fix format used by master (and logger?)'s output: it should be logfmt
- A lot of other stuff should probably be logfmt too but maybe we can get to
that later
- [ ] master rebuilds (or tries to) too many times; need some sort of debounce
or whatever it's called
## medium importance ## medium importance
@@ -32,10 +69,35 @@
- with queries - with queries
- convert to logfmt and is there a viewer UI we could pull in and use - convert to logfmt and is there a viewer UI we could pull in and use
instead? instead?
- [ ] add nested routes. Note that this might be easy to do without actually
changing the logic in express/routes.ts. A function that takes an array
of routes and maps over them rewriting them. Maybe.
- [ ] related: add something to do with default templates and stuff... I
think we can make handlers a lot shorter to write, sometimes not even
necessary at all, with some sane defaults and an easy to use override
mechanism
- [ ] time library
- [ ] fill in the rest of express/http-codes.ts
- [ ] fill out express/content-types.ts
- [ ] identify redundant "old skool" and ajax routes, factor out their
commonalities, etc.
- [ ] figure out and add logging to disk - [ ] figure out and add logging to disk
- [ ] Add email verification - [ ] I don't really feel close to satisfied with template location /
rendering / etc. Rethink and rework.
- [ ] Add email verification (this is partially done already)
- [ ] Reading .env files and dealing with the environment should be immune to
the extent possible from idiotic errors
- [ ] Update check script: - [ ] Update check script:
- [x] shellcheck on shell scripts - [x] shellcheck on shell scripts
@@ -48,6 +110,17 @@
- upgrade docs - upgrade docs
- starting docs - starting docs
- taking over docs - taking over docs
- reference
- internals
- [ ] make migration creation default to something like yyyy-mm-dd_ssss (are
9999 migrations in a day enough?)
- [ ] clean up `cmd` and `mgmt`: do the right thing with their commonalities
and make very plain which is which for what. Consider additional
commands. Maybe `develop` for specific development tasks,
`operate` for operational tasks, and we keep `cmd` for project-specific
commands. Something like that.
## low importance ## low importance
@@ -72,6 +145,10 @@
code; repeat code; repeat
- Slow start them: only start a few at first - Slow start them: only start a few at first
- [ ] in express/user.ts: FIXME: set createdAt and updatedAt to start of epoch
## finished ## finished
@@ -99,3 +176,12 @@
- [x] Log to logging service from the express backend - [x] Log to logging service from the express backend
- Fill out types and functions in `express/logging.ts` - Fill out types and functions in `express/logging.ts`
- [x] Add first cut at database access. Remember that ORMs are not all that!
- [x] Create initial docker-compose.yml file for local development
- include most recent stable postgres
- include beanstalkd
- include memcached
- include redis
- include mailpit

1
backend/.npmrc Normal file
View File

@@ -0,0 +1 @@
shamefully-hoist=true

20
backend/app.ts Normal file
View File

@@ -0,0 +1,20 @@
// This is a sample file provided by diachron. You are encouraged to modify it.
import { core } from "./diachron/core";
import { routes } from "./routes";
import {makeApp}from'./diachron/app'
const app = makeApp({routes});
core.logging.log({ source: "logging", text: ["1"] });
app.start()

View File

@@ -17,7 +17,10 @@
"linter": { "linter": {
"enabled": true, "enabled": true,
"rules": { "rules": {
"recommended": true "recommended": true,
"style": {
"useBlockStatements": "error"
}
} }
}, },
"javascript": { "javascript": {

View File

@@ -6,4 +6,4 @@ DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
cd "$DIR" cd "$DIR"
../cmd pnpm ncc build ./app.ts -o dist ../cmd pnpm ncc build ./app.ts -o dist --source-map

66
backend/check-deps.ts Normal file
View File

@@ -0,0 +1,66 @@
import { readFileSync } from "node:fs";
import { dirname, join } from "node:path";
import { fileURLToPath } from "node:url";
const __dirname = dirname(fileURLToPath(import.meta.url));
interface PackageJson {
dependencies?: Record<string, string>;
devDependencies?: Record<string, string>;
}
function readPackageJson(path: string): PackageJson {
const content = readFileSync(path, "utf-8");
return JSON.parse(content);
}
function getAllDependencyNames(pkg: PackageJson): Set<string> {
const names = new Set<string>();
for (const name of Object.keys(pkg.dependencies ?? {})) {
names.add(name);
}
for (const name of Object.keys(pkg.devDependencies ?? {})) {
names.add(name);
}
return names;
}
const diachronPkgPath = join(__dirname, "diachron", "package.json");
const backendPkgPath = join(__dirname, "package.json");
const diachronPkg = readPackageJson(diachronPkgPath);
const backendPkg = readPackageJson(backendPkgPath);
const diachronDeps = getAllDependencyNames(diachronPkg);
const backendDeps = getAllDependencyNames(backendPkg);
const duplicates: string[] = [];
for (const dep of diachronDeps) {
if (backendDeps.has(dep)) {
duplicates.push(dep);
}
}
if (duplicates.length > 0) {
console.error("Error: Duplicate dependencies found.");
console.error("");
console.error(
"The following dependencies exist in both backend/package.json and backend/diachron/package.json:",
);
console.error("");
for (const dep of duplicates.sort()) {
console.error(` - ${dep}`);
}
console.error("");
console.error(
"Dependencies in backend/diachron/package.json are provided by the framework",
);
console.error(
"and must not be duplicated in backend/package.json. Remove them from",
);
console.error("backend/package.json to fix this error.");
process.exit(1);
}
console.log("No duplicate dependencies found.");

View File

@@ -8,7 +8,8 @@ check_dir="$DIR"
out_dir="$check_dir/out" out_dir="$check_dir/out"
source "$check_dir"/../framework/shims/common source "$check_dir"/../diachron/shims/common
source "$check_dir"/../framework/shims/node.common source "$check_dir"/../diachron/shims/node.common
$ROOT/cmd tsx "$check_dir/check-deps.ts"
$ROOT/cmd pnpm tsc --outDir "$out_dir" $ROOT/cmd pnpm tsc --outDir "$out_dir"

137
backend/diachron/app.ts Normal file
View File

@@ -0,0 +1,137 @@
// FIXME: rename this to make-app.ts and adjust imports accordingly
import{contentTypes} from './content-types'
import{httpCodes}from'./http-codes'
import express, {
type Express,
type NextFunction,
type Request as ExpressRequest,
type Response as ExpressResponse,
} from "express";
import { formatError, formatErrorHtml } from "./errors";
import {isRedirect, InternalHandler, AuthenticationRequired,
AuthorizationDenied, Call,type Method, type ProcessedRoute,methodParser, type Result, type Route,massageMethod } from "./types";
import { cli } from "./cli";
import{processRoutes}from'./routing'
process.on('uncaughtException', (err) => {
console.error(formatError(err));
process.exit(1);
});
process.on('unhandledRejection', (reason) => {
console.error(formatError(reason));
});
type MakeAppArgs={routes:Route[],
processTitle?: string,
}
export interface DiachronApp extends Express {
start: () => void
}
const makeApp = ({routes, processTitle}: MakeAppArgs) => {
if (process.title) {
process.title = `diachron:${cli.listen.port}`;
}
const processedRoutes = processRoutes(routes)
async function handler(
req: ExpressRequest,
_res: ExpressResponse,
): Promise<Result> {
const method = await methodParser.parseAsync(req.method);
const byMethod = processedRoutes[method];
console.log(
"DEBUG: req.path =",
JSON.stringify(req.path),
"method =",
method,
);
for (const [_idx, pr] of byMethod.entries()) {
const match = pr.matcher(req.path);
console.log("DEBUG: trying pattern, match result =", match);
if (match) {
console.log("match", match);
const resp = await pr.handler(req, match.params);
return resp;
}
}
const retval: Result = {
code: httpCodes.clientErrors.NotFound,
contentType: contentTypes.text.plain,
result: "not found!",
};
return retval;
}
// I don't like going around tsc but this is simple enough that it's probably OK.
const app = express() as DiachronApp
app.start = function() {
this.listen(cli.listen.port, cli.listen.host, () => {
console.log(`Listening on ${cli.listen.host}:${cli.listen.port}`);
});
};
app.use(express.json())
app.use(express.urlencoded({ extended: true }));
app.use(async (req: ExpressRequest, res: ExpressResponse) => {
const result0 = await handler(req, res);
const code = result0.code.code;
const result = result0.result;
console.log(result);
// Set any cookies from the result
if (result0.cookies) {
for (const cookie of result0.cookies) {
res.cookie(cookie.name, cookie.value, cookie.options ?? {});
}
}
if (isRedirect(result0)) {
res.redirect(code, result0.redirect);
} else {
res.status(code).send(result);
}
});
app.use(
(
err: Error,
_req: ExpressRequest,
res: ExpressResponse,
_next: NextFunction,
) => {
console.error(formatError(err));
res.status(500).type("html").send(formatErrorHtml(err));
},
);
return app;
}
export{makeApp};
function _isPromise<T>(value: T | Promise<T>): value is Promise<T> {
return typeof (value as any)?.then === "function";
}

View File

@@ -7,7 +7,14 @@
// Import authRoutes directly from "./auth/routes" instead. // Import authRoutes directly from "./auth/routes" instead.
export { hashPassword, verifyPassword } from "./password"; export { hashPassword, verifyPassword } from "./password";
export { AuthService } from "./service"; export { type AuthResult, AuthService } from "./service";
export { type AuthStore, InMemoryAuthStore } from "./store"; export { type AuthStore, InMemoryAuthStore } from "./store";
export { generateToken, hashToken, SESSION_COOKIE_NAME } from "./token"; export { generateToken, hashToken, SESSION_COOKIE_NAME } from "./token";
export * from "./types"; export {
type AuthMethod,
Session,
type SessionData,
type TokenId,
type TokenType,
tokenLifetimes,
} from "./types";

View File

@@ -0,0 +1,80 @@
// Tests for auth/password.ts
// Pure unit tests - no database needed
import assert from "node:assert/strict";
import { describe, it } from "node:test";
import { hashPassword, verifyPassword } from "./password";
describe("password", () => {
describe("hashPassword", () => {
it("returns a scrypt formatted hash", async () => {
const hash = await hashPassword("testpassword");
assert.ok(hash.startsWith("$scrypt$"));
});
it("includes all scrypt parameters", async () => {
const hash = await hashPassword("testpassword");
const parts = hash.split("$");
// Format: $scrypt$N$r$p$salt$hash
assert.equal(parts.length, 7);
assert.equal(parts[0], "");
assert.equal(parts[1], "scrypt");
// N, r, p should be numbers
assert.ok(!Number.isNaN(parseInt(parts[2], 10)));
assert.ok(!Number.isNaN(parseInt(parts[3], 10)));
assert.ok(!Number.isNaN(parseInt(parts[4], 10)));
});
it("generates different hashes for same password (different salt)", async () => {
const hash1 = await hashPassword("testpassword");
const hash2 = await hashPassword("testpassword");
assert.notEqual(hash1, hash2);
});
});
describe("verifyPassword", () => {
it("returns true for correct password", async () => {
const hash = await hashPassword("correctpassword");
const result = await verifyPassword("correctpassword", hash);
assert.equal(result, true);
});
it("returns false for incorrect password", async () => {
const hash = await hashPassword("correctpassword");
const result = await verifyPassword("wrongpassword", hash);
assert.equal(result, false);
});
it("throws for invalid hash format", async () => {
await assert.rejects(
verifyPassword("password", "invalid-hash"),
/Invalid password hash format/,
);
});
it("throws for non-scrypt hash", async () => {
await assert.rejects(
verifyPassword("password", "$bcrypt$10$salt$hash"),
/Invalid password hash format/,
);
});
it("works with empty password", async () => {
const hash = await hashPassword("");
const result = await verifyPassword("", hash);
assert.equal(result, true);
});
it("works with unicode password", async () => {
const hash = await hashPassword("p@$$w0rd\u{1F511}");
const result = await verifyPassword("p@$$w0rd\u{1F511}", hash);
assert.equal(result, true);
});
it("is case sensitive", async () => {
const hash = await hashPassword("Password");
const result = await verifyPassword("password", hash);
assert.equal(result, false);
});
});
});

View File

@@ -28,8 +28,11 @@ function scryptAsync(
): Promise<Buffer> { ): Promise<Buffer> {
return new Promise((resolve, reject) => { return new Promise((resolve, reject) => {
scrypt(password, salt, keylen, options, (err, derivedKey) => { scrypt(password, salt, keylen, options, (err, derivedKey) => {
if (err) reject(err); if (err) {
else resolve(derivedKey); reject(err);
} else {
resolve(derivedKey);
}
}); });
}); });
} }

View File

@@ -5,7 +5,7 @@
import { z } from "zod"; import { z } from "zod";
import { contentTypes } from "../content-types"; import { contentTypes } from "../content-types";
import { httpCodes } from "../http-codes"; import { httpCodes } from "../http-codes";
import { services } from "../services"; import { request } from "../request";
import type { Call, Result, Route } from "../types"; import type { Call, Result, Route } from "../types";
import { import {
forgotPasswordInputParser, forgotPasswordInputParser,
@@ -39,7 +39,7 @@ const loginHandler = async (call: Call): Promise<Result> => {
const body = call.request.body; const body = call.request.body;
const { email, password } = loginInputParser.parse(body); const { email, password } = loginInputParser.parse(body);
const result = await services.auth.login(email, password, "cookie", { const result = await request.auth.login(email, password, "cookie", {
userAgent: call.request.get("User-Agent"), userAgent: call.request.get("User-Agent"),
ipAddress: call.request.ip, ipAddress: call.request.ip,
}); });
@@ -72,9 +72,9 @@ const loginHandler = async (call: Call): Promise<Result> => {
// POST /auth/logout // POST /auth/logout
const logoutHandler = async (call: Call): Promise<Result> => { const logoutHandler = async (call: Call): Promise<Result> => {
const token = services.auth.extractToken(call.request); const token = request.auth.extractToken(call.request);
if (token) { if (token) {
await services.auth.logout(token); await request.auth.logout(token);
} }
return jsonResponse(httpCodes.success.OK, { message: "Logged out" }); return jsonResponse(httpCodes.success.OK, { message: "Logged out" });
@@ -87,7 +87,7 @@ const registerHandler = async (call: Call): Promise<Result> => {
const { email, password, displayName } = const { email, password, displayName } =
registerInputParser.parse(body); registerInputParser.parse(body);
const result = await services.auth.register( const result = await request.auth.register(
email, email,
password, password,
displayName, displayName,
@@ -128,7 +128,7 @@ const forgotPasswordHandler = async (call: Call): Promise<Result> => {
const body = call.request.body; const body = call.request.body;
const { email } = forgotPasswordInputParser.parse(body); const { email } = forgotPasswordInputParser.parse(body);
const result = await services.auth.createPasswordResetToken(email); const result = await request.auth.createPasswordResetToken(email);
// Always return success (don't reveal if email exists) // Always return success (don't reveal if email exists)
if (result) { if (result) {
@@ -159,7 +159,7 @@ const resetPasswordHandler = async (call: Call): Promise<Result> => {
const body = call.request.body; const body = call.request.body;
const { token, password } = resetPasswordInputParser.parse(body); const { token, password } = resetPasswordInputParser.parse(body);
const result = await services.auth.resetPassword(token, password); const result = await request.auth.resetPassword(token, password);
if (!result.success) { if (!result.success) {
return errorResponse( return errorResponse(
@@ -195,7 +195,7 @@ const verifyEmailHandler = async (call: Call): Promise<Result> => {
); );
} }
const result = await services.auth.verifyEmail(token); const result = await request.auth.verifyEmail(token);
if (!result.success) { if (!result.success) {
return errorResponse(httpCodes.clientErrors.BadRequest, result.error); return errorResponse(httpCodes.clientErrors.BadRequest, result.error);

View File

@@ -0,0 +1,419 @@
// Tests for auth/service.ts
// Uses InMemoryAuthStore - no database needed
import assert from "node:assert/strict";
import { beforeEach, describe, it } from "node:test";
import { AuthService } from "./service";
import { InMemoryAuthStore } from "./store";
describe("AuthService", () => {
let store: InMemoryAuthStore;
let service: AuthService;
beforeEach(() => {
store = new InMemoryAuthStore();
service = new AuthService(store);
});
describe("register", () => {
it("creates a new user", async () => {
const result = await service.register(
"test@example.com",
"password123",
"Test User",
);
assert.equal(result.success, true);
if (result.success) {
assert.equal(result.user.email, "test@example.com");
assert.equal(result.user.displayName, "Test User");
assert.ok(result.verificationToken.length > 0);
}
});
it("fails when email already registered", async () => {
await service.register("test@example.com", "password123");
const result = await service.register(
"test@example.com",
"password456",
);
assert.equal(result.success, false);
if (!result.success) {
assert.equal(result.error, "Email already registered");
}
});
it("creates user without displayName", async () => {
const result = await service.register(
"test@example.com",
"password123",
);
assert.equal(result.success, true);
if (result.success) {
assert.equal(result.user.displayName, undefined);
}
});
});
describe("login", () => {
beforeEach(async () => {
// Create and verify a user
const result = await service.register(
"test@example.com",
"password123",
"Test User",
);
if (result.success) {
// Verify email to activate user
await service.verifyEmail(result.verificationToken);
}
});
it("succeeds with correct credentials", async () => {
const result = await service.login(
"test@example.com",
"password123",
"cookie",
);
assert.equal(result.success, true);
if (result.success) {
assert.ok(result.token.length > 0);
assert.equal(result.user.email, "test@example.com");
}
});
it("fails with wrong password", async () => {
const result = await service.login(
"test@example.com",
"wrongpassword",
"cookie",
);
assert.equal(result.success, false);
if (!result.success) {
assert.equal(result.error, "Invalid credentials");
}
});
it("fails with unknown email", async () => {
const result = await service.login(
"unknown@example.com",
"password123",
"cookie",
);
assert.equal(result.success, false);
if (!result.success) {
assert.equal(result.error, "Invalid credentials");
}
});
it("fails for inactive user", async () => {
// Create a user but don't verify email (stays pending)
await service.register("pending@example.com", "password123");
const result = await service.login(
"pending@example.com",
"password123",
"cookie",
);
assert.equal(result.success, false);
if (!result.success) {
assert.equal(result.error, "Account is not active");
}
});
it("stores metadata", async () => {
const result = await service.login(
"test@example.com",
"password123",
"cookie",
{ userAgent: "TestAgent", ipAddress: "192.168.1.1" },
);
assert.equal(result.success, true);
});
});
describe("validateToken", () => {
let token: string;
beforeEach(async () => {
const regResult = await service.register(
"test@example.com",
"password123",
);
if (regResult.success) {
await service.verifyEmail(regResult.verificationToken);
}
const loginResult = await service.login(
"test@example.com",
"password123",
"cookie",
);
if (loginResult.success) {
token = loginResult.token;
}
});
it("returns authenticated for valid token", async () => {
const result = await service.validateToken(token);
assert.equal(result.authenticated, true);
if (result.authenticated) {
assert.equal(result.user.email, "test@example.com");
assert.notEqual(result.session, null);
}
});
it("returns unauthenticated for invalid token", async () => {
const result = await service.validateToken("invalid-token");
assert.equal(result.authenticated, false);
assert.equal(result.user.isAnonymous(), true);
assert.equal(result.session, null);
});
});
describe("logout", () => {
it("invalidates the session", async () => {
const regResult = await service.register(
"test@example.com",
"password123",
);
if (regResult.success) {
await service.verifyEmail(regResult.verificationToken);
}
const loginResult = await service.login(
"test@example.com",
"password123",
"cookie",
);
assert.equal(loginResult.success, true);
if (!loginResult.success) return;
const token = loginResult.token;
// Token should be valid before logout
const beforeLogout = await service.validateToken(token);
assert.equal(beforeLogout.authenticated, true);
// Logout
await service.logout(token);
// Token should be invalid after logout
const afterLogout = await service.validateToken(token);
assert.equal(afterLogout.authenticated, false);
});
});
describe("logoutAllSessions", () => {
it("invalidates all user sessions", async () => {
const regResult = await service.register(
"test@example.com",
"password123",
);
if (regResult.success) {
await service.verifyEmail(regResult.verificationToken);
}
// Create multiple sessions
const login1 = await service.login(
"test@example.com",
"password123",
"cookie",
);
const login2 = await service.login(
"test@example.com",
"password123",
"bearer",
);
assert.equal(login1.success, true);
assert.equal(login2.success, true);
if (!login1.success || !login2.success) return;
// Both should be valid
const before1 = await service.validateToken(login1.token);
const before2 = await service.validateToken(login2.token);
assert.equal(before1.authenticated, true);
assert.equal(before2.authenticated, true);
// Logout all
const user = await store.getUserByEmail("test@example.com");
const count = await service.logoutAllSessions(user!.id);
assert.equal(count, 2);
// Both should be invalid
const after1 = await service.validateToken(login1.token);
const after2 = await service.validateToken(login2.token);
assert.equal(after1.authenticated, false);
assert.equal(after2.authenticated, false);
});
});
describe("verifyEmail", () => {
it("activates user with valid token", async () => {
const regResult = await service.register(
"test@example.com",
"password123",
);
assert.equal(regResult.success, true);
if (!regResult.success) return;
const result = await service.verifyEmail(
regResult.verificationToken,
);
assert.equal(result.success, true);
// User should now be active and can login
const loginResult = await service.login(
"test@example.com",
"password123",
"cookie",
);
assert.equal(loginResult.success, true);
});
it("fails with invalid token", async () => {
const result = await service.verifyEmail("invalid-token");
assert.equal(result.success, false);
if (!result.success) {
assert.equal(
result.error,
"Invalid or expired verification token",
);
}
});
it("fails when token already used", async () => {
const regResult = await service.register(
"test@example.com",
"password123",
);
assert.equal(regResult.success, true);
if (!regResult.success) return;
// First verification succeeds
const result1 = await service.verifyEmail(
regResult.verificationToken,
);
assert.equal(result1.success, true);
// Second verification fails (token deleted)
const result2 = await service.verifyEmail(
regResult.verificationToken,
);
assert.equal(result2.success, false);
});
});
describe("createPasswordResetToken", () => {
it("returns token for existing user", async () => {
const regResult = await service.register(
"test@example.com",
"password123",
);
assert.equal(regResult.success, true);
const result =
await service.createPasswordResetToken("test@example.com");
assert.notEqual(result, null);
assert.ok(result!.token.length > 0);
});
it("returns null for unknown email", async () => {
const result = await service.createPasswordResetToken(
"unknown@example.com",
);
assert.equal(result, null);
});
});
describe("resetPassword", () => {
it("changes password with valid token", async () => {
const regResult = await service.register(
"test@example.com",
"oldpassword",
);
if (regResult.success) {
await service.verifyEmail(regResult.verificationToken);
}
const resetToken =
await service.createPasswordResetToken("test@example.com");
assert.notEqual(resetToken, null);
const result = await service.resetPassword(
resetToken!.token,
"newpassword",
);
assert.equal(result.success, true);
// Old password should no longer work
const loginOld = await service.login(
"test@example.com",
"oldpassword",
"cookie",
);
assert.equal(loginOld.success, false);
// New password should work
const loginNew = await service.login(
"test@example.com",
"newpassword",
"cookie",
);
assert.equal(loginNew.success, true);
});
it("fails with invalid token", async () => {
const result = await service.resetPassword(
"invalid-token",
"newpassword",
);
assert.equal(result.success, false);
if (!result.success) {
assert.equal(result.error, "Invalid or expired reset token");
}
});
it("invalidates all existing sessions", async () => {
const regResult = await service.register(
"test@example.com",
"password123",
);
if (regResult.success) {
await service.verifyEmail(regResult.verificationToken);
}
// Create a session
const loginResult = await service.login(
"test@example.com",
"password123",
"cookie",
);
assert.equal(loginResult.success, true);
if (!loginResult.success) return;
const sessionToken = loginResult.token;
// Reset password
const resetToken =
await service.createPasswordResetToken("test@example.com");
await service.resetPassword(resetToken!.token, "newpassword");
// Old session should be invalid
const validateResult = await service.validateToken(sessionToken);
assert.equal(validateResult.authenticated, false);
});
});
});

View File

@@ -4,7 +4,12 @@
// password reset, and email verification. // password reset, and email verification.
import type { Request as ExpressRequest } from "express"; import type { Request as ExpressRequest } from "express";
import { AnonymousUser, type MaybeUser, type User, type UserId } from "../user"; import {
type AnonymousUser,
anonymousUser,
type User,
type UserId,
} from "../user";
import { hashPassword, verifyPassword } from "./password"; import { hashPassword, verifyPassword } from "./password";
import type { AuthStore } from "./store"; import type { AuthStore } from "./store";
import { import {
@@ -12,7 +17,7 @@ import {
parseAuthorizationHeader, parseAuthorizationHeader,
SESSION_COOKIE_NAME, SESSION_COOKIE_NAME,
} from "./token"; } from "./token";
import { type TokenId, tokenLifetimes } from "./types"; import { type SessionData, type TokenId, tokenLifetimes } from "./types";
type LoginResult = type LoginResult =
| { success: true; token: string; user: User } | { success: true; token: string; user: User }
@@ -24,6 +29,11 @@ type RegisterResult =
type SimpleResult = { success: true } | { success: false; error: string }; type SimpleResult = { success: true } | { success: false; error: string };
// Result of validating a request/token - contains both user and session
export type AuthResult =
| { authenticated: true; user: User; session: SessionData }
| { authenticated: false; user: AnonymousUser; session: null };
export class AuthService { export class AuthService {
constructor(private store: AuthStore) {} constructor(private store: AuthStore) {}
@@ -68,7 +78,7 @@ export class AuthService {
// === Session Validation === // === Session Validation ===
async validateRequest(request: ExpressRequest): Promise<MaybeUser> { async validateRequest(request: ExpressRequest): Promise<AuthResult> {
// Try cookie first (for web requests) // Try cookie first (for web requests)
let token = this.extractCookieToken(request); let token = this.extractCookieToken(request);
@@ -78,38 +88,40 @@ export class AuthService {
} }
if (!token) { if (!token) {
return AnonymousUser; return { authenticated: false, user: anonymousUser, session: null };
} }
return this.validateToken(token); return this.validateToken(token);
} }
async validateToken(token: string): Promise<MaybeUser> { async validateToken(token: string): Promise<AuthResult> {
const tokenId = hashToken(token) as TokenId; const tokenId = hashToken(token) as TokenId;
const session = await this.store.getSession(tokenId); const session = await this.store.getSession(tokenId);
if (!session) { if (!session) {
return AnonymousUser; return { authenticated: false, user: anonymousUser, session: null };
} }
if (session.tokenType !== "session") { if (session.tokenType !== "session") {
return AnonymousUser; return { authenticated: false, user: anonymousUser, session: null };
} }
const user = await this.store.getUserById(session.userId as UserId); const user = await this.store.getUserById(session.userId as UserId);
if (!user || !user.isActive()) { if (!user || !user.isActive()) {
return AnonymousUser; return { authenticated: false, user: anonymousUser, session: null };
} }
// Update last used (fire and forget) // Update last used (fire and forget)
this.store.updateLastUsed(tokenId).catch(() => {}); this.store.updateLastUsed(tokenId).catch(() => {});
return user; return { authenticated: true, user, session };
} }
private extractCookieToken(request: ExpressRequest): string | null { private extractCookieToken(request: ExpressRequest): string | null {
const cookies = request.get("Cookie"); const cookies = request.get("Cookie");
if (!cookies) return null; if (!cookies) {
return null;
}
for (const cookie of cookies.split(";")) { for (const cookie of cookies.split(";")) {
const [name, ...valueParts] = cookie.trim().split("="); const [name, ...valueParts] = cookie.trim().split("=");
@@ -240,7 +252,9 @@ export class AuthService {
extractToken(request: ExpressRequest): string | null { extractToken(request: ExpressRequest): string | null {
// Try Authorization header first // Try Authorization header first
const token = parseAuthorizationHeader(request.get("Authorization")); const token = parseAuthorizationHeader(request.get("Authorization"));
if (token) return token; if (token) {
return token;
}
// Try cookie // Try cookie
return this.extractCookieToken(request); return this.extractCookieToken(request);

View File

@@ -0,0 +1,321 @@
// Tests for auth/store.ts (InMemoryAuthStore)
// Pure unit tests - no database needed
import assert from "node:assert/strict";
import { after, before, beforeEach, describe, it } from "node:test";
import type { UserId } from "../user";
import { InMemoryAuthStore } from "./store";
import { hashToken } from "./token";
import type { TokenId } from "./types";
describe("InMemoryAuthStore", () => {
let store: InMemoryAuthStore;
beforeEach(() => {
store = new InMemoryAuthStore();
});
describe("createUser", () => {
it("creates a user with pending status", async () => {
const user = await store.createUser({
email: "test@example.com",
passwordHash: "hash123",
displayName: "Test User",
});
assert.equal(user.email, "test@example.com");
assert.equal(user.displayName, "Test User");
assert.equal(user.status, "pending");
});
it("creates a user without displayName", async () => {
const user = await store.createUser({
email: "test@example.com",
passwordHash: "hash123",
});
assert.equal(user.email, "test@example.com");
assert.equal(user.displayName, undefined);
});
it("generates a unique id", async () => {
const user1 = await store.createUser({
email: "test1@example.com",
passwordHash: "hash123",
});
const user2 = await store.createUser({
email: "test2@example.com",
passwordHash: "hash456",
});
assert.notEqual(user1.id, user2.id);
});
});
describe("getUserByEmail", () => {
it("returns user when found", async () => {
await store.createUser({
email: "test@example.com",
passwordHash: "hash123",
});
const user = await store.getUserByEmail("test@example.com");
assert.notEqual(user, null);
assert.equal(user!.email, "test@example.com");
});
it("is case-insensitive", async () => {
await store.createUser({
email: "Test@Example.COM",
passwordHash: "hash123",
});
const user = await store.getUserByEmail("test@example.com");
assert.notEqual(user, null);
});
it("returns null when not found", async () => {
const user = await store.getUserByEmail("notfound@example.com");
assert.equal(user, null);
});
});
describe("getUserById", () => {
it("returns user when found", async () => {
const created = await store.createUser({
email: "test@example.com",
passwordHash: "hash123",
});
const user = await store.getUserById(created.id);
assert.notEqual(user, null);
assert.equal(user!.id, created.id);
});
it("returns null when not found", async () => {
const user = await store.getUserById("nonexistent" as UserId);
assert.equal(user, null);
});
});
describe("getUserPasswordHash", () => {
it("returns hash when found", async () => {
const user = await store.createUser({
email: "test@example.com",
passwordHash: "hash123",
});
const hash = await store.getUserPasswordHash(user.id);
assert.equal(hash, "hash123");
});
it("returns null when not found", async () => {
const hash = await store.getUserPasswordHash(
"nonexistent" as UserId,
);
assert.equal(hash, null);
});
});
describe("setUserPassword", () => {
it("updates password hash", async () => {
const user = await store.createUser({
email: "test@example.com",
passwordHash: "oldhash",
});
await store.setUserPassword(user.id, "newhash");
const hash = await store.getUserPasswordHash(user.id);
assert.equal(hash, "newhash");
});
});
describe("updateUserEmailVerified", () => {
it("sets user status to active", async () => {
const created = await store.createUser({
email: "test@example.com",
passwordHash: "hash123",
});
assert.equal(created.status, "pending");
await store.updateUserEmailVerified(created.id);
const user = await store.getUserById(created.id);
assert.equal(user!.status, "active");
});
});
describe("createSession", () => {
it("creates a session with token", async () => {
const user = await store.createUser({
email: "test@example.com",
passwordHash: "hash123",
});
const { token, session } = await store.createSession({
userId: user.id,
tokenType: "session",
authMethod: "cookie",
expiresAt: new Date(Date.now() + 3600000),
});
assert.ok(token.length > 0);
assert.equal(session.userId, user.id);
assert.equal(session.tokenType, "session");
assert.equal(session.authMethod, "cookie");
});
it("stores metadata", async () => {
const user = await store.createUser({
email: "test@example.com",
passwordHash: "hash123",
});
const { session } = await store.createSession({
userId: user.id,
tokenType: "session",
authMethod: "cookie",
expiresAt: new Date(Date.now() + 3600000),
userAgent: "Mozilla/5.0",
ipAddress: "127.0.0.1",
});
assert.equal(session.userAgent, "Mozilla/5.0");
assert.equal(session.ipAddress, "127.0.0.1");
});
});
describe("getSession", () => {
it("returns session when found and not expired", async () => {
const user = await store.createUser({
email: "test@example.com",
passwordHash: "hash123",
});
const { token } = await store.createSession({
userId: user.id,
tokenType: "session",
authMethod: "cookie",
expiresAt: new Date(Date.now() + 3600000), // 1 hour from now
});
const tokenId = hashToken(token) as TokenId;
const session = await store.getSession(tokenId);
assert.notEqual(session, null);
assert.equal(session!.userId, user.id);
});
it("returns null for expired session", async () => {
const user = await store.createUser({
email: "test@example.com",
passwordHash: "hash123",
});
const { token } = await store.createSession({
userId: user.id,
tokenType: "session",
authMethod: "cookie",
expiresAt: new Date(Date.now() - 1000), // Expired 1 second ago
});
const tokenId = hashToken(token) as TokenId;
const session = await store.getSession(tokenId);
assert.equal(session, null);
});
it("returns null for nonexistent session", async () => {
const session = await store.getSession("nonexistent" as TokenId);
assert.equal(session, null);
});
});
describe("deleteSession", () => {
it("removes the session", async () => {
const user = await store.createUser({
email: "test@example.com",
passwordHash: "hash123",
});
const { token } = await store.createSession({
userId: user.id,
tokenType: "session",
authMethod: "cookie",
expiresAt: new Date(Date.now() + 3600000),
});
const tokenId = hashToken(token) as TokenId;
await store.deleteSession(tokenId);
const session = await store.getSession(tokenId);
assert.equal(session, null);
});
});
describe("deleteUserSessions", () => {
it("removes all sessions for user", async () => {
const user = await store.createUser({
email: "test@example.com",
passwordHash: "hash123",
});
const { token: token1 } = await store.createSession({
userId: user.id,
tokenType: "session",
authMethod: "cookie",
expiresAt: new Date(Date.now() + 3600000),
});
const { token: token2 } = await store.createSession({
userId: user.id,
tokenType: "session",
authMethod: "bearer",
expiresAt: new Date(Date.now() + 3600000),
});
const count = await store.deleteUserSessions(user.id);
assert.equal(count, 2);
const session1 = await store.getSession(
hashToken(token1) as TokenId,
);
const session2 = await store.getSession(
hashToken(token2) as TokenId,
);
assert.equal(session1, null);
assert.equal(session2, null);
});
it("returns 0 when user has no sessions", async () => {
const count = await store.deleteUserSessions(
"nonexistent" as UserId,
);
assert.equal(count, 0);
});
});
describe("updateLastUsed", () => {
it("updates lastUsedAt timestamp", async () => {
const user = await store.createUser({
email: "test@example.com",
passwordHash: "hash123",
});
const { token } = await store.createSession({
userId: user.id,
tokenType: "session",
authMethod: "cookie",
expiresAt: new Date(Date.now() + 3600000),
});
const tokenId = hashToken(token) as TokenId;
const beforeUpdate = await store.getSession(tokenId);
assert.equal(beforeUpdate!.lastUsedAt, undefined);
await store.updateLastUsed(tokenId);
const afterUpdate = await store.getSession(tokenId);
assert.ok(afterUpdate!.lastUsedAt instanceof Date);
});
});
});

View File

@@ -3,7 +3,7 @@
// Authentication storage interface and in-memory implementation. // Authentication storage interface and in-memory implementation.
// The interface allows easy migration to PostgreSQL later. // The interface allows easy migration to PostgreSQL later.
import { User, type UserId } from "../user"; import { AuthenticatedUser, type User, type UserId } from "../user";
import { generateToken, hashToken } from "./token"; import { generateToken, hashToken } from "./token";
import type { AuthMethod, SessionData, TokenId, TokenType } from "./types"; import type { AuthMethod, SessionData, TokenId, TokenType } from "./types";
@@ -75,7 +75,9 @@ export class InMemoryAuthStore implements AuthStore {
async getSession(tokenId: TokenId): Promise<SessionData | null> { async getSession(tokenId: TokenId): Promise<SessionData | null> {
const session = this.sessions.get(tokenId); const session = this.sessions.get(tokenId);
if (!session) return null; if (!session) {
return null;
}
// Check expiration // Check expiration
if (new Date() > session.expiresAt) { if (new Date() > session.expiresAt) {
@@ -110,7 +112,9 @@ export class InMemoryAuthStore implements AuthStore {
async getUserByEmail(email: string): Promise<User | null> { async getUserByEmail(email: string): Promise<User | null> {
const userId = this.usersByEmail.get(email.toLowerCase()); const userId = this.usersByEmail.get(email.toLowerCase());
if (!userId) return null; if (!userId) {
return null;
}
return this.users.get(userId) ?? null; return this.users.get(userId) ?? null;
} }
@@ -119,7 +123,7 @@ export class InMemoryAuthStore implements AuthStore {
} }
async createUser(data: CreateUserData): Promise<User> { async createUser(data: CreateUserData): Promise<User> {
const user = User.create(data.email, { const user = AuthenticatedUser.create(data.email, {
displayName: data.displayName, displayName: data.displayName,
status: "pending", // Pending until email verified status: "pending", // Pending until email verified
}); });
@@ -147,7 +151,7 @@ export class InMemoryAuthStore implements AuthStore {
const user = this.users.get(userId); const user = this.users.get(userId);
if (user) { if (user) {
// Create new user with active status // Create new user with active status
const updatedUser = User.create(user.email, { const updatedUser = AuthenticatedUser.create(user.email, {
id: user.id, id: user.id,
displayName: user.displayName, displayName: user.displayName,
status: "active", status: "active",

View File

@@ -0,0 +1,94 @@
// Tests for auth/token.ts
// Pure unit tests - no database needed
import assert from "node:assert/strict";
import { describe, it } from "node:test";
import {
generateToken,
hashToken,
parseAuthorizationHeader,
SESSION_COOKIE_NAME,
} from "./token";
describe("token", () => {
describe("generateToken", () => {
it("generates a non-empty string", () => {
const token = generateToken();
assert.equal(typeof token, "string");
assert.ok(token.length > 0);
});
it("generates unique tokens", () => {
const tokens = new Set<string>();
for (let i = 0; i < 100; i++) {
tokens.add(generateToken());
}
assert.equal(tokens.size, 100);
});
it("generates base64url encoded tokens", () => {
const token = generateToken();
// base64url uses A-Z, a-z, 0-9, -, _
assert.match(token, /^[A-Za-z0-9_-]+$/);
});
});
describe("hashToken", () => {
it("returns a hex string", () => {
const hash = hashToken("test-token");
assert.match(hash, /^[a-f0-9]+$/);
});
it("returns consistent hash for same input", () => {
const hash1 = hashToken("test-token");
const hash2 = hashToken("test-token");
assert.equal(hash1, hash2);
});
it("returns different hash for different input", () => {
const hash1 = hashToken("token-1");
const hash2 = hashToken("token-2");
assert.notEqual(hash1, hash2);
});
it("returns 64 character hash (SHA-256)", () => {
const hash = hashToken("test-token");
assert.equal(hash.length, 64);
});
});
describe("parseAuthorizationHeader", () => {
it("returns null for undefined header", () => {
assert.equal(parseAuthorizationHeader(undefined), null);
});
it("returns null for empty string", () => {
assert.equal(parseAuthorizationHeader(""), null);
});
it("returns null for non-bearer auth", () => {
assert.equal(parseAuthorizationHeader("Basic abc123"), null);
});
it("returns null for malformed header", () => {
assert.equal(parseAuthorizationHeader("Bearer"), null);
assert.equal(parseAuthorizationHeader("Bearer token extra"), null);
});
it("extracts token from valid bearer header", () => {
assert.equal(parseAuthorizationHeader("Bearer abc123"), "abc123");
});
it("is case-insensitive for Bearer keyword", () => {
assert.equal(parseAuthorizationHeader("bearer abc123"), "abc123");
assert.equal(parseAuthorizationHeader("BEARER abc123"), "abc123");
});
});
describe("SESSION_COOKIE_NAME", () => {
it("is defined", () => {
assert.equal(typeof SESSION_COOKIE_NAME, "string");
assert.ok(SESSION_COOKIE_NAME.length > 0);
});
});
});

View File

@@ -19,7 +19,9 @@ function hashToken(token: string): string {
// Parse token from Authorization header // Parse token from Authorization header
function parseAuthorizationHeader(header: string | undefined): string | null { function parseAuthorizationHeader(header: string | undefined): string | null {
if (!header) return null; if (!header) {
return null;
}
const parts = header.split(" "); const parts = header.split(" ");
if (parts.length !== 2 || parts[0].toLowerCase() !== "bearer") { if (parts.length !== 2 || parts[0].toLowerCase() !== "bearer") {

View File

@@ -0,0 +1,253 @@
// Tests for auth/types.ts
// Pure unit tests - no database needed
import assert from "node:assert/strict";
import { describe, it } from "node:test";
import { z } from "zod";
import { AuthenticatedUser, anonymousUser } from "../user";
import {
authMethodParser,
forgotPasswordInputParser,
loginInputParser,
registerInputParser,
resetPasswordInputParser,
Session,
sessionDataParser,
tokenLifetimes,
tokenTypeParser,
} from "./types";
describe("auth/types", () => {
describe("tokenTypeParser", () => {
it("accepts valid token types", () => {
assert.equal(tokenTypeParser.parse("session"), "session");
assert.equal(
tokenTypeParser.parse("password_reset"),
"password_reset",
);
assert.equal(tokenTypeParser.parse("email_verify"), "email_verify");
});
it("rejects invalid token types", () => {
assert.throws(() => tokenTypeParser.parse("invalid"));
});
});
describe("authMethodParser", () => {
it("accepts valid auth methods", () => {
assert.equal(authMethodParser.parse("cookie"), "cookie");
assert.equal(authMethodParser.parse("bearer"), "bearer");
});
it("rejects invalid auth methods", () => {
assert.throws(() => authMethodParser.parse("basic"));
});
});
describe("sessionDataParser", () => {
it("accepts valid session data", () => {
const data = {
tokenId: "abc123",
userId: "user-1",
tokenType: "session",
authMethod: "cookie",
createdAt: new Date(),
expiresAt: new Date(),
};
const result = sessionDataParser.parse(data);
assert.equal(result.tokenId, "abc123");
assert.equal(result.userId, "user-1");
});
it("coerces date strings to dates", () => {
const data = {
tokenId: "abc123",
userId: "user-1",
tokenType: "session",
authMethod: "cookie",
createdAt: "2025-01-01T00:00:00Z",
expiresAt: "2025-01-02T00:00:00Z",
};
const result = sessionDataParser.parse(data);
assert.ok(result.createdAt instanceof Date);
assert.ok(result.expiresAt instanceof Date);
});
it("accepts optional fields", () => {
const data = {
tokenId: "abc123",
userId: "user-1",
tokenType: "session",
authMethod: "cookie",
createdAt: new Date(),
expiresAt: new Date(),
lastUsedAt: new Date(),
userAgent: "Mozilla/5.0",
ipAddress: "127.0.0.1",
isUsed: true,
};
const result = sessionDataParser.parse(data);
assert.equal(result.userAgent, "Mozilla/5.0");
assert.equal(result.ipAddress, "127.0.0.1");
assert.equal(result.isUsed, true);
});
});
describe("loginInputParser", () => {
it("accepts valid login input", () => {
const result = loginInputParser.parse({
email: "test@example.com",
password: "secret",
});
assert.equal(result.email, "test@example.com");
assert.equal(result.password, "secret");
});
it("rejects invalid email", () => {
assert.throws(() =>
loginInputParser.parse({
email: "not-an-email",
password: "secret",
}),
);
});
it("rejects empty password", () => {
assert.throws(() =>
loginInputParser.parse({
email: "test@example.com",
password: "",
}),
);
});
});
describe("registerInputParser", () => {
it("accepts valid registration input", () => {
const result = registerInputParser.parse({
email: "test@example.com",
password: "password123",
displayName: "Test User",
});
assert.equal(result.email, "test@example.com");
assert.equal(result.password, "password123");
assert.equal(result.displayName, "Test User");
});
it("accepts registration without displayName", () => {
const result = registerInputParser.parse({
email: "test@example.com",
password: "password123",
});
assert.equal(result.displayName, undefined);
});
it("rejects password shorter than 8 characters", () => {
assert.throws(() =>
registerInputParser.parse({
email: "test@example.com",
password: "short",
}),
);
});
});
describe("forgotPasswordInputParser", () => {
it("accepts valid email", () => {
const result = forgotPasswordInputParser.parse({
email: "test@example.com",
});
assert.equal(result.email, "test@example.com");
});
it("rejects invalid email", () => {
assert.throws(() =>
forgotPasswordInputParser.parse({
email: "invalid",
}),
);
});
});
describe("resetPasswordInputParser", () => {
it("accepts valid reset input", () => {
const result = resetPasswordInputParser.parse({
token: "abc123",
password: "newpassword",
});
assert.equal(result.token, "abc123");
assert.equal(result.password, "newpassword");
});
it("rejects empty token", () => {
assert.throws(() =>
resetPasswordInputParser.parse({
token: "",
password: "newpassword",
}),
);
});
it("rejects password shorter than 8 characters", () => {
assert.throws(() =>
resetPasswordInputParser.parse({
token: "abc123",
password: "short",
}),
);
});
});
describe("tokenLifetimes", () => {
it("defines session lifetime", () => {
assert.ok(tokenLifetimes.session > 0);
// 30 days in ms
assert.equal(tokenLifetimes.session, 30 * 24 * 60 * 60 * 1000);
});
it("defines password_reset lifetime", () => {
assert.ok(tokenLifetimes.password_reset > 0);
// 1 hour in ms
assert.equal(tokenLifetimes.password_reset, 1 * 60 * 60 * 1000);
});
it("defines email_verify lifetime", () => {
assert.ok(tokenLifetimes.email_verify > 0);
// 24 hours in ms
assert.equal(tokenLifetimes.email_verify, 24 * 60 * 60 * 1000);
});
});
describe("Session", () => {
it("wraps authenticated session", () => {
const user = AuthenticatedUser.create("test@example.com", {
id: "user-1",
});
const sessionData = {
tokenId: "token-1",
userId: "user-1",
tokenType: "session" as const,
authMethod: "cookie" as const,
createdAt: new Date(),
expiresAt: new Date(),
};
const session = new Session(sessionData, user);
assert.equal(session.isAuthenticated(), true);
assert.equal(session.getUser(), user);
assert.equal(session.getData(), sessionData);
assert.equal(session.tokenId, "token-1");
assert.equal(session.userId, "user-1");
});
it("wraps anonymous session", () => {
const session = new Session(null, anonymousUser);
assert.equal(session.isAuthenticated(), false);
assert.equal(session.getUser(), anonymousUser);
assert.equal(session.getData(), null);
assert.equal(session.tokenId, undefined);
assert.equal(session.userId, undefined);
});
});
});

View File

@@ -62,3 +62,35 @@ export const tokenLifetimes: Record<TokenType, number> = {
password_reset: 1 * 60 * 60 * 1000, // 1 hour password_reset: 1 * 60 * 60 * 1000, // 1 hour
email_verify: 24 * 60 * 60 * 1000, // 24 hours email_verify: 24 * 60 * 60 * 1000, // 24 hours
}; };
// Import here to avoid circular dependency at module load time
import type { User } from "../user";
// Session wrapper class providing a consistent interface for handlers.
// Always present on Call (never null), but may represent an anonymous session.
export class Session {
constructor(
private readonly data: SessionData | null,
private readonly user: User,
) {}
getUser(): User {
return this.user;
}
getData(): SessionData | null {
return this.data;
}
isAuthenticated(): boolean {
return !this.user.isAnonymous();
}
get tokenId(): string | undefined {
return this.data?.tokenId;
}
get userId(): string | undefined {
return this.data?.userId;
}
}

View File

@@ -0,0 +1,24 @@
// Tests for basic/login.ts
// These tests verify the route structure and export
import assert from "node:assert/strict";
import { describe, it } from "node:test";
import { loginRoute } from "./login";
describe("basic/login", () => {
describe("loginRoute", () => {
it("has correct path", () => {
assert.equal(loginRoute.path, "/login");
});
it("handles GET and POST methods", () => {
assert.ok(loginRoute.methods.includes("GET"));
assert.ok(loginRoute.methods.includes("POST"));
assert.equal(loginRoute.methods.length, 2);
});
it("has a handler function", () => {
assert.equal(typeof loginRoute.handler, "function");
});
});
});

View File

@@ -0,0 +1,62 @@
import { SESSION_COOKIE_NAME } from "../auth/token";
import { tokenLifetimes } from "../auth/types";
import { request } from "../request";
import { html, redirect, render } from "../request/util";
import type { Call, Result, Route } from "../types";
const loginHandler = async (call: Call): Promise<Result> => {
if (call.method === "GET") {
const c = await render("basic/login", {});
return html(c);
}
// POST - handle login
const { email, password } = call.request.body;
if (!email || !password) {
const c = await render("basic/login", {
error: "Email and password are required",
email,
});
return html(c);
}
const result = await request.auth.login(email, password, "cookie", {
userAgent: call.request.get("User-Agent"),
ipAddress: call.request.ip,
});
if (!result.success) {
const c = await render("basic/login", {
error: result.error,
email,
});
return html(c);
}
// Success - set cookie and redirect to home
const redirectResult = redirect("/");
redirectResult.cookies = [
{
name: SESSION_COOKIE_NAME,
value: result.token,
options: {
httpOnly: true,
secure: false, // Set to true in production with HTTPS
sameSite: "lax",
maxAge: tokenLifetimes.session,
path: "/",
},
},
];
return redirectResult;
};
const loginRoute: Route = {
path: "/login",
methods: ["GET", "POST"],
handler: loginHandler,
};
export { loginRoute };

View File

@@ -0,0 +1,24 @@
// Tests for basic/logout.ts
// These tests verify the route structure and export
import assert from "node:assert/strict";
import { describe, it } from "node:test";
import { logoutRoute } from "./logout";
describe("basic/logout", () => {
describe("logoutRoute", () => {
it("has correct path", () => {
assert.equal(logoutRoute.path, "/logout");
});
it("handles GET and POST methods", () => {
assert.ok(logoutRoute.methods.includes("GET"));
assert.ok(logoutRoute.methods.includes("POST"));
assert.equal(logoutRoute.methods.length, 2);
});
it("has a handler function", () => {
assert.equal(typeof logoutRoute.handler, "function");
});
});
});

View File

@@ -0,0 +1,38 @@
import { SESSION_COOKIE_NAME } from "../auth/token";
import { request } from "../request";
import { redirect } from "../request/util";
import type { Call, Result, Route } from "../types";
const logoutHandler = async (call: Call): Promise<Result> => {
// Extract token from cookie and invalidate the session
const token = request.auth.extractToken(call.request);
if (token) {
await request.auth.logout(token);
}
// Clear the cookie and redirect to login
const redirectResult = redirect("/login");
redirectResult.cookies = [
{
name: SESSION_COOKIE_NAME,
value: "",
options: {
httpOnly: true,
secure: false,
sameSite: "lax",
maxAge: 0,
path: "/",
},
},
];
return redirectResult;
};
const logoutRoute: Route = {
path: "/logout",
methods: ["GET", "POST"],
handler: logoutHandler,
};
export { logoutRoute };

View File

@@ -0,0 +1,73 @@
// Tests for basic/routes.ts
// These tests verify the route structure and exports
import assert from "node:assert/strict";
import { describe, it } from "node:test";
import { routes } from "./routes";
describe("basic/routes", () => {
describe("routes object", () => {
it("exports routes as an object", () => {
assert.equal(typeof routes, "object");
});
it("contains hello route", () => {
assert.ok("hello" in routes);
assert.equal(routes.hello.path, "/hello");
assert.ok(routes.hello.methods.includes("GET"));
});
it("contains home route", () => {
assert.ok("home" in routes);
assert.equal(routes.home.path, "/");
assert.ok(routes.home.methods.includes("GET"));
});
it("contains login route", () => {
assert.ok("login" in routes);
assert.equal(routes.login.path, "/login");
});
it("contains logout route", () => {
assert.ok("logout" in routes);
assert.equal(routes.logout.path, "/logout");
});
it("all routes have handlers", () => {
for (const [name, route] of Object.entries(routes)) {
assert.equal(
typeof route.handler,
"function",
`Route ${name} should have a handler function`,
);
}
});
it("all routes have methods array", () => {
for (const [name, route] of Object.entries(routes)) {
assert.ok(
Array.isArray(route.methods),
`Route ${name} should have methods array`,
);
assert.ok(
route.methods.length > 0,
`Route ${name} should have at least one method`,
);
}
});
it("all routes have path string", () => {
for (const [name, route] of Object.entries(routes)) {
assert.equal(
typeof route.path,
"string",
`Route ${name} should have a path string`,
);
assert.ok(
route.path.startsWith("/"),
`Route ${name} path should start with /`,
);
}
});
});
});

View File

@@ -0,0 +1,58 @@
import { DateTime } from "ts-luxon";
import { get, User } from "../hydrators/user";
import { request } from "../request";
import { html, render } from "../request/util";
import type { Call, Result, Route } from "../types";
import { loginRoute } from "./login";
import { logoutRoute } from "./logout";
const routes: Record<string, Route> = {
hello: {
path: "/hello",
methods: ["GET"],
handler: async (call: Call): Promise<Result> => {
const now = DateTime.now();
const args: {now: DateTime,greeting?: string} = {now};
if (call.path !== '/hello') {
const greeting = call. path.replaceAll('/','').replaceAll('-', ' ')
args.greeting = greeting;
}
const c = await render("basic/hello", args);
return html(c);
},
},
home: {
path: "/",
methods: ["GET"],
handler: async (_call: Call): Promise<Result> => {
const _auth = request.auth;
const me = request.session.getUser();
const id = me.id;
console.log(`*** id: ${id}`);
const u = await get(id);
const email = u?.email || "anonymous@example.com";
const name = u?.display_name || "anonymous";
const showLogin = me.isAnonymous();
const showLogout = !me.isAnonymous();
const c = await render("basic/home", {
name,
email,
showLogin,
showLogout,
});
return html(c);
},
},
login: loginRoute,
logout: logoutRoute,
};
export { routes };

View File

@@ -30,7 +30,7 @@ function parseListenAddress(listen: string | undefined): {
if (lastColon === -1) { if (lastColon === -1) {
// Just a port number // Just a port number
const port = parseInt(listen, 10); const port = parseInt(listen, 10);
if (isNaN(port)) { if (Number.isNaN(port)) {
throw new Error(`Invalid listen address: ${listen}`); throw new Error(`Invalid listen address: ${listen}`);
} }
return { host: defaultHost, port }; return { host: defaultHost, port };
@@ -39,7 +39,7 @@ function parseListenAddress(listen: string | undefined): {
const host = listen.slice(0, lastColon); const host = listen.slice(0, lastColon);
const port = parseInt(listen.slice(lastColon + 1), 10); const port = parseInt(listen.slice(lastColon + 1), 10);
if (isNaN(port)) { if (Number.isNaN(port)) {
throw new Error(`Invalid port in listen address: ${listen}`); throw new Error(`Invalid port in listen address: ${listen}`);
} }

View File

@@ -1,4 +1,4 @@
import { Extensible } from "./interfaces"; // This file belongs to the framework. You are not expected to modify it.
export type ContentType = string; export type ContentType = string;

View File

@@ -0,0 +1,27 @@
// context.ts
//
// Request-scoped context using AsyncLocalStorage.
// Allows services to access request data (like the current user) without
// needing to pass Call through every function.
import { AsyncLocalStorage } from "node:async_hooks";
import { anonymousUser, type User } from "./user";
type RequestContext = {
user: User;
};
const asyncLocalStorage = new AsyncLocalStorage<RequestContext>();
// Run a function within a request context
function runWithContext<T>(context: RequestContext, fn: () => T): T {
return asyncLocalStorage.run(context, fn);
}
// Get the current user from context, or AnonymousUser if not in a request
function getCurrentUser(): User {
const context = asyncLocalStorage.getStore();
return context?.user ?? anonymousUser;
}
export { getCurrentUser, runWithContext, type RequestContext };

View File

@@ -0,0 +1,50 @@
import nunjucks from "nunjucks";
import { db, migrate, migrationStatus } from "../database";
import { formatError, formatErrorHtml } from "../errors";
import { getLogs, log } from "../logging";
// FIXME: This doesn't belong here; move it somewhere else.
const conf = {
templateEngine: () => {
return {
renderTemplate: (template: string, context: object) => {
return nunjucks.renderString(template, context);
},
};
},
};
const database = {
db,
migrate,
migrationStatus,
};
const logging = {
log,
getLogs,
};
const random = {
randomNumber: () => {
return Math.random();
},
};
const misc = {
sleep: (ms: number) => {
return new Promise((resolve) => setTimeout(resolve, ms));
},
};
// Keep this asciibetically sorted
const core = {
conf,
database,
errors: { formatError, formatErrorHtml },
logging,
misc,
random,
};
export { core };

View File

@@ -0,0 +1,285 @@
// Tests for database.ts
// Requires test PostgreSQL: docker compose -f docker-compose.test.yml up -d
import assert from "node:assert/strict";
import { after, before, beforeEach, describe, it } from "node:test";
import {
connectionConfig,
db,
migrate,
migrationStatus,
PostgresAuthStore,
pool,
raw,
rawPool,
} from "./database";
import type { UserId } from "./user";
describe("database", () => {
before(async () => {
// Run migrations to set up schema
await migrate();
});
after(async () => {
await pool.end();
});
describe("connectionConfig", () => {
it("has required fields", () => {
assert.ok("host" in connectionConfig);
assert.ok("port" in connectionConfig);
assert.ok("user" in connectionConfig);
assert.ok("password" in connectionConfig);
assert.ok("database" in connectionConfig);
});
it("port is a number", () => {
assert.equal(typeof connectionConfig.port, "number");
});
});
describe("raw", () => {
it("executes raw SQL queries", async () => {
const result = await raw<{ one: number }>("SELECT 1 as one");
assert.equal(result.length, 1);
assert.equal(result[0].one, 1);
});
it("supports parameterized queries", async () => {
const result = await raw<{ sum: number }>(
"SELECT $1::int + $2::int as sum",
[2, 3],
);
assert.equal(result[0].sum, 5);
});
});
describe("db (Kysely instance)", () => {
it("can execute SELECT queries", async () => {
const result = await db
.selectFrom("users")
.select("id")
.limit(1)
.execute();
// May be empty, just verify it runs
assert.ok(Array.isArray(result));
});
});
describe("rawPool", () => {
it("is a pg Pool instance", () => {
assert.ok(rawPool.query !== undefined);
});
it("can execute queries", async () => {
const result = await rawPool.query("SELECT 1 as one");
assert.equal(result.rows[0].one, 1);
});
});
describe("migrate", () => {
it("runs without error when migrations are up to date", async () => {
// Should not throw
await migrate();
});
});
describe("migrationStatus", () => {
it("returns applied and pending arrays", async () => {
const status = await migrationStatus();
assert.ok(Array.isArray(status.applied));
assert.ok(Array.isArray(status.pending));
});
it("shows framework migrations as applied", async () => {
const status = await migrationStatus();
// At least the users migration should be applied
const hasUsersMigration = status.applied.some((m) =>
m.includes("users"),
);
assert.ok(hasUsersMigration);
});
});
describe("PostgresAuthStore", () => {
let store: PostgresAuthStore;
before(() => {
store = new PostgresAuthStore();
});
beforeEach(async () => {
// Clean up test data before each test
await rawPool.query("DELETE FROM sessions");
await rawPool.query("DELETE FROM user_credentials");
await rawPool.query("DELETE FROM user_emails");
await rawPool.query("DELETE FROM users");
});
describe("createUser", () => {
it("creates a user with pending status", async () => {
const user = await store.createUser({
email: "test@example.com",
passwordHash: "hash123",
displayName: "Test User",
});
assert.equal(user.email, "test@example.com");
assert.equal(user.displayName, "Test User");
assert.equal(user.status, "pending");
});
it("stores the password hash", async () => {
const user = await store.createUser({
email: "test@example.com",
passwordHash: "secrethash",
});
const hash = await store.getUserPasswordHash(user.id);
assert.equal(hash, "secrethash");
});
});
describe("getUserByEmail", () => {
it("returns user when found", async () => {
await store.createUser({
email: "find@example.com",
passwordHash: "hash",
});
const user = await store.getUserByEmail("find@example.com");
assert.notEqual(user, null);
assert.equal(user!.email, "find@example.com");
});
it("is case-insensitive", async () => {
await store.createUser({
email: "UPPER@EXAMPLE.COM",
passwordHash: "hash",
});
const user = await store.getUserByEmail("upper@example.com");
assert.notEqual(user, null);
});
it("returns null when not found", async () => {
const user = await store.getUserByEmail("notfound@example.com");
assert.equal(user, null);
});
});
describe("getUserById", () => {
it("returns user when found", async () => {
const created = await store.createUser({
email: "test@example.com",
passwordHash: "hash",
});
const user = await store.getUserById(created.id);
assert.notEqual(user, null);
assert.equal(user!.id, created.id);
});
it("returns null when not found", async () => {
const user = await store.getUserById(
"00000000-0000-0000-0000-000000000000" as UserId,
);
assert.equal(user, null);
});
});
describe("setUserPassword", () => {
it("updates the password hash", async () => {
const user = await store.createUser({
email: "test@example.com",
passwordHash: "oldhash",
});
await store.setUserPassword(user.id, "newhash");
const hash = await store.getUserPasswordHash(user.id);
assert.equal(hash, "newhash");
});
});
describe("updateUserEmailVerified", () => {
it("sets user status to active", async () => {
const created = await store.createUser({
email: "test@example.com",
passwordHash: "hash",
});
assert.equal(created.status, "pending");
await store.updateUserEmailVerified(created.id);
const user = await store.getUserById(created.id);
assert.equal(user!.status, "active");
});
});
describe("session operations", () => {
let userId: UserId;
beforeEach(async () => {
const user = await store.createUser({
email: "session@example.com",
passwordHash: "hash",
});
userId = user.id;
});
it("creates and retrieves sessions", async () => {
const { token, session } = await store.createSession({
userId,
tokenType: "session",
authMethod: "cookie",
expiresAt: new Date(Date.now() + 3600000),
});
assert.ok(token.length > 0);
assert.equal(session.userId, userId);
assert.equal(session.tokenType, "session");
});
it("deletes sessions", async () => {
const { session } = await store.createSession({
userId,
tokenType: "session",
authMethod: "cookie",
expiresAt: new Date(Date.now() + 3600000),
});
await store.deleteSession(session.tokenId as any);
// Session should be soft-deleted (revoked)
const retrieved = await store.getSession(
session.tokenId as any,
);
assert.equal(retrieved, null);
});
it("deletes all user sessions", async () => {
await store.createSession({
userId,
tokenType: "session",
authMethod: "cookie",
expiresAt: new Date(Date.now() + 3600000),
});
await store.createSession({
userId,
tokenType: "session",
authMethod: "bearer",
expiresAt: new Date(Date.now() + 3600000),
});
const count = await store.deleteUserSessions(userId);
assert.equal(count, 2);
});
});
});
});

View File

@@ -0,0 +1,548 @@
// database.ts
// PostgreSQL database access with Kysely query builder and simple migrations
import * as fs from "node:fs";
import * as path from "node:path";
import {
type Generated,
Kysely,
PostgresDialect,
type Selectable,
sql,
} from "kysely";
import { Pool } from "pg";
import type {
AuthStore,
CreateSessionData,
CreateUserData,
} from "./auth/store";
import { generateToken, hashToken } from "./auth/token";
import type { SessionData, TokenId } from "./auth/types";
import type { Domain } from "./types";
import { AuthenticatedUser, type User, type UserId } from "./user";
// Connection configuration (supports environment variable overrides)
const connectionConfig = {
host: process.env.DB_HOST ?? "localhost",
port: Number(process.env.DB_PORT ?? 5432),
user: process.env.DB_USER ?? "diachron",
password: process.env.DB_PASSWORD ?? "diachron",
database: process.env.DB_NAME ?? "diachron",
};
// Database schema types for Kysely
// Generated<T> marks columns with database defaults (optional on insert)
interface UsersTable {
id: string;
status: Generated<string>;
display_name: string | null;
created_at: Generated<Date>;
updated_at: Generated<Date>;
}
interface UserEmailsTable {
id: string;
user_id: string;
email: string;
normalized_email: string;
is_primary: Generated<boolean>;
is_verified: Generated<boolean>;
created_at: Generated<Date>;
verified_at: Date | null;
revoked_at: Date | null;
}
interface UserCredentialsTable {
id: string;
user_id: string;
credential_type: Generated<string>;
password_hash: string | null;
created_at: Generated<Date>;
updated_at: Generated<Date>;
}
interface SessionsTable {
id: Generated<string>;
token_hash: string;
user_id: string;
user_email_id: string | null;
token_type: string;
auth_method: string;
created_at: Generated<Date>;
expires_at: Date;
revoked_at: Date | null;
ip_address: string | null;
user_agent: string | null;
is_used: Generated<boolean | null>;
}
interface Database {
users: UsersTable;
user_emails: UserEmailsTable;
user_credentials: UserCredentialsTable;
sessions: SessionsTable;
}
// Create the connection pool
const pool = new Pool(connectionConfig);
// Create the Kysely instance
const db = new Kysely<Database>({
dialect: new PostgresDialect({ pool }),
});
// Raw pool access for when you need it
const rawPool = pool;
// Execute raw SQL (for when Kysely doesn't fit)
async function raw<T = unknown>(
query: string,
params: unknown[] = [],
): Promise<T[]> {
const result = await pool.query(query, params);
return result.rows as T[];
}
// ============================================================================
// Migrations
// ============================================================================
// Migration file naming convention:
// yyyy-mm-dd_ss_description.sql
// e.g., 2025-01-15_01_initial.sql, 2025-01-15_02_add_users.sql
//
// Migrations directory: express/migrations/
const FRAMEWORK_MIGRATIONS_DIR = path.join(__dirname, "diachron/migrations");
const APP_MIGRATIONS_DIR = path.join(__dirname, "migrations");
const MIGRATIONS_TABLE = "_migrations";
interface MigrationRecord {
id: number;
name: string;
applied_at: Date;
}
// Ensure migrations table exists
async function ensureMigrationsTable(): Promise<void> {
await pool.query(`
CREATE TABLE IF NOT EXISTS ${MIGRATIONS_TABLE} (
id SERIAL PRIMARY KEY,
name TEXT NOT NULL UNIQUE,
applied_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
)
`);
}
// Get list of applied migrations
async function getAppliedMigrations(): Promise<string[]> {
const result = await pool.query<MigrationRecord>(
`SELECT name FROM ${MIGRATIONS_TABLE} ORDER BY name`,
);
return result.rows.map((r) => r.name);
}
// Get pending migration files
function getMigrationFiles(kind: Domain): string[] {
const dir = kind === "fw" ? FRAMEWORK_MIGRATIONS_DIR : APP_MIGRATIONS_DIR;
if (!fs.existsSync(dir)) {
return [];
}
const root = __dirname;
const mm = fs
.readdirSync(dir)
.filter((f) => f.endsWith(".sql"))
.filter((f) => /^\d{4}-\d{2}-\d{2}_\d{2}-/.test(f))
.map((f) => `${dir}/${f}`)
.map((f) => f.replace(`${root}/`, ""))
.sort();
return mm;
}
// Run a single migration
async function runMigration(filename: string): Promise<void> {
// const filepath = path.join(MIGRATIONS_DIR, filename);
const filepath = filename;
const content = fs.readFileSync(filepath, "utf-8");
process.stdout.write(` Migration: ${filename}...`);
// Run migration in a transaction
const client = await pool.connect();
try {
await client.query("BEGIN");
await client.query(content);
await client.query(
`INSERT INTO ${MIGRATIONS_TABLE} (name) VALUES ($1)`,
[filename],
);
await client.query("COMMIT");
console.log(" ✓");
} catch (err) {
console.log(" ✗");
const message = err instanceof Error ? err.message : String(err);
console.error(` Error: ${message}`);
await client.query("ROLLBACK");
throw err;
} finally {
client.release();
}
}
function getAllMigrationFiles() {
const fw_files = getMigrationFiles("fw");
const app_files = getMigrationFiles("app");
const all = [...fw_files, ...app_files];
return all;
}
// Run all pending migrations
async function migrate(): Promise<void> {
await ensureMigrationsTable();
const applied = new Set(await getAppliedMigrations());
const all = getAllMigrationFiles();
const pending = all.filter((all) => !applied.has(all));
if (pending.length === 0) {
console.log("No pending migrations");
return;
}
console.log(`Applying ${pending.length} migration(s):`);
for (const file of pending) {
await runMigration(file);
}
}
// List migration status
async function migrationStatus(): Promise<{
applied: string[];
pending: string[];
}> {
await ensureMigrationsTable();
const applied = new Set(await getAppliedMigrations());
const ff = getAllMigrationFiles();
return {
applied: ff.filter((ff) => applied.has(ff)),
pending: ff.filter((ff) => !applied.has(ff)),
};
}
// ============================================================================
// PostgresAuthStore - Database-backed authentication storage
// ============================================================================
class PostgresAuthStore implements AuthStore {
// Session operations
async createSession(
data: CreateSessionData,
): Promise<{ token: string; session: SessionData }> {
const token = generateToken();
const tokenHash = hashToken(token);
const row = await db
.insertInto("sessions")
.values({
token_hash: tokenHash,
user_id: data.userId,
token_type: data.tokenType,
auth_method: data.authMethod,
expires_at: data.expiresAt,
user_agent: data.userAgent ?? null,
ip_address: data.ipAddress ?? null,
})
.returningAll()
.executeTakeFirstOrThrow();
const session: SessionData = {
tokenId: row.token_hash,
userId: row.user_id,
tokenType: row.token_type as SessionData["tokenType"],
authMethod: row.auth_method as SessionData["authMethod"],
createdAt: row.created_at,
expiresAt: row.expires_at,
userAgent: row.user_agent ?? undefined,
ipAddress: row.ip_address ?? undefined,
isUsed: row.is_used ?? undefined,
};
return { token, session };
}
async getSession(tokenId: TokenId): Promise<SessionData | null> {
const row = await db
.selectFrom("sessions")
.selectAll()
.where("token_hash", "=", tokenId)
.where("expires_at", ">", new Date())
.where("revoked_at", "is", null)
.executeTakeFirst();
if (!row) {
return null;
}
return {
tokenId: row.token_hash,
userId: row.user_id,
tokenType: row.token_type as SessionData["tokenType"],
authMethod: row.auth_method as SessionData["authMethod"],
createdAt: row.created_at,
expiresAt: row.expires_at,
userAgent: row.user_agent ?? undefined,
ipAddress: row.ip_address ?? undefined,
isUsed: row.is_used ?? undefined,
};
}
async updateLastUsed(_tokenId: TokenId): Promise<void> {
// The new schema doesn't have last_used_at column
// This is now a no-op; session activity tracking could be added later
}
async deleteSession(tokenId: TokenId): Promise<void> {
// Soft delete by setting revoked_at
await db
.updateTable("sessions")
.set({ revoked_at: new Date() })
.where("token_hash", "=", tokenId)
.execute();
}
async deleteUserSessions(userId: UserId): Promise<number> {
const result = await db
.updateTable("sessions")
.set({ revoked_at: new Date() })
.where("user_id", "=", userId)
.where("revoked_at", "is", null)
.executeTakeFirst();
return Number(result.numUpdatedRows);
}
// User operations
async getUserByEmail(email: string): Promise<User | null> {
// Find user through user_emails table
const normalizedEmail = email.toLowerCase().trim();
const row = await db
.selectFrom("user_emails")
.innerJoin("users", "users.id", "user_emails.user_id")
.select([
"users.id",
"users.status",
"users.display_name",
"users.created_at",
"users.updated_at",
"user_emails.email",
])
.where("user_emails.normalized_email", "=", normalizedEmail)
.where("user_emails.revoked_at", "is", null)
.executeTakeFirst();
if (!row) {
return null;
}
return this.rowToUser(row);
}
async getUserById(userId: UserId): Promise<User | null> {
// Get user with their primary email
const row = await db
.selectFrom("users")
.leftJoin("user_emails", (join) =>
join
.onRef("user_emails.user_id", "=", "users.id")
.on("user_emails.is_primary", "=", true)
.on("user_emails.revoked_at", "is", null),
)
.select([
"users.id",
"users.status",
"users.display_name",
"users.created_at",
"users.updated_at",
"user_emails.email",
])
.where("users.id", "=", userId)
.executeTakeFirst();
if (!row) {
return null;
}
return this.rowToUser(row);
}
async createUser(data: CreateUserData): Promise<User> {
const userId = crypto.randomUUID();
const emailId = crypto.randomUUID();
const credentialId = crypto.randomUUID();
const now = new Date();
const normalizedEmail = data.email.toLowerCase().trim();
// Create user record
await db
.insertInto("users")
.values({
id: userId,
display_name: data.displayName ?? null,
status: "pending",
created_at: now,
updated_at: now,
})
.execute();
// Create user_email record
await db
.insertInto("user_emails")
.values({
id: emailId,
user_id: userId,
email: data.email,
normalized_email: normalizedEmail,
is_primary: true,
is_verified: false,
created_at: now,
})
.execute();
// Create user_credential record
await db
.insertInto("user_credentials")
.values({
id: credentialId,
user_id: userId,
credential_type: "password",
password_hash: data.passwordHash,
created_at: now,
updated_at: now,
})
.execute();
return new AuthenticatedUser({
id: userId,
email: data.email,
displayName: data.displayName,
status: "pending",
roles: [],
permissions: [],
createdAt: now,
updatedAt: now,
});
}
async getUserPasswordHash(userId: UserId): Promise<string | null> {
const row = await db
.selectFrom("user_credentials")
.select("password_hash")
.where("user_id", "=", userId)
.where("credential_type", "=", "password")
.executeTakeFirst();
return row?.password_hash ?? null;
}
async setUserPassword(userId: UserId, passwordHash: string): Promise<void> {
const now = new Date();
// Try to update existing credential
const result = await db
.updateTable("user_credentials")
.set({ password_hash: passwordHash, updated_at: now })
.where("user_id", "=", userId)
.where("credential_type", "=", "password")
.executeTakeFirst();
// If no existing credential, create one
if (Number(result.numUpdatedRows) === 0) {
await db
.insertInto("user_credentials")
.values({
id: crypto.randomUUID(),
user_id: userId,
credential_type: "password",
password_hash: passwordHash,
created_at: now,
updated_at: now,
})
.execute();
}
// Update user's updated_at
await db
.updateTable("users")
.set({ updated_at: now })
.where("id", "=", userId)
.execute();
}
async updateUserEmailVerified(userId: UserId): Promise<void> {
const now = new Date();
// Update user_emails to mark as verified
await db
.updateTable("user_emails")
.set({
is_verified: true,
verified_at: now,
})
.where("user_id", "=", userId)
.where("is_primary", "=", true)
.execute();
// Update user status to active
await db
.updateTable("users")
.set({
status: "active",
updated_at: now,
})
.where("id", "=", userId)
.execute();
}
// Helper to convert database row to User object
private rowToUser(row: {
id: string;
status: string;
display_name: string | null;
created_at: Date;
updated_at: Date;
email: string | null;
}): User {
return new AuthenticatedUser({
id: row.id,
email: row.email ?? "unknown@example.com",
displayName: row.display_name ?? undefined,
status: row.status as "active" | "suspended" | "pending",
roles: [], // TODO: query from RBAC tables
permissions: [], // TODO: query from RBAC tables
createdAt: row.created_at,
updatedAt: row.updated_at,
});
}
}
// ============================================================================
// Exports
// ============================================================================
export {
db,
raw,
rawPool,
pool,
migrate,
migrationStatus,
connectionConfig,
PostgresAuthStore,
type Database,
};

View File

@@ -0,0 +1,17 @@
import { connectionConfig, migrate, pool } from "../database";
import { dropTables, exitIfUnforced } from "./util";
async function main(): Promise<void> {
exitIfUnforced();
try {
await dropTables();
} finally {
await pool.end();
}
}
main().catch((err) => {
console.error("Failed to clear database:", err.message);
process.exit(1);
});

View File

@@ -0,0 +1,26 @@
// reset-db.ts
// Development command to wipe the database and apply all migrations from scratch
import { connectionConfig, migrate, pool } from "../database";
import { dropTables, exitIfUnforced } from "./util";
async function main(): Promise<void> {
exitIfUnforced();
try {
await dropTables();
console.log("");
await migrate();
console.log("");
console.log("Database reset complete.");
} finally {
await pool.end();
}
}
main().catch((err) => {
console.error("Failed to reset database:", err.message);
process.exit(1);
});

View File

@@ -0,0 +1,42 @@
// FIXME: this is at the wrong level of specificity
import { connectionConfig, migrate, pool } from "../database";
const exitIfUnforced = () => {
const args = process.argv.slice(2);
// Require explicit confirmation unless --force is passed
if (!args.includes("--force")) {
console.error("This will DROP ALL TABLES in the database!");
console.error(` Database: ${connectionConfig.database}`);
console.error(
` Host: ${connectionConfig.host}:${connectionConfig.port}`,
);
console.error("");
console.error("Run with --force to proceed.");
process.exit(1);
}
};
const dropTables = async () => {
console.log("Dropping all tables...");
// Get all table names in the public schema
const result = await pool.query<{ tablename: string }>(`
SELECT tablename FROM pg_tables
WHERE schemaname = 'public'
`);
if (result.rows.length > 0) {
// Drop all tables with CASCADE to handle foreign key constraints
const tableNames = result.rows
.map((r) => `"${r.tablename}"`)
.join(", ");
await pool.query(`DROP TABLE IF EXISTS ${tableNames} CASCADE`);
console.log(`Dropped ${result.rows.length} table(s)`);
} else {
console.log("No tables to drop");
}
};
export { dropTables, exitIfUnforced };

229
backend/diachron/errors.ts Normal file
View File

@@ -0,0 +1,229 @@
// ANSI escape codes
const bold = "\x1b[1m";
const red = "\x1b[31m";
const cyan = "\x1b[36m";
const dim = "\x1b[2m";
const reset = "\x1b[0m";
interface ParsedFrame {
raw: string;
fn: string;
file: string;
line: string;
col: string;
isApp: boolean;
}
const frameRe = /^\s*at\s+(?:(.+?)\s+)?\(?((?:\/|[a-zA-Z]:\\).+?):(\d+):(\d+)\)?$/;
function parseFrame(line: string): ParsedFrame | null {
const m = line.match(frameRe);
if (!m) return null;
const fn = m[1] ?? "<anonymous>";
const file = m[2];
const lineNum = m[3];
const col = m[4];
const isApp =
!file.includes("node_modules") && !file.startsWith("node:");
return { raw: line, fn, file, line: lineNum, col, isApp };
}
function relativePath(absPath: string): string {
const marker = "backend/";
const idx = absPath.lastIndexOf(marker);
if (idx !== -1) return absPath.slice(idx);
return absPath;
}
function libraryName(file: string): string {
const nmIdx = file.indexOf("node_modules/");
if (nmIdx === -1) return "node";
const after = file.slice(nmIdx + "node_modules/".length);
// Handle scoped packages like @scope/pkg
if (after.startsWith("@")) {
const parts = after.split("/");
return `${parts[0]}/${parts[1]}`;
}
return after.split("/")[0];
}
interface ParsedError {
message: string;
frames: ParsedFrame[];
}
function parseError(error: unknown): ParsedError {
if (!(error instanceof Error)) {
return { message: String(error), frames: [] };
}
const message = error.message ?? String(error);
const stack = error.stack ?? "";
const lines = stack.split("\n");
const frameLines: string[] = [];
for (const line of lines) {
if (line.trimStart().startsWith("at ")) {
frameLines.push(line);
}
}
const frames = frameLines
.map(parseFrame)
.filter((f): f is ParsedFrame => f !== null);
return { message, frames };
}
// Group consecutive library frames into collapsed runs
type FrameGroup =
| { kind: "app"; frame: ParsedFrame }
| { kind: "lib"; count: number; names: string[] };
function groupFrames(frames: ParsedFrame[]): FrameGroup[] {
const groups: FrameGroup[] = [];
let i = 0;
while (i < frames.length) {
if (frames[i].isApp) {
groups.push({ kind: "app", frame: frames[i] });
i++;
} else {
const libNames = new Set<string>();
let count = 0;
while (i < frames.length && !frames[i].isApp) {
libNames.add(libraryName(frames[i].file));
count++;
i++;
}
groups.push({ kind: "lib", count, names: [...libNames] });
}
}
return groups;
}
function libSummary(count: number, names: string[]): string {
const s = count === 1 ? "" : "s";
return `... ${count} internal frame${s} (${names.join(", ")})`;
}
// --- Console formatting (ANSI) ---
function formatError(error: unknown): string {
const { message, frames } = parseError(error);
if (frames.length === 0) {
return `${bold}${red}ERROR${reset} ${message}`;
}
const parts: string[] = [];
parts.push(`${bold}${red}ERROR${reset} ${message}`);
parts.push("");
for (const group of groupFrames(frames)) {
if (group.kind === "app") {
const rel = relativePath(group.frame.file);
const loc = `${rel}:${group.frame.line}`;
parts.push(
` ${bold}${cyan}${loc.padEnd(24)}${reset}at ${group.frame.fn}`,
);
} else {
parts.push(
` ${dim}${libSummary(group.count, group.names)}${reset}`,
);
}
}
return parts.join("\n");
}
// --- HTML formatting (browser) ---
function esc(s: string): string {
return s
.replace(/&/g, "&amp;")
.replace(/</g, "&lt;")
.replace(/>/g, "&gt;")
.replace(/"/g, "&quot;");
}
function formatErrorHtml(error: unknown): string {
const { message, frames } = parseError(error);
const groups = groupFrames(frames);
let frameRows = "";
for (const group of groups) {
if (group.kind === "app") {
const rel = relativePath(group.frame.file);
const loc = `${rel}:${group.frame.line}`;
frameRows += `<tr class="app">
<td class="loc">${esc(loc)}</td>
<td class="fn">at ${esc(group.frame.fn)}</td>
</tr>\n`;
} else {
frameRows += `<tr class="lib">
<td colspan="2">${esc(libSummary(group.count, group.names))}</td>
</tr>\n`;
}
}
return `<!doctype html>
<html>
<head>
<meta charset="utf-8">
<title>Error</title>
<style>
* { margin: 0; padding: 0; box-sizing: border-box; }
body {
font-family: "SF Mono", "Menlo", "Consolas", monospace;
font-size: 14px;
background: #1a1a2e;
color: #e0e0e0;
padding: 40px;
}
.error-label {
display: inline-block;
background: #e74c3c;
color: #fff;
font-weight: 700;
font-size: 12px;
padding: 2px 8px;
border-radius: 3px;
letter-spacing: 0.5px;
}
.message {
margin-top: 12px;
font-size: 18px;
font-weight: 600;
color: #f8f8f2;
line-height: 1.4;
}
table {
margin-top: 24px;
border-collapse: collapse;
}
tr.app td { padding: 4px 0; }
tr.app .loc {
color: #56d4dd;
font-weight: 600;
padding-right: 24px;
white-space: nowrap;
}
tr.app .fn { color: #ccc; }
tr.lib td {
color: #666;
padding: 4px 0;
font-style: italic;
}
</style>
</head>
<body>
<span class="error-label">ERROR</span>
<div class="message">${esc(message)}</div>
<table>${frameRows}</table>
</body>
</html>`;
}
export { formatError, formatErrorHtml };

View File

@@ -0,0 +1,13 @@
import { z } from "zod";
export const executionContextSchema = z.object({
diachron_root: z.string(),
});
export type ExecutionContext = z.infer<typeof executionContextSchema>;
export function parseExecutionContext(
env: Record<string, string | undefined>,
): ExecutionContext {
return executionContextSchema.parse(env);
}

View File

@@ -0,0 +1,38 @@
import assert from "node:assert/strict";
import { describe, it } from "node:test";
import { ZodError } from "zod";
import {
executionContextSchema,
parseExecutionContext,
} from "./execution-context-schema";
describe("parseExecutionContext", () => {
it("parses valid executionContext with diachron_root", () => {
const env = { diachron_root: "/some/path" };
const result = parseExecutionContext(env);
assert.deepEqual(result, { diachron_root: "/some/path" });
});
it("throws ZodError when diachron_root is missing", () => {
const env = {};
assert.throws(() => parseExecutionContext(env), ZodError);
});
it("strips extra fields not in schema", () => {
const env = {
diachron_root: "/some/path",
EXTRA_VAR: "should be stripped",
};
const result = parseExecutionContext(env);
assert.deepEqual(result, { diachron_root: "/some/path" });
assert.equal("EXTRA_VAR" in result, false);
});
});
describe("executionContextSchema", () => {
it("requires diachron_root to be a string", () => {
const result = executionContextSchema.safeParse({ diachron_root: 123 });
assert.equal(result.success, false);
});
});

View File

@@ -0,0 +1,5 @@
import { parseExecutionContext } from "./execution-context-schema";
const executionContext = parseExecutionContext(process.env);
export { executionContext };

View File

@@ -1,4 +1,4 @@
import { Extensible } from "./interfaces"; // This file belongs to the framework. You are not expected to modify it.
export type HttpCode = { export type HttpCode = {
code: number; code: number;

View File

@@ -0,0 +1,29 @@
import { Kysely, PostgresDialect } from "kysely";
import { Pool } from "pg";
import type { DB } from "../../generated/db";
import { connectionConfig } from "../database";
const db = new Kysely<DB>({
dialect: new PostgresDialect({
pool: new Pool(connectionConfig),
}),
log(event) {
if (event.level === "query") {
// FIXME: Wire this up to the logging system
console.log("SQL:", event.query.sql);
console.log("Params:", event.query.parameters);
}
},
});
abstract class Hydrator<T> {
public db: Kysely<DB>;
protected abstract table: string;
constructor() {
this.db = db;
}
}
export { Hydrator, db };

View File

@@ -0,0 +1 @@
export type Hydrators = {};

View File

@@ -0,0 +1,44 @@
// Test setup for hydrator tests
// Run: DB_PORT=5433 DB_USER=diachron_test DB_PASSWORD=diachron_test DB_NAME=diachron_test npx tsx --test tests/*.test.ts
import { Pool } from "pg";
import { connectionConfig, migrate } from "../../database";
const pool = new Pool(connectionConfig);
export async function setupTestDatabase(): Promise<void> {
await migrate();
}
export async function cleanupTables(): Promise<void> {
// Clean in reverse dependency order
await pool.query("DELETE FROM user_emails");
await pool.query("DELETE FROM users");
}
export async function teardownTestDatabase(): Promise<void> {
await pool.end();
}
export async function insertTestUser(data: {
id: string;
displayName: string;
status: string;
email: string;
}): Promise<void> {
const emailId = crypto.randomUUID();
const normalizedEmail = data.email.toLowerCase().trim();
await pool.query(
`INSERT INTO users (id, display_name, status) VALUES ($1, $2, $3)`,
[data.id, data.displayName, data.status],
);
await pool.query(
`INSERT INTO user_emails (id, user_id, email, normalized_email, is_primary)
VALUES ($1, $2, $3, $4, true)`,
[emailId, data.id, data.email, normalizedEmail],
);
}
export { pool };

View File

@@ -0,0 +1,98 @@
// Tests for user hydrator
// Run with: cd express && DB_PORT=5433 DB_USER=diachron_test DB_PASSWORD=diachron_test DB_NAME=diachron_test ../cmd npx tsx --test diachron/hydrators/tests/user.test.ts
import assert from "node:assert/strict";
import { after, before, beforeEach, describe, it } from "node:test";
import { get } from "../user";
import {
cleanupTables,
insertTestUser,
setupTestDatabase,
teardownTestDatabase,
} from "./setup";
describe("user hydrator", () => {
before(async () => {
await setupTestDatabase();
});
after(async () => {
await teardownTestDatabase();
});
beforeEach(async () => {
await cleanupTables();
});
describe("get", () => {
it("returns null for non-existent user", async () => {
const result = await get("00000000-0000-0000-0000-000000000000");
assert.equal(result, null);
});
it("returns user when found", async () => {
const userId = "cfae0a19-6515-4813-bc2d-1e032b72b203";
await insertTestUser({
id: userId,
displayName: "Test User",
status: "active",
email: "test@example.com",
});
const result = await get(userId);
assert.notEqual(result, null);
assert.equal(result!.id, userId);
assert.equal(result!.display_name, "Test User");
assert.equal(result!.status, "active");
assert.equal(result!.email, "test@example.com");
});
it("validates user data with zod parser", async () => {
const userId = crypto.randomUUID();
await insertTestUser({
id: userId,
displayName: "Valid User",
status: "active",
email: "valid@example.com",
});
const result = await get(userId);
// If we get here without throwing, parsing succeeded
assert.notEqual(result, null);
assert.equal(typeof result!.id, "string");
assert.equal(typeof result!.email, "string");
});
it("returns user with pending status", async () => {
const userId = crypto.randomUUID();
await insertTestUser({
id: userId,
displayName: "Pending User",
status: "pending",
email: "pending@example.com",
});
const result = await get(userId);
assert.notEqual(result, null);
assert.equal(result!.status, "pending");
});
it("returns user with suspended status", async () => {
const userId = crypto.randomUUID();
await insertTestUser({
id: userId,
displayName: "Suspended User",
status: "suspended",
email: "suspended@example.com",
});
const result = await get(userId);
assert.notEqual(result, null);
assert.equal(result!.status, "suspended");
});
});
});

View File

@@ -0,0 +1,59 @@
import {
ColumnType,
Generated,
Insertable,
JSONColumnType,
Selectable,
Updateable,
} from "kysely";
import type { TypeID } from "typeid-js";
import { z } from "zod";
import { db, Hydrator } from "./hydrator";
const parser = z.object({
// id: z.uuidv7(),
id: z.uuid(),
display_name: z.string(),
// FIXME: status is duplicated elsewhere
status: z.union([
z.literal("active"),
z.literal("suspended"),
z.literal("pending"),
]),
email: z.email(),
});
const tp = parser.parse({
id: "cfae0a19-6515-4813-bc2d-1e032b72b203",
display_name: "foo",
status: "active",
email: "mw@philologue.net",
});
export type User = z.infer<typeof parser>;
const get = async (id: string): Promise<null | User> => {
const ret = await db
.selectFrom("users")
.where("users.id", "=", id)
.innerJoin("user_emails", "user_emails.user_id", "users.id")
.select([
"users.id",
"users.status",
"users.display_name",
"user_emails.email",
])
.executeTakeFirst();
if (ret === undefined) {
return null;
}
console.dir(ret);
const parsed = parser.parse(ret);
return parsed;
};
export { get };

View File

@@ -0,0 +1,53 @@
// Tests for logging.ts
// Note: These tests verify the module structure and types.
// Full integration tests would require a running logging service.
import assert from "node:assert/strict";
import { describe, it } from "node:test";
// We can't easily test log() and getLogs() without mocking fetch,
// but we can verify the module exports correctly and types work.
describe("logging", () => {
describe("module structure", () => {
it("exports log function", async () => {
const { log } = await import("./logging");
assert.equal(typeof log, "function");
});
it("exports getLogs function", async () => {
const { getLogs } = await import("./logging");
assert.equal(typeof getLogs, "function");
});
});
describe("Message type", () => {
// Type-level tests - if these compile, the types are correct
it("accepts valid message sources", () => {
type MessageSource = "logging" | "diagnostic" | "user";
const sources: MessageSource[] = ["logging", "diagnostic", "user"];
assert.equal(sources.length, 3);
});
});
describe("FilterArgument type", () => {
// Type-level tests
it("accepts valid filter options", () => {
type FilterArgument = {
limit?: number;
before?: number;
after?: number;
match?: (string | RegExp)[];
};
const filter: FilterArgument = {
limit: 10,
before: Date.now(),
after: Date.now() - 3600000,
match: ["error", /warning/i],
};
assert.ok(filter.limit === 10);
});
});
});

View File

@@ -15,8 +15,8 @@ type Message = {
text: AtLeastOne<string>; text: AtLeastOne<string>;
}; };
const m1: Message = { timestamp: 123, source: "logging", text: ["foo"] }; const _m1: Message = { timestamp: 123, source: "logging", text: ["foo"] };
const m2: Message = { const _m2: Message = {
timestamp: 321, timestamp: 321,
source: "diagnostic", source: "diagnostic",
text: ["ok", "whatever"], text: ["ok", "whatever"],

View File

@@ -0,0 +1,69 @@
// add-user.ts
// Management command to create users from the command line
import { hashPassword } from "../auth/password";
import { PostgresAuthStore, pool } from "../database";
async function main(): Promise<void> {
const args = process.argv.slice(2);
if (args.length < 2) {
console.error(
"Usage: ./mgmt add-user <email> <password> [--display-name <name>] [--active]",
);
process.exit(1);
}
const email = args[0];
const password = args[1];
// Parse optional flags
let displayName: string | undefined;
let makeActive = false;
for (let i = 2; i < args.length; i++) {
if (args[i] === "--display-name" && args[i + 1]) {
displayName = args[i + 1];
i++;
} else if (args[i] === "--active") {
makeActive = true;
}
}
try {
const store = new PostgresAuthStore();
// Check if user already exists
const existing = await store.getUserByEmail(email);
if (existing) {
console.error(`Error: User with email '${email}' already exists`);
process.exit(1);
}
// Hash password and create user
const passwordHash = await hashPassword(password);
const user = await store.createUser({
email,
passwordHash,
displayName,
});
// Optionally activate user immediately
if (makeActive) {
await store.updateUserEmailVerified(user.id);
console.log(
`Created and activated user: ${user.email} (${user.id})`,
);
} else {
console.log(`Created user: ${user.email} (${user.id})`);
console.log(" Status: pending (use --active to create as active)");
}
} finally {
await pool.end();
}
}
main().catch((err) => {
console.error("Failed to create user:", err.message);
process.exit(1);
});

View File

@@ -0,0 +1,45 @@
// migrate.ts
// CLI script for running database migrations
import { migrate, migrationStatus, pool } from "./database";
async function main(): Promise<void> {
const command = process.argv[2] || "run";
try {
switch (command) {
case "run":
await migrate();
break;
case "status": {
const status = await migrationStatus();
console.log("Applied migrations:");
for (const name of status.applied) {
console.log(`${name}`);
}
if (status.pending.length > 0) {
console.log("\nPending migrations:");
for (const name of status.pending) {
console.log(`${name}`);
}
} else {
console.log("\nNo pending migrations");
}
break;
}
default:
console.error(`Unknown command: ${command}`);
console.error("Usage: migrate [run|status]");
process.exit(1);
}
} finally {
await pool.end();
}
}
main().catch((err) => {
console.error("Migration failed:", err);
process.exit(1);
});

View File

@@ -0,0 +1,29 @@
-- 0001_users.sql
-- Create users table for authentication
CREATE TABLE users (
id UUID PRIMARY KEY,
status TEXT NOT NULL DEFAULT 'active',
display_name TEXT,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
CREATE TABLE user_emails (
id UUID PRIMARY KEY,
user_id UUID NOT NULL REFERENCES users(id),
email TEXT NOT NULL,
normalized_email TEXT NOT NULL,
is_primary BOOLEAN NOT NULL DEFAULT FALSE,
is_verified BOOLEAN NOT NULL DEFAULT FALSE,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
verified_at TIMESTAMPTZ,
revoked_at TIMESTAMPTZ
);
-- Enforce uniqueness only among *active* emails
CREATE UNIQUE INDEX user_emails_unique_active
ON user_emails (normalized_email)
WHERE revoked_at IS NULL;

View File

@@ -0,0 +1,26 @@
-- 0002_sessions.sql
-- Create sessions table for auth tokens
CREATE TABLE sessions (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
token_hash TEXT UNIQUE NOT NULL,
user_id UUID NOT NULL REFERENCES users(id),
user_email_id UUID REFERENCES user_emails(id),
token_type TEXT NOT NULL,
auth_method TEXT NOT NULL,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
expires_at TIMESTAMPTZ NOT NULL,
revoked_at TIMESTAMPTZ,
ip_address INET,
user_agent TEXT,
is_used BOOLEAN DEFAULT FALSE
);
-- Index for user session lookups (logout all, etc.)
CREATE INDEX sessions_user_id_idx ON sessions (user_id);
-- Index for expiration cleanup
CREATE INDEX sessions_expires_at_idx ON sessions (expires_at);
-- Index for token type filtering
CREATE INDEX sessions_token_type_idx ON sessions (token_type);

View File

@@ -0,0 +1,17 @@
-- 0003_user_credentials.sql
-- Create user_credentials table for password storage (extensible for other auth methods)
CREATE TABLE user_credentials (
id UUID PRIMARY KEY,
user_id UUID NOT NULL REFERENCES users(id),
credential_type TEXT NOT NULL DEFAULT 'password',
password_hash TEXT,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
-- Each user can have at most one credential per type
CREATE UNIQUE INDEX user_credentials_user_type_idx ON user_credentials (user_id, credential_type);
-- Index for user lookups
CREATE INDEX user_credentials_user_id_idx ON user_credentials (user_id);

View File

@@ -0,0 +1,20 @@
CREATE TABLE roles (
id UUID PRIMARY KEY,
name TEXT UNIQUE NOT NULL,
description TEXT
);
CREATE TABLE groups (
id UUID PRIMARY KEY,
name TEXT NOT NULL,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
CREATE TABLE user_group_roles (
user_id UUID NOT NULL REFERENCES users(id),
group_id UUID NOT NULL REFERENCES groups(id),
role_id UUID NOT NULL REFERENCES roles(id),
granted_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
revoked_at TIMESTAMPTZ,
PRIMARY KEY (user_id, group_id, role_id)
);

View File

@@ -0,0 +1,14 @@
CREATE TABLE capabilities (
id UUID PRIMARY KEY,
name TEXT UNIQUE NOT NULL,
description TEXT
);
CREATE TABLE role_capabilities (
role_id UUID NOT NULL REFERENCES roles(id),
capability_id UUID NOT NULL REFERENCES capabilities(id),
granted_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
revoked_at TIMESTAMPTZ,
PRIMARY KEY (role_id, capability_id)
);

View File

@@ -0,0 +1,39 @@
{
"name": "diachron",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"test": "DB_PORT=5433 DB_USER=diachron_test DB_PASSWORD=diachron_test DB_NAME=diachron_test tsx --test '**/*.{test,spec}.ts'",
"test:watch": "DB_PORT=5433 DB_USER=diachron_test DB_PASSWORD=diachron_test DB_NAME=diachron_test tsx --test --watch '**/*.{test,spec}.ts'",
"nodemon": "nodemon dist/index.js",
"kysely-codegen": "kysely-codegen"
},
"keywords": [],
"author": "",
"license": "ISC",
"packageManager": "pnpm@10.12.4",
"dependencies": {
"@types/node": "^24.10.1",
"@types/nunjucks": "^3.2.6",
"@vercel/ncc": "^0.38.4",
"express": "^5.1.0",
"kysely": "^0.28.9",
"nodemon": "^3.1.11",
"nunjucks": "^3.2.4",
"path-to-regexp": "^8.3.0",
"pg": "^8.16.3",
"ts-luxon": "^6.2.0",
"ts-node": "^10.9.2",
"tsx": "^4.20.6",
"typeid-js": "^1.2.0",
"typescript": "^5.9.3",
"zod": "^4.1.12"
},
"devDependencies": {
"@biomejs/biome": "2.3.10",
"@types/express": "^5.0.5",
"@types/pg": "^8.16.0",
"kysely-codegen": "^0.19.0"
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,25 @@
import { AuthService } from "../auth";
import { getCurrentUser } from "../context";
import { PostgresAuthStore } from "../database";
import type { User } from "../user";
import { html, redirect, render } from "./util";
const util = { html, redirect, render };
const session = {
getUser: (): User => {
return getCurrentUser();
},
};
// Initialize auth with PostgreSQL store
const authStore = new PostgresAuthStore();
const auth = new AuthService(authStore);
const request = {
auth,
session,
util,
};
export { request };

View File

@@ -0,0 +1,45 @@
import { contentTypes } from "../content-types";
import { core } from "../core";
import { executionContext } from "../execution-context";
import { httpCodes } from "../http-codes";
import type { RedirectResult, Result } from "../types";
import { loadFile } from "../util";
import { request } from "./index";
type NoUser = {
[key: string]: unknown;
} & {
user?: never;
};
const render = async (path: string, ctx?: NoUser): Promise<string> => {
const fullPath = `${executionContext.diachron_root}/templates/${path}.html.njk`;
const template = await loadFile(fullPath);
const user = request.session.getUser();
const context = { user, ...ctx };
const engine = core.conf.templateEngine();
const retval = engine.renderTemplate(template, context);
return retval;
};
const html = (payload: string): Result => {
const retval: Result = {
code: httpCodes.success.OK,
result: payload,
contentType: contentTypes.text.html,
};
return retval;
};
const redirect = (location: string): RedirectResult => {
return {
code: httpCodes.redirection.SeeOther,
contentType: contentTypes.text.plain,
result: "",
redirect: location,
};
};
export { html, redirect, render };

View File

@@ -0,0 +1,94 @@
import { contentTypes } from "./content-types";
import { httpCodes } from "./http-codes";
import express, {
type NextFunction,
type Request as ExpressRequest,
type Response as ExpressResponse,
} from "express";
import {isRedirect, InternalHandler, AuthenticationRequired,
AuthorizationDenied, Call,type Method, type ProcessedRoute,methodParser, type Result, type Route,massageMethod } from "./types";
import { runWithContext } from "./context";
import { Session } from "./auth";import { request } from "./request";
import { match } from "path-to-regexp";
type ProcessedRoutes= {[K in Method]: ProcessedRoute[] }
const processRoutes=(routes:Route[]) :ProcessedRoutes => {
const retval:ProcessedRoutes= {
GET: [],
POST: [],
PUT: [],
PATCH: [],
DELETE: [],
};
routes.forEach((route: Route, _idx: number, _allRoutes: Route[]) => {
// const pattern /*: URLPattern */ = new URLPattern({ pathname: route.path });
const matcher = match<Record<string, string>>(route.path);
const methodList = route.methods;
const handler: InternalHandler = async (
expressRequest: ExpressRequest,
params: Record<string, string>,
): Promise<Result> => {
const method = massageMethod(expressRequest.method);
console.log("method", method);
if (!methodList.includes(method)) {
// XXX: Worth asserting this?
}
console.log("request.originalUrl", expressRequest.originalUrl);
// Authenticate the request
const auth = await request.auth.validateRequest(expressRequest);
const req: Call = {
pattern: route.path,
path: expressRequest.originalUrl,
method,
parameters: params,
request: expressRequest,
user: auth.user,
session: new Session(auth.session, auth.user),
};
try {
const retval = await runWithContext({ user: auth.user }, () =>
route.handler(req),
);
return retval;
} catch (error) {
// Handle authentication errors
if (error instanceof AuthenticationRequired) {
return {
code: httpCodes.clientErrors.Unauthorized,
contentType: contentTypes.application.json,
result: JSON.stringify({
error: "Authentication required",
}),
};
}
if (error instanceof AuthorizationDenied) {
return {
code: httpCodes.clientErrors.Forbidden,
contentType: contentTypes.application.json,
result: JSON.stringify({ error: "Access denied" }),
};
}
throw error;
}
};
for (const [_idx, method] of methodList.entries()) {
const pr: ProcessedRoute = { matcher, method, handler };
retval[method].push(pr);
}
});
return retval;
}
export{processRoutes}

View File

@@ -0,0 +1,179 @@
// Tests for types.ts
// Pure unit tests
import assert from "node:assert/strict";
import { describe, it } from "node:test";
import type { Request as ExpressRequest } from "express";
import { Session } from "./auth/types";
import { contentTypes } from "./content-types";
import { httpCodes } from "./http-codes";
import {
AuthenticationRequired,
AuthorizationDenied,
type Call,
isRedirect,
massageMethod,
methodParser,
type Permission,
type RedirectResult,
type Result,
requireAuth,
requirePermission,
} from "./types";
import { AuthenticatedUser, anonymousUser } from "./user";
// Helper to create a minimal mock Call
function createMockCall(overrides: Partial<Call> = {}): Call {
const defaultSession = new Session(null, anonymousUser);
return {
pattern: "/test",
path: "/test",
method: "GET",
parameters: {},
request: {} as ExpressRequest,
user: anonymousUser,
session: defaultSession,
...overrides,
};
}
describe("types", () => {
describe("methodParser", () => {
it("accepts valid HTTP methods", () => {
assert.equal(methodParser.parse("GET"), "GET");
assert.equal(methodParser.parse("POST"), "POST");
assert.equal(methodParser.parse("PUT"), "PUT");
assert.equal(methodParser.parse("PATCH"), "PATCH");
assert.equal(methodParser.parse("DELETE"), "DELETE");
});
it("rejects invalid methods", () => {
assert.throws(() => methodParser.parse("get"));
assert.throws(() => methodParser.parse("OPTIONS"));
assert.throws(() => methodParser.parse("HEAD"));
assert.throws(() => methodParser.parse(""));
});
});
describe("massageMethod", () => {
it("converts lowercase to uppercase", () => {
assert.equal(massageMethod("get"), "GET");
assert.equal(massageMethod("post"), "POST");
assert.equal(massageMethod("put"), "PUT");
assert.equal(massageMethod("patch"), "PATCH");
assert.equal(massageMethod("delete"), "DELETE");
});
it("handles mixed case", () => {
assert.equal(massageMethod("Get"), "GET");
assert.equal(massageMethod("pOsT"), "POST");
});
it("throws for invalid methods", () => {
assert.throws(() => massageMethod("options"));
assert.throws(() => massageMethod("head"));
});
});
describe("isRedirect", () => {
it("returns true for redirect results", () => {
const result: RedirectResult = {
code: httpCodes.redirection.Found,
contentType: contentTypes.text.html,
result: "",
redirect: "/other",
};
assert.equal(isRedirect(result), true);
});
it("returns false for non-redirect results", () => {
const result: Result = {
code: httpCodes.success.OK,
contentType: contentTypes.text.html,
result: "hello",
};
assert.equal(isRedirect(result), false);
});
});
describe("AuthenticationRequired", () => {
it("has correct name and message", () => {
const err = new AuthenticationRequired();
assert.equal(err.name, "AuthenticationRequired");
assert.equal(err.message, "Authentication required");
});
it("is an instance of Error", () => {
const err = new AuthenticationRequired();
assert.ok(err instanceof Error);
});
});
describe("AuthorizationDenied", () => {
it("has correct name and message", () => {
const err = new AuthorizationDenied();
assert.equal(err.name, "AuthorizationDenied");
assert.equal(err.message, "Authorization denied");
});
it("is an instance of Error", () => {
const err = new AuthorizationDenied();
assert.ok(err instanceof Error);
});
});
describe("requireAuth", () => {
it("returns user for authenticated call", () => {
const user = AuthenticatedUser.create("test@example.com");
const session = new Session(null, user);
const call = createMockCall({ user, session });
const result = requireAuth(call);
assert.equal(result, user);
});
it("throws AuthenticationRequired for anonymous user", () => {
const call = createMockCall({ user: anonymousUser });
assert.throws(() => requireAuth(call), AuthenticationRequired);
});
});
describe("requirePermission", () => {
it("returns user when they have the permission", () => {
const user = AuthenticatedUser.create("test@example.com", {
permissions: ["posts:create" as Permission],
});
const session = new Session(null, user);
const call = createMockCall({ user, session });
const result = requirePermission(
call,
"posts:create" as Permission,
);
assert.equal(result, user);
});
it("throws AuthenticationRequired for anonymous user", () => {
const call = createMockCall({ user: anonymousUser });
assert.throws(
() => requirePermission(call, "posts:create" as Permission),
AuthenticationRequired,
);
});
it("throws AuthorizationDenied when missing permission", () => {
const user = AuthenticatedUser.create("test@example.com", {
permissions: ["posts:read" as Permission],
});
const session = new Session(null, user);
const call = createMockCall({ user, session });
assert.throws(
() => requirePermission(call, "posts:create" as Permission),
AuthorizationDenied,
);
});
});
});

View File

@@ -2,20 +2,13 @@
// FIXME: split this up into types used by app developers and types internal // FIXME: split this up into types used by app developers and types internal
// to the framework. // to the framework.
import { import type { Request as ExpressRequest } from "express";
type Request as ExpressRequest,
Response as ExpressResponse,
} from "express";
import type { MatchFunction } from "path-to-regexp"; import type { MatchFunction } from "path-to-regexp";
import { z } from "zod"; import { z } from "zod";
import { type ContentType, contentTypes } from "./content-types"; import type { Session } from "./auth/types";
import { type HttpCode, httpCodes } from "./http-codes"; import type { ContentType } from "./content-types";
import { import type { HttpCode } from "./http-codes";
AnonymousUser, import type { Permission, User } from "./user";
type MaybeUser,
type Permission,
type User,
} from "./user";
const methodParser = z.union([ const methodParser = z.union([
z.literal("GET"), z.literal("GET"),
@@ -36,12 +29,13 @@ export type Call = {
pattern: string; pattern: string;
path: string; path: string;
method: Method; method: Method;
parameters: object; parameters: Record<string, string>;
request: ExpressRequest; request: ExpressRequest;
user: MaybeUser; user: User;
session: Session;
}; };
export type InternalHandler = (req: ExpressRequest) => Promise<Result>; export type InternalHandler = (req: ExpressRequest, params: Record<string, string>) => Promise<Result>;
export type Handler = (call: Call) => Promise<Result>; export type Handler = (call: Call) => Promise<Result>;
export type ProcessedRoute = { export type ProcessedRoute = {
@@ -50,12 +44,35 @@ export type ProcessedRoute = {
handler: InternalHandler; handler: InternalHandler;
}; };
export type CookieOptions = {
httpOnly?: boolean;
secure?: boolean;
sameSite?: "strict" | "lax" | "none";
maxAge?: number;
path?: string;
};
export type Cookie = {
name: string;
value: string;
options?: CookieOptions;
};
export type Result = { export type Result = {
code: HttpCode; code: HttpCode;
contentType: ContentType; contentType: ContentType;
result: string; result: string;
cookies?: Cookie[];
}; };
export type RedirectResult = Result & {
redirect: string;
};
export function isRedirect(result: Result): result is RedirectResult {
return "redirect" in result;
}
export type Route = { export type Route = {
path: string; path: string;
methods: Method[]; methods: Method[];
@@ -80,7 +97,7 @@ export class AuthorizationDenied extends Error {
// Helper for handlers to require authentication // Helper for handlers to require authentication
export function requireAuth(call: Call): User { export function requireAuth(call: Call): User {
if (call.user === AnonymousUser) { if (call.user.isAnonymous()) {
throw new AuthenticationRequired(); throw new AuthenticationRequired();
} }
return call.user; return call.user;
@@ -95,4 +112,6 @@ export function requirePermission(call: Call, permission: Permission): User {
return user; return user;
} }
export type Domain = "app" | "fw";
export { methodParser, massageMethod }; export { methodParser, massageMethod };

View File

@@ -0,0 +1,213 @@
// Tests for user.ts
// These are pure unit tests - no database needed
import assert from "node:assert/strict";
import { describe, it } from "node:test";
import {
AnonymousUser,
AuthenticatedUser,
anonymousUser,
type Permission,
type Role,
} from "./user";
describe("User", () => {
describe("AuthenticatedUser.create", () => {
it("creates a user with default values", () => {
const user = AuthenticatedUser.create("test@example.com");
assert.equal(user.email, "test@example.com");
assert.equal(user.status, "active");
assert.equal(user.isAnonymous(), false);
assert.deepEqual([...user.roles], []);
assert.deepEqual([...user.permissions], []);
});
it("creates a user with custom values", () => {
const user = AuthenticatedUser.create("test@example.com", {
id: "custom-id",
displayName: "Test User",
status: "pending",
roles: ["admin"],
permissions: ["posts:create"],
});
assert.equal(user.id, "custom-id");
assert.equal(user.displayName, "Test User");
assert.equal(user.status, "pending");
assert.deepEqual([...user.roles], ["admin"]);
assert.deepEqual([...user.permissions], ["posts:create"]);
});
});
describe("status checks", () => {
it("isActive returns true for active users", () => {
const user = AuthenticatedUser.create("test@example.com", {
status: "active",
});
assert.equal(user.isActive(), true);
});
it("isActive returns false for suspended users", () => {
const user = AuthenticatedUser.create("test@example.com", {
status: "suspended",
});
assert.equal(user.isActive(), false);
});
it("isActive returns false for pending users", () => {
const user = AuthenticatedUser.create("test@example.com", {
status: "pending",
});
assert.equal(user.isActive(), false);
});
});
describe("role checks", () => {
it("hasRole returns true when user has the role", () => {
const user = AuthenticatedUser.create("test@example.com", {
roles: ["admin", "editor"],
});
assert.equal(user.hasRole("admin"), true);
assert.equal(user.hasRole("editor"), true);
});
it("hasRole returns false when user does not have the role", () => {
const user = AuthenticatedUser.create("test@example.com", {
roles: ["user"],
});
assert.equal(user.hasRole("admin"), false);
});
it("hasAnyRole returns true when user has at least one role", () => {
const user = AuthenticatedUser.create("test@example.com", {
roles: ["editor"],
});
assert.equal(user.hasAnyRole(["admin", "editor"]), true);
});
it("hasAnyRole returns false when user has none of the roles", () => {
const user = AuthenticatedUser.create("test@example.com", {
roles: ["user"],
});
assert.equal(user.hasAnyRole(["admin", "editor"]), false);
});
it("hasAllRoles returns true when user has all roles", () => {
const user = AuthenticatedUser.create("test@example.com", {
roles: ["admin", "editor", "user"],
});
assert.equal(user.hasAllRoles(["admin", "editor"]), true);
});
it("hasAllRoles returns false when user is missing a role", () => {
const user = AuthenticatedUser.create("test@example.com", {
roles: ["admin"],
});
assert.equal(user.hasAllRoles(["admin", "editor"]), false);
});
});
describe("permission checks", () => {
it("hasPermission returns true for direct permissions", () => {
const user = AuthenticatedUser.create("test@example.com", {
permissions: ["posts:create" as Permission],
});
assert.equal(
user.hasPermission("posts:create" as Permission),
true,
);
});
it("hasPermission returns true for role-derived permissions", () => {
const user = AuthenticatedUser.create("test@example.com", {
roles: ["admin" as Role],
});
// admin role has users:read, users:create, users:update, users:delete
assert.equal(user.hasPermission("users:read" as Permission), true);
assert.equal(
user.hasPermission("users:delete" as Permission),
true,
);
});
it("hasPermission returns false when permission not granted", () => {
const user = AuthenticatedUser.create("test@example.com", {
roles: ["user" as Role],
});
// user role only has users:read
assert.equal(
user.hasPermission("users:delete" as Permission),
false,
);
});
it("can() is a convenience method for hasPermission", () => {
const user = AuthenticatedUser.create("test@example.com", {
roles: ["admin" as Role],
});
assert.equal(user.can("read", "users"), true);
assert.equal(user.can("delete", "users"), true);
assert.equal(user.can("create", "posts"), false);
});
});
describe("effectivePermissions", () => {
it("returns combined direct and role-derived permissions", () => {
const user = AuthenticatedUser.create("test@example.com", {
roles: ["user" as Role],
permissions: ["posts:create" as Permission],
});
const perms = user.effectivePermissions();
assert.equal(perms.has("posts:create" as Permission), true);
assert.equal(perms.has("users:read" as Permission), true); // from user role
});
it("returns empty set for user with no roles or permissions", () => {
const user = AuthenticatedUser.create("test@example.com");
const perms = user.effectivePermissions();
assert.equal(perms.size, 0);
});
});
describe("serialization", () => {
it("toJSON returns plain object", () => {
const user = AuthenticatedUser.create("test@example.com", {
id: "test-id",
displayName: "Test",
status: "active",
roles: ["admin"],
permissions: ["posts:create"],
});
const json = user.toJSON();
assert.equal(json.id, "test-id");
assert.equal(json.email, "test@example.com");
assert.equal(json.displayName, "Test");
assert.equal(json.status, "active");
assert.deepEqual(json.roles, ["admin"]);
assert.deepEqual(json.permissions, ["posts:create"]);
});
it("toString returns readable string", () => {
const user = AuthenticatedUser.create("test@example.com", {
id: "test-id",
});
assert.equal(user.toString(), "User(id test-id)");
});
});
describe("AnonymousUser", () => {
it("isAnonymous returns true", () => {
const user = AnonymousUser.create("anon@example.com");
assert.equal(user.isAnonymous(), true);
});
it("anonymousUser singleton is anonymous", () => {
assert.equal(anonymousUser.isAnonymous(), true);
assert.equal(anonymousUser.id, "-1");
assert.equal(anonymousUser.email, "anonymous@example.com");
});
});
});

View File

@@ -51,39 +51,15 @@ const defaultRolePermissions: RolePermissionMap = new Map([
["user", ["users:read"]], ["user", ["users:read"]],
]); ]);
export class User { export abstract class User {
private readonly data: UserData; protected readonly data: UserData;
private rolePermissions: RolePermissionMap; protected rolePermissions: RolePermissionMap;
constructor(data: UserData, rolePermissions?: RolePermissionMap) { constructor(data: UserData, rolePermissions?: RolePermissionMap) {
this.data = userDataParser.parse(data); this.data = userDataParser.parse(data);
this.rolePermissions = rolePermissions ?? defaultRolePermissions; this.rolePermissions = rolePermissions ?? defaultRolePermissions;
} }
// Factory for creating new users with sensible defaults
static create(
email: string,
options?: {
id?: string;
displayName?: string;
status?: UserStatus;
roles?: Role[];
permissions?: Permission[];
},
): User {
const now = new Date();
return new User({
id: options?.id ?? crypto.randomUUID(),
email,
displayName: options?.displayName,
status: options?.status ?? "active",
roles: options?.roles ?? [],
permissions: options?.permissions ?? [],
createdAt: now,
updatedAt: now,
});
}
// Identity // Identity
get id(): UserId { get id(): UserId {
return this.data.id as UserId; return this.data.id as UserId;
@@ -185,15 +161,72 @@ export class User {
toString(): string { toString(): string {
return `User(id ${this.id})`; return `User(id ${this.id})`;
} }
abstract isAnonymous(): boolean;
}
export class AuthenticatedUser extends User {
// Factory for creating new users with sensible defaults
static create(
email: string,
options?: {
id?: string;
displayName?: string;
status?: UserStatus;
roles?: Role[];
permissions?: Permission[];
},
): User {
const now = new Date();
return new AuthenticatedUser({
id: options?.id ?? crypto.randomUUID(),
email,
displayName: options?.displayName,
status: options?.status ?? "active",
roles: options?.roles ?? [],
permissions: options?.permissions ?? [],
createdAt: now,
updatedAt: now,
});
}
isAnonymous(): boolean {
return false;
}
} }
// For representing "no user" in contexts where user is optional // For representing "no user" in contexts where user is optional
export const AnonymousUser = Symbol("AnonymousUser"); export class AnonymousUser extends User {
// FIXME: this is C&Ped with only minimal changes. No bueno.
static create(
email: string,
options?: {
id?: string;
displayName?: string;
status?: UserStatus;
roles?: Role[];
permissions?: Permission[];
},
): AnonymousUser {
const now = new Date(0);
return new AnonymousUser({
id: options?.id ?? crypto.randomUUID(),
email,
displayName: options?.displayName,
status: options?.status ?? "active",
roles: options?.roles ?? [],
permissions: options?.permissions ?? [],
createdAt: now,
updatedAt: now,
});
}
export const anonymousUser = User.create("anonymous@example.com", { isAnonymous(): boolean {
return true;
}
}
export const anonymousUser = AnonymousUser.create("anonymous@example.com", {
id: "-1", id: "-1",
displayName: "Anonymous User", displayName: "Anonymous User",
// FIXME: set createdAt and updatedAt to start of epoch
}); });
export type MaybeUser = User | typeof AnonymousUser;

View File

@@ -0,0 +1,61 @@
// Tests for util.ts
// Pure unit tests with filesystem
import assert from "node:assert/strict";
import { mkdir, rm, writeFile } from "node:fs/promises";
import { join } from "node:path";
import { after, before, describe, it } from "node:test";
import { loadFile } from "./util";
describe("util", () => {
const testDir = join(import.meta.dirname, ".test-util-tmp");
before(async () => {
await mkdir(testDir, { recursive: true });
});
after(async () => {
await rm(testDir, { recursive: true, force: true });
});
describe("loadFile", () => {
it("loads file contents as string", async () => {
const testFile = join(testDir, "test.txt");
await writeFile(testFile, "hello world");
const content = await loadFile(testFile);
assert.equal(content, "hello world");
});
it("handles utf-8 content", async () => {
const testFile = join(testDir, "utf8.txt");
await writeFile(testFile, "hello \u{1F511} world");
const content = await loadFile(testFile);
assert.equal(content, "hello \u{1F511} world");
});
it("handles empty file", async () => {
const testFile = join(testDir, "empty.txt");
await writeFile(testFile, "");
const content = await loadFile(testFile);
assert.equal(content, "");
});
it("handles multiline content", async () => {
const testFile = join(testDir, "multiline.txt");
await writeFile(testFile, "line1\nline2\nline3");
const content = await loadFile(testFile);
assert.equal(content, "line1\nline2\nline3");
});
it("throws for nonexistent file", async () => {
await assert.rejects(
loadFile(join(testDir, "nonexistent.txt")),
/ENOENT/,
);
});
});
});

11
backend/diachron/util.ts Normal file
View File

@@ -0,0 +1,11 @@
import { readFile } from "node:fs/promises";
// FIXME: Handle the error here
const loadFile = async (path: string): Promise<string> => {
// Specifying 'utf8' returns a string; otherwise, it returns a Buffer
const data = await readFile(path, "utf8");
return data;
};
export { loadFile };

109
backend/generated/db.d.ts vendored Normal file
View File

@@ -0,0 +1,109 @@
/**
* This file was generated by kysely-codegen.
* Please do not edit it manually.
*/
import type { ColumnType } from "kysely";
export type Generated<T> =
T extends ColumnType<infer S, infer I, infer U>
? ColumnType<S, I | undefined, U>
: ColumnType<T, T | undefined, T>;
export type Timestamp = ColumnType<Date, Date | string, Date | string>;
export interface _Migrations {
applied_at: Generated<Timestamp>;
id: Generated<number>;
name: string;
}
export interface Capabilities {
description: string | null;
id: string;
name: string;
}
export interface Groups {
created_at: Generated<Timestamp>;
id: string;
name: string;
}
export interface RoleCapabilities {
capability_id: string;
granted_at: Generated<Timestamp>;
revoked_at: Timestamp | null;
role_id: string;
}
export interface Roles {
description: string | null;
id: string;
name: string;
}
export interface Sessions {
auth_method: string;
created_at: Generated<Timestamp>;
expires_at: Timestamp;
id: Generated<string>;
ip_address: string | null;
is_used: Generated<boolean | null>;
revoked_at: Timestamp | null;
token_hash: string;
token_type: string;
user_agent: string | null;
user_email_id: string | null;
user_id: string;
}
export interface UserCredentials {
created_at: Generated<Timestamp>;
credential_type: Generated<string>;
id: string;
password_hash: string | null;
updated_at: Generated<Timestamp>;
user_id: string;
}
export interface UserEmails {
created_at: Generated<Timestamp>;
email: string;
id: string;
is_primary: Generated<boolean>;
is_verified: Generated<boolean>;
normalized_email: string;
revoked_at: Timestamp | null;
user_id: string;
verified_at: Timestamp | null;
}
export interface UserGroupRoles {
granted_at: Generated<Timestamp>;
group_id: string;
revoked_at: Timestamp | null;
role_id: string;
user_id: string;
}
export interface Users {
created_at: Generated<Timestamp>;
display_name: string | null;
id: string;
status: Generated<string>;
updated_at: Generated<Timestamp>;
}
export interface DB {
_migrations: _Migrations;
capabilities: Capabilities;
groups: Groups;
role_capabilities: RoleCapabilities;
roles: Roles;
sessions: Sessions;
user_credentials: UserCredentials;
user_emails: UserEmails;
user_group_roles: UserGroupRoles;
users: Users;
}

71
backend/handlers.spec.ts Normal file
View File

@@ -0,0 +1,71 @@
// Tests for handlers.ts
// These tests use mock Call objects
import assert from "node:assert/strict";
import { describe, it } from "node:test";
import type { Request as ExpressRequest } from "express";
import { Session } from "./diachron/auth/types";
import { contentTypes } from "./diachron/content-types";
import { multiHandler } from "./handlers";
import { httpCodes } from "./diachron/http-codes";
import type { Call } from "./diachron/types";
import { anonymousUser } from "./diachron/user";
// Helper to create a minimal mock Call
function createMockCall(overrides: Partial<Call> = {}): Call {
const defaultSession = new Session(null, anonymousUser);
return {
pattern: "/test",
path: "/test",
method: "GET",
parameters: {},
request: {} as ExpressRequest,
user: anonymousUser,
session: defaultSession,
...overrides,
};
}
describe("handlers", () => {
describe("multiHandler", () => {
it("returns OK status", async () => {
const call = createMockCall({ method: "GET" });
const result = await multiHandler(call);
assert.equal(result.code, httpCodes.success.OK);
});
it("returns text/plain content type", async () => {
const call = createMockCall();
const result = await multiHandler(call);
assert.equal(result.contentType, contentTypes.text.plain);
});
it("includes method in result", async () => {
const call = createMockCall({ method: "POST" });
const result = await multiHandler(call);
assert.ok(result.result.includes("POST"));
});
it("includes a random number in result", async () => {
const call = createMockCall();
const result = await multiHandler(call);
// Result format: "that was GET (0.123456789)"
assert.match(result.result, /that was \w+ \(\d+\.?\d*\)/);
});
it("works with different HTTP methods", async () => {
const methods = ["GET", "POST", "PUT", "PATCH", "DELETE"] as const;
for (const method of methods) {
const call = createMockCall({ method });
const result = await multiHandler(call);
assert.ok(result.result.includes(method));
}
});
});
});

21
backend/handlers.ts Normal file
View File

@@ -0,0 +1,21 @@
// This is a sample file provided by diachron. You are encouraged to modify it.
import { contentTypes } from "./diachron/content-types";
import { core } from "./diachron/core";
import { httpCodes } from "./diachron/http-codes";
import type { Call, Handler, Result } from "./diachron/types";
const multiHandler: Handler = async (call: Call): Promise<Result> => {
const code = httpCodes.success.OK;
const rn = core.random.randomNumber();
const retval: Result = {
code,
result: `that was ${call.method} (${rn})`,
contentType: contentTypes.text.plain,
};
return retval;
};
export { multiHandler };

View File

@@ -0,0 +1 @@
CREATE TABLE test_application_table ();

View File

@@ -0,0 +1 @@
CREATE TABLE test_application_table ();

21
backend/package.json Normal file
View File

@@ -0,0 +1,21 @@
{
"name": "my app",
"version": "0.0.1",
"description": "",
"main": "index.js",
"scripts": {
"test": "DB_PORT=5433 DB_USER=diachron_test DB_PASSWORD=diachron_test DB_NAME=diachron_test tsx --test '**/*.{test,spec}.ts'",
"test:watch": "DB_PORT=5433 DB_USER=diachron_test DB_PASSWORD=diachron_test DB_NAME=diachron_test tsx --test --watch '**/*.{test,spec}.ts'",
"nodemon": "nodemon dist/index.js",
"kysely-codegen": "kysely-codegen"
},
"keywords": [],
"author": "",
"license": "ISC",
"packageManager": "pnpm@10.12.4",
"dependencies": {
"diachron": "workspace:*"
},
"devDependencies": {
}
}

View File

@@ -0,0 +1,2 @@
packages:
- 'diachron'

View File

@@ -1,13 +1,16 @@
// This is a sample file provided by diachron. You are encouraged to modify it.
/// <reference lib="dom" /> /// <reference lib="dom" />
import nunjucks from "nunjucks"; import nunjucks from "nunjucks";
import { DateTime } from "ts-luxon"; import { DateTime } from "ts-luxon";
import { authRoutes } from "./auth/routes"; import { authRoutes } from "./diachron/auth/routes";
import { contentTypes } from "./content-types"; import { routes as basicRoutes } from "./diachron/basic/routes";
import { contentTypes } from "./diachron/content-types";
import { core } from "./diachron/core";
import { multiHandler } from "./handlers"; import { multiHandler } from "./handlers";
import { HttpCode, httpCodes } from "./http-codes"; import { httpCodes } from "./diachron/http-codes";
import { services } from "./services"; import type { Call, Result, Route } from "./diachron/types";
import { type Call, ProcessedRoute, type Result, type Route } from "./types";
// FIXME: Obviously put this somewhere else // FIXME: Obviously put this somewhere else
const okText = (result: string): Result => { const okText = (result: string): Result => {
@@ -24,13 +27,20 @@ const okText = (result: string): Result => {
const routes: Route[] = [ const routes: Route[] = [
...authRoutes, ...authRoutes,
basicRoutes.home,
basicRoutes.hello,
{...basicRoutes.hello,
path: "/yo-whats-up"
},
basicRoutes.login,
basicRoutes.logout,
{ {
path: "/slow", path: "/slow",
methods: ["GET"], methods: ["GET"],
handler: async (_call: Call): Promise<Result> => { handler: async (_call: Call): Promise<Result> => {
console.log("starting slow request"); console.log("starting slow request");
await services.misc.sleep(2); await core.misc.sleep(5000);
console.log("finishing slow request"); console.log("finishing slow request");
const retval = okText("that was slow"); const retval = okText("that was slow");
@@ -41,7 +51,7 @@ const routes: Route[] = [
{ {
path: "/list", path: "/list",
methods: ["GET"], methods: ["GET"],
handler: async (call: Call): Promise<Result> => { handler: async (_call: Call): Promise<Result> => {
const code = httpCodes.success.OK; const code = httpCodes.success.OK;
const lr = (rr: Route[]) => { const lr = (rr: Route[]) => {
const ret = rr.map((r: Route) => { const ret = rr.map((r: Route) => {
@@ -51,19 +61,34 @@ const routes: Route[] = [
return ret; return ret;
}; };
const listing = lr(routes).join(", "); const rrr = lr(routes);
const template = `
<html>
<head></head>
<body>
<ul>
{% for route in rrr %}
<li><a href="{{ route }}">{{ route }}</a></li>
{% endfor %}
</ul>
</body>
</html>
`;
const result = nunjucks.renderString(template, { rrr });
return { return {
code, code,
result: listing + "\n", result,
contentType: contentTypes.text.plain, contentType: contentTypes.text.html,
}; };
}, },
}, },
{ {
path: "/whoami", path: "/whoami",
methods: ["GET"], methods: ["GET"],
handler: async (_call: Call): Promise<Result> => { handler: async (call: Call): Promise<Result> => {
const me = services.session.getUser(); const me = call.session.getUser();
const template = ` const template = `
<html> <html>
<head></head> <head></head>

9
backend/run.sh Executable file
View File

@@ -0,0 +1,9 @@
#!/bin/bash
set -eu
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
cd "$DIR"
exec ../cmd node --enable-source-maps dist/index.js "$@"

15
backend/services.ts Normal file
View File

@@ -0,0 +1,15 @@
// This is a sample file provided by diachron. You are encouraged to modify it.
// Application services go here. A service encapsulates a capability that
// handlers depend on: database queries, external API calls, business logic
// that doesn't belong in a handler.
//
// The framework provides core services via `core` (from ./diachron/core):
// core.database, core.logging, core.misc, etc. This file is for your
// application's own services.
import { core } from "./diachron/core";
const db = core.database.db;
export { db };

View File

@@ -6,7 +6,7 @@ DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
check_dir="$DIR" check_dir="$DIR"
source "$check_dir"/../framework/shims/common source "$check_dir"/../diachron/shims/common
source "$check_dir"/../framework/shims/node.common source "$check_dir"/../diachron/shims/node.common
$ROOT/cmd pnpm tsc --showConfig $ROOT/cmd pnpm tsc --showConfig

View File

@@ -9,5 +9,6 @@
"strict": true, "strict": true,
"types": ["node"], "types": ["node"],
"outDir": "out" "outDir": "out"
} },
"exclude": ["**/*.spec.ts", "**/*.test.ts", "check-deps.ts"]
} }

8
backend/types.ts Normal file
View File

@@ -0,0 +1,8 @@
// This is a sample file provided by diachron. You are encouraged to modify it.
// Application-specific types go here. Framework types (Call, Result, Route,
// Handler, etc.) are defined in ./diachron/types and should be imported from
// there.
//
// This file is for your domain types: the nouns and shapes that are specific
// to your application.

View File

@@ -6,8 +6,8 @@ DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
check_dir="$DIR" check_dir="$DIR"
source "$check_dir"/../framework/shims/common source "$check_dir"/../diachron/shims/common
source "$check_dir"/../framework/shims/node.common source "$check_dir"/../diachron/shims/node.common
# $ROOT/cmd pnpm tsc --lib ES2023 --esModuleInterop -w $check_dir/app.ts # $ROOT/cmd pnpm tsc --lib ES2023 --esModuleInterop -w $check_dir/app.ts
# $ROOT/cmd pnpm tsc -w $check_dir/app.ts # $ROOT/cmd pnpm tsc -w $check_dir/app.ts

49
bootstrap.sh Executable file
View File

@@ -0,0 +1,49 @@
#!/bin/bash
# shellcheck disable=SC2002
set -eu
set -o pipefail
IFS=$'\n\t'
# print useful message on failure
trap 's=$?; echo >&2 "$0: Error on line "$LINENO": $BASH_COMMAND"; exit $s' ERR
# shellcheck disable=SC2034
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
# cd "$DIR"
here="$PWD"
"$DIR/update-cached-repository.sh"
# repository="${2:-https://gitea.philologue.net/philologue/diachron}"
repository="${2:-$HOME/.cache/diachron/v1/repositories/diachron.git}"
ref="${1:-hydrators-kysely}"
echo will bootstrap ref "$ref" of repo "$repository"
into=$(mktemp -d)
cd "$into"
echo I am in $(pwd)
echo I will clone repository "$repository", ref "$ref"
git clone "$repository"
r=$(ls -1)
cd "$r"
echo I am in $(pwd)
git checkout "$ref"
ls
echo working dir: $PWD
# ls backend
# exit 0
tar cvf - $(cat "$PWD/file-list" | grep -v '^#' | sed 's/^?//') | (cd "$here" && tar xf -)
echo "$ref" > "$here/.diachron-version"
echo "Now, run the command ./sync.sh"

View File

@@ -10,7 +10,7 @@ cd "$DIR"
# #
exclusions="SC2002" exclusions="SC2002"
source "$DIR/framework/versions" source "$DIR/diachron/versions"
if [[ $# -ne 0 ]]; then if [[ $# -ne 0 ]]; then
shellcheck --exclude="$exclusions" "$@" shellcheck --exclude="$exclusions" "$@"
@@ -20,10 +20,10 @@ fi
shell_scripts="$(fd .sh | xargs)" shell_scripts="$(fd .sh | xargs)"
# The files we need to check all either end in .sh or else they're the files # The files we need to check all either end in .sh or else they're the files
# in framework/cmd.d and framework/shims. -x instructs shellcheck to also # in diachron/cmd.d and diachron/shims. -x instructs shellcheck to also
# check `source`d files. # check `source`d files.
shellcheck -x --exclude="$exclusions" "$DIR/cmd" "$DIR"/framework/cmd.d/* "$DIR"/framework/shims/* "$shell_scripts" shellcheck -x --exclude="$exclusions" "$DIR/cmd" "$DIR"/diachron/cmd.d/* "$DIR"/diachron/shims/* "$shell_scripts"
pushd "$DIR/master" pushd "$DIR/master"
docker run --rm -v $(pwd):/app -w /app golangci/golangci-lint:$golangci_lint golangci-lint run docker run --rm -v $(pwd):/app -w /app golangci/golangci-lint:$golangci_lint golangci-lint run

24
cmd
View File

@@ -2,20 +2,26 @@
# This file belongs to the framework. You are not expected to modify it. # This file belongs to the framework. You are not expected to modify it.
# FIXME: Obviously this file isn't nearly robust enough. Make it so. # Managed binary runner - runs framework-managed binaries like node, pnpm, tsx
# Usage: ./cmd <command> [args...]
set -eu set -eu
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
if [ $# -lt 1 ]; then
echo "Usage: ./cmd <command> [args...]"
echo ""
echo "Available commands:"
for cmd in "$DIR"/diachron/cmd.d/*; do
if [ -x "$cmd" ]; then
basename "$cmd"
fi
done
exit 1
fi
subcmd="$1" subcmd="$1"
# echo "$subcmd"
#exit 3
shift shift
echo will run "$DIR"/framework/cmd.d/"$subcmd" "$@" exec "$DIR"/diachron/cmd.d/"$subcmd" "$@"
exec "$DIR"/framework/cmd.d/"$subcmd" "$@"

27
develop Executable file
View File

@@ -0,0 +1,27 @@
#!/bin/bash
# This file belongs to the framework. You are not expected to modify it.
# Development command runner - parallel to ./mgmt for development tasks
# Usage: ./develop <command> [args...]
set -eu
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
if [ $# -lt 1 ]; then
echo "Usage: ./develop <command> [args...]"
echo ""
echo "Available commands:"
for cmd in "$DIR"/diachron/develop.d/*; do
if [ -x "$cmd" ]; then
basename "$cmd"
fi
done
exit 1
fi
subcmd="$1"
shift
exec "$DIR"/diachron/develop.d/"$subcmd" "$@"

0
diachron/.nodejs/.gitignore vendored Normal file
View File

214
diachron/AGENTS.md Normal file
View File

@@ -0,0 +1,214 @@
# Working with diachron (Agent Guide)
This document helps AI coding agents work effectively with projects built on
the diachron framework. It covers the conventions, commands, and structures
you need to know.
## Quick orientation
diachron is a TypeScript/Express web framework with a Go master process.
Your application code lives in `backend/`. The framework owns `master/`,
`logger/`, `diachron/`, and `backend/diachron/`.
**Do not modify framework-owned files** unless explicitly asked to work on
the framework itself.
## Running the application
```bash
./sync.sh # Install dependencies (run once, or after pulling changes)
./master # Start the application (watches files, rebuilds, proxies)
```
By default, the app listens on port 8080 (proxy) and workers run on
ports starting at 3000.
## Commands you'll need
All tools (node, pnpm, tsx, etc.) are managed by the framework. Do not
invoke system-installed versions.
```bash
./cmd pnpm install # Install npm packages
./cmd pnpm test # Run tests
./cmd pnpm biome check . # Lint (run from backend/)
./develop db # Open database shell
./develop reset-db # Drop and recreate database
./develop migrate # Run migrations (development)
./mgmt migrate # Run migrations (production-safe)
```
### Formatting and linting
```bash
cd backend && ../cmd pnpm biome check --write .
```
### Building Go code
```bash
cd master && go build
cd logger && go build
```
### Quality checks
```bash
./check.sh # shellcheck + golangci-lint
```
## Application structure
### Where to put code
| What | Where |
|--------------------------|-----------------------------|
| Application entry point | `backend/app.ts` |
| Route definitions | `backend/routes.ts` |
| Route handlers | `backend/handlers.ts` |
| Service layer | `backend/services.ts` |
| Application types | `backend/types.ts` |
| Application config | `backend/config.ts` |
| Database migrations | `backend/migrations/` |
| Framework library code | `backend/diachron/` |
### Types and naming
- HTTP request wrapper: `Call` (not Request)
- HTTP response wrapper: `Result` (not Response)
- Route definitions: arrays of `Route` objects
- Handlers: functions that take a `Call` and return a `Result`
These names are intentional. Use them consistently.
Import framework types from `./diachron/types`:
```typescript
import type { Call, Result, Route, Handler } from "./diachron/types";
```
Application-specific domain types go in `backend/types.ts`.
### Services
Application services go in `backend/services.ts`. Framework services are
accessed through the `core` object:
```typescript
import { core } from "./diachron/core";
core.database.db // Kysely database instance
core.logging.log // Logging
core.misc.sleep // Utilities
```
### Exports
When a TypeScript file exports symbols, they should be listed in
alphabetical order.
## Database
diachron uses PostgreSQL exclusively, accessed through Kysely (type-safe
query builder). There is no ORM.
- Write SQL via Kysely, not raw query strings (unless Kysely can't express
the query)
- Migrations live in `backend/migrations/`
- Run `./develop codegen` after schema changes to regenerate Kysely types
## Key conventions
### No dev/prod distinction
There is no `NODE_ENV`. The application behaves identically everywhere.
Do not introduce environment-based branching.
### Managed tooling
Never reference globally installed `node`, `npm`, `npx`, or `pnpm`.
Always use `./cmd node`, `./cmd pnpm`, etc.
### File ownership boundary
```
You may edit: backend/* (except backend/diachron/)
Do not edit: master/*, logger/*, diachron/*, backend/diachron/*
```
If a task requires framework changes, confirm with the user first.
When framework files are modified, the changes can be extracted as a
diff against upstream with `./diff-upstream.sh` (or `--stat` to list
changed files only).
When committing framework changes, keep them in separate commits from
application code. Each framework commit should have a clear message
explaining what was changed and why. This makes it much easier to
upstream the changes later.
### Command safety tiers
- `./cmd` -- Tool wrappers, always safe
- `./mgmt` -- Production-safe operations (migrations, user management)
- `./develop` -- Destructive operations, development only
Never use `./develop` commands against production data.
## Common tasks
### Add a new route
1. Define the route in `backend/routes.ts` as a `Route` object
2. Implement the handler in `backend/handlers.ts`
3. Add any needed types to `backend/types.ts`
### Add a database migration
1. Create a migration file in `backend/migrations/`
2. Run `./develop migrate` to apply it
3. Run `./develop codegen` to regenerate Kysely types
### Install a package
```bash
cd backend && ../cmd pnpm add <package>
```
### Run a one-off TypeScript script
```bash
./develop tsx backend/path/to/script.ts
```
## file-list
The root `file-list` file is a manifest of all files that ship with the
framework. When you create or delete a file that is part of the project
(not a scratch file or generated output), you must update `file-list` to
match. Keep it sorted alphabetically.
Entries can have a `?` prefix (e.g. `?backend/app.ts`). These are
**sample files** -- starter code that `bootstrap.sh` copies into a new
project but that `upgrade.sh` will not overwrite. Once the user has the
file, it belongs to them. On upgrade, new sample files that don't exist
yet in the project are copied in; existing ones are left untouched.
Unprefixed entries are **framework-owned** and are always replaced on
upgrade. When adding a new file to `file-list`, decide which category
it belongs to:
- Framework-owned (no prefix): infrastructure scripts, framework
library code, build tooling, config that must stay in sync.
- Sample (`?` prefix): application starter code the user is expected
to edit (routes, handlers, services, types, package.json, etc.).
## Things to avoid
- Do not introduce `.env` files or `dotenv` without checking with the
team first. The configuration story is still being decided.
- Do not introduce webpack, vite, or other bundlers. The master process
handles building via `@vercel/ncc`.
- Do not add express middleware directly. Use the framework's route
processing in `backend/diachron/routing.ts`.
- Do not use `npm` or globally installed `pnpm`. Use `./cmd pnpm`.
- Do not add `NODE_ENV` checks or development/production branches.

Some files were not shown because too many files have changed in this diff Show More