108 Commits

Author SHA1 Message Date
cd19a32be5 Add more todo items 2026-01-25 12:12:15 -06:00
478305bc4f Update /home template 2026-01-25 12:12:02 -06:00
421628d49e Add various doc updates
They are still very far from complete.
2026-01-25 12:11:34 -06:00
4f37a72d7b Clean commands up 2026-01-24 16:54:54 -06:00
e30bf5d96d Fix regexp in fixup.sh 2026-01-24 16:39:13 -06:00
8704c4a8d5 Separate framework and app migrations
Also add a new develop command: clear-db.
2026-01-24 16:38:33 -06:00
579a19669e Match user and session schema changes 2026-01-24 15:48:22 -06:00
474420ac1e Add development command to reset the database and rerun migrations 2026-01-24 15:13:34 -06:00
960f78a1ad Update initial tables 2026-01-24 15:13:30 -06:00
d921679058 Rework user types: create AuthenticatedUser and AnonymousUser class
Both are subclasses of an abstract User class which contains almost everything
interesting.
2026-01-17 17:45:36 -06:00
350bf7c865 Run shell scripts through shfmt 2026-01-17 16:30:55 -06:00
8a7682e953 Split services into core and request 2026-01-17 16:20:55 -06:00
e59bb35ac9 Update todo list 2026-01-17 16:10:38 -06:00
a345a2adfb Add directive 2026-01-17 16:10:24 -06:00
00d84d6686 Note that files belong to framework 2026-01-17 15:45:02 -06:00
7ed05695b9 Separate happy path utility functions for requests 2026-01-17 15:43:52 -06:00
03cc4cf4eb Remove prettier; we've been using biome for a while 2026-01-17 13:19:40 -06:00
2121a6b5de Merge remote-tracking branch 'crondiad/experiments' into experiments 2026-01-11 16:08:03 -06:00
Michael Wolf
6ace2163ed Update pnpm version 2026-01-11 16:07:32 -06:00
Michael Wolf
93ab4b5d53 Update node version 2026-01-11 16:07:24 -06:00
Michael Wolf
70ddcb2a94 Note that we need bash 2026-01-11 16:06:48 -06:00
Michael Wolf
1da81089cd Add sync.sh script
This downloads and installs dependencies necessary to run or develop.

Add docker-compose.yml for initial use
2026-01-11 16:06:43 -06:00
f383c6a465 Add logger wrapper script 2026-01-11 15:48:32 -06:00
e34d47b352 Add various todo items 2026-01-11 15:36:15 -06:00
de70be996e Add docker-compose.yml for initial use 2026-01-11 15:33:01 -06:00
096a1235b5 Add basic logout 2026-01-11 15:31:59 -06:00
4a4dc11aa4 Fix formatting 2026-01-11 15:17:58 -06:00
7399cbe785 Add / template 2026-01-11 14:57:51 -06:00
14d20be9a2 Note that file belongs to the framework 2026-01-11 14:57:26 -06:00
55f5cc699d Add request-scoped context for session.getUser()
Use AsyncLocalStorage to provide request context so services can access
the current user without needing Call passed through every function.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-11 14:56:10 -06:00
afcb447b2b Add a command to add a new user 2026-01-11 14:38:19 -06:00
1c1eeddcbe Add basic login screen with form-based authentication
Adds /login route with HTML template that handles GET (show form) and
POST (authenticate). On successful login, sets session cookie and
redirects to /. Also adds framework support for redirects and cookies
in route handlers.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-11 10:07:02 -06:00
7cecf5326d Make biome happier 2026-01-10 14:02:38 -06:00
47f6bee75f Improve test command to find spec/test files recursively
Use globstar for recursive matching and support both *.spec.ts
and *.test.ts patterns in any subdirectory.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-10 13:55:42 -06:00
6e96c33457 Add very basic support for finding and rendering templates 2026-01-10 13:50:44 -06:00
9e3329fa58 . 2026-01-10 13:38:42 -06:00
05eaf938fa Add test command
For now this just runs typescript tests.  Eventually it'll do more than that.
2026-01-10 13:38:10 -06:00
df2d4eea3f Add initial way to get info about execution context 2026-01-10 13:37:39 -06:00
b235a6be9a Add block for declared var 2026-01-10 13:05:39 -06:00
8cd4b42cc6 Add scripts to run migrations and to connect to the db 2026-01-10 09:05:05 -06:00
241d3e799e Use less ambiguous funcion 2026-01-10 08:55:00 -06:00
49dc0e3fe0 Mark several unused vars as such 2026-01-10 08:54:51 -06:00
c7b8cd33da Clean up imports 2026-01-10 08:54:34 -06:00
6c0895de07 Fix formatting 2026-01-10 08:51:20 -06:00
17ea6ba02d Consider block stmts without braces to be errors 2026-01-09 11:44:09 -06:00
661def8a5c Refmt 2026-01-04 15:24:29 -06:00
74d75d08dd Add Session class to provide getUser() on call.session
Wraps SessionData and user into a Session class that handlers can use
via call.session.getUser() instead of accessing services directly.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-04 15:22:27 -06:00
ad6d405206 Add session data to Call type
- AuthService.validateRequest now returns AuthResult with both user and session
- Call type includes session: SessionData | null
- Handlers can access session metadata (createdAt, authMethod, etc.)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-04 09:50:05 -06:00
e9ccf6d757 Add PostgreSQL database layer with Kysely and migrations
- Add database.ts with connection pool, Kysely query builder, and migration runner
- Create migrations for users and sessions tables (0001, 0002)
- Implement PostgresAuthStore to replace InMemoryAuthStore
- Wire up database service in services/index.ts

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-04 09:43:20 -06:00
34ec5be7ec Pull in kysely and pg deps 2026-01-03 17:20:49 -06:00
e136c07928 Add some stub user stuff 2026-01-03 17:06:54 -06:00
c926f15aab Fix circular dependency breaking ncc bundle
Don't export authRoutes from barrel file to break the cycle:
services.ts → auth/index.ts → auth/routes.ts → services.ts

Import authRoutes directly from ./auth/routes instead.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-03 14:24:53 -06:00
39cd93c81e Move services.ts 2026-01-03 14:12:27 -06:00
c246e0384f Add authentication system with session-based auth
Implements full auth flows with opaque tokens (not JWT) for easy revocation:
- Login/logout with cookie or bearer token support
- Registration with email verification
- Password reset with one-time tokens
- scrypt password hashing (no external deps)

New files in express/auth/:
- token.ts: 256-bit token generation, SHA-256 hashing
- password.ts: scrypt hashing with timing-safe verification
- types.ts: Session schemas, token types, input validation
- store.ts: AuthStore interface + InMemoryAuthStore
- service.ts: AuthService with all auth operations
- routes.ts: 6 auth endpoints

Modified:
- types.ts: Added user field to Call, requireAuth/requirePermission helpers
- app.ts: JSON body parsing, populates call.user, handles auth errors
- services.ts: Added services.auth
- routes.ts: Includes auth routes

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-03 13:59:02 -06:00
788ea2ab19 Add User class with role and permission-based authorization
Foundation for authentication/authorization with:
- Stable UUID id for database keys, email as human identifier
- Account status (active/suspended/pending)
- Role-based auth with role-to-permission mappings
- Direct permissions in resource:action format
- Methods: hasRole(), hasPermission(), can(), effectivePermissions()

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-03 12:59:47 -06:00
6297a95d3c Reformat more files 2026-01-01 21:20:45 -06:00
63cf0a670d Update todo list 2026-01-01 21:20:38 -06:00
5524eaf18f ? 2026-01-01 21:12:55 -06:00
03980e114b Add basic template rendering route 2026-01-01 21:12:38 -06:00
539717efda Add todo item 2026-01-01 21:11:28 -06:00
8be88bb696 Move TODOs re logging to the end 2026-01-01 21:11:10 -06:00
ab74695f4c Have master start and manage the logger process
Master now:
- Starts logger on startup with configurable port and capacity
- Restarts logger automatically if it crashes
- Stops logger gracefully on shutdown

New flags:
- --logger-port (default 8085)
- --logger-capacity (default 1000000)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-01 20:53:29 -06:00
dc5a70ba33 Add logging service
New Go program (logger/) that:
- Accepts POSTed JSON log messages via POST /log
- Stores last N messages in a ring buffer (default 1M)
- Retrieves logs via GET /logs with limit/before/after filters
- Shows status via GET /status

Also updates express/logging.ts to POST messages to the logger service.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-01 20:45:34 -06:00
4adf6cf358 Add another TODO item 2026-01-01 20:37:16 -06:00
bee6938a67 Add some logging related stubs to express backend 2026-01-01 20:18:37 -06:00
b0ee53f7d5 Listen by default on port 3500
The master process will continue to start at port 3000.  In practice, this
ought to make conflicts between master-superviced processes and ones run by
hand less of an issue.
2026-01-01 20:17:26 -06:00
5c93c9e982 Add TODO items 2026-01-01 20:17:15 -06:00
5606a59614 Note that you need docker as well as docker compose 2026-01-01 17:36:22 -06:00
22dde8c213 Add wrapper script for master program 2026-01-01 17:35:56 -06:00
30463b60a5 Use CLI flags instead of environment variables for master config
Replace env var parsing with Go's flag package:
- --watch (default: ../express)
- --workers (default: 1)
- --base-port (default: 3000)
- --port (default: 8080)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-01 17:31:08 -06:00
e2ea472a10 Make biome happier 2026-01-01 17:22:04 -06:00
20e5da0d54 Teach fixup.sh to use biome 2026-01-01 17:16:46 -06:00
13d02d86be Pull in and set up biome 2026-01-01 17:16:02 -06:00
d35e7bace2 Make shfmt happier 2026-01-01 16:53:19 -06:00
cb4a730838 Make go fmt happier 2026-01-01 16:53:00 -06:00
58f88e3695 Add check.sh and fixup.sh scripts 2026-01-01 16:47:50 -06:00
7b8eaac637 Add TODO.md and instructions 2026-01-01 15:45:43 -06:00
f504576f3e Add first cut at a pool 2026-01-01 15:43:49 -06:00
8722062f4a Change process names again 2026-01-01 15:12:01 -06:00
9cc1991d07 Name backend process 2026-01-01 14:54:17 -06:00
5d5a2430ad Fix arg in build script 2026-01-01 14:37:11 -06:00
a840137f83 Mark build.sh as executable 2026-01-01 14:34:31 -06:00
c330da49fc Add rudimentary command line parsing to express app 2026-01-01 14:34:16 -06:00
db81129724 Add build.sh 2026-01-01 14:17:09 -06:00
43ff2edad2 Pull in nunjucks 2026-01-01 14:14:03 -06:00
Michael Wolf
ad95f652b8 Fix bogus path expansion 2026-01-01 14:10:57 -06:00
Michael Wolf
51d24209b0 Use build.sh script 2026-01-01 14:09:51 -06:00
Michael Wolf
1083655a3b Add and use a simpler run script 2026-01-01 14:08:46 -06:00
Michael Wolf
615cd89656 Ignore more node_modules directories 2026-01-01 13:24:50 -06:00
Michael Wolf
321b2abd23 Sort of run node app 2026-01-01 13:24:36 -06:00
Michael Wolf
642c7d9434 Update CLAUDE.md 2026-01-01 13:06:21 -06:00
Michael Wolf
8e5b46d426 Add first cut at a CLAUDE.md file 2026-01-01 12:31:35 -06:00
Michael Wolf
a178536472 Rename monitor to master 2026-01-01 12:30:58 -06:00
Michael Wolf
3bece46638 Add first cut at golang monitor program 2026-01-01 12:26:54 -06:00
Michael Wolf
4257a9b615 Improve wording in a few places 2025-12-05 19:47:58 -06:00
Michael Wolf
b0eaf6b136 Add stub nomenclature doc 2025-11-17 19:54:53 -06:00
Michael Wolf
8ca89b75cd Rewrite README.md 2025-11-17 19:54:35 -06:00
Michael Wolf
a797cae0e6 Add placeholder files for docs to come 2025-11-17 19:54:10 -06:00
Michael Wolf
666f1447f4 Add first cut at docs to create a new project 2025-11-17 19:53:57 -06:00
Michael Wolf
bd3779acef Fill out http codes object 2025-11-17 18:06:59 -06:00
Michael Wolf
c21638c5d5 Fill out content types object 2025-11-17 18:06:42 -06:00
Michael Wolf
d125f61c4c Stake out dir 2025-11-17 11:38:36 -06:00
Michael Wolf
2641a8d29d Update shell code
Now it (mostly) passes shellcheck and is formatted with shfmt.
2025-11-17 11:38:04 -06:00
Michael Wolf
1a13fd0909 Add a first cut at an express-based backend 2025-11-17 10:58:54 -06:00
Michael Wolf
c346a70cce Drop deno dir 2025-11-16 15:56:27 -06:00
Michael Wolf
96d861f043 Use variable names less likely to be shadowed 2025-11-16 12:25:57 -06:00
Michael Wolf
0201d08009 Add ts-node and tsx command wrappers 2025-11-16 11:13:21 -06:00
Michael Wolf
d4d5a72b3e Ignore addl dir 2025-11-08 09:58:45 -06:00
133 changed files with 7242 additions and 719 deletions

4
.beads/issues.jsonl Normal file
View File

@@ -0,0 +1,4 @@
{"id":"diachron-2vh","title":"Add unit testing to golang programs","status":"open","priority":2,"issue_type":"task","created_at":"2026-01-03T12:31:41.281891462-06:00","created_by":"mw","updated_at":"2026-01-03T12:31:41.281891462-06:00"}
{"id":"diachron-64w","title":"Add unit testing to express backend","status":"open","priority":2,"issue_type":"task","created_at":"2026-01-03T12:31:30.439206099-06:00","created_by":"mw","updated_at":"2026-01-03T12:31:30.439206099-06:00"}
{"id":"diachron-fzd","title":"Add generic 'user' functionality","status":"open","priority":2,"issue_type":"task","created_at":"2026-01-03T12:35:53.73213604-06:00","created_by":"mw","updated_at":"2026-01-03T12:35:53.73213604-06:00"}
{"id":"diachron-ngx","title":"Teach the master and/or build process to send messages with notify-send when builds fail or succeed. Ideally this will be fairly generic.","status":"open","priority":2,"issue_type":"task","created_at":"2026-01-03T14:10:11.773218844-06:00","created_by":"mw","updated_at":"2026-01-03T14:10:11.773218844-06:00"}

2
.claude/instructions.md Normal file
View File

@@ -0,0 +1,2 @@
When asked "what's next?" or during downtime, check TODO.md and suggest items to work on.

2
.gitignore vendored
View File

@@ -1,4 +1,4 @@
framework/node/node_modules
**/node_modules
framework/downloads
framework/binaries
framework/.nodejs

122
CLAUDE.md Normal file
View File

@@ -0,0 +1,122 @@
# CLAUDE.md
This file provides guidance to Claude Code (claude.ai/code) when working with
code in this repository.
## Project Overview
Diachron is an opinionated TypeScript/Node.js web framework with a Go-based
master process. Key design principles:
- No development/production distinction - single mode of operation everywhere
- Everything loggable and inspectable for debuggability
- Minimal magic, explicit behavior
- PostgreSQL-only (no database abstraction)
- Inspired by "Taking PHP Seriously" essay
## Commands
### General
**Install dependencies:**
```bash
./sync.sh
```
**Run an app:**
```bash
./master
```
### Development
**Check shell scripts (shellcheck + shfmt) (eventually go fmt and biome or similar):**
```bash
./check.sh
```
**Format TypeScript code:**
```bash
cd express && ../cmd pnpm biome check --write .
```
**Build Go master process:**
```bash
cd master && go build
```
### Operational
(to be written)
## Architecture
### Components
- **express/** - TypeScript/Express.js backend application
- **master/** - Go-based master process for file watching and process management
- **framework/** - Managed binaries (Node.js, pnpm), command wrappers, and
framework-specific library code
- **monitor/** - Go file watcher that triggers rebuilds (experimental)
### Master Process (Go)
Responsibilities:
- Watch TypeScript source for changes and trigger rebuilds
- Manage worker processes
- Proxy web requests to backend workers
- Behaves identically in all environments (no dev/prod distinction)
### Express App Structure
- `app.ts` - Main Express application setup with route matching
- `routes.ts` - Route definitions
- `handlers.ts` - Route handlers
- `services.ts` - Service layer (database, logging, misc)
- `types.ts` - TypeScript type definitions (Route, Call, Handler, Result, Method)
### Framework Command System
Commands flow through: `./cmd``framework/cmd.d/*``framework/shims/*` → managed binaries in `framework/binaries/`
This ensures consistent tooling versions across the team without system-wide installations.
## Tech Stack
- TypeScript 5.9+ / Node.js 22.15
- Express.js 5.1
- Go 1.23.3+ (master process)
- pnpm 10.12.4 (package manager)
- Zod (runtime validation)
- Nunjucks (templating)
- @vercel/ncc (bundling)
## Platform Requirements
Linux x86_64 only (currently). Requires:
- Modern libc for Go binaries
- docker compose (for full stack)
- fd, shellcheck, shfmt (for development)
## Current Status
Early stage - most implementations are stubs:
- Database service is placeholder
- Logging functions marked WRITEME
- No test framework configured yet
# meta
## formatting and sorting
- When a typescript file exports symbols, they should be listed in order
## guidelines for this document
- Try to keep lines below 80 characters in length, especially prose. But if
embedded code or literals are longer, that's fine.
- Use formatting such as bold or italics sparingly
- In general, we treat this document like source code insofar as it should be
both human-readable and machine-readable
- Keep this meta section at the end of the file.

View File

@@ -1,18 +1,62 @@
diachron
## Introduction
Is your answer to some of these questions "yes"? If so, you might like
diachron. (When it comes to that dev/test/prod one, hear us out first, ok?)
- Do you want to share a lot of backend and frontend code?
- Are you tired of your web stack breaking when you blink too hard?
- Have you read [Taking PHP
Seriously](https://slack.engineering/taking-php-seriously/) and wish you had
something similar for Typescript?
- Do you think that ORMs are not all that? Do you wish you had first class
unmediated access to your database? And do you think that database
agnosticism is overrated?
- Do you think dev/testing/prod distinctions are a bad idea? (Hear us out on
this one.)
- Have you ever lost hours getting everyone on your team to have the exact
same environment, yet you're not willing to take the plunge and use a tool
like [nix](https://nixos.org)?
- Are you frustrated by unclear documentation? Is ramping up a frequent
problem?
- Do you want a framework that's not only easy to write but also easy to get
inside and debug?
- Have you been bogged down with details that are not relevant to the problems
you're trying to solve, the features you're trying to implement, the bugs
you're trying to fix? We're talking authentication, authorization, XSS,
https, nested paths, all that stuff.
## Getting started
Different situations require different getting started docs.
- [How to create a new project](docs/new-project.md)
- [How to work on an existing project](docs/existing-project.md)
## Requirements
To run diachron, you currently need the following requirements:
To run diachron, you currently need to have a Linux box running x86_64 with a
new enough libc to run golang binaries. Support for other platforms will come
eventually.
- docker compose
- deno
To run a more complete system, you also need to have docker compose installed.
## Development requirements
### Development requirements
To hack on diachron, you need the following:
To hack on diachron itself, you need the following:
- docker compose
- deno
- bash
- docker and docker compose
- [fd](https://github.com/sharkdp/fd)
- golang, version 1.23.6 or greater
- shellcheck
- shfmt

177
TODO.md Normal file
View File

@@ -0,0 +1,177 @@
## high importance
- [ ] Add unit tests all over the place.
- ⚠️ Huge task - needs breakdown before starting
- [ ] migrations, seeding, fixtures
```sql
CREATE SCHEMA fw;
CREATE TABLE fw.users (...);
CREATE TABLE fw.groups (...);
```
```sql
CREATE TABLE app.user_profiles (...);
CREATE TABLE app.customer_metadata (...);
```
- [ ] flesh out `mgmt` and `develop` (does not exist yet)
4.1 What belongs in develop
- Create migrations
- Squash migrations
- Reset DB
- Roll back migrations
- Seed large test datasets
- Run tests
- Snapshot / restore local DB state (!!!)
`develop` fails if APP_ENV (or whatever) is `production`. Or maybe even
`testing`.
- [ ] Add default user table(s) to database.
- [ ] Add authentication
- [ ] password
- [ ] third party?
- [ ] Add middleware concept
- [ ] Add authorization
- for specific routes / resources / etc
- [ ] Add basic text views
Partially done; see the /time route. But we need to figure out where to
store templates, static files, etc.
- [ ] fix process management: if you control-c `master` process sometimes it
leaves around `master-bin`, `logger-bin`, and `diachron:nnnn` processes.
Huge problem.
## medium importance
- [ ] Add a log viewer
- with queries
- convert to logfmt and is there a viewer UI we could pull in and use
instead?
- [ ] add nested routes. Note that this might be easy to do without actually
changing the logic in express/routes.ts. A function that takes an array
of routes and maps over them rewriting them. Maybe.
- [ ] related: add something to do with default templates and stuff... I
think we can make handlers a lot shorter to write, sometimes not even
necessary at all, with some sane defaults and an easy to use override
mechanism
- [ ] time library
- [ ] fill in the rest of express/http-codes.ts
- [ ] fill out express/content-types.ts
- [ ] identify redundant "old skool" and ajax routes, factor out their
commonalities, etc.
- [ ] figure out and add logging to disk
- [ ] I don't really feel close to satisfied with template location /
rendering / etc. Rethink and rework.
- [ ] Add email verification (this is partially done already)
- [ ] Reading .env files and dealing with the environment should be immune to
the extent possible from idiotic errors
- [ ] Update check script:
- [x] shellcheck on shell scripts
- [x] `go vet` on go files
- [x] `golangci-lint` on go files
- [x] Run `go fmt` on all .go files
- [ ] Eventually, run unit tests
- [ ] write docs
- upgrade docs
- starting docs
- taking over docs
- reference
- internals
- [ ] make migration creation default to something like yyyy-mm-dd_ssss (are
9999 migrations in a day enough?)
- [ ] clean up `cmd` and `mgmt`: do the right thing with their commonalities
and make very plain which is which for what. Consider additional
commands. Maybe `develop` for specific development tasks,
`operate` for operational tasks, and we keep `cmd` for project-specific
commands. Something like that.
## low importance
- [ ] add a prometheus-style `/metrics` endpoint to master
- [ ] create a metrics server analogous to the logging server
- accept various stats from the workers (TBD)
- [ ] move `master-bin` into a subdir like `master/cmd` or whatever is
idiomatic for golang programs; adapt `master` wrapper shell script
accordingly
- [ ] flesh out the `sync.sh` script
- [ ] update framework-managed node
- [ ] update framework-managed pnpm
- [ ] update pnpm-managed deps
- [ ] rebuild golang programs
- [ ] If the number of workers is large, then there is a long lapse between
when you change a file and when the server responds
- One solution: start and stop workers serially: stop one, restart it with new
code; repeat
- Slow start them: only start a few at first
- [ ] in express/user.ts: FIXME: set createdAt and updatedAt to start of epoch
## finished
- [x] Reimplement fixup.sh
- [x] run shfmt on all shell scripts (and the files they `source`)
- [x] Run `go fmt` on all .go files
- [x] Run ~~prettier~~ biome on all .ts files and maybe others
- [x] Adapt master program so that it reads configuration from command line
args instead of from environment variables
- Should have sane defaults
- Adding new arguments should be easy and obvious
- [x] Add wrapper script to run master program (so that various assumptions related
to relative paths are safer)
- [x] Add logging service
- New golang program, in the same directory as master
- Intended to be started by master
- Listens to a port specified command line arg
- Accepts POSTed (or possibly PUT) json messages, currently in a
to-be-defined format. We will work on this format later.
- Keeps the most recent N messages in memory. N can be a fairly large
number; let's start by assuming 1 million.
- [x] Log to logging service from the express backend
- Fill out types and functions in `express/logging.ts`
- [x] Add first cut at database access. Remember that ORMs are not all that!
- [x] Create initial docker-compose.yml file for local development
- include most recent stable postgres
- include beanstalkd
- include memcached
- include redis
- include mailpit

30
check.sh Executable file
View File

@@ -0,0 +1,30 @@
#!/bin/bash
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
cd "$DIR"
# Keep exclusions sorted. And list them here.
#
# - SC2002 is useless use of cat
#
exclusions="SC2002"
source "$DIR/framework/versions"
if [[ $# -ne 0 ]]; then
shellcheck --exclude="$exclusions" "$@"
exit $?
fi
shell_scripts="$(fd .sh | xargs)"
# The files we need to check all either end in .sh or else they're the files
# in framework/cmd.d and framework/shims. -x instructs shellcheck to also
# check `source`d files.
shellcheck -x --exclude="$exclusions" "$DIR/cmd" "$DIR"/framework/cmd.d/* "$DIR"/framework/shims/* "$shell_scripts"
pushd "$DIR/master"
docker run --rm -v $(pwd):/app -w /app golangci/golangci-lint:$golangci_lint golangci-lint run
popd

24
cmd
View File

@@ -2,20 +2,26 @@
# This file belongs to the framework. You are not expected to modify it.
# FIXME: Obviously this file isn't nearly robust enough. Make it so.
# Managed binary runner - runs framework-managed binaries like node, pnpm, tsx
# Usage: ./cmd <command> [args...]
set -eu
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
if [ $# -lt 1 ]; then
echo "Usage: ./cmd <command> [args...]"
echo ""
echo "Available commands:"
for cmd in "$DIR"/framework/cmd.d/*; do
if [ -x "$cmd" ]; then
basename "$cmd"
fi
done
exit 1
fi
subcmd="$1"
# echo "$subcmd"
#exit 3
shift
echo will run "$DIR"/framework/cmd.d/"$subcmd" "$@"
exec "$DIR"/framework/cmd.d/"$subcmd" "$@"

View File

@@ -1 +0,0 @@
((typescript-mode . ((lsp-disabled-clients . (ts-ls)))))

View File

@@ -1,84 +0,0 @@
import { services } from "./services.ts";
import {
DenoRequest,
DenoResponse,
InternalHandler,
massageMethod,
Method,
ProcessedRoute,
Request as Request,
Route,
} from "./types.ts";
import { routes } from "./routes.ts";
services.logging.log({ source: "logging", text: ["1"] });
const processedRoutes: { [K in Method]: ProcessedRoute[] } = {
"GET": [],
"POST": [],
"PUT": [],
"PATCH": [],
"DELETE": [],
};
function isPromise<T>(value: T | Promise<T>): value is Promise<T> {
return typeof (value as any)?.then === "function";
}
routes.forEach(
(route: Route, _idx: number, _allRoutes: Route[]) => {
const pattern: URLPattern = new URLPattern({ pathname: route.path });
const methodList = route.methods;
const handler: InternalHandler = async (denoRequest: DenoRequest) => {
const method = massageMethod(denoRequest.method);
if (!methodList.includes(method)) {
// XXX: Worth asserting this?
}
const p = new URL(denoRequest.url);
const path = p.pathname;
const req: Request = {
pattern: route.path,
path,
method,
parameters: { one: 1, two: 2 },
denoRequest,
};
const retval = route.handler(req);
if (isPromise(retval)) {
return await retval;
} else {
return retval;
}
};
for (const [_idx, method] of methodList.entries()) {
const pr: ProcessedRoute = { pattern, method, handler };
processedRoutes[method].push(pr);
}
},
);
async function handler(req: DenoRequest): Promise<DenoResponse> {
const m = req.method;
const m1 = massageMethod(m);
const byMethod = processedRoutes[m1];
for (const [_idx, pr] of byMethod.entries()) {
const match = pr.pattern.exec(req.url);
if (match) {
const resp = await pr.handler(req);
return new globalThis.Response(resp.result);
}
}
return new globalThis.Response("not found", { status: 404 });
}
Deno.serve(handler);

View File

@@ -1,38 +0,0 @@
import { Extensible } from "./interfaces.ts";
export type ContentType = string;
const contentTypes = {
text: {
plain: "text/plain",
html: "text/html",
css: "text/css",
javascript: "text/javascript",
xml: "text/xml",
},
image: {
jpeg: "image/jpeg",
png: "image/png",
gif: "image/gif",
svgPlusXml: "image/svg+xml",
webp: "image/webp",
},
audio: {
"mpeg": "audio/mpeg",
"wav": "audio/wav",
},
video: {
mp4: "video/mp4",
webm: "video/webm",
xMsvideo: "video/x-msvideo",
},
application: {
json: "application/json",
pdf: "application/pdf",
zip: "application/zip",
xWwwFormUrlencoded: "x-www-form-urlencoded",
octetStream: "octet-stream",
},
};
export { contentTypes };

View File

@@ -1,15 +0,0 @@
{
"compilerOptions": {
"lib": ["deno.ns", "dom"]
},
"tasks": {
"dev": "deno run --watch main.ts"
},
"imports": {
"@std/assert": "jsr:@std/assert@1",
"zod": "npm:zod@^3.24.2"
},
"fmt": {
"indentWidth": 4
}
}

202
deno/deno.lock generated
View File

@@ -1,202 +0,0 @@
{
"version": "4",
"specifiers": {
"jsr:@std/assert@1": "1.0.11",
"jsr:@std/async@1": "1.0.10",
"jsr:@std/bytes@1": "1.0.5",
"jsr:@std/bytes@^1.0.2-rc.3": "1.0.5",
"jsr:@std/internal@^1.0.5": "1.0.5",
"jsr:@std/io@0.224.5": "0.224.5",
"npm:zod@^3.24.2": "3.24.2"
},
"jsr": {
"@std/assert@1.0.11": {
"integrity": "2461ef3c368fe88bc60e186e7744a93112f16fd110022e113a0849e94d1c83c1",
"dependencies": [
"jsr:@std/internal"
]
},
"@std/async@1.0.10": {
"integrity": "2ff1b1c7d33d1416159989b0f69e59ec7ee8cb58510df01e454def2108b3dbec"
},
"@std/bytes@1.0.5": {
"integrity": "4465dd739d7963d964c809202ebea6d5c6b8e3829ef25c6a224290fbb8a1021e"
},
"@std/internal@1.0.5": {
"integrity": "54a546004f769c1ac9e025abd15a76b6671ddc9687e2313b67376125650dc7ba"
},
"@std/io@0.224.5": {
"integrity": "cb84fe655d1273fca94efcff411465027a8b0b4225203f19d6ee98d9c8920a2d",
"dependencies": [
"jsr:@std/bytes@^1.0.2-rc.3"
]
}
},
"npm": {
"zod@3.24.2": {
"integrity": "sha512-lY7CDW43ECgW9u1TcT3IoXHflywfVqDYze4waEz812jR/bZ8FHDsl7pFQoSZTz5N+2NqRXs8GBwnAwo3ZNxqhQ=="
}
},
"redirects": {
"https://deno.land/x/random_number/mod.ts": "https://deno.land/x/random_number@2.0.0/mod.ts",
"https://deno.land/x/sleep/mod.ts": "https://deno.land/x/sleep@v1.3.0/mod.ts"
},
"remote": {
"https://deno.land/std@0.214.0/assert/assert.ts": "bec068b2fccdd434c138a555b19a2c2393b71dfaada02b7d568a01541e67cdc5",
"https://deno.land/std@0.214.0/assert/assertion_error.ts": "9f689a101ee586c4ce92f52fa7ddd362e86434ffdf1f848e45987dc7689976b8",
"https://deno.land/std@0.214.0/async/delay.ts": "8e1d18fe8b28ff95885e2bc54eccec1713f57f756053576d8228e6ca110793ad",
"https://deno.land/std@0.214.0/bytes/copy.ts": "f29c03168853720dfe82eaa57793d0b9e3543ebfe5306684182f0f1e3bfd422a",
"https://deno.land/std@0.214.0/crypto/_fnv/fnv32.ts": "ba2c5ef976b9f047d7ce2d33dfe18671afc75154bcf20ef89d932b2fe8820535",
"https://deno.land/std@0.214.0/crypto/_fnv/fnv64.ts": "580cadfe2ff333fe253d15df450f927c8ac7e408b704547be26aab41b5772558",
"https://deno.land/std@0.214.0/crypto/_fnv/mod.ts": "8dbb60f062a6e77b82f7a62ac11fabfba52c3cd408c21916b130d8f57a880f96",
"https://deno.land/std@0.214.0/crypto/_fnv/util.ts": "27b36ce3440d0a180af6bf1cfc2c326f68823288540a354dc1d636b781b9b75f",
"https://deno.land/std@0.214.0/crypto/_wasm/lib/deno_std_wasm_crypto.generated.mjs": "76c727912539737def4549bb62a96897f37eb334b979f49c57b8af7a1617635e",
"https://deno.land/std@0.214.0/crypto/_wasm/mod.ts": "c55f91473846827f077dfd7e5fc6e2726dee5003b6a5747610707cdc638a22ba",
"https://deno.land/std@0.214.0/crypto/crypto.ts": "4448f8461c797adba8d70a2c60f7795a546d7a0926e96366391bffdd06491c16",
"https://deno.land/std@0.214.0/datetime/_common.ts": "a62214c1924766e008e27d3d843ceba4b545dc2aa9880de0ecdef9966d5736b6",
"https://deno.land/std@0.214.0/datetime/parse.ts": "bb248bbcb3cd54bcaf504a1ee670fc4695e429d9019c06af954bbe2bcb8f1d02",
"https://deno.land/std@0.214.0/encoding/_util.ts": "beacef316c1255da9bc8e95afb1fa56ed69baef919c88dc06ae6cb7a6103d376",
"https://deno.land/std@0.214.0/encoding/base64.ts": "96e61a556d933201266fea84ae500453293f2aff130057b579baafda096a96bc",
"https://deno.land/std@0.214.0/encoding/hex.ts": "4d47d3b25103cf81a2ed38f54b394d39a77b63338e1eaa04b70c614cb45ec2e6",
"https://deno.land/std@0.214.0/fmt/colors.ts": "aeaee795471b56fc62a3cb2e174ed33e91551b535f44677f6320336aabb54fbb",
"https://deno.land/std@0.214.0/io/buf_reader.ts": "c73aad99491ee6db3d6b001fa4a780e9245c67b9296f5bad9c0fa7384e35d47a",
"https://deno.land/std@0.214.0/io/buf_writer.ts": "f82f640c8b3a820f600a8da429ad0537037c7d6a78426bbca2396fb1f75d3ef4",
"https://deno.land/std@0.214.0/io/types.ts": "748bbb3ac96abda03594ef5a0db15ce5450dcc6c0d841c8906f8b10ac8d32c96",
"https://deno.land/std@0.214.0/path/_common/assert_path.ts": "2ca275f36ac1788b2acb60fb2b79cb06027198bc2ba6fb7e163efaedde98c297",
"https://deno.land/std@0.214.0/path/_common/basename.ts": "569744855bc8445f3a56087fd2aed56bdad39da971a8d92b138c9913aecc5fa2",
"https://deno.land/std@0.214.0/path/_common/common.ts": "6157c7ec1f4db2b4a9a187efd6ce76dcaf1e61cfd49f87e40d4ea102818df031",
"https://deno.land/std@0.214.0/path/_common/constants.ts": "dc5f8057159f4b48cd304eb3027e42f1148cf4df1fb4240774d3492b5d12ac0c",
"https://deno.land/std@0.214.0/path/_common/dirname.ts": "684df4aa71a04bbcc346c692c8485594fc8a90b9408dfbc26ff32cf3e0c98cc8",
"https://deno.land/std@0.214.0/path/_common/format.ts": "92500e91ea5de21c97f5fe91e178bae62af524b72d5fcd246d6d60ae4bcada8b",
"https://deno.land/std@0.214.0/path/_common/from_file_url.ts": "d672bdeebc11bf80e99bf266f886c70963107bdd31134c4e249eef51133ceccf",
"https://deno.land/std@0.214.0/path/_common/glob_to_reg_exp.ts": "2007aa87bed6eb2c8ae8381adcc3125027543d9ec347713c1ad2c68427330770",
"https://deno.land/std@0.214.0/path/_common/normalize.ts": "684df4aa71a04bbcc346c692c8485594fc8a90b9408dfbc26ff32cf3e0c98cc8",
"https://deno.land/std@0.214.0/path/_common/normalize_string.ts": "dfdf657a1b1a7db7999f7c575ee7e6b0551d9c20f19486c6c3f5ff428384c965",
"https://deno.land/std@0.214.0/path/_common/relative.ts": "faa2753d9b32320ed4ada0733261e3357c186e5705678d9dd08b97527deae607",
"https://deno.land/std@0.214.0/path/_common/strip_trailing_separators.ts": "7024a93447efcdcfeaa9339a98fa63ef9d53de363f1fbe9858970f1bba02655a",
"https://deno.land/std@0.214.0/path/_common/to_file_url.ts": "7f76adbc83ece1bba173e6e98a27c647712cab773d3f8cbe0398b74afc817883",
"https://deno.land/std@0.214.0/path/_interface.ts": "a1419fcf45c0ceb8acdccc94394e3e94f99e18cfd32d509aab514c8841799600",
"https://deno.land/std@0.214.0/path/_os.ts": "8fb9b90fb6b753bd8c77cfd8a33c2ff6c5f5bc185f50de8ca4ac6a05710b2c15",
"https://deno.land/std@0.214.0/path/basename.ts": "5d341aadb7ada266e2280561692c165771d071c98746fcb66da928870cd47668",
"https://deno.land/std@0.214.0/path/common.ts": "03e52e22882402c986fe97ca3b5bb4263c2aa811c515ce84584b23bac4cc2643",
"https://deno.land/std@0.214.0/path/constants.ts": "0c206169ca104938ede9da48ac952de288f23343304a1c3cb6ec7625e7325f36",
"https://deno.land/std@0.214.0/path/dirname.ts": "85bd955bf31d62c9aafdd7ff561c4b5fb587d11a9a5a45e2b01aedffa4238a7c",
"https://deno.land/std@0.214.0/path/extname.ts": "593303db8ae8c865cbd9ceec6e55d4b9ac5410c1e276bfd3131916591b954441",
"https://deno.land/std@0.214.0/path/format.ts": "98fad25f1af7b96a48efb5b67378fcc8ed77be895df8b9c733b86411632162af",
"https://deno.land/std@0.214.0/path/from_file_url.ts": "911833ae4fd10a1c84f6271f36151ab785955849117dc48c6e43b929504ee069",
"https://deno.land/std@0.214.0/path/glob_to_regexp.ts": "83c5fd36a8c86f5e72df9d0f45317f9546afa2ce39acaafe079d43a865aced08",
"https://deno.land/std@0.214.0/path/is_absolute.ts": "4791afc8bfd0c87f0526eaa616b0d16e7b3ab6a65b62942e50eac68de4ef67d7",
"https://deno.land/std@0.214.0/path/is_glob.ts": "a65f6195d3058c3050ab905705891b412ff942a292bcbaa1a807a74439a14141",
"https://deno.land/std@0.214.0/path/join.ts": "ae2ec5ca44c7e84a235fd532e4a0116bfb1f2368b394db1c4fb75e3c0f26a33a",
"https://deno.land/std@0.214.0/path/join_globs.ts": "e9589869a33dc3982101898ee50903db918ca00ad2614dbe3934d597d7b1fbea",
"https://deno.land/std@0.214.0/path/mod.ts": "ffeaccb713dbe6c72e015b7c767f753f8ec5fbc3b621ff5eeee486ffc2c0ddda",
"https://deno.land/std@0.214.0/path/normalize.ts": "4155743ccceeed319b350c1e62e931600272fad8ad00c417b91df093867a8352",
"https://deno.land/std@0.214.0/path/normalize_glob.ts": "98ee8268fad271193603271c203ae973280b5abfbdd2cbca1053fd2af71869ca",
"https://deno.land/std@0.214.0/path/parse.ts": "65e8e285f1a63b714e19ef24b68f56e76934c3df0b6e65fd440d3991f4f8aefb",
"https://deno.land/std@0.214.0/path/posix/_util.ts": "1e3937da30f080bfc99fe45d7ed23c47dd8585c5e473b2d771380d3a6937cf9d",
"https://deno.land/std@0.214.0/path/posix/basename.ts": "39ee27a29f1f35935d3603ccf01d53f3d6e0c5d4d0f84421e65bd1afeff42843",
"https://deno.land/std@0.214.0/path/posix/common.ts": "26f60ccc8b2cac3e1613000c23ac5a7d392715d479e5be413473a37903a2b5d4",
"https://deno.land/std@0.214.0/path/posix/constants.ts": "93481efb98cdffa4c719c22a0182b994e5a6aed3047e1962f6c2c75b7592bef1",
"https://deno.land/std@0.214.0/path/posix/dirname.ts": "6535d2bdd566118963537b9dda8867ba9e2a361015540dc91f5afbb65c0cce8b",
"https://deno.land/std@0.214.0/path/posix/extname.ts": "8d36ae0082063c5e1191639699e6f77d3acf501600a3d87b74943f0ae5327427",
"https://deno.land/std@0.214.0/path/posix/format.ts": "185e9ee2091a42dd39e2a3b8e4925370ee8407572cee1ae52838aed96310c5c1",
"https://deno.land/std@0.214.0/path/posix/from_file_url.ts": "951aee3a2c46fd0ed488899d024c6352b59154c70552e90885ed0c2ab699bc40",
"https://deno.land/std@0.214.0/path/posix/glob_to_regexp.ts": "54d3ff40f309e3732ab6e5b19d7111d2d415248bcd35b67a99defcbc1972e697",
"https://deno.land/std@0.214.0/path/posix/is_absolute.ts": "cebe561ad0ae294f0ce0365a1879dcfca8abd872821519b4fcc8d8967f888ede",
"https://deno.land/std@0.214.0/path/posix/is_glob.ts": "8a8b08c08bf731acf2c1232218f1f45a11131bc01de81e5f803450a5914434b9",
"https://deno.land/std@0.214.0/path/posix/join.ts": "aef88d5fa3650f7516730865dbb951594d1a955b785e2450dbee93b8e32694f3",
"https://deno.land/std@0.214.0/path/posix/join_globs.ts": "ee2f4676c5b8a0dfa519da58b8ade4d1c4aa8dd3fe35619edec883ae9df1f8c9",
"https://deno.land/std@0.214.0/path/posix/mod.ts": "563a18c2b3ddc62f3e4a324ff0f583e819b8602a72ad880cb98c9e2e34f8db5b",
"https://deno.land/std@0.214.0/path/posix/normalize.ts": "baeb49816a8299f90a0237d214cef46f00ba3e95c0d2ceb74205a6a584b58a91",
"https://deno.land/std@0.214.0/path/posix/normalize_glob.ts": "65f0138fa518ef9ece354f32889783fc38cdf985fb02dcf1c3b14fa47d665640",
"https://deno.land/std@0.214.0/path/posix/parse.ts": "d5bac4eb21262ab168eead7e2196cb862940c84cee572eafedd12a0d34adc8fb",
"https://deno.land/std@0.214.0/path/posix/relative.ts": "3907d6eda41f0ff723d336125a1ad4349112cd4d48f693859980314d5b9da31c",
"https://deno.land/std@0.214.0/path/posix/resolve.ts": "bac20d9921beebbbb2b73706683b518b1d0c1b1da514140cee409e90d6b2913a",
"https://deno.land/std@0.214.0/path/posix/separator.ts": "c9ecae5c843170118156ac5d12dc53e9caf6a1a4c96fc8b1a0ab02dff5c847b0",
"https://deno.land/std@0.214.0/path/posix/to_file_url.ts": "7aa752ba66a35049e0e4a4be5a0a31ac6b645257d2e031142abb1854de250aaf",
"https://deno.land/std@0.214.0/path/posix/to_namespaced_path.ts": "28b216b3c76f892a4dca9734ff1cc0045d135532bfd9c435ae4858bfa5a2ebf0",
"https://deno.land/std@0.214.0/path/relative.ts": "ab739d727180ed8727e34ed71d976912461d98e2b76de3d3de834c1066667add",
"https://deno.land/std@0.214.0/path/resolve.ts": "a6f977bdb4272e79d8d0ed4333e3d71367cc3926acf15ac271f1d059c8494d8d",
"https://deno.land/std@0.214.0/path/separator.ts": "c6c890507f944a1f5cb7d53b8d638d6ce3cf0f34609c8d84a10c1eaa400b77a9",
"https://deno.land/std@0.214.0/path/to_file_url.ts": "88f049b769bce411e2d2db5bd9e6fd9a185a5fbd6b9f5ad8f52bef517c4ece1b",
"https://deno.land/std@0.214.0/path/to_namespaced_path.ts": "b706a4103b104cfadc09600a5f838c2ba94dbcdb642344557122dda444526e40",
"https://deno.land/std@0.214.0/path/windows/_util.ts": "d5f47363e5293fced22c984550d5e70e98e266cc3f31769e1710511803d04808",
"https://deno.land/std@0.214.0/path/windows/basename.ts": "e2dbf31d1d6385bfab1ce38c333aa290b6d7ae9e0ecb8234a654e583cf22f8fe",
"https://deno.land/std@0.214.0/path/windows/common.ts": "26f60ccc8b2cac3e1613000c23ac5a7d392715d479e5be413473a37903a2b5d4",
"https://deno.land/std@0.214.0/path/windows/constants.ts": "5afaac0a1f67b68b0a380a4ef391bf59feb55856aa8c60dfc01bd3b6abb813f5",
"https://deno.land/std@0.214.0/path/windows/dirname.ts": "33e421be5a5558a1346a48e74c330b8e560be7424ed7684ea03c12c21b627bc9",
"https://deno.land/std@0.214.0/path/windows/extname.ts": "165a61b00d781257fda1e9606a48c78b06815385e7d703232548dbfc95346bef",
"https://deno.land/std@0.214.0/path/windows/format.ts": "bbb5ecf379305b472b1082cd2fdc010e44a0020030414974d6029be9ad52aeb6",
"https://deno.land/std@0.214.0/path/windows/from_file_url.ts": "ced2d587b6dff18f963f269d745c4a599cf82b0c4007356bd957cb4cb52efc01",
"https://deno.land/std@0.214.0/path/windows/glob_to_regexp.ts": "6dcd1242bd8907aa9660cbdd7c93446e6927b201112b0cba37ca5d80f81be51b",
"https://deno.land/std@0.214.0/path/windows/is_absolute.ts": "4a8f6853f8598cf91a835f41abed42112cebab09478b072e4beb00ec81f8ca8a",
"https://deno.land/std@0.214.0/path/windows/is_glob.ts": "8a8b08c08bf731acf2c1232218f1f45a11131bc01de81e5f803450a5914434b9",
"https://deno.land/std@0.214.0/path/windows/join.ts": "e0b3356615c1a75c56ebb6a7311157911659e11fd533d80d724800126b761ac3",
"https://deno.land/std@0.214.0/path/windows/join_globs.ts": "ee2f4676c5b8a0dfa519da58b8ade4d1c4aa8dd3fe35619edec883ae9df1f8c9",
"https://deno.land/std@0.214.0/path/windows/mod.ts": "7d6062927bda47c47847ffb55d8f1a37b0383840aee5c7dfc93984005819689c",
"https://deno.land/std@0.214.0/path/windows/normalize.ts": "78126170ab917f0ca355a9af9e65ad6bfa5be14d574c5fb09bb1920f52577780",
"https://deno.land/std@0.214.0/path/windows/normalize_glob.ts": "179c86ba89f4d3fe283d2addbe0607341f79ee9b1ae663abcfb3439db2e97810",
"https://deno.land/std@0.214.0/path/windows/parse.ts": "b9239edd892a06a06625c1b58425e199f018ce5649ace024d144495c984da734",
"https://deno.land/std@0.214.0/path/windows/relative.ts": "3e1abc7977ee6cc0db2730d1f9cb38be87b0ce4806759d271a70e4997fc638d7",
"https://deno.land/std@0.214.0/path/windows/resolve.ts": "75b2e3e1238d840782cee3d8864d82bfaa593c7af8b22f19c6422cf82f330ab3",
"https://deno.land/std@0.214.0/path/windows/separator.ts": "e51c5522140eff4f8402617c5c68a201fdfa3a1a8b28dc23587cff931b665e43",
"https://deno.land/std@0.214.0/path/windows/to_file_url.ts": "1cd63fd35ec8d1370feaa4752eccc4cc05ea5362a878be8dc7db733650995484",
"https://deno.land/std@0.214.0/path/windows/to_namespaced_path.ts": "4ffa4fb6fae321448d5fe810b3ca741d84df4d7897e61ee29be961a6aac89a4c",
"https://deno.land/x/memcached@v1.0.0/mod.ts": "3c8528631e603638ded48f8fadcf61cde00fb1d2054e8647cc033bc9f2ea5867",
"https://deno.land/x/postgres@v0.19.3/client.ts": "d141c65c20484c545a1119c9af7a52dcc24f75c1a5633de2b9617b0f4b2ed5c1",
"https://deno.land/x/postgres@v0.19.3/client/error.ts": "05b0e35d65caf0ba21f7f6fab28c0811da83cd8b4897995a2f411c2c83391036",
"https://deno.land/x/postgres@v0.19.3/connection/auth.ts": "db15c1659742ef4d2791b32834950278dc7a40cb931f8e434e6569298e58df51",
"https://deno.land/x/postgres@v0.19.3/connection/connection.ts": "198a0ecf92a0d2aa72db3bb88b8f412d3b1f6b87d464d5f7bff9aa3b6aff8370",
"https://deno.land/x/postgres@v0.19.3/connection/connection_params.ts": "463d7a9ed559f537a55d6928cab62e1c31b808d08cd0411b6ae461d0c0183c93",
"https://deno.land/x/postgres@v0.19.3/connection/message.ts": "20da5d80fc4d7ddb7b850083e0b3fa8734eb26642221dad89c62e27d78e57a4d",
"https://deno.land/x/postgres@v0.19.3/connection/message_code.ts": "12bcb110df6945152f9f6c63128786558d7ad1e61006920daaa16ef85b3bab7d",
"https://deno.land/x/postgres@v0.19.3/connection/packet.ts": "050aeff1fc13c9349e89451a155ffcd0b1343dc313a51f84439e3e45f64b56c8",
"https://deno.land/x/postgres@v0.19.3/connection/scram.ts": "532d4d58b565a2ab48fb5e1e14dc9bfb3bb283d535011e371e698eb4a89dd994",
"https://deno.land/x/postgres@v0.19.3/debug.ts": "8add17699191f11e6830b8c95d9de25857d221bb2cf6c4ae22254d395895c1f9",
"https://deno.land/x/postgres@v0.19.3/deps.ts": "c312038fe64b8368f8a294119f11d8f235fe67de84d7c3b0ef67b3a56628171a",
"https://deno.land/x/postgres@v0.19.3/mod.ts": "4930c7b44f8d16ea71026f7e3ef22a2322d84655edceacd55f7461a9218d8560",
"https://deno.land/x/postgres@v0.19.3/pool.ts": "2289f029e7a3bd3d460d4faa71399a920b7406c92a97c0715d6e31dbf1380ec3",
"https://deno.land/x/postgres@v0.19.3/query/array_parser.ts": "ff72d3e026e3022a1a223a6530be5663f8ebbd911ed978291314e7fe6c2f2464",
"https://deno.land/x/postgres@v0.19.3/query/decode.ts": "3e89ad2a662eab66a4f4e195ff0924d71d199af3c2f5637d1ae650301a03fa9b",
"https://deno.land/x/postgres@v0.19.3/query/decoders.ts": "6a73da1024086ab91e233648c850dccbde59248b90d87054bbbd7f0bf4a50681",
"https://deno.land/x/postgres@v0.19.3/query/encode.ts": "5b1c305bc7352a6f9fe37f235dddfc23e26419c77a133b4eaea42cf136481aa6",
"https://deno.land/x/postgres@v0.19.3/query/oid.ts": "21fc714ac212350ba7df496f88ea9e01a4ee0458911d0f2b6a81498e12e7af4c",
"https://deno.land/x/postgres@v0.19.3/query/query.ts": "510f9a27da87ed7b31b5cbcd14bf3028b441ac2ddc368483679d0b86a9d9f213",
"https://deno.land/x/postgres@v0.19.3/query/transaction.ts": "8f4eef68f8e9b4be216199404315e6e08fe1fe98afb2e640bffd077662f79678",
"https://deno.land/x/postgres@v0.19.3/query/types.ts": "540f6f973d493d63f2c0059a09f3368071f57931bba68bea408a635a3e0565d6",
"https://deno.land/x/postgres@v0.19.3/utils/deferred.ts": "5420531adb6c3ea29ca8aac57b9b59bd3e4b9a938a4996bbd0947a858f611080",
"https://deno.land/x/postgres@v0.19.3/utils/utils.ts": "ca47193ea03ff5b585e487a06f106d367e509263a960b787197ce0c03113a738",
"https://deno.land/x/random_number@2.0.0/mod.ts": "83010e4a0192b015ba4491d8bb8c73a458f352ebc613b847ff6349961d1c7827",
"https://deno.land/x/redis@v0.37.1/backoff.ts": "33e4a6e245f8743fbae0ce583993a671a3ac2ecee433a3e7f0bd77b5dd541d84",
"https://deno.land/x/redis@v0.37.1/command.ts": "2d1da4b32495ea852bdff0c2e7fd191a056779a696b9f83fb648c5ebac45cfc3",
"https://deno.land/x/redis@v0.37.1/connection.ts": "cee30a6310298441de17d1028d4ce3fd239dcf05a92294fba7173a922d0596cb",
"https://deno.land/x/redis@v0.37.1/deps/std/async.ts": "5a588aefb041cca49f0e6b7e3c397119693a3e07bea89c54cf7fe4a412e37bbf",
"https://deno.land/x/redis@v0.37.1/deps/std/bytes.ts": "f5b437ebcac77600101a81ef457188516e4944b3c2a931dff5ced3fa0c239b62",
"https://deno.land/x/redis@v0.37.1/deps/std/io.ts": "b7505c5e738384f5f7a021d7bbd78380490c059cc7c83cd8dada1f86ec16e835",
"https://deno.land/x/redis@v0.37.1/errors.ts": "8293f56a70ea8388cb80b6e1caa15d350ed1719529fc06573b01a443d0caad69",
"https://deno.land/x/redis@v0.37.1/events.ts": "704767b1beed2d5acfd5e86bd1ef93befdc8a8f8c8bb4ae1b4485664a8a6a625",
"https://deno.land/x/redis@v0.37.1/executor.ts": "5ac4c1f7bec44d12ebc0f3702bf074bd3ba6c1aae74953582f6358d2948718e7",
"https://deno.land/x/redis@v0.37.1/internal/encoding.ts": "0525f7f444a96b92cd36423abdfe221f8d8de4a018dc5cb6750a428a5fc897c2",
"https://deno.land/x/redis@v0.37.1/internal/symbols.ts": "e36097bab1da1c9fe84a3bb9cb0ed1ec10c3dc7dd0b557769c5c54e15d110dd2",
"https://deno.land/x/redis@v0.37.1/mod.ts": "e11d9384c2ffe1b3d81ce0ad275254519990635ad1ba39f46d49a73b3c35238d",
"https://deno.land/x/redis@v0.37.1/pipeline.ts": "974fff59bf0befa2ad7eee50ecba40006c47364f5e3285e1335c9f9541b7ebae",
"https://deno.land/x/redis@v0.37.1/protocol/deno_streams/command.ts": "5c5e5fb639cae22c1f9bfdc87631edcd67bb28bf8590161ae484d293b733aa01",
"https://deno.land/x/redis@v0.37.1/protocol/deno_streams/mod.ts": "b084bf64d6b795f6c1d0b360d9be221e246a9c033e5d88fd1e82fa14f711d25b",
"https://deno.land/x/redis@v0.37.1/protocol/deno_streams/reply.ts": "639de34541f207f793393a3cd45f9a23ef308f094d9d3d6ce62f84b175d3af47",
"https://deno.land/x/redis@v0.37.1/protocol/shared/command.ts": "e75f6be115ff73bd865e01be4e2a28077a9993b1e0c54ed96b6825bfe997d382",
"https://deno.land/x/redis@v0.37.1/protocol/shared/protocol.ts": "5b9284ee28ec74dfc723c7c7f07dca8d5f9d303414f36689503622dfdde12551",
"https://deno.land/x/redis@v0.37.1/protocol/shared/reply.ts": "3311ff66357bacbd60785cb43b97539c341d8a7d963bc5e80cb864ac81909ea5",
"https://deno.land/x/redis@v0.37.1/protocol/shared/types.ts": "c6bf2b9eafd69e358a972823d94b8b478c00bac195b87b33b7437de2a9bb7fb4",
"https://deno.land/x/redis@v0.37.1/pubsub.ts": "a36892455b0a4a50af169332a165b0985cc90d84486087f036e507e3137b2afb",
"https://deno.land/x/redis@v0.37.1/redis.ts": "4904772596c8a82d7112092e7edea45243eae38809b2f2ea8db61a4207fe246b",
"https://deno.land/x/redis@v0.37.1/stream.ts": "d43076815d046eb8428fcd2799544a9fd07b3480099f5fc67d2ba12fdc73725f",
"https://deno.land/x/sleep@v1.3.0/mod.ts": "e9955ecd3228a000e29d46726cd6ab14b65cf83904e9b365f3a8d64ec61c1af3",
"https://deno.land/x/sleep@v1.3.0/sleep.ts": "b6abaca093b094b0c2bba94f287b19a60946a8d15764d168f83fcf555f5bb59e"
},
"workspace": {
"dependencies": [
"jsr:@std/assert@1",
"npm:zod@^3.24.2"
]
}
}

View File

@@ -1,7 +0,0 @@
// Database
export { Client as PostgresClient } from "https://deno.land/x/postgres@v0.19.3/mod.ts";
export type { ClientOptions as PostgresOptions } from "https://deno.land/x/postgres@v0.19.3/mod.ts";
// Redis
export { connect as redisConnect } from "https://deno.land/x/redis@v0.37.1/mod.ts";
export type { Redis } from "https://deno.land/x/redis@v0.37.1/mod.ts";

View File

@@ -1,19 +0,0 @@
import { contentTypes } from "./content-types.ts";
import { httpCodes } from "./http-codes.ts";
import { services } from "./services.ts";
import { Request,Handler, Response } from "./types.ts";
const multiHandler: Handler = (req: Request): Response => {
const code = httpCodes.success.OK;
const rn = services.random.randomNumber();
const retval: Response = {
code,
result: `that was ${req.method} (${rn})`,
contentType: contentTypes.text.plain,
};
return retval;
};
export { multiHandler };

View File

@@ -1,43 +0,0 @@
import { Extensible } from "./interfaces.ts";
export type HttpCode = {
code: number;
name: string;
description?: string;
};
type Group = "success" | "redirection" | "clientErrors" | "serverErrors";
type CodeDefinitions = {
[K in Group]: {
[K: string]: HttpCode;
};
};
// FIXME: Figure out how to brand CodeDefinitions in a way that isn't
// tedious.
const httpCodes: CodeDefinitions = {
success: {
OK: { code: 200, name: "OK", "description": "" },
Created: { code: 201, name: "Created" },
Accepted: { code: 202, name: "Accepted" },
NoContent: { code: 204, name: "No content" },
},
redirection: {
// later
},
clientErrors: {
BadRequest: { code: 400, name: "Bad Request" },
Unauthorized: { code: 401, name: "Unauthorized" },
Forbidden: { code: 403, name: "Forbidden" },
NotFound: { code: 404, name: "Not Found" },
MethodNotAllowed: { code: 405, name: "Method Not Allowed" },
NotAcceptable: { code: 406, name: "Not Acceptable" },
// More later
},
serverErrors: {
InternalServerError: { code: 500, name: "Internal Server Error" },
NotImplemented: { code: 500, name: "Not implemented" },
// more later
},
};
export { httpCodes };

View File

@@ -1,44 +0,0 @@
// internal-logging.ts
// FIXME: Move this to somewhere more appropriate
type AtLeastOne<T> = [T, ...T[]];
type MessageSource = "logging" | "diagnostic" | "user";
type Message = {
// FIXME: number probably isn't what we want here
timestamp?: number;
source: MessageSource;
text: AtLeastOne<string>;
};
const m1: Message = { timestamp: 123, source: "logging", text: ["foo"] };
const m2: Message = {
timestamp: 321,
source: "diagnostic",
text: ["ok", "whatever"],
};
type FilterArgument = {
limit?: number;
before?: number;
after?: number;
// FIXME: add offsets to use instead of or in addition to before/after
match?: (string | RegExp)[];
};
const log = (_message: Message) => {
// WRITEME
};
const getLogs = (filter: FilterArgument) => {
// WRITEME
};
// FIXME: there's scope for more specialized functions although they
// probably should be defined in terms of the basic ones here.
export { getLogs, log };

View File

@@ -1,8 +0,0 @@
export function add(a: number, b: number): number {
return a + b;
}
// Learn more at https://docs.deno.com/runtime/manual/examples/module_metadata#concepts
if (import.meta.main) {
console.log("Add 2 + 3 =", add(2, 3));
}

View File

@@ -1,6 +0,0 @@
import { assertEquals } from "@std/assert";
import { add } from "./main.ts";
Deno.test(function addTest() {
assertEquals(add(2, 3), 5);
});

View File

@@ -1,82 +0,0 @@
/// <reference lib="dom" />
import { sleep } from "https://deno.land/x/sleep/mod.ts";
import { HttpCode, httpCodes } from "./http-codes.ts";
import { ContentType, contentTypes } from "./content-types.ts";
import { services } from "./services.ts";
import { multiHandler } from "./handlers.ts";
import {
DenoRequest,
Handler,
Method,
ProcessedRoute,
Request,
Response,
Route,
UserRequest,
} from "./types.ts";
// FIXME: Obviously put this somewhere else
const okText = (out: string) => {
const code = httpCodes.success.OK;
return {
code,
result: out,
contentType: contentTypes.text.plain,
};
};
const routes: Route[] = [
{
path: "/slow",
methods: ["GET"],
handler: async (_req: Request) => {
console.log("starting slow request");
await sleep(2);
console.log("finishing slow request");
return okText("that was slow");
},
},
{
path: "/list",
methods: ["GET"],
handler: (_req: Request) => {
const code = httpCodes.success.OK;
const lr = (rr: Route[]) => {
const ret = rr.map((r: Route) => {
return r.path;
});
return ret;
};
const listing = lr(routes).join(", ");
return {
code,
result: listing + "\n",
contentType: contentTypes.text.plain,
};
},
},
{
path: "/ok",
methods: ["GET", "POST", "PUT"],
handler: multiHandler,
},
{
path: "/alsook",
methods: ["GET"],
handler: (_req) => {
const code = httpCodes.success.OK;
return {
code,
result: "it is also ok",
contentType: contentTypes.text.plain,
};
},
},
];
export { routes };

View File

@@ -1,11 +0,0 @@
#!/bin/bash
set -e
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
cd "$DIR"
deno run --allow-net --unstable-sloppy-imports --watch app.ts

View File

@@ -1,28 +0,0 @@
// services.ts
import { randomNumber } from "https://deno.land/x/random_number/mod.ts";
import { config } from "./config.ts";
import { getLogs, log } from "./logging.ts";
//const database = Client({
//})
const database = {};
const logging = {
log,
getLogs,
};
const random = {
randomNumber,
};
const services = {
database,
logging,
random,
};
export { services };

View File

@@ -1,63 +0,0 @@
// types.ts
// FIXME: split this up into types used by app developers and types internal
// to the framework.
// FIXME: the use of types like Request and Response cause problems because it's
// easy to forget to import them and then all sorts of random typechecking errors
// start showing up even if the code is sound. So find other names for them.
import { z } from "zod";
import { HttpCode, httpCodes } from "./http-codes.ts";
import { ContentType, contentTypes } from "./content-types.ts";
const methodParser = z.union([
z.literal("GET"),
z.literal("POST"),
z.literal("PUT"),
z.literal("PATCH"),
z.literal("DELETE"),
]);
export type Method = z.infer<typeof methodParser>;
const massageMethod = (input: string): Method => {
const r = methodParser.parse(input.toUpperCase());
return r;
};
export type DenoRequest = globalThis.Request;
export type DenoResponse = globalThis.Response;
export type UserRequest = {};
export type Request = {
pattern: string;
path: string;
method: Method;
parameters: object;
denoRequest: globalThis.Request;
};
export type InternalHandler = (req: DenoRequest) => Promise<Response>;
export type Handler = (req: Request) => Promise<Response> | Response;
export type ProcessedRoute = {
pattern: URLPattern;
method: Method;
handler: InternalHandler;
};
export type Response = {
code: HttpCode;
contentType: ContentType;
result: string;
};
export type Route = {
path: string;
methods: Method[];
handler: Handler;
interruptable?: boolean;
};
export { massageMethod };

27
develop Executable file
View File

@@ -0,0 +1,27 @@
#!/bin/bash
# This file belongs to the framework. You are not expected to modify it.
# Development command runner - parallel to ./mgmt for development tasks
# Usage: ./develop <command> [args...]
set -eu
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
if [ $# -lt 1 ]; then
echo "Usage: ./develop <command> [args...]"
echo ""
echo "Available commands:"
for cmd in "$DIR"/framework/develop.d/*; do
if [ -x "$cmd" ]; then
basename "$cmd"
fi
done
exit 1
fi
subcmd="$1"
shift
exec "$DIR"/framework/develop.d/"$subcmd" "$@"

35
docker-compose.yml Normal file
View File

@@ -0,0 +1,35 @@
services:
postgres:
image: postgres:17
ports:
- "5432:5432"
environment:
POSTGRES_USER: diachron
POSTGRES_PASSWORD: diachron
POSTGRES_DB: diachron
volumes:
- postgres_data:/var/lib/postgresql/data
redis:
image: redis:7
ports:
- "6379:6379"
memcached:
image: memcached:1.6
ports:
- "11211:11211"
beanstalkd:
image: schickling/beanstalkd
ports:
- "11300:11300"
mailpit:
image: axllent/mailpit
ports:
- "1025:1025" # SMTP
- "8025:8025" # Web UI
volumes:
postgres_data:

125
docs/commands.md Normal file
View File

@@ -0,0 +1,125 @@
# The Three Types of Commands
This framework deliberately separates *how* you interact with the system into three distinct command types. The split is not cosmetic; it encodes safety, intent, and operational assumptions directly into the tooling so that mistakes are harder to make under stress.
The guiding idea: **production should feel boring and safe; exploration should feel powerful and a little dangerous; the application itself should not care how it is being operated.**
---
## 1. Application Commands (`app`)
**What they are**
Commands defined *by the application itself*, for its own domain needs. They are not part of the framework, even though they are built on top of it.
The framework provides structure and affordances; the application supplies meaning.
**Core properties**
* Express domain behavior, not infrastructure concerns
* Safe by definition
* Deterministic and repeatable
* No environmentdependent semantics
* Identical behavior in dev, staging, and production
**Examples**
* Handling HTTP requests
* Rendering templates
* Running background jobs / queues
* Sending emails triggered by application logic
**Nongoals**
* No schema changes
* No data backfills
* No destructive behavior
* No operational or lifecycle management
**Rule of thumb**
If removing the framework would require rewriting *how* it runs but not *what* it does, the command belongs here.
---
## 2. Management Commands (`mgmt`)
**What they are**
Operational, *productionsafe* commands used to evolve and maintain a live system.
These commands assume real data exists and must not be casually destroyed.
**Core properties**
* Forwardonly
* Idempotent or safely repeatable
* Designed to run in production
* Explicit, auditable intent
**Examples**
* Applying migrations
* Running seeders that assert invariant data
* Reindexing or rebuilding derived data
* Rotating keys, recalculating counters
**Design constraints**
* No implicit rollbacks
* No hidden destructive actions
* Fail fast if assumptions are violated
**Rule of thumb**
If you would run it at 3am while tired and worried, it must live here.
---
## 3. Development Commands (`develop`)
**What they are**
Sharp, *unsafe by design* tools meant exclusively for local development and experimentation.
These commands optimize for speed, learning, and iteration — not safety.
**Core properties**
* Destructive operations allowed
* May reset or mutate large amounts of data
* Assume a clean or disposable environment
* Explicitly gated in production
**Examples**
* Dropping and recreating databases
* Rolling migrations backward
* Loading fixtures or scenarios
* Generating fake or randomized data
**Safety model**
* Hard to run in production
* Requires explicit optin if ever enabled
* Clear, noisy warnings when invoked
**Rule of thumb**
If it would be irresponsible to run against real user data, it belongs here.
---
## Why This Split Matters
Many frameworks blur these concerns, leading to:
* Fearful production operations
* Overpowered dev tools leaking into prod
* Environmentspecific behavior and bugs
By naming and enforcing these three command types:
* Intent is visible at the CLI level
* Safety properties are architectural, not cultural
* Developers can move fast *without* normalizing risk
---
## OneSentence Summary
> **App commands run the system, mgmt commands evolve it safely, and develop commands let you break things on purpose — but only where its allowed.**

View File

@@ -0,0 +1,37 @@
Let's consider a bullseye with the following concentric circles:
- Ring 0: small, simple systems
- Single jurisdiction
- Email + password
- A few roles
- Naïve or soft deletion
- Minimal audit needs
- Ring 1: grown-up systems
- Long-lived data
- Changing requirements
- Shared accounts
- GDPR-style erasure/anonymization
- Some cross-border concerns
- Historical data must remain usable
- “Oops, we should have thought about that” moments
- Ring 2: heavy compliance
- Formal audit trails
- Legal hold
- Non-repudiation
- Regulatory reporting
- Strong identity guarantees
- Jurisdiction-aware data partitioning
- Ring 3: banking / defense / healthcare at scale
- Cryptographic auditability
- Append-only ledgers
- Explicit legal models
- Independent compliance teams
- Lawyers embedded in engineeRing
diachron is designed to be suitable for Rings 0 and 1. Occasionally we may
look over the fence into Ring 2, but it's not what we've principally designed
for. Please take this framing into account when evaluating diachron for
greenfield projects.

1
docs/deployment.md Normal file
View File

@@ -0,0 +1 @@
.

View File

@@ -0,0 +1,142 @@
# Freedom, Hacking, and Responsibility
This framework is **free and open source software**.
That fact is not incidental. It is a deliberate ethical, practical, and technical choice.
This document explains how freedom to modify coexists with strong guidance about *how the framework is meant to be used* — without contradiction, and without apology.
---
## The short version
* This is free software. You are free to modify it.
* The framework has documented invariants for good reasons.
* You are encouraged to explore, question, and patch.
* You are discouraged from casually undermining guarantees you still expect to rely on.
* Clarity beats enforcement.
Freedom with understanding beats both lock-in and chaos.
---
## Your Freedom
You are free to:
* study the source code
* run the software for any purpose
* modify it in any way
* fork it
* redistribute it, with or without changes
* submit patches, extensions, or experiments
…subject only to the terms of the license.
These freedoms are foundational. They are not granted reluctantly, and they are not symbolic. They exist so that:
* you can understand what your software is really doing
* you are not trapped by vendor control
* the system can outlive its original authors
---
## Freedom Is Not the Same as Endorsement
While you are free to change anything, **not all changes are equally wise**.
Some parts of the framework are carefully constrained because they encode:
* security assumptions
* lifecycle invariants
* hard-won lessons from real systems under stress
You are free to violate these constraints in your own fork.
But the frameworks documentation will often say things like:
* “do not modify this”
* “application code must not depend on this”
* “this table or class is framework-owned”
These statements are **technical guidance**, not legal restrictions.
They exist to answer the question:
> *If you want this system to remain upgradeable, predictable, and boring — what should you leave alone?*
---
## The Intended Social Contract
The framework makes a clear offer:
* We expose our internals so you can learn.
* We provide explicit extension points so you can adapt.
* We document invariants so you dont have to rediscover them the hard way.
In return, we ask that:
* application code respects documented boundaries
* extensions use explicit seams rather than hidden hooks
* patches that change invariants are proposed consciously, not accidentally
Nothing here is enforced by technical locks.
It is enforced — insofar as it is enforced at all — by clarity and shared expectations.
---
## Hacking Is Welcome
Exploration is not just allowed; it is encouraged.
Good reasons to hack on the framework include:
* understanding how it works
* evaluating whether its constraints make sense
* adapting it to unfamiliar environments
* testing alternative designs
* discovering better abstractions
Fork it. Instrument it. Break it. Learn from it.
Many of the frameworks constraints exist *because* someone once ignored them and paid the price.
---
## Patches, Not Patches-in-Place
If you discover a problem or a better design:
* patches are welcome
* discussions are welcome
* disagreements are welcome
What is discouraged is **quietly patching around framework invariants inside application code**.
That approach:
* obscures intent
* creates one-off local truths
* makes systems harder to reason about
If the framework is wrong, it should be corrected *at the framework level*, or consciously forked.
---
## Why This Is Not a Contradiction
Strong opinions and free software are not enemies.
Freedom means you can change the software.
Responsibility means understanding what you are changing, and why.
A system that pretends every modification is equally safe is dishonest.
A system that hides its internals to prevent modification is hostile.
This framework aims for neither.

27
docs/groups-and-roles.md Normal file
View File

@@ -0,0 +1,27 @@
- Role: a named bundle of responsibilities (editor, admin, member)
- Group: a scope or context (org, team, project, publication)
- Permission / Capability (capability preferred in code): a boolean fact about
allowed behavior
## tips
- In the database, capabilities are boolean values. Their names should be
verb-subject. Don't include `can` and definitely do not include `cannot`.
✔️ `edit_post`
`cannot_remove_comment`
- The capabilities table is deliberately flat. If you need to group them, use
`.` as a delimiter and sort and filter accordingly in queries and in your
UI.
✔️ `blog.edit_post`
✔️ `blog.moderate_comment`
or
✔️ `blog.post.edit`
✔️ `blog.post.delete`
✔️ `blog.comment.moderate`
✔️ `blog.comment.edit`
are all fine.

17
docs/index.md Normal file
View File

@@ -0,0 +1,17 @@
misc notes for now. of course this needs to be written up for real.
## execution context
The execution context represents facts such as the runtime directory, the
operating system, hardware, and filesystem layout, distinct from environment
variables or request-scoped context.
## philosophy
- TODO-DESIGN.md
- concentric-circles.md
- nomenclature.md
- mutability.md
- commands.md
- groups-and-roles.md

View File

@@ -0,0 +1,34 @@
Some database tables are owned by diachron and some are owned by the
application.
This also applies to seeders: some are owned by diachron and some by the
application.
The database's structure is managed by migrations written in SQL.
Each migration gets its own file. These files' names should match
`yyyy-mm-dd_ss-description.sql`, eg `2026-01-01_01-users.sql`.
Files are sorted lexicographically by name and applied in order.
Note: in the future we may relax or modify the restriction on migration file
names, but they'll continue to be applied in lexicographical order.
## framework and application migrations
Migrations owned by the framework are kept in a separate directory from those
owned by applications. Pending framework migrations, if any, are applied
before pending application migrations, if any.
diachron will go to some lengths to ensure that framework migrations do not
break applications.
## no downward migrations
diachron does not provide them. "The only way out is through."
When developing locally, you can use the command `develop reset-db`. **NEVER
USE THIS IN PRODUCTION!** Always be sure that you can "get back to where you
were". Being careful when creating migrations and seeders can help, but
dumping and restoring known-good copies of the database can also take you a
long way.

1
docs/mutability.md Normal file
View File

@@ -0,0 +1 @@
Describe and define what is expected to be mutable and what is not.

84
docs/new-project.md Normal file
View File

@@ -0,0 +1,84 @@
If any of the steps here don't work or are unclear in any way, it is
probably a bug and we want to fix it!
## how to create a new diachron project
1. Create an empty directory for your project. This directory can be inside of a
git repository but it doesn't have to be.
2. Download the sync program and put it in the empty directory created in the
previous step. There is a sync program for every version of diachron.
You'll usually want to use the most recent stable version. [FIXME: explain
why you'd want to use something else.] And you'll want the version for
the operating system and hardware you're using.
3. Run the `setup` program. This program is [FIXME: will be] written in
[go](https://go.dev), so as long as you have downloaded the right file, it
ought to work.
This will create several files and directories. It will also download a number
of binaries and put them in different places in some of the directories
that are created.
4. At this point, you should have a usable, if not very useful, diachron
application. To see what it does, run the program `develop run`; it will run a
simple web application on localhost:3000. To make changes, have a look at
the files `src/app.ts` and `src/routes.ts`.
## where do we go from here?
Now that we have a very simple project, we need to attend to a few other
important matters.
### version control
(These instructions assume you're using git. If you're using a different
version control system then you will need to make allowances. In particular,
you should convert the `.gitignore` file to whatever your version control
system uses.)
You should add the whole directory to git and commit it. There will be two
`.gitignore` files, one in the root, and one in the `framework/` directory.
The root `.gitignore` created for you will be a good starting point, but you
can make changes to it as you see fit. However, you should not ever modify
`framework/.gitignore`. More on this in the next section.
### working with diachron
There are four commands to know about:
- `sync` is used to install all dependencies, including the ones you specify
as well as the ones that diachron provides
- `develop` is used to run "development-related" tasks. Run `develop help` to
get an overview of what it can do.
- `operate` is used to run "operations-related" tasks. Run `operate help` to
get an overview of what it can do.
- `cmd` runs diachron-managed commands, such as `pnpm`. When working on a
diachron project, you should always use these diachron-managed commands
instead of whatever else you may have available.
### what files belong to your project, what files belong to the framework
In a new diachron project, there are some files and directories that are
"owned" by the framework and others that are "owned" by the programmer.
In particular, you own everything in the directory `src/`. You own
`README.md` and `package.json` and `pnpm-lock.yaml`. You own any other files
our directories you create.
Everything else _belongs to the framework_ and you are not expected to change
it except when upgrading.
This is just an overview. It is exhaustively documented in
[ownership.md](ownership.md).
### updates
### when the docs sound a bit authoritarian...
Finally, remember that diachron's license allows you to do whatever you like
with it, with very few limitations. This includes making changes to files
about which, in the documentation, we say "you must not change" or "you are
not expected to change."

15
docs/nomenclature.md Normal file
View File

@@ -0,0 +1,15 @@
We use `Call` and `Result` for our own types that wrap `Request` and
`Response`.
This hopefully will make things less confusing and avoid problems with shadowing.
## meta
- We use _algorithmic complexity_ for performance discussions, when
things like Big-O come up, etc
- We use _conceptual complexity_ for design and architecture
- We use _cognitive load_ when talking about developer experience
- We use _operational burden_ when talking about production reality

219
docs/ownership.md Normal file
View File

@@ -0,0 +1,219 @@
# Framework vs Application Ownership
This document defines **ownership boundaries** between the framework and application code. These boundaries are intentional and non-negotiable: they exist to preserve upgradeability, predictability, and developer sanity under stress.
Ownership answers a simple question:
> **Who is allowed to change this, and under what rules?**
The framework draws a hard line between *frameworkowned* and *applicationowned* concerns, while still encouraging extension through explicit, visible mechanisms.
---
## Core Principle
The framework is not a library of suggestions. It is a **runtime with invariants**.
Application code:
* **uses** the framework
* **extends** it through defined seams
* **never mutates or overrides its invariants**
Framework code:
* guarantees stable behavior
* owns critical lifecycle and security concerns
* must remain internally consistent across versions
Breaking this boundary creates systems that work *until they dont*, usually during upgrades or emergencies.
---
## Database Ownership
### FrameworkOwned Tables
Certain database tables are **owned and managed exclusively by the framework**.
Examples (illustrative, not exhaustive):
* authentication primitives
* session or token state
* internal capability/permission metadata
* migration bookkeeping
* framework feature flags or invariants
#### Rules
Application code **must not**:
* modify schema
* add columns
* delete rows
* update rows directly
* rely on undocumented columns or behaviors
Application code **may**:
* read via documented framework APIs
* reference stable identifiers explicitly exposed by the framework
Think of these tables as **private internal state** — even though they live in your database.
> If the framework needs you to interact with this data, it will expose an API for it.
#### Rationale
These tables:
* encode security or correctness invariants
* may change structure across framework versions
* must remain globally coherent
Treating them as appowned data tightly couples your app to framework internals and blocks safe upgrades.
---
### ApplicationOwned Tables
All domain data belongs to the application.
Examples:
* users (as domain actors, not auth primitives)
* posts, orders, comments, invoices
* businessspecific joins and projections
* denormalized or performanceoriented tables
#### Rules
Application code:
* owns schema design
* owns migrations
* owns constraints and indexes
* may evolve these tables freely
The framework:
* never mutates application tables implicitly
* interacts only through explicit queries or contracts
#### Integration Pattern
Where framework concepts must relate to app data:
* use **foreign keys to frameworkexposed identifiers**, or
* introduce **explicit join tables** owned by the application
No hidden coupling, no magic backfills.
---
## Code Ownership
### FrameworkOwned Code
Some classes, constants, and modules are **frameworkowned**.
These include:
* core request/response abstractions
* auth and user primitives
* capability/permission evaluation logic
* lifecycle hooks
* lowlevel utilities relied on by the framework itself
#### Rules
Application code **must not**:
* modify framework source
* monkeypatch or override internals
* rely on undocumented behavior
* change constant values or internal defaults
Framework code is treated as **readonly** from the apps perspective.
---
### Extension Is Encouraged (But Explicit)
Ownership does **not** mean rigidity.
The framework is designed to be extended via **intentional seams**, such as:
* subclassing
* composition
* adapters
* delegation
* configuration objects
* explicit registration APIs
#### Preferred Patterns
* **Subclass when behavior is stable and conceptual**
* **Compose when behavior is contextual or optional**
* **Delegate when authority should remain with the framework**
What matters is that extension is:
* visible in code
* locally understandable
* reversible
No spooky action at a distance.
---
## What the App Owns Completely
The application fully owns:
* domain models and data shapes
* SQL queries and result parsing
* business rules
* authorization policy *inputs* (not the engine)
* rendering decisions
* feature flags specific to the app
* performance tradeoffs
The framework does not attempt to infer intent from your domain.
---
## What the Framework Guarantees
In return for respecting ownership boundaries, the framework guarantees:
* stable semantics across versions
* forwardonly migrations for its own tables
* explicit deprecations
* no silent behavior changes
* identical runtime behavior in dev and prod
The framework may evolve internally — **but never by reaching into your apps data or code**.
---
## A Useful Mental Model
* Frameworkowned things are **constitutional law**
* Applicationowned things are **legislation**
You can write any laws you want — but you dont amend the constitution inline.
If you need a new power, the framework should expose it deliberately.
---
## Summary
* Ownership is about **who is allowed to change what**
* Frameworkowned tables and code are readonly to the app
* Applicationowned tables and code are sovereign
* Extension is encouraged, mutation is not
* Explicit seams beat clever hacks
Respecting these boundaries keeps systems boring — and boring systems survive stress.

1
docs/upgrades.md Normal file
View File

@@ -0,0 +1 @@
.

2
express/.gitignore vendored Normal file
View File

@@ -0,0 +1,2 @@
out/
dist/

173
express/app.ts Normal file
View File

@@ -0,0 +1,173 @@
import express, {
type Request as ExpressRequest,
type Response as ExpressResponse,
} from "express";
import { match } from "path-to-regexp";
import { Session } from "./auth";
import { cli } from "./cli";
import { contentTypes } from "./content-types";
import { runWithContext } from "./context";
import { core } from "./core";
import { httpCodes } from "./http-codes";
import { request } from "./request";
import { routes } from "./routes";
// import { URLPattern } from 'node:url';
import {
AuthenticationRequired,
AuthorizationDenied,
type Call,
type InternalHandler,
isRedirect,
type Method,
massageMethod,
methodParser,
type ProcessedRoute,
type Result,
type Route,
} from "./types";
const app = express();
// Parse request bodies
app.use(express.json());
app.use(express.urlencoded({ extended: true }));
core.logging.log({ source: "logging", text: ["1"] });
const processedRoutes: { [K in Method]: ProcessedRoute[] } = {
GET: [],
POST: [],
PUT: [],
PATCH: [],
DELETE: [],
};
function _isPromise<T>(value: T | Promise<T>): value is Promise<T> {
return typeof (value as any)?.then === "function";
}
routes.forEach((route: Route, _idx: number, _allRoutes: Route[]) => {
// const pattern /*: URLPattern */ = new URLPattern({ pathname: route.path });
const matcher = match<Record<string, string>>(route.path);
const methodList = route.methods;
const handler: InternalHandler = async (
expressRequest: ExpressRequest,
): Promise<Result> => {
const method = massageMethod(expressRequest.method);
console.log("method", method);
if (!methodList.includes(method)) {
// XXX: Worth asserting this?
}
console.log("request.originalUrl", expressRequest.originalUrl);
// Authenticate the request
const auth = await request.auth.validateRequest(expressRequest);
const req: Call = {
pattern: route.path,
path: expressRequest.originalUrl,
method,
parameters: { one: 1, two: 2 },
request: expressRequest,
user: auth.user,
session: new Session(auth.session, auth.user),
};
try {
const retval = await runWithContext({ user: auth.user }, () =>
route.handler(req),
);
return retval;
} catch (error) {
// Handle authentication errors
if (error instanceof AuthenticationRequired) {
return {
code: httpCodes.clientErrors.Unauthorized,
contentType: contentTypes.application.json,
result: JSON.stringify({
error: "Authentication required",
}),
};
}
if (error instanceof AuthorizationDenied) {
return {
code: httpCodes.clientErrors.Forbidden,
contentType: contentTypes.application.json,
result: JSON.stringify({ error: "Access denied" }),
};
}
throw error;
}
};
for (const [_idx, method] of methodList.entries()) {
const pr: ProcessedRoute = { matcher, method, handler };
processedRoutes[method].push(pr);
}
});
async function handler(
req: ExpressRequest,
_res: ExpressResponse,
): Promise<Result> {
const method = await methodParser.parseAsync(req.method);
const byMethod = processedRoutes[method];
console.log(
"DEBUG: req.path =",
JSON.stringify(req.path),
"method =",
method,
);
for (const [_idx, pr] of byMethod.entries()) {
const match = pr.matcher(req.path);
console.log("DEBUG: trying pattern, match result =", match);
if (match) {
console.log("match", match);
const resp = await pr.handler(req);
return resp;
}
}
const retval: Result = {
code: httpCodes.clientErrors.NotFound,
contentType: contentTypes.text.plain,
result: "not found!",
};
return retval;
}
app.use(async (req: ExpressRequest, res: ExpressResponse) => {
const result0 = await handler(req, res);
const code = result0.code.code;
const result = result0.result;
console.log(result);
// Set any cookies from the result
if (result0.cookies) {
for (const cookie of result0.cookies) {
res.cookie(cookie.name, cookie.value, cookie.options ?? {});
}
}
if (isRedirect(result0)) {
res.redirect(code, result0.redirect);
} else {
res.status(code).send(result);
}
});
process.title = `diachron:${cli.listen.port}`;
app.listen(cli.listen.port, cli.listen.host, () => {
console.log(`Listening on ${cli.listen.host}:${cli.listen.port}`);
});

20
express/auth/index.ts Normal file
View File

@@ -0,0 +1,20 @@
// index.ts
//
// Barrel export for auth module.
//
// NOTE: authRoutes is NOT exported here to avoid circular dependency:
// services.ts → auth/index.ts → auth/routes.ts → services.ts
// Import authRoutes directly from "./auth/routes" instead.
export { hashPassword, verifyPassword } from "./password";
export { type AuthResult, AuthService } from "./service";
export { type AuthStore, InMemoryAuthStore } from "./store";
export { generateToken, hashToken, SESSION_COOKIE_NAME } from "./token";
export {
type AuthMethod,
Session,
type SessionData,
type TokenId,
type TokenType,
tokenLifetimes,
} from "./types";

70
express/auth/password.ts Normal file
View File

@@ -0,0 +1,70 @@
// password.ts
//
// Password hashing using Node.js scrypt (no external dependencies).
// Format: $scrypt$N$r$p$salt$hash (all base64)
import {
randomBytes,
type ScryptOptions,
scrypt,
timingSafeEqual,
} from "node:crypto";
// Configuration
const SALT_LENGTH = 32;
const KEY_LENGTH = 64;
const SCRYPT_PARAMS: ScryptOptions = {
N: 16384, // CPU/memory cost parameter (2^14)
r: 8, // Block size
p: 1, // Parallelization
};
// Promisified scrypt with options support
function scryptAsync(
password: string,
salt: Buffer,
keylen: number,
options: ScryptOptions,
): Promise<Buffer> {
return new Promise((resolve, reject) => {
scrypt(password, salt, keylen, options, (err, derivedKey) => {
if (err) {
reject(err);
} else {
resolve(derivedKey);
}
});
});
}
async function hashPassword(password: string): Promise<string> {
const salt = randomBytes(SALT_LENGTH);
const hash = await scryptAsync(password, salt, KEY_LENGTH, SCRYPT_PARAMS);
const { N, r, p } = SCRYPT_PARAMS;
return `$scrypt$${N}$${r}$${p}$${salt.toString("base64")}$${hash.toString("base64")}`;
}
async function verifyPassword(
password: string,
stored: string,
): Promise<boolean> {
const parts = stored.split("$");
if (parts[1] !== "scrypt" || parts.length !== 7) {
throw new Error("Invalid password hash format");
}
const [, , nStr, rStr, pStr, saltB64, hashB64] = parts;
const salt = Buffer.from(saltB64, "base64");
const storedHash = Buffer.from(hashB64, "base64");
const computedHash = await scryptAsync(password, salt, storedHash.length, {
N: parseInt(nStr, 10),
r: parseInt(rStr, 10),
p: parseInt(pStr, 10),
});
return timingSafeEqual(storedHash, computedHash);
}
export { hashPassword, verifyPassword };

231
express/auth/routes.ts Normal file
View File

@@ -0,0 +1,231 @@
// routes.ts
//
// Authentication route handlers.
import { z } from "zod";
import { contentTypes } from "../content-types";
import { httpCodes } from "../http-codes";
import { request } from "../request";
import type { Call, Result, Route } from "../types";
import {
forgotPasswordInputParser,
loginInputParser,
registerInputParser,
resetPasswordInputParser,
} from "./types";
// Helper for JSON responses
const jsonResponse = (
code: (typeof httpCodes.success)[keyof typeof httpCodes.success],
data: object,
): Result => ({
code,
contentType: contentTypes.application.json,
result: JSON.stringify(data),
});
const errorResponse = (
code: (typeof httpCodes.clientErrors)[keyof typeof httpCodes.clientErrors],
error: string,
): Result => ({
code,
contentType: contentTypes.application.json,
result: JSON.stringify({ error }),
});
// POST /auth/login
const loginHandler = async (call: Call): Promise<Result> => {
try {
const body = call.request.body;
const { email, password } = loginInputParser.parse(body);
const result = await request.auth.login(email, password, "cookie", {
userAgent: call.request.get("User-Agent"),
ipAddress: call.request.ip,
});
if (!result.success) {
return errorResponse(
httpCodes.clientErrors.Unauthorized,
result.error,
);
}
return jsonResponse(httpCodes.success.OK, {
token: result.token,
user: {
id: result.user.id,
email: result.user.email,
displayName: result.user.displayName,
},
});
} catch (error) {
if (error instanceof z.ZodError) {
return errorResponse(
httpCodes.clientErrors.BadRequest,
"Invalid input",
);
}
throw error;
}
};
// POST /auth/logout
const logoutHandler = async (call: Call): Promise<Result> => {
const token = request.auth.extractToken(call.request);
if (token) {
await request.auth.logout(token);
}
return jsonResponse(httpCodes.success.OK, { message: "Logged out" });
};
// POST /auth/register
const registerHandler = async (call: Call): Promise<Result> => {
try {
const body = call.request.body;
const { email, password, displayName } =
registerInputParser.parse(body);
const result = await request.auth.register(
email,
password,
displayName,
);
if (!result.success) {
return errorResponse(httpCodes.clientErrors.Conflict, result.error);
}
// TODO: Send verification email with result.verificationToken
// For now, log it for development
console.log(
`[AUTH] Verification token for ${email}: ${result.verificationToken}`,
);
return jsonResponse(httpCodes.success.Created, {
message:
"Registration successful. Please check your email to verify your account.",
user: {
id: result.user.id,
email: result.user.email,
},
});
} catch (error) {
if (error instanceof z.ZodError) {
return errorResponse(
httpCodes.clientErrors.BadRequest,
"Invalid input",
);
}
throw error;
}
};
// POST /auth/forgot-password
const forgotPasswordHandler = async (call: Call): Promise<Result> => {
try {
const body = call.request.body;
const { email } = forgotPasswordInputParser.parse(body);
const result = await request.auth.createPasswordResetToken(email);
// Always return success (don't reveal if email exists)
if (result) {
// TODO: Send password reset email
console.log(
`[AUTH] Password reset token for ${email}: ${result.token}`,
);
}
return jsonResponse(httpCodes.success.OK, {
message:
"If an account exists with that email, a password reset link has been sent.",
});
} catch (error) {
if (error instanceof z.ZodError) {
return errorResponse(
httpCodes.clientErrors.BadRequest,
"Invalid input",
);
}
throw error;
}
};
// POST /auth/reset-password
const resetPasswordHandler = async (call: Call): Promise<Result> => {
try {
const body = call.request.body;
const { token, password } = resetPasswordInputParser.parse(body);
const result = await request.auth.resetPassword(token, password);
if (!result.success) {
return errorResponse(
httpCodes.clientErrors.BadRequest,
result.error,
);
}
return jsonResponse(httpCodes.success.OK, {
message:
"Password has been reset. You can now log in with your new password.",
});
} catch (error) {
if (error instanceof z.ZodError) {
return errorResponse(
httpCodes.clientErrors.BadRequest,
"Invalid input",
);
}
throw error;
}
};
// GET /auth/verify-email?token=xxx
const verifyEmailHandler = async (call: Call): Promise<Result> => {
const url = new URL(call.path, "http://localhost");
const token = url.searchParams.get("token");
if (!token) {
return errorResponse(
httpCodes.clientErrors.BadRequest,
"Missing token",
);
}
const result = await request.auth.verifyEmail(token);
if (!result.success) {
return errorResponse(httpCodes.clientErrors.BadRequest, result.error);
}
return jsonResponse(httpCodes.success.OK, {
message: "Email verified successfully. You can now log in.",
});
};
// Export routes
const authRoutes: Route[] = [
{ path: "/auth/login", methods: ["POST"], handler: loginHandler },
{ path: "/auth/logout", methods: ["POST"], handler: logoutHandler },
{ path: "/auth/register", methods: ["POST"], handler: registerHandler },
{
path: "/auth/forgot-password",
methods: ["POST"],
handler: forgotPasswordHandler,
},
{
path: "/auth/reset-password",
methods: ["POST"],
handler: resetPasswordHandler,
},
{
path: "/auth/verify-email",
methods: ["GET"],
handler: verifyEmailHandler,
},
];
export { authRoutes };

262
express/auth/service.ts Normal file
View File

@@ -0,0 +1,262 @@
// service.ts
//
// Core authentication service providing login, logout, registration,
// password reset, and email verification.
import type { Request as ExpressRequest } from "express";
import {
type AnonymousUser,
anonymousUser,
type User,
type UserId,
} from "../user";
import { hashPassword, verifyPassword } from "./password";
import type { AuthStore } from "./store";
import {
hashToken,
parseAuthorizationHeader,
SESSION_COOKIE_NAME,
} from "./token";
import { type SessionData, type TokenId, tokenLifetimes } from "./types";
type LoginResult =
| { success: true; token: string; user: User }
| { success: false; error: string };
type RegisterResult =
| { success: true; user: User; verificationToken: string }
| { success: false; error: string };
type SimpleResult = { success: true } | { success: false; error: string };
// Result of validating a request/token - contains both user and session
export type AuthResult =
| { authenticated: true; user: User; session: SessionData }
| { authenticated: false; user: AnonymousUser; session: null };
export class AuthService {
constructor(private store: AuthStore) {}
// === Login ===
async login(
email: string,
password: string,
authMethod: "cookie" | "bearer",
metadata?: { userAgent?: string; ipAddress?: string },
): Promise<LoginResult> {
const user = await this.store.getUserByEmail(email);
if (!user) {
return { success: false, error: "Invalid credentials" };
}
if (!user.isActive()) {
return { success: false, error: "Account is not active" };
}
const passwordHash = await this.store.getUserPasswordHash(user.id);
if (!passwordHash) {
return { success: false, error: "Invalid credentials" };
}
const valid = await verifyPassword(password, passwordHash);
if (!valid) {
return { success: false, error: "Invalid credentials" };
}
const { token } = await this.store.createSession({
userId: user.id,
tokenType: "session",
authMethod,
expiresAt: new Date(Date.now() + tokenLifetimes.session),
userAgent: metadata?.userAgent,
ipAddress: metadata?.ipAddress,
});
return { success: true, token, user };
}
// === Session Validation ===
async validateRequest(request: ExpressRequest): Promise<AuthResult> {
// Try cookie first (for web requests)
let token = this.extractCookieToken(request);
// Fall back to Authorization header (for API requests)
if (!token) {
token = parseAuthorizationHeader(request.get("Authorization"));
}
if (!token) {
return { authenticated: false, user: anonymousUser, session: null };
}
return this.validateToken(token);
}
async validateToken(token: string): Promise<AuthResult> {
const tokenId = hashToken(token) as TokenId;
const session = await this.store.getSession(tokenId);
if (!session) {
return { authenticated: false, user: anonymousUser, session: null };
}
if (session.tokenType !== "session") {
return { authenticated: false, user: anonymousUser, session: null };
}
const user = await this.store.getUserById(session.userId as UserId);
if (!user || !user.isActive()) {
return { authenticated: false, user: anonymousUser, session: null };
}
// Update last used (fire and forget)
this.store.updateLastUsed(tokenId).catch(() => {});
return { authenticated: true, user, session };
}
private extractCookieToken(request: ExpressRequest): string | null {
const cookies = request.get("Cookie");
if (!cookies) {
return null;
}
for (const cookie of cookies.split(";")) {
const [name, ...valueParts] = cookie.trim().split("=");
if (name === SESSION_COOKIE_NAME) {
return valueParts.join("="); // Handle = in token value
}
}
return null;
}
// === Logout ===
async logout(token: string): Promise<void> {
const tokenId = hashToken(token) as TokenId;
await this.store.deleteSession(tokenId);
}
async logoutAllSessions(userId: UserId): Promise<number> {
return this.store.deleteUserSessions(userId);
}
// === Registration ===
async register(
email: string,
password: string,
displayName?: string,
): Promise<RegisterResult> {
const existing = await this.store.getUserByEmail(email);
if (existing) {
return { success: false, error: "Email already registered" };
}
const passwordHash = await hashPassword(password);
const user = await this.store.createUser({
email,
passwordHash,
displayName,
});
// Create email verification token
const { token: verificationToken } = await this.store.createSession({
userId: user.id,
tokenType: "email_verify",
authMethod: "bearer",
expiresAt: new Date(Date.now() + tokenLifetimes.email_verify),
});
return { success: true, user, verificationToken };
}
// === Email Verification ===
async verifyEmail(token: string): Promise<SimpleResult> {
const tokenId = hashToken(token) as TokenId;
const session = await this.store.getSession(tokenId);
if (!session || session.tokenType !== "email_verify") {
return {
success: false,
error: "Invalid or expired verification token",
};
}
if (session.isUsed) {
return { success: false, error: "Token already used" };
}
await this.store.updateUserEmailVerified(session.userId as UserId);
await this.store.deleteSession(tokenId);
return { success: true };
}
// === Password Reset ===
async createPasswordResetToken(
email: string,
): Promise<{ token: string } | null> {
const user = await this.store.getUserByEmail(email);
if (!user) {
// Don't reveal whether email exists
return null;
}
const { token } = await this.store.createSession({
userId: user.id,
tokenType: "password_reset",
authMethod: "bearer",
expiresAt: new Date(Date.now() + tokenLifetimes.password_reset),
});
return { token };
}
async resetPassword(
token: string,
newPassword: string,
): Promise<SimpleResult> {
const tokenId = hashToken(token) as TokenId;
const session = await this.store.getSession(tokenId);
if (!session || session.tokenType !== "password_reset") {
return { success: false, error: "Invalid or expired reset token" };
}
if (session.isUsed) {
return { success: false, error: "Token already used" };
}
const passwordHash = await hashPassword(newPassword);
await this.store.setUserPassword(
session.userId as UserId,
passwordHash,
);
// Invalidate all existing sessions (security: password changed)
await this.store.deleteUserSessions(session.userId as UserId);
// Delete the reset token
await this.store.deleteSession(tokenId);
return { success: true };
}
// === Token Extraction Helper (for routes) ===
extractToken(request: ExpressRequest): string | null {
// Try Authorization header first
const token = parseAuthorizationHeader(request.get("Authorization"));
if (token) {
return token;
}
// Try cookie
return this.extractCookieToken(request);
}
}

164
express/auth/store.ts Normal file
View File

@@ -0,0 +1,164 @@
// store.ts
//
// Authentication storage interface and in-memory implementation.
// The interface allows easy migration to PostgreSQL later.
import { AuthenticatedUser, type User, type UserId } from "../user";
import { generateToken, hashToken } from "./token";
import type { AuthMethod, SessionData, TokenId, TokenType } from "./types";
// Data for creating a new session (tokenId generated internally)
export type CreateSessionData = {
userId: string;
tokenType: TokenType;
authMethod: AuthMethod;
expiresAt: Date;
userAgent?: string;
ipAddress?: string;
};
// Data for creating a new user
export type CreateUserData = {
email: string;
passwordHash: string;
displayName?: string;
};
// Abstract interface for auth storage - implement for PostgreSQL later
export interface AuthStore {
// Session operations
createSession(
data: CreateSessionData,
): Promise<{ token: string; session: SessionData }>;
getSession(tokenId: TokenId): Promise<SessionData | null>;
updateLastUsed(tokenId: TokenId): Promise<void>;
deleteSession(tokenId: TokenId): Promise<void>;
deleteUserSessions(userId: UserId): Promise<number>;
// User operations
getUserByEmail(email: string): Promise<User | null>;
getUserById(userId: UserId): Promise<User | null>;
createUser(data: CreateUserData): Promise<User>;
getUserPasswordHash(userId: UserId): Promise<string | null>;
setUserPassword(userId: UserId, passwordHash: string): Promise<void>;
updateUserEmailVerified(userId: UserId): Promise<void>;
}
// In-memory implementation for development
export class InMemoryAuthStore implements AuthStore {
private sessions: Map<string, SessionData> = new Map();
private users: Map<string, User> = new Map();
private usersByEmail: Map<string, string> = new Map();
private passwordHashes: Map<string, string> = new Map();
private emailVerified: Map<string, boolean> = new Map();
async createSession(
data: CreateSessionData,
): Promise<{ token: string; session: SessionData }> {
const token = generateToken();
const tokenId = hashToken(token);
const session: SessionData = {
tokenId,
userId: data.userId,
tokenType: data.tokenType,
authMethod: data.authMethod,
createdAt: new Date(),
expiresAt: data.expiresAt,
userAgent: data.userAgent,
ipAddress: data.ipAddress,
};
this.sessions.set(tokenId, session);
return { token, session };
}
async getSession(tokenId: TokenId): Promise<SessionData | null> {
const session = this.sessions.get(tokenId);
if (!session) {
return null;
}
// Check expiration
if (new Date() > session.expiresAt) {
this.sessions.delete(tokenId);
return null;
}
return session;
}
async updateLastUsed(tokenId: TokenId): Promise<void> {
const session = this.sessions.get(tokenId);
if (session) {
session.lastUsedAt = new Date();
}
}
async deleteSession(tokenId: TokenId): Promise<void> {
this.sessions.delete(tokenId);
}
async deleteUserSessions(userId: UserId): Promise<number> {
let count = 0;
for (const [tokenId, session] of this.sessions) {
if (session.userId === userId) {
this.sessions.delete(tokenId);
count++;
}
}
return count;
}
async getUserByEmail(email: string): Promise<User | null> {
const userId = this.usersByEmail.get(email.toLowerCase());
if (!userId) {
return null;
}
return this.users.get(userId) ?? null;
}
async getUserById(userId: UserId): Promise<User | null> {
return this.users.get(userId) ?? null;
}
async createUser(data: CreateUserData): Promise<User> {
const user = AuthenticatedUser.create(data.email, {
displayName: data.displayName,
status: "pending", // Pending until email verified
});
this.users.set(user.id, user);
this.usersByEmail.set(data.email.toLowerCase(), user.id);
this.passwordHashes.set(user.id, data.passwordHash);
this.emailVerified.set(user.id, false);
return user;
}
async getUserPasswordHash(userId: UserId): Promise<string | null> {
return this.passwordHashes.get(userId) ?? null;
}
async setUserPassword(userId: UserId, passwordHash: string): Promise<void> {
this.passwordHashes.set(userId, passwordHash);
}
async updateUserEmailVerified(userId: UserId): Promise<void> {
this.emailVerified.set(userId, true);
// Update user status to active
const user = this.users.get(userId);
if (user) {
// Create new user with active status
const updatedUser = AuthenticatedUser.create(user.email, {
id: user.id,
displayName: user.displayName,
status: "active",
roles: [...user.roles],
permissions: [...user.permissions],
});
this.users.set(userId, updatedUser);
}
}
}

42
express/auth/token.ts Normal file
View File

@@ -0,0 +1,42 @@
// token.ts
//
// Token generation and hashing utilities for authentication.
// Raw tokens are never stored - only their SHA-256 hashes.
import { createHash, randomBytes } from "node:crypto";
const TOKEN_BYTES = 32; // 256 bits of entropy
// Generate a cryptographically secure random token
function generateToken(): string {
return randomBytes(TOKEN_BYTES).toString("base64url");
}
// Hash token for storage (never store raw tokens)
function hashToken(token: string): string {
return createHash("sha256").update(token).digest("hex");
}
// Parse token from Authorization header
function parseAuthorizationHeader(header: string | undefined): string | null {
if (!header) {
return null;
}
const parts = header.split(" ");
if (parts.length !== 2 || parts[0].toLowerCase() !== "bearer") {
return null;
}
return parts[1];
}
// Cookie name for web sessions
const SESSION_COOKIE_NAME = "diachron_session";
export {
generateToken,
hashToken,
parseAuthorizationHeader,
SESSION_COOKIE_NAME,
};

96
express/auth/types.ts Normal file
View File

@@ -0,0 +1,96 @@
// types.ts
//
// Authentication types and Zod schemas.
import { z } from "zod";
// Branded type for token IDs (the hash, not the raw token)
export type TokenId = string & { readonly __brand: "TokenId" };
// Token types for different purposes
export const tokenTypeParser = z.enum([
"session",
"password_reset",
"email_verify",
]);
export type TokenType = z.infer<typeof tokenTypeParser>;
// Authentication method - how the token was delivered
export const authMethodParser = z.enum(["cookie", "bearer"]);
export type AuthMethod = z.infer<typeof authMethodParser>;
// Session data schema - what gets stored
export const sessionDataParser = z.object({
tokenId: z.string().min(1),
userId: z.string().min(1),
tokenType: tokenTypeParser,
authMethod: authMethodParser,
createdAt: z.coerce.date(),
expiresAt: z.coerce.date(),
lastUsedAt: z.coerce.date().optional(),
userAgent: z.string().optional(),
ipAddress: z.string().optional(),
isUsed: z.boolean().optional(), // For one-time tokens
});
export type SessionData = z.infer<typeof sessionDataParser>;
// Input validation schemas for auth endpoints
export const loginInputParser = z.object({
email: z.string().email(),
password: z.string().min(1),
});
export const registerInputParser = z.object({
email: z.string().email(),
password: z.string().min(8),
displayName: z.string().optional(),
});
export const forgotPasswordInputParser = z.object({
email: z.string().email(),
});
export const resetPasswordInputParser = z.object({
token: z.string().min(1),
password: z.string().min(8),
});
// Token lifetimes in milliseconds
export const tokenLifetimes: Record<TokenType, number> = {
session: 30 * 24 * 60 * 60 * 1000, // 30 days
password_reset: 1 * 60 * 60 * 1000, // 1 hour
email_verify: 24 * 60 * 60 * 1000, // 24 hours
};
// Import here to avoid circular dependency at module load time
import type { User } from "../user";
// Session wrapper class providing a consistent interface for handlers.
// Always present on Call (never null), but may represent an anonymous session.
export class Session {
constructor(
private readonly data: SessionData | null,
private readonly user: User,
) {}
getUser(): User {
return this.user;
}
getData(): SessionData | null {
return this.data;
}
isAuthenticated(): boolean {
return !this.user.isAnonymous();
}
get tokenId(): string | undefined {
return this.data?.tokenId;
}
get userId(): string | undefined {
return this.data?.userId;
}
}

62
express/basic/login.ts Normal file
View File

@@ -0,0 +1,62 @@
import { SESSION_COOKIE_NAME } from "../auth/token";
import { tokenLifetimes } from "../auth/types";
import { request } from "../request";
import { html, redirect, render } from "../request/util";
import type { Call, Result, Route } from "../types";
const loginHandler = async (call: Call): Promise<Result> => {
if (call.method === "GET") {
const c = await render("basic/login", {});
return html(c);
}
// POST - handle login
const { email, password } = call.request.body;
if (!email || !password) {
const c = await render("basic/login", {
error: "Email and password are required",
email,
});
return html(c);
}
const result = await request.auth.login(email, password, "cookie", {
userAgent: call.request.get("User-Agent"),
ipAddress: call.request.ip,
});
if (!result.success) {
const c = await render("basic/login", {
error: result.error,
email,
});
return html(c);
}
// Success - set cookie and redirect to home
const redirectResult = redirect("/");
redirectResult.cookies = [
{
name: SESSION_COOKIE_NAME,
value: result.token,
options: {
httpOnly: true,
secure: false, // Set to true in production with HTTPS
sameSite: "lax",
maxAge: tokenLifetimes.session,
path: "/",
},
},
];
return redirectResult;
};
const loginRoute: Route = {
path: "/login",
methods: ["GET", "POST"],
handler: loginHandler,
};
export { loginRoute };

38
express/basic/logout.ts Normal file
View File

@@ -0,0 +1,38 @@
import { SESSION_COOKIE_NAME } from "../auth/token";
import { request } from "../request";
import { redirect } from "../request/util";
import type { Call, Result, Route } from "../types";
const logoutHandler = async (call: Call): Promise<Result> => {
// Extract token from cookie and invalidate the session
const token = request.auth.extractToken(call.request);
if (token) {
await request.auth.logout(token);
}
// Clear the cookie and redirect to login
const redirectResult = redirect("/login");
redirectResult.cookies = [
{
name: SESSION_COOKIE_NAME,
value: "",
options: {
httpOnly: true,
secure: false,
sameSite: "lax",
maxAge: 0,
path: "/",
},
},
];
return redirectResult;
};
const logoutRoute: Route = {
path: "/logout",
methods: ["GET", "POST"],
handler: logoutHandler,
};
export { logoutRoute };

43
express/basic/routes.ts Normal file
View File

@@ -0,0 +1,43 @@
import { DateTime } from "ts-luxon";
import { request } from "../request";
import { html, render } from "../request/util";
import type { Call, Result, Route } from "../types";
import { loginRoute } from "./login";
import { logoutRoute } from "./logout";
const routes: Record<string, Route> = {
hello: {
path: "/hello",
methods: ["GET"],
handler: async (_call: Call): Promise<Result> => {
const now = DateTime.now();
const c = await render("basic/hello", { now });
return html(c);
},
},
home: {
path: "/",
methods: ["GET"],
handler: async (_call: Call): Promise<Result> => {
const _auth = request.auth;
const me = request.session.getUser();
const email = me.toString();
const showLogin = me.isAnonymous();
const showLogout = !me.isAnonymous();
const c = await render("basic/home", {
email,
showLogin,
showLogout,
});
return html(c);
},
},
login: loginRoute,
logout: logoutRoute,
};
export { routes };

39
express/biome.jsonc Normal file
View File

@@ -0,0 +1,39 @@
{
"$schema": "https://biomejs.dev/schemas/2.3.10/schema.json",
"vcs": {
"enabled": true,
"clientKind": "git",
"useIgnoreFile": true
},
"files": {
"includes": ["**", "!!**/dist"]
},
"formatter": {
"enabled": true,
"indentStyle": "space",
"indentWidth": 4
},
"linter": {
"enabled": true,
"rules": {
"recommended": true,
"style": {
"useBlockStatements": "error"
}
}
},
"javascript": {
"formatter": {
"quoteStyle": "double"
}
},
"assist": {
"enabled": true,
"actions": {
"source": {
"organizeImports": "on"
}
}
}
}

9
express/build.sh Executable file
View File

@@ -0,0 +1,9 @@
#!/bin/bash
set -eu
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
cd "$DIR"
../cmd pnpm ncc build ./app.ts -o dist

14
express/check.sh Executable file
View File

@@ -0,0 +1,14 @@
#!/bin/bash
set -eu
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
check_dir="$DIR"
out_dir="$check_dir/out"
source "$check_dir"/../framework/shims/common
source "$check_dir"/../framework/shims/node.common
$ROOT/cmd pnpm tsc --outDir "$out_dir"

55
express/cli.ts Normal file
View File

@@ -0,0 +1,55 @@
import { parseArgs } from "node:util";
const { values } = parseArgs({
options: {
listen: {
type: "string",
short: "l",
},
"log-address": {
type: "string",
default: "8085",
},
},
strict: true,
allowPositionals: false,
});
function parseListenAddress(listen: string | undefined): {
host: string;
port: number;
} {
const defaultHost = "127.0.0.1";
const defaultPort = 3500;
if (!listen) {
return { host: defaultHost, port: defaultPort };
}
const lastColon = listen.lastIndexOf(":");
if (lastColon === -1) {
// Just a port number
const port = parseInt(listen, 10);
if (Number.isNaN(port)) {
throw new Error(`Invalid listen address: ${listen}`);
}
return { host: defaultHost, port };
}
const host = listen.slice(0, lastColon);
const port = parseInt(listen.slice(lastColon + 1), 10);
if (Number.isNaN(port)) {
throw new Error(`Invalid port in listen address: ${listen}`);
}
return { host, port };
}
const listenAddress = parseListenAddress(values.listen);
const logAddress = parseListenAddress(values["log-address"]);
export const cli = {
listen: listenAddress,
logAddress,
};

122
express/content-types.ts Normal file
View File

@@ -0,0 +1,122 @@
// This file belongs to the framework. You are not expected to modify it.
export type ContentType = string;
// tx claude https://claude.ai/share/344fc7bd-5321-4763-af2f-b82275e9f865
const contentTypes = {
text: {
plain: "text/plain",
html: "text/html",
css: "text/css",
javascript: "text/javascript",
xml: "text/xml",
csv: "text/csv",
markdown: "text/markdown",
calendar: "text/calendar",
},
image: {
jpeg: "image/jpeg",
png: "image/png",
gif: "image/gif",
svgPlusXml: "image/svg+xml",
webp: "image/webp",
bmp: "image/bmp",
ico: "image/x-icon",
tiff: "image/tiff",
avif: "image/avif",
},
audio: {
mpeg: "audio/mpeg",
wav: "audio/wav",
ogg: "audio/ogg",
webm: "audio/webm",
aac: "audio/aac",
midi: "audio/midi",
opus: "audio/opus",
flac: "audio/flac",
},
video: {
mp4: "video/mp4",
webm: "video/webm",
xMsvideo: "video/x-msvideo",
mpeg: "video/mpeg",
ogg: "video/ogg",
quicktime: "video/quicktime",
xMatroska: "video/x-matroska",
},
application: {
json: "application/json",
pdf: "application/pdf",
zip: "application/zip",
xWwwFormUrlencoded: "application/x-www-form-urlencoded",
octetStream: "application/octet-stream",
xml: "application/xml",
gzip: "application/gzip",
javascript: "application/javascript",
ld_json: "application/ld+json",
msword: "application/msword",
vndOpenxmlformatsOfficedocumentWordprocessingmlDocument:
"application/vnd.openxmlformats-officedocument.wordprocessingml.document",
vndMsExcel: "application/vnd.ms-excel",
vndOpenxmlformatsOfficedocumentSpreadsheetmlSheet:
"application/vnd.openxmlformats-officedocument.spreadsheetml.sheet",
vndMsPowerpoint: "application/vnd.ms-powerpoint",
vndOpenxmlformatsOfficedocumentPresentationmlPresentation:
"application/vnd.openxmlformats-officedocument.presentationml.presentation",
sql: "application/sql",
graphql: "application/graphql",
wasm: "application/wasm",
xTar: "application/x-tar",
x7zCompressed: "application/x-7z-compressed",
xRarCompressed: "application/x-rar-compressed",
},
multipart: {
formData: "multipart/form-data",
byteranges: "multipart/byteranges",
},
font: {
woff: "font/woff",
woff2: "font/woff2",
ttf: "font/ttf",
otf: "font/otf",
},
};
export { contentTypes };
/*
possible additions for later
Looking at what's there, here are a few gaps that might be worth filling:
Streaming/Modern Web:
application/x-ndjson or application/jsonlines - newline-delimited JSON (popular for streaming APIs)
text/event-stream - Server-Sent Events
API/Data Exchange:
application/yaml or text/yaml - YAML files
application/protobuf - Protocol Buffers
application/msgpack - MessagePack
Archives you're missing:
application/x-bzip2 - bzip2 compression
Images:
image/heic - HEIC/HEIF (common on iOS)
Fonts:
application/vnd.ms-fontobject - EOT fonts (legacy but still seen)
Text:
text/rtf - Rich Text Format
The most impactful would probably be text/event-stream (if you do any SSE), application/x-ndjson (common in modern APIs), and maybe text/yaml. The rest are more situational.
But honestly, what you have covers 95% of common web development scenarios. You can definitely add as you go when you encounter specific needs!
*/

27
express/context.ts Normal file
View File

@@ -0,0 +1,27 @@
// context.ts
//
// Request-scoped context using AsyncLocalStorage.
// Allows services to access request data (like the current user) without
// needing to pass Call through every function.
import { AsyncLocalStorage } from "node:async_hooks";
import { anonymousUser, type User } from "./user";
type RequestContext = {
user: User;
};
const asyncLocalStorage = new AsyncLocalStorage<RequestContext>();
// Run a function within a request context
function runWithContext<T>(context: RequestContext, fn: () => T): T {
return asyncLocalStorage.run(context, fn);
}
// Get the current user from context, or AnonymousUser if not in a request
function getCurrentUser(): User {
const context = asyncLocalStorage.getStore();
return context?.user ?? anonymousUser;
}
export { getCurrentUser, runWithContext, type RequestContext };

48
express/core/index.ts Normal file
View File

@@ -0,0 +1,48 @@
import nunjucks from "nunjucks";
import { db, migrate, migrationStatus } from "../database";
import { getLogs, log } from "../logging";
// FIXME: This doesn't belong here; move it somewhere else.
const conf = {
templateEngine: () => {
return {
renderTemplate: (template: string, context: object) => {
return nunjucks.renderString(template, context);
},
};
},
};
const database = {
db,
migrate,
migrationStatus,
};
const logging = {
log,
getLogs,
};
const random = {
randomNumber: () => {
return Math.random();
},
};
const misc = {
sleep: (ms: number) => {
return new Promise((resolve) => setTimeout(resolve, ms));
},
};
// Keep this asciibetically sorted
const core = {
conf,
database,
logging,
misc,
random,
};
export { core };

548
express/database.ts Normal file
View File

@@ -0,0 +1,548 @@
// database.ts
// PostgreSQL database access with Kysely query builder and simple migrations
import * as fs from "node:fs";
import * as path from "node:path";
import {
type Generated,
Kysely,
PostgresDialect,
type Selectable,
sql,
} from "kysely";
import { Pool } from "pg";
import type {
AuthStore,
CreateSessionData,
CreateUserData,
} from "./auth/store";
import { generateToken, hashToken } from "./auth/token";
import type { SessionData, TokenId } from "./auth/types";
import type { Domain } from "./types";
import { AuthenticatedUser, type User, type UserId } from "./user";
// Connection configuration
const connectionConfig = {
host: "localhost",
port: 5432,
user: "diachron",
password: "diachron",
database: "diachron",
};
// Database schema types for Kysely
// Generated<T> marks columns with database defaults (optional on insert)
interface UsersTable {
id: string;
status: Generated<string>;
display_name: string | null;
created_at: Generated<Date>;
updated_at: Generated<Date>;
}
interface UserEmailsTable {
id: string;
user_id: string;
email: string;
normalized_email: string;
is_primary: Generated<boolean>;
is_verified: Generated<boolean>;
created_at: Generated<Date>;
verified_at: Date | null;
revoked_at: Date | null;
}
interface UserCredentialsTable {
id: string;
user_id: string;
credential_type: Generated<string>;
password_hash: string | null;
created_at: Generated<Date>;
updated_at: Generated<Date>;
}
interface SessionsTable {
id: Generated<string>;
token_hash: string;
user_id: string;
user_email_id: string | null;
token_type: string;
auth_method: string;
created_at: Generated<Date>;
expires_at: Date;
revoked_at: Date | null;
ip_address: string | null;
user_agent: string | null;
is_used: Generated<boolean | null>;
}
interface Database {
users: UsersTable;
user_emails: UserEmailsTable;
user_credentials: UserCredentialsTable;
sessions: SessionsTable;
}
// Create the connection pool
const pool = new Pool(connectionConfig);
// Create the Kysely instance
const db = new Kysely<Database>({
dialect: new PostgresDialect({ pool }),
});
// Raw pool access for when you need it
const rawPool = pool;
// Execute raw SQL (for when Kysely doesn't fit)
async function raw<T = unknown>(
query: string,
params: unknown[] = [],
): Promise<T[]> {
const result = await pool.query(query, params);
return result.rows as T[];
}
// ============================================================================
// Migrations
// ============================================================================
// Migration file naming convention:
// yyyy-mm-dd_ss_description.sql
// e.g., 2025-01-15_01_initial.sql, 2025-01-15_02_add_users.sql
//
// Migrations directory: express/migrations/
const FRAMEWORK_MIGRATIONS_DIR = path.join(__dirname, "framework/migrations");
const APP_MIGRATIONS_DIR = path.join(__dirname, "migrations");
const MIGRATIONS_TABLE = "_migrations";
interface MigrationRecord {
id: number;
name: string;
applied_at: Date;
}
// Ensure migrations table exists
async function ensureMigrationsTable(): Promise<void> {
await pool.query(`
CREATE TABLE IF NOT EXISTS ${MIGRATIONS_TABLE} (
id SERIAL PRIMARY KEY,
name TEXT NOT NULL UNIQUE,
applied_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
)
`);
}
// Get list of applied migrations
async function getAppliedMigrations(): Promise<string[]> {
const result = await pool.query<MigrationRecord>(
`SELECT name FROM ${MIGRATIONS_TABLE} ORDER BY name`,
);
return result.rows.map((r) => r.name);
}
// Get pending migration files
function getMigrationFiles(kind: Domain): string[] {
const dir = kind === "fw" ? FRAMEWORK_MIGRATIONS_DIR : APP_MIGRATIONS_DIR;
if (!fs.existsSync(dir)) {
return [];
}
const root = __dirname;
const mm = fs
.readdirSync(dir)
.filter((f) => f.endsWith(".sql"))
.filter((f) => /^\d{4}-\d{2}-\d{2}_\d{2}-/.test(f))
.map((f) => `${dir}/${f}`)
.map((f) => f.replace(`${root}/`, ""))
.sort();
return mm;
}
// Run a single migration
async function runMigration(filename: string): Promise<void> {
// const filepath = path.join(MIGRATIONS_DIR, filename);
const filepath = filename;
const content = fs.readFileSync(filepath, "utf-8");
process.stdout.write(` Migration: ${filename}...`);
// Run migration in a transaction
const client = await pool.connect();
try {
await client.query("BEGIN");
await client.query(content);
await client.query(
`INSERT INTO ${MIGRATIONS_TABLE} (name) VALUES ($1)`,
[filename],
);
await client.query("COMMIT");
console.log(" ✓");
} catch (err) {
console.log(" ✗");
const message = err instanceof Error ? err.message : String(err);
console.error(` Error: ${message}`);
await client.query("ROLLBACK");
throw err;
} finally {
client.release();
}
}
function getAllMigrationFiles() {
const fw_files = getMigrationFiles("fw");
const app_files = getMigrationFiles("app");
const all = [...fw_files, ...app_files];
return all;
}
// Run all pending migrations
async function migrate(): Promise<void> {
await ensureMigrationsTable();
const applied = new Set(await getAppliedMigrations());
const all = getAllMigrationFiles();
const pending = all.filter((all) => !applied.has(all));
if (pending.length === 0) {
console.log("No pending migrations");
return;
}
console.log(`Applying ${pending.length} migration(s):`);
for (const file of pending) {
await runMigration(file);
}
}
// List migration status
async function migrationStatus(): Promise<{
applied: string[];
pending: string[];
}> {
await ensureMigrationsTable();
const applied = new Set(await getAppliedMigrations());
const ff = getAllMigrationFiles();
return {
applied: ff.filter((ff) => applied.has(ff)),
pending: ff.filter((ff) => !applied.has(ff)),
};
}
// ============================================================================
// PostgresAuthStore - Database-backed authentication storage
// ============================================================================
class PostgresAuthStore implements AuthStore {
// Session operations
async createSession(
data: CreateSessionData,
): Promise<{ token: string; session: SessionData }> {
const token = generateToken();
const tokenHash = hashToken(token);
const row = await db
.insertInto("sessions")
.values({
token_hash: tokenHash,
user_id: data.userId,
token_type: data.tokenType,
auth_method: data.authMethod,
expires_at: data.expiresAt,
user_agent: data.userAgent ?? null,
ip_address: data.ipAddress ?? null,
})
.returningAll()
.executeTakeFirstOrThrow();
const session: SessionData = {
tokenId: row.token_hash,
userId: row.user_id,
tokenType: row.token_type as SessionData["tokenType"],
authMethod: row.auth_method as SessionData["authMethod"],
createdAt: row.created_at,
expiresAt: row.expires_at,
userAgent: row.user_agent ?? undefined,
ipAddress: row.ip_address ?? undefined,
isUsed: row.is_used ?? undefined,
};
return { token, session };
}
async getSession(tokenId: TokenId): Promise<SessionData | null> {
const row = await db
.selectFrom("sessions")
.selectAll()
.where("token_hash", "=", tokenId)
.where("expires_at", ">", new Date())
.where("revoked_at", "is", null)
.executeTakeFirst();
if (!row) {
return null;
}
return {
tokenId: row.token_hash,
userId: row.user_id,
tokenType: row.token_type as SessionData["tokenType"],
authMethod: row.auth_method as SessionData["authMethod"],
createdAt: row.created_at,
expiresAt: row.expires_at,
userAgent: row.user_agent ?? undefined,
ipAddress: row.ip_address ?? undefined,
isUsed: row.is_used ?? undefined,
};
}
async updateLastUsed(_tokenId: TokenId): Promise<void> {
// The new schema doesn't have last_used_at column
// This is now a no-op; session activity tracking could be added later
}
async deleteSession(tokenId: TokenId): Promise<void> {
// Soft delete by setting revoked_at
await db
.updateTable("sessions")
.set({ revoked_at: new Date() })
.where("token_hash", "=", tokenId)
.execute();
}
async deleteUserSessions(userId: UserId): Promise<number> {
const result = await db
.updateTable("sessions")
.set({ revoked_at: new Date() })
.where("user_id", "=", userId)
.where("revoked_at", "is", null)
.executeTakeFirst();
return Number(result.numUpdatedRows);
}
// User operations
async getUserByEmail(email: string): Promise<User | null> {
// Find user through user_emails table
const normalizedEmail = email.toLowerCase().trim();
const row = await db
.selectFrom("user_emails")
.innerJoin("users", "users.id", "user_emails.user_id")
.select([
"users.id",
"users.status",
"users.display_name",
"users.created_at",
"users.updated_at",
"user_emails.email",
])
.where("user_emails.normalized_email", "=", normalizedEmail)
.where("user_emails.revoked_at", "is", null)
.executeTakeFirst();
if (!row) {
return null;
}
return this.rowToUser(row);
}
async getUserById(userId: UserId): Promise<User | null> {
// Get user with their primary email
const row = await db
.selectFrom("users")
.leftJoin("user_emails", (join) =>
join
.onRef("user_emails.user_id", "=", "users.id")
.on("user_emails.is_primary", "=", true)
.on("user_emails.revoked_at", "is", null),
)
.select([
"users.id",
"users.status",
"users.display_name",
"users.created_at",
"users.updated_at",
"user_emails.email",
])
.where("users.id", "=", userId)
.executeTakeFirst();
if (!row) {
return null;
}
return this.rowToUser(row);
}
async createUser(data: CreateUserData): Promise<User> {
const userId = crypto.randomUUID();
const emailId = crypto.randomUUID();
const credentialId = crypto.randomUUID();
const now = new Date();
const normalizedEmail = data.email.toLowerCase().trim();
// Create user record
await db
.insertInto("users")
.values({
id: userId,
display_name: data.displayName ?? null,
status: "pending",
created_at: now,
updated_at: now,
})
.execute();
// Create user_email record
await db
.insertInto("user_emails")
.values({
id: emailId,
user_id: userId,
email: data.email,
normalized_email: normalizedEmail,
is_primary: true,
is_verified: false,
created_at: now,
})
.execute();
// Create user_credential record
await db
.insertInto("user_credentials")
.values({
id: credentialId,
user_id: userId,
credential_type: "password",
password_hash: data.passwordHash,
created_at: now,
updated_at: now,
})
.execute();
return new AuthenticatedUser({
id: userId,
email: data.email,
displayName: data.displayName,
status: "pending",
roles: [],
permissions: [],
createdAt: now,
updatedAt: now,
});
}
async getUserPasswordHash(userId: UserId): Promise<string | null> {
const row = await db
.selectFrom("user_credentials")
.select("password_hash")
.where("user_id", "=", userId)
.where("credential_type", "=", "password")
.executeTakeFirst();
return row?.password_hash ?? null;
}
async setUserPassword(userId: UserId, passwordHash: string): Promise<void> {
const now = new Date();
// Try to update existing credential
const result = await db
.updateTable("user_credentials")
.set({ password_hash: passwordHash, updated_at: now })
.where("user_id", "=", userId)
.where("credential_type", "=", "password")
.executeTakeFirst();
// If no existing credential, create one
if (Number(result.numUpdatedRows) === 0) {
await db
.insertInto("user_credentials")
.values({
id: crypto.randomUUID(),
user_id: userId,
credential_type: "password",
password_hash: passwordHash,
created_at: now,
updated_at: now,
})
.execute();
}
// Update user's updated_at
await db
.updateTable("users")
.set({ updated_at: now })
.where("id", "=", userId)
.execute();
}
async updateUserEmailVerified(userId: UserId): Promise<void> {
const now = new Date();
// Update user_emails to mark as verified
await db
.updateTable("user_emails")
.set({
is_verified: true,
verified_at: now,
})
.where("user_id", "=", userId)
.where("is_primary", "=", true)
.execute();
// Update user status to active
await db
.updateTable("users")
.set({
status: "active",
updated_at: now,
})
.where("id", "=", userId)
.execute();
}
// Helper to convert database row to User object
private rowToUser(row: {
id: string;
status: string;
display_name: string | null;
created_at: Date;
updated_at: Date;
email: string | null;
}): User {
return new AuthenticatedUser({
id: row.id,
email: row.email ?? "unknown@example.com",
displayName: row.display_name ?? undefined,
status: row.status as "active" | "suspended" | "pending",
roles: [], // TODO: query from RBAC tables
permissions: [], // TODO: query from RBAC tables
createdAt: row.created_at,
updatedAt: row.updated_at,
});
}
}
// ============================================================================
// Exports
// ============================================================================
export {
db,
raw,
rawPool,
pool,
migrate,
migrationStatus,
connectionConfig,
PostgresAuthStore,
type Database,
};

7
express/deps.ts Normal file
View File

@@ -0,0 +1,7 @@
// Database
//export { Client as PostgresClient } from "https://deno.land/x/postgres@v0.19.3/mod.ts";
//export type { ClientOptions as PostgresOptions } from "https://deno.land/x/postgres@v0.19.3/mod.ts";
// Redis
//export { connect as redisConnect } from "https://deno.land/x/redis@v0.37.1/mod.ts";
//export type { Redis } from "https://deno.land/x/redis@v0.37.1/mod.ts";

View File

@@ -0,0 +1,17 @@
import { connectionConfig, migrate, pool } from "../database";
import { dropTables, exitIfUnforced } from "./util";
async function main(): Promise<void> {
exitIfUnforced();
try {
await dropTables();
} finally {
await pool.end();
}
}
main().catch((err) => {
console.error("Failed to clear database:", err.message);
process.exit(1);
});

View File

@@ -0,0 +1,26 @@
// reset-db.ts
// Development command to wipe the database and apply all migrations from scratch
import { connectionConfig, migrate, pool } from "../database";
import { dropTables, exitIfUnforced } from "./util";
async function main(): Promise<void> {
exitIfUnforced();
try {
await dropTables();
console.log("");
await migrate();
console.log("");
console.log("Database reset complete.");
} finally {
await pool.end();
}
}
main().catch((err) => {
console.error("Failed to reset database:", err.message);
process.exit(1);
});

42
express/develop/util.ts Normal file
View File

@@ -0,0 +1,42 @@
// FIXME: this is at the wrong level of specificity
import { connectionConfig, migrate, pool } from "../database";
const exitIfUnforced = () => {
const args = process.argv.slice(2);
// Require explicit confirmation unless --force is passed
if (!args.includes("--force")) {
console.error("This will DROP ALL TABLES in the database!");
console.error(` Database: ${connectionConfig.database}`);
console.error(
` Host: ${connectionConfig.host}:${connectionConfig.port}`,
);
console.error("");
console.error("Run with --force to proceed.");
process.exit(1);
}
};
const dropTables = async () => {
console.log("Dropping all tables...");
// Get all table names in the public schema
const result = await pool.query<{ tablename: string }>(`
SELECT tablename FROM pg_tables
WHERE schemaname = 'public'
`);
if (result.rows.length > 0) {
// Drop all tables with CASCADE to handle foreign key constraints
const tableNames = result.rows
.map((r) => `"${r.tablename}"`)
.join(", ");
await pool.query(`DROP TABLE IF EXISTS ${tableNames} CASCADE`);
console.log(`Dropped ${result.rows.length} table(s)`);
} else {
console.log("No tables to drop");
}
};
export { dropTables, exitIfUnforced };

View File

@@ -0,0 +1,13 @@
import { z } from "zod";
export const executionContextSchema = z.object({
diachron_root: z.string(),
});
export type ExecutionContext = z.infer<typeof executionContextSchema>;
export function parseExecutionContext(
env: Record<string, string | undefined>,
): ExecutionContext {
return executionContextSchema.parse(env);
}

View File

@@ -0,0 +1,38 @@
import assert from "node:assert/strict";
import { describe, it } from "node:test";
import { ZodError } from "zod";
import {
executionContextSchema,
parseExecutionContext,
} from "./execution-context-schema";
describe("parseExecutionContext", () => {
it("parses valid executionContext with diachron_root", () => {
const env = { diachron_root: "/some/path" };
const result = parseExecutionContext(env);
assert.deepEqual(result, { diachron_root: "/some/path" });
});
it("throws ZodError when diachron_root is missing", () => {
const env = {};
assert.throws(() => parseExecutionContext(env), ZodError);
});
it("strips extra fields not in schema", () => {
const env = {
diachron_root: "/some/path",
EXTRA_VAR: "should be stripped",
};
const result = parseExecutionContext(env);
assert.deepEqual(result, { diachron_root: "/some/path" });
assert.equal("EXTRA_VAR" in result, false);
});
});
describe("executionContextSchema", () => {
it("requires diachron_root to be a string", () => {
const result = executionContextSchema.safeParse({ diachron_root: 123 });
assert.equal(result.success, false);
});
});

View File

@@ -0,0 +1,5 @@
import { parseExecutionContext } from "./execution-context-schema";
const executionContext = parseExecutionContext(process.env);
export { executionContext };

View File

@@ -0,0 +1,29 @@
-- 0001_users.sql
-- Create users table for authentication
CREATE TABLE users (
id UUID PRIMARY KEY,
status TEXT NOT NULL DEFAULT 'active',
display_name TEXT,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
CREATE TABLE user_emails (
id UUID PRIMARY KEY,
user_id UUID NOT NULL REFERENCES users(id),
email TEXT NOT NULL,
normalized_email TEXT NOT NULL,
is_primary BOOLEAN NOT NULL DEFAULT FALSE,
is_verified BOOLEAN NOT NULL DEFAULT FALSE,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
verified_at TIMESTAMPTZ,
revoked_at TIMESTAMPTZ
);
-- Enforce uniqueness only among *active* emails
CREATE UNIQUE INDEX user_emails_unique_active
ON user_emails (normalized_email)
WHERE revoked_at IS NULL;

View File

@@ -0,0 +1,26 @@
-- 0002_sessions.sql
-- Create sessions table for auth tokens
CREATE TABLE sessions (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
token_hash TEXT UNIQUE NOT NULL,
user_id UUID NOT NULL REFERENCES users(id),
user_email_id UUID REFERENCES user_emails(id),
token_type TEXT NOT NULL,
auth_method TEXT NOT NULL,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
expires_at TIMESTAMPTZ NOT NULL,
revoked_at TIMESTAMPTZ,
ip_address INET,
user_agent TEXT,
is_used BOOLEAN DEFAULT FALSE
);
-- Index for user session lookups (logout all, etc.)
CREATE INDEX sessions_user_id_idx ON sessions (user_id);
-- Index for expiration cleanup
CREATE INDEX sessions_expires_at_idx ON sessions (expires_at);
-- Index for token type filtering
CREATE INDEX sessions_token_type_idx ON sessions (token_type);

View File

@@ -0,0 +1,20 @@
CREATE TABLE roles (
id UUID PRIMARY KEY,
name TEXT UNIQUE NOT NULL,
description TEXT
);
CREATE TABLE groups (
id UUID PRIMARY KEY,
name TEXT NOT NULL,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
CREATE TABLE user_group_roles (
user_id UUID NOT NULL REFERENCES users(id),
group_id UUID NOT NULL REFERENCES groups(id),
role_id UUID NOT NULL REFERENCES roles(id),
granted_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
revoked_at TIMESTAMPTZ,
PRIMARY KEY (user_id, group_id, role_id)
);

View File

@@ -0,0 +1,14 @@
CREATE TABLE capabilities (
id UUID PRIMARY KEY,
name TEXT UNIQUE NOT NULL,
description TEXT
);
CREATE TABLE role_capabilities (
role_id UUID NOT NULL REFERENCES roles(id),
capability_id UUID NOT NULL REFERENCES capabilities(id),
granted_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
revoked_at TIMESTAMPTZ,
PRIMARY KEY (role_id, capability_id)
);

19
express/handlers.ts Normal file
View File

@@ -0,0 +1,19 @@
import { contentTypes } from "./content-types";
import { core } from "./core";
import { httpCodes } from "./http-codes";
import type { Call, Handler, Result } from "./types";
const multiHandler: Handler = async (call: Call): Promise<Result> => {
const code = httpCodes.success.OK;
const rn = core.random.randomNumber();
const retval: Result = {
code,
result: `that was ${call.method} (${rn})`,
contentType: contentTypes.text.plain,
};
return retval;
};
export { multiHandler };

76
express/http-codes.ts Normal file
View File

@@ -0,0 +1,76 @@
// This file belongs to the framework. You are not expected to modify it.
export type HttpCode = {
code: number;
name: string;
description?: string;
};
type Group = "success" | "redirection" | "clientErrors" | "serverErrors";
type CodeDefinitions = {
[K in Group]: {
[K: string]: HttpCode;
};
};
// tx claude https://claude.ai/share/344fc7bd-5321-4763-af2f-b82275e9f865
const httpCodes: CodeDefinitions = {
success: {
OK: { code: 200, name: "OK", description: "" },
Created: { code: 201, name: "Created" },
Accepted: { code: 202, name: "Accepted" },
NonAuthoritativeInformation: {
code: 203,
name: "Non-Authoritative Information",
},
NoContent: { code: 204, name: "No Content" },
ResetContent: { code: 205, name: "Reset Content" },
PartialContent: { code: 206, name: "Partial Content" },
},
redirection: {
MultipleChoices: { code: 300, name: "Multiple Choices" },
MovedPermanently: { code: 301, name: "Moved Permanently" },
Found: { code: 302, name: "Found" },
SeeOther: { code: 303, name: "See Other" },
NotModified: { code: 304, name: "Not Modified" },
TemporaryRedirect: { code: 307, name: "Temporary Redirect" },
PermanentRedirect: { code: 308, name: "Permanent Redirect" },
},
clientErrors: {
BadRequest: { code: 400, name: "Bad Request" },
Unauthorized: { code: 401, name: "Unauthorized" },
PaymentRequired: { code: 402, name: "Payment Required" },
Forbidden: { code: 403, name: "Forbidden" },
NotFound: { code: 404, name: "Not Found" },
MethodNotAllowed: { code: 405, name: "Method Not Allowed" },
NotAcceptable: { code: 406, name: "Not Acceptable" },
ProxyAuthenticationRequired: {
code: 407,
name: "Proxy Authentication Required",
},
RequestTimeout: { code: 408, name: "Request Timeout" },
Conflict: { code: 409, name: "Conflict" },
Gone: { code: 410, name: "Gone" },
LengthRequired: { code: 411, name: "Length Required" },
PreconditionFailed: { code: 412, name: "Precondition Failed" },
PayloadTooLarge: { code: 413, name: "Payload Too Large" },
URITooLong: { code: 414, name: "URI Too Long" },
UnsupportedMediaType: { code: 415, name: "Unsupported Media Type" },
RangeNotSatisfiable: { code: 416, name: "Range Not Satisfiable" },
ExpectationFailed: { code: 417, name: "Expectation Failed" },
ImATeapot: { code: 418, name: "I'm a teapot" },
UnprocessableEntity: { code: 422, name: "Unprocessable Entity" },
TooManyRequests: { code: 429, name: "Too Many Requests" },
},
serverErrors: {
InternalServerError: { code: 500, name: "Internal Server Error" },
NotImplemented: { code: 501, name: "Not Implemented" },
BadGateway: { code: 502, name: "Bad Gateway" },
ServiceUnavailable: { code: 503, name: "Service Unavailable" },
GatewayTimeout: { code: 504, name: "Gateway Timeout" },
HTTPVersionNotSupported: {
code: 505,
name: "HTTP Version Not Supported",
},
},
};
export { httpCodes };

73
express/logging.ts Normal file
View File

@@ -0,0 +1,73 @@
// internal-logging.ts
import { cli } from "./cli";
// FIXME: Move this to somewhere more appropriate
type AtLeastOne<T> = [T, ...T[]];
type MessageSource = "logging" | "diagnostic" | "user";
type Message = {
// FIXME: number probably isn't what we want here
timestamp?: number;
source: MessageSource;
text: AtLeastOne<string>;
};
const _m1: Message = { timestamp: 123, source: "logging", text: ["foo"] };
const _m2: Message = {
timestamp: 321,
source: "diagnostic",
text: ["ok", "whatever"],
};
type FilterArgument = {
limit?: number;
before?: number;
after?: number;
// FIXME: add offsets to use instead of or in addition to before/after
match?: (string | RegExp)[];
};
const loggerUrl = `http://${cli.logAddress.host}:${cli.logAddress.port}`;
const log = (message: Message) => {
const payload = {
timestamp: message.timestamp ?? Date.now(),
source: message.source,
text: message.text,
};
fetch(`${loggerUrl}/log`, {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify(payload),
}).catch((err) => {
console.error("[logging] Failed to send log:", err.message);
});
};
const getLogs = async (filter: FilterArgument): Promise<Message[]> => {
const params = new URLSearchParams();
if (filter.limit) {
params.set("limit", String(filter.limit));
}
if (filter.before) {
params.set("before", String(filter.before));
}
if (filter.after) {
params.set("after", String(filter.after));
}
const url = `${loggerUrl}/logs?${params.toString()}`;
const response = await fetch(url);
return response.json();
};
// FIXME: there's scope for more specialized functions although they
// probably should be defined in terms of the basic ones here.
export { getLogs, log };

69
express/mgmt/add-user.ts Normal file
View File

@@ -0,0 +1,69 @@
// add-user.ts
// Management command to create users from the command line
import { hashPassword } from "../auth/password";
import { PostgresAuthStore, pool } from "../database";
async function main(): Promise<void> {
const args = process.argv.slice(2);
if (args.length < 2) {
console.error(
"Usage: ./mgmt add-user <email> <password> [--display-name <name>] [--active]",
);
process.exit(1);
}
const email = args[0];
const password = args[1];
// Parse optional flags
let displayName: string | undefined;
let makeActive = false;
for (let i = 2; i < args.length; i++) {
if (args[i] === "--display-name" && args[i + 1]) {
displayName = args[i + 1];
i++;
} else if (args[i] === "--active") {
makeActive = true;
}
}
try {
const store = new PostgresAuthStore();
// Check if user already exists
const existing = await store.getUserByEmail(email);
if (existing) {
console.error(`Error: User with email '${email}' already exists`);
process.exit(1);
}
// Hash password and create user
const passwordHash = await hashPassword(password);
const user = await store.createUser({
email,
passwordHash,
displayName,
});
// Optionally activate user immediately
if (makeActive) {
await store.updateUserEmailVerified(user.id);
console.log(
`Created and activated user: ${user.email} (${user.id})`,
);
} else {
console.log(`Created user: ${user.email} (${user.id})`);
console.log(" Status: pending (use --active to create as active)");
}
} finally {
await pool.end();
}
}
main().catch((err) => {
console.error("Failed to create user:", err.message);
process.exit(1);
});

45
express/migrate.ts Normal file
View File

@@ -0,0 +1,45 @@
// migrate.ts
// CLI script for running database migrations
import { migrate, migrationStatus, pool } from "./database";
async function main(): Promise<void> {
const command = process.argv[2] || "run";
try {
switch (command) {
case "run":
await migrate();
break;
case "status": {
const status = await migrationStatus();
console.log("Applied migrations:");
for (const name of status.applied) {
console.log(`${name}`);
}
if (status.pending.length > 0) {
console.log("\nPending migrations:");
for (const name of status.pending) {
console.log(`${name}`);
}
} else {
console.log("\nNo pending migrations");
}
break;
}
default:
console.error(`Unknown command: ${command}`);
console.error("Usage: migrate [run|status]");
process.exit(1);
}
} finally {
await pool.end();
}
}
main().catch((err) => {
console.error("Migration failed:", err);
process.exit(1);
});

35
express/package.json Normal file
View File

@@ -0,0 +1,35 @@
{
"name": "express",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1",
"nodemon": "nodemon dist/index.js"
},
"keywords": [],
"author": "",
"license": "ISC",
"packageManager": "pnpm@10.12.4",
"dependencies": {
"@types/node": "^24.10.1",
"@types/nunjucks": "^3.2.6",
"@vercel/ncc": "^0.38.4",
"express": "^5.1.0",
"kysely": "^0.28.9",
"nodemon": "^3.1.11",
"nunjucks": "^3.2.4",
"path-to-regexp": "^8.3.0",
"pg": "^8.16.3",
"ts-luxon": "^6.2.0",
"ts-node": "^10.9.2",
"tsx": "^4.20.6",
"typescript": "^5.9.3",
"zod": "^4.1.12"
},
"devDependencies": {
"@biomejs/biome": "2.3.10",
"@types/express": "^5.0.5",
"@types/pg": "^8.16.0"
}
}

1627
express/pnpm-lock.yaml generated Normal file

File diff suppressed because it is too large Load Diff

25
express/request/index.ts Normal file
View File

@@ -0,0 +1,25 @@
import { AuthService } from "../auth";
import { getCurrentUser } from "../context";
import { PostgresAuthStore } from "../database";
import type { User } from "../user";
import { html, redirect, render } from "./util";
const util = { html, redirect, render };
const session = {
getUser: (): User => {
return getCurrentUser();
},
};
// Initialize auth with PostgreSQL store
const authStore = new PostgresAuthStore();
const auth = new AuthService(authStore);
const request = {
auth,
session,
util,
};
export { request };

45
express/request/util.ts Normal file
View File

@@ -0,0 +1,45 @@
import { contentTypes } from "../content-types";
import { core } from "../core";
import { executionContext } from "../execution-context";
import { httpCodes } from "../http-codes";
import type { RedirectResult, Result } from "../types";
import { loadFile } from "../util";
import { request } from "./index";
type NoUser = {
[key: string]: unknown;
} & {
user?: never;
};
const render = async (path: string, ctx?: NoUser): Promise<string> => {
const fullPath = `${executionContext.diachron_root}/templates/${path}.html.njk`;
const template = await loadFile(fullPath);
const user = request.session.getUser();
const context = { user, ...ctx };
const engine = core.conf.templateEngine();
const retval = engine.renderTemplate(template, context);
return retval;
};
const html = (payload: string): Result => {
const retval: Result = {
code: httpCodes.success.OK,
result: payload,
contentType: contentTypes.text.html,
};
return retval;
};
const redirect = (location: string): RedirectResult => {
return {
code: httpCodes.redirection.SeeOther,
contentType: contentTypes.text.plain,
result: "",
redirect: location,
};
};
export { html, redirect, render };

148
express/routes.ts Normal file
View File

@@ -0,0 +1,148 @@
/// <reference lib="dom" />
import nunjucks from "nunjucks";
import { DateTime } from "ts-luxon";
import { authRoutes } from "./auth/routes";
import { routes as basicRoutes } from "./basic/routes";
import { contentTypes } from "./content-types";
import { core } from "./core";
import { multiHandler } from "./handlers";
import { httpCodes } from "./http-codes";
import type { Call, Result, Route } from "./types";
// FIXME: Obviously put this somewhere else
const okText = (result: string): Result => {
const code = httpCodes.success.OK;
const retval: Result = {
code,
result,
contentType: contentTypes.text.plain,
};
return retval;
};
const routes: Route[] = [
...authRoutes,
basicRoutes.home,
basicRoutes.hello,
basicRoutes.login,
basicRoutes.logout,
{
path: "/slow",
methods: ["GET"],
handler: async (_call: Call): Promise<Result> => {
console.log("starting slow request");
await core.misc.sleep(2);
console.log("finishing slow request");
const retval = okText("that was slow");
return retval;
},
},
{
path: "/list",
methods: ["GET"],
handler: async (_call: Call): Promise<Result> => {
const code = httpCodes.success.OK;
const lr = (rr: Route[]) => {
const ret = rr.map((r: Route) => {
return r.path;
});
return ret;
};
const rrr = lr(routes);
const template = `
<html>
<head></head>
<body>
<ul>
{% for route in rrr %}
<li><a href="{{ route }}">{{ route }}</a></li>
{% endfor %}
</ul>
</body>
</html>
`;
const result = nunjucks.renderString(template, { rrr });
const _listing = lr(routes).join(", ");
return {
code,
result,
contentType: contentTypes.text.html,
};
},
},
{
path: "/whoami",
methods: ["GET"],
handler: async (call: Call): Promise<Result> => {
const me = call.session.getUser();
const template = `
<html>
<head></head>
<body>
{{ me }}
</body>
</html>
`;
const result = nunjucks.renderString(template, { me });
return {
code: httpCodes.success.OK,
contentType: contentTypes.text.html,
result,
};
},
},
{
path: "/ok",
methods: ["GET", "POST", "PUT"],
handler: multiHandler,
},
{
path: "/alsook",
methods: ["GET"],
handler: async (_req): Promise<Result> => {
const code = httpCodes.success.OK;
return {
code,
result: "it is also ok",
contentType: contentTypes.text.plain,
};
},
},
{
path: "/time",
methods: ["GET"],
handler: async (_req): Promise<Result> => {
const now = DateTime.now();
const template = `
<html>
<head></head>
<body>
{{ now }}
</body>
</html>
`;
const result = nunjucks.renderString(template, { now });
return {
code: httpCodes.success.OK,
contentType: contentTypes.text.html,
result,
};
},
},
];
export { routes };

9
express/run.sh Executable file
View File

@@ -0,0 +1,9 @@
#!/bin/bash
set -eu
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
cd "$DIR"
exec ../cmd node dist/index.js "$@"

12
express/show-config.sh Executable file
View File

@@ -0,0 +1,12 @@
#!/bin/bash
set -e
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
check_dir="$DIR"
source "$check_dir"/../framework/shims/common
source "$check_dir"/../framework/shims/node.common
$ROOT/cmd pnpm tsc --showConfig

13
express/tsconfig.json Normal file
View File

@@ -0,0 +1,13 @@
{
"compilerOptions": {
"esModuleInterop": true,
"target": "ES2022",
"lib": ["ES2023"],
"module": "NodeNext",
"moduleResolution": "NodeNext",
"noImplicitAny": true,
"strict": true,
"types": ["node"],
"outDir": "out"
}
}

117
express/types.ts Normal file
View File

@@ -0,0 +1,117 @@
// types.ts
// FIXME: split this up into types used by app developers and types internal
// to the framework.
import type { Request as ExpressRequest } from "express";
import type { MatchFunction } from "path-to-regexp";
import { z } from "zod";
import type { Session } from "./auth/types";
import type { ContentType } from "./content-types";
import type { HttpCode } from "./http-codes";
import type { Permission, User } from "./user";
const methodParser = z.union([
z.literal("GET"),
z.literal("POST"),
z.literal("PUT"),
z.literal("PATCH"),
z.literal("DELETE"),
]);
export type Method = z.infer<typeof methodParser>;
const massageMethod = (input: string): Method => {
const r = methodParser.parse(input.toUpperCase());
return r;
};
export type Call = {
pattern: string;
path: string;
method: Method;
parameters: object;
request: ExpressRequest;
user: User;
session: Session;
};
export type InternalHandler = (req: ExpressRequest) => Promise<Result>;
export type Handler = (call: Call) => Promise<Result>;
export type ProcessedRoute = {
matcher: MatchFunction<Record<string, string>>;
method: Method;
handler: InternalHandler;
};
export type CookieOptions = {
httpOnly?: boolean;
secure?: boolean;
sameSite?: "strict" | "lax" | "none";
maxAge?: number;
path?: string;
};
export type Cookie = {
name: string;
value: string;
options?: CookieOptions;
};
export type Result = {
code: HttpCode;
contentType: ContentType;
result: string;
cookies?: Cookie[];
};
export type RedirectResult = Result & {
redirect: string;
};
export function isRedirect(result: Result): result is RedirectResult {
return "redirect" in result;
}
export type Route = {
path: string;
methods: Method[];
handler: Handler;
interruptable?: boolean;
};
// Authentication error classes
export class AuthenticationRequired extends Error {
constructor() {
super("Authentication required");
this.name = "AuthenticationRequired";
}
}
export class AuthorizationDenied extends Error {
constructor() {
super("Authorization denied");
this.name = "AuthorizationDenied";
}
}
// Helper for handlers to require authentication
export function requireAuth(call: Call): User {
if (call.user.isAnonymous()) {
throw new AuthenticationRequired();
}
return call.user;
}
// Helper for handlers to require specific permission
export function requirePermission(call: Call, permission: Permission): User {
const user = requireAuth(call);
if (!user.hasPermission(permission)) {
throw new AuthorizationDenied();
}
return user;
}
export type Domain = "app" | "fw";
export { methodParser, massageMethod };

232
express/user.ts Normal file
View File

@@ -0,0 +1,232 @@
// user.ts
//
// User model for authentication and authorization.
//
// Design notes:
// - `id` is the stable internal identifier (UUID when database-backed)
// - `email` is the primary human-facing identifier
// - Roles provide coarse-grained authorization (admin, editor, etc.)
// - Permissions provide fine-grained authorization (posts:create, etc.)
// - Users can have both roles (which grant permissions) and direct permissions
import { z } from "zod";
// Branded type for user IDs to prevent accidental mixing with other strings
export type UserId = string & { readonly __brand: "UserId" };
// User account status
const userStatusParser = z.enum(["active", "suspended", "pending"]);
export type UserStatus = z.infer<typeof userStatusParser>;
// Role - simple string identifier
const roleParser = z.string().min(1);
export type Role = z.infer<typeof roleParser>;
// Permission format: "resource:action" e.g. "posts:create", "users:delete"
const permissionParser = z.string().regex(/^[a-z_]+:[a-z_]+$/, {
message: "Permission must be in format 'resource:action'",
});
export type Permission = z.infer<typeof permissionParser>;
// Core user data schema - this is what gets stored/serialized
const userDataParser = z.object({
id: z.string().min(1),
email: z.email(),
displayName: z.string().optional(),
status: userStatusParser,
roles: z.array(roleParser),
permissions: z.array(permissionParser),
createdAt: z.coerce.date(),
updatedAt: z.coerce.date(),
});
export type UserData = z.infer<typeof userDataParser>;
// Role-to-permission mappings
// In a real system this might be database-driven or configurable
type RolePermissionMap = Map<Role, Permission[]>;
const defaultRolePermissions: RolePermissionMap = new Map([
["admin", ["users:read", "users:create", "users:update", "users:delete"]],
["user", ["users:read"]],
]);
export abstract class User {
protected readonly data: UserData;
protected rolePermissions: RolePermissionMap;
constructor(data: UserData, rolePermissions?: RolePermissionMap) {
this.data = userDataParser.parse(data);
this.rolePermissions = rolePermissions ?? defaultRolePermissions;
}
// Identity
get id(): UserId {
return this.data.id as UserId;
}
get email(): string {
return this.data.email;
}
get displayName(): string | undefined {
return this.data.displayName;
}
// Status
get status(): UserStatus {
return this.data.status;
}
isActive(): boolean {
return this.data.status === "active";
}
// Roles
get roles(): readonly Role[] {
return this.data.roles;
}
hasRole(role: Role): boolean {
return this.data.roles.includes(role);
}
hasAnyRole(roles: Role[]): boolean {
return roles.some((role) => this.hasRole(role));
}
hasAllRoles(roles: Role[]): boolean {
return roles.every((role) => this.hasRole(role));
}
// Permissions
get permissions(): readonly Permission[] {
return this.data.permissions;
}
// Get all permissions: direct + role-derived
effectivePermissions(): Set<Permission> {
const perms = new Set<Permission>(this.data.permissions);
for (const role of this.data.roles) {
const rolePerms = this.rolePermissions.get(role);
if (rolePerms) {
for (const p of rolePerms) {
perms.add(p);
}
}
}
return perms;
}
// Check if user has a specific permission (direct or via role)
hasPermission(permission: Permission): boolean {
// Check direct permissions first
if (this.data.permissions.includes(permission)) {
return true;
}
// Check role-derived permissions
for (const role of this.data.roles) {
const rolePerms = this.rolePermissions.get(role);
if (rolePerms?.includes(permission)) {
return true;
}
}
return false;
}
// Convenience method: can user perform action on resource?
can(action: string, resource: string): boolean {
const permission = `${resource}:${action}` as Permission;
return this.hasPermission(permission);
}
// Timestamps
get createdAt(): Date {
return this.data.createdAt;
}
get updatedAt(): Date {
return this.data.updatedAt;
}
// Serialization - returns plain object for storage/transmission
toJSON(): UserData {
return { ...this.data };
}
toString(): string {
return `User(id ${this.id})`;
}
abstract isAnonymous(): boolean;
}
export class AuthenticatedUser extends User {
// Factory for creating new users with sensible defaults
static create(
email: string,
options?: {
id?: string;
displayName?: string;
status?: UserStatus;
roles?: Role[];
permissions?: Permission[];
},
): User {
const now = new Date();
return new AuthenticatedUser({
id: options?.id ?? crypto.randomUUID(),
email,
displayName: options?.displayName,
status: options?.status ?? "active",
roles: options?.roles ?? [],
permissions: options?.permissions ?? [],
createdAt: now,
updatedAt: now,
});
}
isAnonymous(): boolean {
return false;
}
}
// For representing "no user" in contexts where user is optional
export class AnonymousUser extends User {
// FIXME: this is C&Ped with only minimal changes. No bueno.
static create(
email: string,
options?: {
id?: string;
displayName?: string;
status?: UserStatus;
roles?: Role[];
permissions?: Permission[];
},
): AnonymousUser {
const now = new Date(0);
return new AnonymousUser({
id: options?.id ?? crypto.randomUUID(),
email,
displayName: options?.displayName,
status: options?.status ?? "active",
roles: options?.roles ?? [],
permissions: options?.permissions ?? [],
createdAt: now,
updatedAt: now,
});
}
isAnonymous(): boolean {
return true;
}
}
export const anonymousUser = AnonymousUser.create("anonymous@example.com", {
id: "-1",
displayName: "Anonymous User",
});

11
express/util.ts Normal file
View File

@@ -0,0 +1,11 @@
import { readFile } from "node:fs/promises";
// FIXME: Handle the error here
const loadFile = async (path: string): Promise<string> => {
// Specifying 'utf8' returns a string; otherwise, it returns a Buffer
const data = await readFile(path, "utf8");
return data;
};
export { loadFile };

14
express/watch.sh Executable file
View File

@@ -0,0 +1,14 @@
#!/bin/bash
set -e
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
check_dir="$DIR"
source "$check_dir"/../framework/shims/common
source "$check_dir"/../framework/shims/node.common
# $ROOT/cmd pnpm tsc --lib ES2023 --esModuleInterop -w $check_dir/app.ts
# $ROOT/cmd pnpm tsc -w $check_dir/app.ts
$ROOT/cmd pnpm tsc --watch --project ./tsconfig.json

26
fixup.sh Executable file
View File

@@ -0,0 +1,26 @@
#!/bin/bash
set -eu
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
cd "$DIR"
# uv run ruff check --select I --fix .
# uv run ruff format .
shell_scripts="$(fd '.sh$' | xargs)"
shfmt -i 4 -w "$DIR/cmd" "$DIR"/framework/cmd.d/* "$DIR"/framework/shims/* "$DIR"/master/master "$DIR"/logger/logger
# "$shell_scripts"
for ss in $shell_scripts; do
shfmt -i 4 -w $ss
done
pushd "$DIR/master"
go fmt
popd
pushd "$DIR/express"
../cmd pnpm biome check --write
popd

View File

@@ -2,7 +2,7 @@
set -eu
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
cd "$DIR"

View File

@@ -2,6 +2,6 @@
set -eu
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
"$DIR"/../shims/node "$@"
exec "$DIR"/../shims/node "$@"

View File

@@ -2,6 +2,6 @@
set -eu
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
"$DIR"/../shims/pnpm "$@"

View File

@@ -2,7 +2,7 @@
set -eu
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
# figure out the platform we're on
@@ -11,11 +11,8 @@ DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
# download $nodejs_version
# verify its checksum against $nodejs_checksum
cd "$DIR/../node"
cd $DIR/../node
$DIR/pnpm install
"$DIR"/pnpm install
echo we will download other files here later

15
framework/cmd.d/test Executable file
View File

@@ -0,0 +1,15 @@
#!/bin/bash
set -eu
shopt -s globstar nullglob
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
cd "$DIR/../../express"
if [ $# -eq 0 ]; then
"$DIR"/../shims/pnpm tsx --test ./**/*.spec.ts ./**/*.test.ts
else
"$DIR"/../shims/pnpm tsx --test "$@"
fi

5
framework/cmd.d/ts-node Executable file
View File

@@ -0,0 +1,5 @@
#!/bin/bash
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
"$DIR"/../shims/pnpm ts-node "$@"

5
framework/cmd.d/tsx Executable file
View File

@@ -0,0 +1,5 @@
#!/bin/bash
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
"$DIR"/../shims/pnpm tsx "$@"

9
framework/common.d/db Executable file
View File

@@ -0,0 +1,9 @@
#!/bin/bash
set -eu
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
ROOT="$DIR/../.."
# FIXME: don't hard code this of course
PGPASSWORD=diachron psql -U diachron -h localhost diachron

9
framework/common.d/migrate Executable file
View File

@@ -0,0 +1,9 @@
#!/bin/bash
set -eu
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
ROOT="$DIR/../.."
cd "$ROOT/express"
"$DIR"/tsx migrate.ts "$@"

11
framework/develop.d/clear-db Executable file
View File

@@ -0,0 +1,11 @@
#!/bin/bash
# This file belongs to the framework. You are not expected to modify it.
set -eu
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
ROOT="$DIR/../.."
cd "$ROOT/express"
"$DIR"/../cmd.d/tsx develop/clear-db.ts "$@"

1
framework/develop.d/db Symbolic link
View File

@@ -0,0 +1 @@
../common.d/db

Some files were not shown because too many files have changed in this diff Show More