Compare commits
142 Commits
19d4309272
...
master
| Author | SHA1 | Date | |
|---|---|---|---|
| eb12471941 | |||
| 60643409f6 | |||
| 5a8c0028d7 | |||
| f7e6e56aca | |||
| cd19a32be5 | |||
| 478305bc4f | |||
| 421628d49e | |||
| 4f37a72d7b | |||
| e30bf5d96d | |||
| 8704c4a8d5 | |||
| 579a19669e | |||
| 474420ac1e | |||
| 960f78a1ad | |||
| d921679058 | |||
| 350bf7c865 | |||
| 8a7682e953 | |||
| e59bb35ac9 | |||
| a345a2adfb | |||
| 00d84d6686 | |||
| 7ed05695b9 | |||
| 03cc4cf4eb | |||
| 2121a6b5de | |||
|
|
6ace2163ed | ||
|
|
93ab4b5d53 | ||
|
|
70ddcb2a94 | ||
|
|
1da81089cd | ||
| f383c6a465 | |||
| e34d47b352 | |||
| de70be996e | |||
| 096a1235b5 | |||
| 4a4dc11aa4 | |||
| 7399cbe785 | |||
| 14d20be9a2 | |||
| 55f5cc699d | |||
| afcb447b2b | |||
| 1c1eeddcbe | |||
| 7cecf5326d | |||
| 47f6bee75f | |||
| 6e96c33457 | |||
| 9e3329fa58 | |||
| 05eaf938fa | |||
| df2d4eea3f | |||
| b235a6be9a | |||
| 8cd4b42cc6 | |||
| 241d3e799e | |||
| 49dc0e3fe0 | |||
| c7b8cd33da | |||
| 6c0895de07 | |||
| 17ea6ba02d | |||
| 661def8a5c | |||
| 74d75d08dd | |||
| ad6d405206 | |||
| e9ccf6d757 | |||
| 34ec5be7ec | |||
| e136c07928 | |||
| c926f15aab | |||
| 39cd93c81e | |||
| c246e0384f | |||
| 788ea2ab19 | |||
| 6297a95d3c | |||
| 63cf0a670d | |||
| 5524eaf18f | |||
| 03980e114b | |||
| 539717efda | |||
| 8be88bb696 | |||
| ab74695f4c | |||
| dc5a70ba33 | |||
| 4adf6cf358 | |||
| bee6938a67 | |||
| b0ee53f7d5 | |||
| 5c93c9e982 | |||
| 5606a59614 | |||
| 22dde8c213 | |||
| 30463b60a5 | |||
| e2ea472a10 | |||
| 20e5da0d54 | |||
| 13d02d86be | |||
| d35e7bace2 | |||
| cb4a730838 | |||
| 58f88e3695 | |||
| 7b8eaac637 | |||
| f504576f3e | |||
| 8722062f4a | |||
| 9cc1991d07 | |||
| 5d5a2430ad | |||
| a840137f83 | |||
| c330da49fc | |||
| db81129724 | |||
| 43ff2edad2 | |||
|
|
ad95f652b8 | ||
|
|
51d24209b0 | ||
|
|
1083655a3b | ||
|
|
615cd89656 | ||
|
|
321b2abd23 | ||
|
|
642c7d9434 | ||
|
|
8e5b46d426 | ||
|
|
a178536472 | ||
|
|
3bece46638 | ||
|
|
4257a9b615 | ||
|
|
b0eaf6b136 | ||
|
|
8ca89b75cd | ||
|
|
a797cae0e6 | ||
|
|
666f1447f4 | ||
|
|
bd3779acef | ||
|
|
c21638c5d5 | ||
|
|
d125f61c4c | ||
|
|
2641a8d29d | ||
|
|
1a13fd0909 | ||
|
|
c346a70cce | ||
|
|
96d861f043 | ||
|
|
0201d08009 | ||
|
|
d4d5a72b3e | ||
|
|
292ce4be7f | ||
|
|
6a4a2f7eef | ||
|
|
de7dbf45cd | ||
|
|
15187ed752 | ||
|
|
f2513f7be0 | ||
|
|
235d2b50dd | ||
|
|
553cd680dc | ||
|
|
72284505a0 | ||
|
|
ecdbedc135 | ||
|
|
6ed81e871f | ||
|
|
40f3e4ef51 | ||
|
|
d1d4e03885 | ||
|
|
8d494cdf3b | ||
|
|
5a08baab5e | ||
|
|
c850815e97 | ||
|
|
6d2779ba83 | ||
|
|
c19e6a9537 | ||
|
|
d48a533d42 | ||
|
|
d6fda3fdb3 | ||
| ad2b85bc2b | |||
| d6fccc3172 | |||
| a078f36d08 | |||
| efa9a7a3de | |||
| e7425a2685 | |||
| 0c5b8b734c | |||
| a441563c91 | |||
| 503e82c2f4 | |||
| 82a0a290d4 | |||
| b0f69a6dbe | |||
| a0043fd475 |
4
.beads/issues.jsonl
Normal file
4
.beads/issues.jsonl
Normal file
@@ -0,0 +1,4 @@
|
|||||||
|
{"id":"diachron-2vh","title":"Add unit testing to golang programs","status":"open","priority":2,"issue_type":"task","created_at":"2026-01-03T12:31:41.281891462-06:00","created_by":"mw","updated_at":"2026-01-03T12:31:41.281891462-06:00"}
|
||||||
|
{"id":"diachron-64w","title":"Add unit testing to express backend","status":"open","priority":2,"issue_type":"task","created_at":"2026-01-03T12:31:30.439206099-06:00","created_by":"mw","updated_at":"2026-01-03T12:31:30.439206099-06:00"}
|
||||||
|
{"id":"diachron-fzd","title":"Add generic 'user' functionality","status":"open","priority":2,"issue_type":"task","created_at":"2026-01-03T12:35:53.73213604-06:00","created_by":"mw","updated_at":"2026-01-03T12:35:53.73213604-06:00"}
|
||||||
|
{"id":"diachron-ngx","title":"Teach the master and/or build process to send messages with notify-send when builds fail or succeed. Ideally this will be fairly generic.","status":"open","priority":2,"issue_type":"task","created_at":"2026-01-03T14:10:11.773218844-06:00","created_by":"mw","updated_at":"2026-01-03T14:10:11.773218844-06:00"}
|
||||||
2
.claude/instructions.md
Normal file
2
.claude/instructions.md
Normal file
@@ -0,0 +1,2 @@
|
|||||||
|
When asked "what's next?" or during downtime, check TODO.md and suggest items to work on.
|
||||||
|
|
||||||
5
.gitignore
vendored
Normal file
5
.gitignore
vendored
Normal file
@@ -0,0 +1,5 @@
|
|||||||
|
**/node_modules
|
||||||
|
framework/downloads
|
||||||
|
framework/binaries
|
||||||
|
framework/.nodejs
|
||||||
|
framework/.nodejs-config
|
||||||
1
.go-version
Normal file
1
.go-version
Normal file
@@ -0,0 +1 @@
|
|||||||
|
1.23.6
|
||||||
122
CLAUDE.md
Normal file
122
CLAUDE.md
Normal file
@@ -0,0 +1,122 @@
|
|||||||
|
# CLAUDE.md
|
||||||
|
|
||||||
|
This file provides guidance to Claude Code (claude.ai/code) when working with
|
||||||
|
code in this repository.
|
||||||
|
|
||||||
|
## Project Overview
|
||||||
|
|
||||||
|
Diachron is an opinionated TypeScript/Node.js web framework with a Go-based
|
||||||
|
master process. Key design principles:
|
||||||
|
- No development/production distinction - single mode of operation everywhere
|
||||||
|
- Everything loggable and inspectable for debuggability
|
||||||
|
- Minimal magic, explicit behavior
|
||||||
|
- PostgreSQL-only (no database abstraction)
|
||||||
|
- Inspired by "Taking PHP Seriously" essay
|
||||||
|
|
||||||
|
## Commands
|
||||||
|
|
||||||
|
### General
|
||||||
|
|
||||||
|
**Install dependencies:**
|
||||||
|
```bash
|
||||||
|
./sync.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
**Run an app:**
|
||||||
|
```bash
|
||||||
|
./master
|
||||||
|
```
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
### Development
|
||||||
|
|
||||||
|
**Check shell scripts (shellcheck + shfmt) (eventually go fmt and biome or similar):**
|
||||||
|
```bash
|
||||||
|
./check.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
**Format TypeScript code:**
|
||||||
|
```bash
|
||||||
|
cd express && ../cmd pnpm biome check --write .
|
||||||
|
```
|
||||||
|
|
||||||
|
**Build Go master process:**
|
||||||
|
```bash
|
||||||
|
cd master && go build
|
||||||
|
```
|
||||||
|
|
||||||
|
### Operational
|
||||||
|
|
||||||
|
(to be written)
|
||||||
|
|
||||||
|
## Architecture
|
||||||
|
|
||||||
|
### Components
|
||||||
|
|
||||||
|
- **express/** - TypeScript/Express.js backend application
|
||||||
|
- **master/** - Go-based master process for file watching and process management
|
||||||
|
- **framework/** - Managed binaries (Node.js, pnpm), command wrappers, and
|
||||||
|
framework-specific library code
|
||||||
|
- **monitor/** - Go file watcher that triggers rebuilds (experimental)
|
||||||
|
|
||||||
|
### Master Process (Go)
|
||||||
|
|
||||||
|
Responsibilities:
|
||||||
|
- Watch TypeScript source for changes and trigger rebuilds
|
||||||
|
- Manage worker processes
|
||||||
|
- Proxy web requests to backend workers
|
||||||
|
- Behaves identically in all environments (no dev/prod distinction)
|
||||||
|
|
||||||
|
### Express App Structure
|
||||||
|
|
||||||
|
- `app.ts` - Main Express application setup with route matching
|
||||||
|
- `routes.ts` - Route definitions
|
||||||
|
- `handlers.ts` - Route handlers
|
||||||
|
- `services.ts` - Service layer (database, logging, misc)
|
||||||
|
- `types.ts` - TypeScript type definitions (Route, Call, Handler, Result, Method)
|
||||||
|
|
||||||
|
### Framework Command System
|
||||||
|
|
||||||
|
Commands flow through: `./cmd` → `framework/cmd.d/*` → `framework/shims/*` → managed binaries in `framework/binaries/`
|
||||||
|
|
||||||
|
This ensures consistent tooling versions across the team without system-wide installations.
|
||||||
|
|
||||||
|
## Tech Stack
|
||||||
|
|
||||||
|
- TypeScript 5.9+ / Node.js 22.15
|
||||||
|
- Express.js 5.1
|
||||||
|
- Go 1.23.3+ (master process)
|
||||||
|
- pnpm 10.12.4 (package manager)
|
||||||
|
- Zod (runtime validation)
|
||||||
|
- Nunjucks (templating)
|
||||||
|
- @vercel/ncc (bundling)
|
||||||
|
|
||||||
|
## Platform Requirements
|
||||||
|
|
||||||
|
Linux or macOS on x86_64. Requires:
|
||||||
|
- Modern libc for Go binaries (Linux)
|
||||||
|
- docker compose (for full stack)
|
||||||
|
- fd, shellcheck, shfmt (for development)
|
||||||
|
|
||||||
|
## Current Status
|
||||||
|
|
||||||
|
Early stage - most implementations are stubs:
|
||||||
|
- Database service is placeholder
|
||||||
|
- Logging functions marked WRITEME
|
||||||
|
- No test framework configured yet
|
||||||
|
|
||||||
|
# meta
|
||||||
|
|
||||||
|
## formatting and sorting
|
||||||
|
|
||||||
|
- When a typescript file exports symbols, they should be listed in order
|
||||||
|
|
||||||
|
## guidelines for this document
|
||||||
|
|
||||||
|
- Try to keep lines below 80 characters in length, especially prose. But if
|
||||||
|
embedded code or literals are longer, that's fine.
|
||||||
|
- Use formatting such as bold or italics sparingly
|
||||||
|
- In general, we treat this document like source code insofar as it should be
|
||||||
|
both human-readable and machine-readable
|
||||||
|
- Keep this meta section at the end of the file.
|
||||||
61
README.md
61
README.md
@@ -1,18 +1,61 @@
|
|||||||
diachron
|
diachron
|
||||||
|
|
||||||
|
## Introduction
|
||||||
|
|
||||||
|
- Do you want to share a lot of backend and frontend code?
|
||||||
|
|
||||||
|
- Are you tired of your web stack breaking when you blink too hard?
|
||||||
|
|
||||||
|
- Have you read [Taking PHP
|
||||||
|
Seriously](https://slack.engineering/taking-php-seriously/) and do you wish
|
||||||
|
you had something similar for Typescript?
|
||||||
|
|
||||||
|
- Do you think that ORMs are not all that? Do you wish you had first class
|
||||||
|
unmediated access to your database? And do you think that database
|
||||||
|
agnosticism is overrated?
|
||||||
|
|
||||||
|
- Do you think dev/testing/prod distinctions are a bad idea? (Hear us out on
|
||||||
|
this one.)
|
||||||
|
|
||||||
|
- Have you ever lost hours getting everyone on your team to have the exact
|
||||||
|
same environment, yet you're not willing to take the plunge and use a tool
|
||||||
|
like [nix](https://nixos.org)?
|
||||||
|
|
||||||
|
- Are you frustrated by unclear documentation? Is ramping up a frequent
|
||||||
|
problem?
|
||||||
|
|
||||||
|
- Do you want a framework that's not only easy to write but also easy to get
|
||||||
|
inside and debug?
|
||||||
|
|
||||||
|
- Have you been bogged down with details that are not relevant to the problems
|
||||||
|
you're trying to solve, the features you're trying to implement, the bugs
|
||||||
|
you're trying to fix? We're talking authentication, authorization, XSS,
|
||||||
|
https, nested paths, all that stuff.
|
||||||
|
|
||||||
|
Is your answer to some of these questions "yes"? If so, you might like
|
||||||
|
diachron. (When it comes to that dev/test/prod one, hear us out first, ok?)
|
||||||
|
|
||||||
|
## Getting started
|
||||||
|
|
||||||
|
Different situations require different getting started docs.
|
||||||
|
|
||||||
|
- [How to create a new project](docs/new-project.md)
|
||||||
|
- [How to work on an existing project](docs/existing-project.md)
|
||||||
|
|
||||||
## Requirements
|
## Requirements
|
||||||
|
|
||||||
To run diachron, you currently need the following requirements:
|
To run diachron, you need Linux or macOS on x86_64. Linux requires a new
|
||||||
|
enough libc to run golang binaries.
|
||||||
|
|
||||||
- docker compose
|
To run a more complete system, you also need to have docker compose installed.
|
||||||
- deno
|
|
||||||
|
|
||||||
## Development requirements
|
### Development requirements
|
||||||
|
|
||||||
To hack on diachron, you need the following:
|
To hack on diachron itself, you need the following:
|
||||||
|
|
||||||
- docker compose
|
- bash
|
||||||
- deno
|
- docker and docker compose
|
||||||
|
- [fd](https://github.com/sharkdp/fd)
|
||||||
- golang, version 1.23.6 or greater
|
- golang, version 1.23.6 or greater
|
||||||
|
- shellcheck
|
||||||
|
- shfmt
|
||||||
|
|||||||
177
TODO.md
Normal file
177
TODO.md
Normal file
@@ -0,0 +1,177 @@
|
|||||||
|
## high importance
|
||||||
|
|
||||||
|
- [ ] Add unit tests all over the place.
|
||||||
|
- ⚠️ Huge task - needs breakdown before starting
|
||||||
|
|
||||||
|
|
||||||
|
- [ ] migrations, seeding, fixtures
|
||||||
|
|
||||||
|
```sql
|
||||||
|
CREATE SCHEMA fw;
|
||||||
|
CREATE TABLE fw.users (...);
|
||||||
|
CREATE TABLE fw.groups (...);
|
||||||
|
```
|
||||||
|
|
||||||
|
```sql
|
||||||
|
CREATE TABLE app.user_profiles (...);
|
||||||
|
CREATE TABLE app.customer_metadata (...);
|
||||||
|
```
|
||||||
|
|
||||||
|
- [ ] flesh out `mgmt` and `develop` (does not exist yet)
|
||||||
|
|
||||||
|
4.1 What belongs in develop
|
||||||
|
|
||||||
|
- Create migrations
|
||||||
|
- Squash migrations
|
||||||
|
- Reset DB
|
||||||
|
- Roll back migrations
|
||||||
|
- Seed large test datasets
|
||||||
|
- Run tests
|
||||||
|
- Snapshot / restore local DB state (!!!)
|
||||||
|
|
||||||
|
`develop` fails if APP_ENV (or whatever) is `production`. Or maybe even
|
||||||
|
`testing`.
|
||||||
|
|
||||||
|
- [ ] Add default user table(s) to database.
|
||||||
|
|
||||||
|
|
||||||
|
- [ ] Add authentication
|
||||||
|
- [ ] password
|
||||||
|
- [ ] third party?
|
||||||
|
|
||||||
|
|
||||||
|
- [ ] Add middleware concept
|
||||||
|
|
||||||
|
- [ ] Add authorization
|
||||||
|
- for specific routes / resources / etc
|
||||||
|
|
||||||
|
- [ ] Add basic text views
|
||||||
|
Partially done; see the /time route. But we need to figure out where to
|
||||||
|
store templates, static files, etc.
|
||||||
|
|
||||||
|
- [ ] fix process management: if you control-c `master` process sometimes it
|
||||||
|
leaves around `master-bin`, `logger-bin`, and `diachron:nnnn` processes.
|
||||||
|
Huge problem.
|
||||||
|
|
||||||
|
## medium importance
|
||||||
|
|
||||||
|
- [ ] Add a log viewer
|
||||||
|
- with queries
|
||||||
|
- convert to logfmt and is there a viewer UI we could pull in and use
|
||||||
|
instead?
|
||||||
|
|
||||||
|
- [ ] add nested routes. Note that this might be easy to do without actually
|
||||||
|
changing the logic in express/routes.ts. A function that takes an array
|
||||||
|
of routes and maps over them rewriting them. Maybe.
|
||||||
|
|
||||||
|
- [ ] related: add something to do with default templates and stuff... I
|
||||||
|
think we can make handlers a lot shorter to write, sometimes not even
|
||||||
|
necessary at all, with some sane defaults and an easy to use override
|
||||||
|
mechanism
|
||||||
|
|
||||||
|
- [ ] time library
|
||||||
|
|
||||||
|
- [ ] fill in the rest of express/http-codes.ts
|
||||||
|
|
||||||
|
- [ ] fill out express/content-types.ts
|
||||||
|
|
||||||
|
|
||||||
|
- [ ] identify redundant "old skool" and ajax routes, factor out their
|
||||||
|
commonalities, etc.
|
||||||
|
|
||||||
|
- [ ] figure out and add logging to disk
|
||||||
|
|
||||||
|
- [ ] I don't really feel close to satisfied with template location /
|
||||||
|
rendering / etc. Rethink and rework.
|
||||||
|
|
||||||
|
- [ ] Add email verification (this is partially done already)
|
||||||
|
|
||||||
|
- [ ] Reading .env files and dealing with the environment should be immune to
|
||||||
|
the extent possible from idiotic errors
|
||||||
|
|
||||||
|
- [ ] Update check script:
|
||||||
|
- [x] shellcheck on shell scripts
|
||||||
|
- [x] `go vet` on go files
|
||||||
|
- [x] `golangci-lint` on go files
|
||||||
|
- [x] Run `go fmt` on all .go files
|
||||||
|
- [ ] Eventually, run unit tests
|
||||||
|
|
||||||
|
- [ ] write docs
|
||||||
|
- upgrade docs
|
||||||
|
- starting docs
|
||||||
|
- taking over docs
|
||||||
|
- reference
|
||||||
|
- internals
|
||||||
|
|
||||||
|
- [ ] make migration creation default to something like yyyy-mm-dd_ssss (are
|
||||||
|
9999 migrations in a day enough?)
|
||||||
|
|
||||||
|
- [ ] clean up `cmd` and `mgmt`: do the right thing with their commonalities
|
||||||
|
and make very plain which is which for what. Consider additional
|
||||||
|
commands. Maybe `develop` for specific development tasks,
|
||||||
|
`operate` for operational tasks, and we keep `cmd` for project-specific
|
||||||
|
commands. Something like that.
|
||||||
|
|
||||||
|
|
||||||
|
## low importance
|
||||||
|
|
||||||
|
- [ ] add a prometheus-style `/metrics` endpoint to master
|
||||||
|
- [ ] create a metrics server analogous to the logging server
|
||||||
|
- accept various stats from the workers (TBD)
|
||||||
|
|
||||||
|
- [ ] move `master-bin` into a subdir like `master/cmd` or whatever is
|
||||||
|
idiomatic for golang programs; adapt `master` wrapper shell script
|
||||||
|
accordingly
|
||||||
|
|
||||||
|
- [ ] flesh out the `sync.sh` script
|
||||||
|
- [ ] update framework-managed node
|
||||||
|
- [ ] update framework-managed pnpm
|
||||||
|
- [ ] update pnpm-managed deps
|
||||||
|
- [ ] rebuild golang programs
|
||||||
|
|
||||||
|
- [ ] If the number of workers is large, then there is a long lapse between
|
||||||
|
when you change a file and when the server responds
|
||||||
|
- One solution: start and stop workers serially: stop one, restart it with new
|
||||||
|
code; repeat
|
||||||
|
- Slow start them: only start a few at first
|
||||||
|
|
||||||
|
- [ ] in express/user.ts: FIXME: set createdAt and updatedAt to start of epoch
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
## finished
|
||||||
|
|
||||||
|
- [x] Reimplement fixup.sh
|
||||||
|
- [x] run shfmt on all shell scripts (and the files they `source`)
|
||||||
|
- [x] Run `go fmt` on all .go files
|
||||||
|
- [x] Run ~~prettier~~ biome on all .ts files and maybe others
|
||||||
|
|
||||||
|
- [x] Adapt master program so that it reads configuration from command line
|
||||||
|
args instead of from environment variables
|
||||||
|
- Should have sane defaults
|
||||||
|
- Adding new arguments should be easy and obvious
|
||||||
|
|
||||||
|
- [x] Add wrapper script to run master program (so that various assumptions related
|
||||||
|
to relative paths are safer)
|
||||||
|
|
||||||
|
- [x] Add logging service
|
||||||
|
- New golang program, in the same directory as master
|
||||||
|
- Intended to be started by master
|
||||||
|
- Listens to a port specified command line arg
|
||||||
|
- Accepts POSTed (or possibly PUT) json messages, currently in a
|
||||||
|
to-be-defined format. We will work on this format later.
|
||||||
|
- Keeps the most recent N messages in memory. N can be a fairly large
|
||||||
|
number; let's start by assuming 1 million.
|
||||||
|
|
||||||
|
- [x] Log to logging service from the express backend
|
||||||
|
- Fill out types and functions in `express/logging.ts`
|
||||||
|
|
||||||
|
- [x] Add first cut at database access. Remember that ORMs are not all that!
|
||||||
|
|
||||||
|
- [x] Create initial docker-compose.yml file for local development
|
||||||
|
- include most recent stable postgres
|
||||||
|
- include beanstalkd
|
||||||
|
- include memcached
|
||||||
|
- include redis
|
||||||
|
- include mailpit
|
||||||
30
check.sh
Executable file
30
check.sh
Executable file
@@ -0,0 +1,30 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
|
||||||
|
cd "$DIR"
|
||||||
|
|
||||||
|
# Keep exclusions sorted. And list them here.
|
||||||
|
#
|
||||||
|
# - SC2002 is useless use of cat
|
||||||
|
#
|
||||||
|
exclusions="SC2002"
|
||||||
|
|
||||||
|
source "$DIR/framework/versions"
|
||||||
|
|
||||||
|
if [[ $# -ne 0 ]]; then
|
||||||
|
shellcheck --exclude="$exclusions" "$@"
|
||||||
|
exit $?
|
||||||
|
fi
|
||||||
|
|
||||||
|
shell_scripts="$(fd .sh | xargs)"
|
||||||
|
|
||||||
|
# The files we need to check all either end in .sh or else they're the files
|
||||||
|
# in framework/cmd.d and framework/shims. -x instructs shellcheck to also
|
||||||
|
# check `source`d files.
|
||||||
|
|
||||||
|
shellcheck -x --exclude="$exclusions" "$DIR/cmd" "$DIR"/framework/cmd.d/* "$DIR"/framework/shims/* "$shell_scripts"
|
||||||
|
|
||||||
|
pushd "$DIR/master"
|
||||||
|
docker run --rm -v $(pwd):/app -w /app golangci/golangci-lint:$golangci_lint golangci-lint run
|
||||||
|
popd
|
||||||
27
cmd
Executable file
27
cmd
Executable file
@@ -0,0 +1,27 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
# This file belongs to the framework. You are not expected to modify it.
|
||||||
|
|
||||||
|
# Managed binary runner - runs framework-managed binaries like node, pnpm, tsx
|
||||||
|
# Usage: ./cmd <command> [args...]
|
||||||
|
|
||||||
|
set -eu
|
||||||
|
|
||||||
|
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
|
||||||
|
if [ $# -lt 1 ]; then
|
||||||
|
echo "Usage: ./cmd <command> [args...]"
|
||||||
|
echo ""
|
||||||
|
echo "Available commands:"
|
||||||
|
for cmd in "$DIR"/framework/cmd.d/*; do
|
||||||
|
if [ -x "$cmd" ]; then
|
||||||
|
basename "$cmd"
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
subcmd="$1"
|
||||||
|
shift
|
||||||
|
|
||||||
|
exec "$DIR"/framework/cmd.d/"$subcmd" "$@"
|
||||||
@@ -1,36 +0,0 @@
|
|||||||
import { Extensible } from "./interfaces";
|
|
||||||
|
|
||||||
const contentTypes: Extensible = {
|
|
||||||
text: {
|
|
||||||
plain: "text/plain",
|
|
||||||
html: "text/html",
|
|
||||||
css: "text/css",
|
|
||||||
javascript: "text/javascript",
|
|
||||||
xml: "text/xml",
|
|
||||||
},
|
|
||||||
image: {
|
|
||||||
jpeg: "image/jpeg",
|
|
||||||
png: "image/png",
|
|
||||||
gif: "image/gif",
|
|
||||||
svgPlusXml: "image/svg+xml",
|
|
||||||
webp: "image/webp",
|
|
||||||
},
|
|
||||||
audio: {
|
|
||||||
"mpeg": "audio/mpeg",
|
|
||||||
"wav": "audio/wav",
|
|
||||||
},
|
|
||||||
video: {
|
|
||||||
mp4: "video/mp4",
|
|
||||||
webm: "video/webm",
|
|
||||||
xMsvideo: "video/x-msvideo",
|
|
||||||
},
|
|
||||||
application: {
|
|
||||||
json: "application/json",
|
|
||||||
pdf: "application/pdf",
|
|
||||||
zip: "application/zip",
|
|
||||||
xWwwFormUrlencoded: "x-www-form-urlencoded",
|
|
||||||
octetStream: "octet-stream",
|
|
||||||
},
|
|
||||||
};
|
|
||||||
|
|
||||||
export { contentTypes };
|
|
||||||
@@ -1,41 +0,0 @@
|
|||||||
import { Extensible } from "./interfaces";
|
|
||||||
|
|
||||||
type HttpCode = {
|
|
||||||
code: number;
|
|
||||||
name: string;
|
|
||||||
description?: string;
|
|
||||||
};
|
|
||||||
type Group = "success" | "redirection" | "clientErrors" | "serverErrors";
|
|
||||||
type CodeDefinitions = {
|
|
||||||
[K in Group]: {
|
|
||||||
[K: string]: HttpCode;
|
|
||||||
};
|
|
||||||
};
|
|
||||||
|
|
||||||
const httpCodes: CodeDefinitions & Extensible = {
|
|
||||||
success: {
|
|
||||||
OK: { code: 200, name: "OK", "description": "" },
|
|
||||||
Created: { code: 201, name: "Created" },
|
|
||||||
Accepted: { code: 202, name: "Accepted" },
|
|
||||||
NoContent: { code: 204, name: "No content" },
|
|
||||||
},
|
|
||||||
redirection: {
|
|
||||||
// later
|
|
||||||
},
|
|
||||||
clientErrors: {
|
|
||||||
BadRequest: { code: 400, name: "Bad Request" },
|
|
||||||
Unauthorized: { code: 401, name: "Unauthorized" },
|
|
||||||
Forbidden: { code: 403, name: "Forbidden" },
|
|
||||||
NotFound: { code: 404, name: "Not Found" },
|
|
||||||
MethodNotAllowed: { code: 405, name: "Method Not Allowed" },
|
|
||||||
NotAcceptable: { code: 406, name: "Not Acceptable" },
|
|
||||||
// More later
|
|
||||||
},
|
|
||||||
serverErrors: {
|
|
||||||
InternalServerError: { code: 500, name: "Internal Server Error" },
|
|
||||||
NotImplemented: { code: 500, name: "Not implemented" },
|
|
||||||
// more later
|
|
||||||
},
|
|
||||||
};
|
|
||||||
|
|
||||||
export { httpCodes };
|
|
||||||
@@ -1 +0,0 @@
|
|||||||
export interface Extensible {}
|
|
||||||
27
develop
Executable file
27
develop
Executable file
@@ -0,0 +1,27 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
# This file belongs to the framework. You are not expected to modify it.
|
||||||
|
|
||||||
|
# Development command runner - parallel to ./mgmt for development tasks
|
||||||
|
# Usage: ./develop <command> [args...]
|
||||||
|
|
||||||
|
set -eu
|
||||||
|
|
||||||
|
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
|
||||||
|
if [ $# -lt 1 ]; then
|
||||||
|
echo "Usage: ./develop <command> [args...]"
|
||||||
|
echo ""
|
||||||
|
echo "Available commands:"
|
||||||
|
for cmd in "$DIR"/framework/develop.d/*; do
|
||||||
|
if [ -x "$cmd" ]; then
|
||||||
|
basename "$cmd"
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
subcmd="$1"
|
||||||
|
shift
|
||||||
|
|
||||||
|
exec "$DIR"/framework/develop.d/"$subcmd" "$@"
|
||||||
35
docker-compose.yml
Normal file
35
docker-compose.yml
Normal file
@@ -0,0 +1,35 @@
|
|||||||
|
services:
|
||||||
|
postgres:
|
||||||
|
image: postgres:17
|
||||||
|
ports:
|
||||||
|
- "5432:5432"
|
||||||
|
environment:
|
||||||
|
POSTGRES_USER: diachron
|
||||||
|
POSTGRES_PASSWORD: diachron
|
||||||
|
POSTGRES_DB: diachron
|
||||||
|
volumes:
|
||||||
|
- postgres_data:/var/lib/postgresql/data
|
||||||
|
|
||||||
|
redis:
|
||||||
|
image: redis:7
|
||||||
|
ports:
|
||||||
|
- "6379:6379"
|
||||||
|
|
||||||
|
memcached:
|
||||||
|
image: memcached:1.6
|
||||||
|
ports:
|
||||||
|
- "11211:11211"
|
||||||
|
|
||||||
|
beanstalkd:
|
||||||
|
image: schickling/beanstalkd
|
||||||
|
ports:
|
||||||
|
- "11300:11300"
|
||||||
|
|
||||||
|
mailpit:
|
||||||
|
image: axllent/mailpit
|
||||||
|
ports:
|
||||||
|
- "1025:1025" # SMTP
|
||||||
|
- "8025:8025" # Web UI
|
||||||
|
|
||||||
|
volumes:
|
||||||
|
postgres_data:
|
||||||
125
docs/commands.md
Normal file
125
docs/commands.md
Normal file
@@ -0,0 +1,125 @@
|
|||||||
|
# The Three Types of Commands
|
||||||
|
|
||||||
|
This framework deliberately separates *how* you interact with the system into three distinct command types. The split is not cosmetic; it encodes safety, intent, and operational assumptions directly into the tooling so that mistakes are harder to make under stress.
|
||||||
|
|
||||||
|
The guiding idea: **production should feel boring and safe; exploration should feel powerful and a little dangerous; the application itself should not care how it is being operated.**
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 1. Application Commands (`app`)
|
||||||
|
|
||||||
|
**What they are**
|
||||||
|
Commands defined *by the application itself*, for its own domain needs. They are not part of the framework, even though they are built on top of it.
|
||||||
|
|
||||||
|
The framework provides structure and affordances; the application supplies meaning.
|
||||||
|
|
||||||
|
**Core properties**
|
||||||
|
|
||||||
|
* Express domain behavior, not infrastructure concerns
|
||||||
|
* Safe by definition
|
||||||
|
* Deterministic and repeatable
|
||||||
|
* No environment‑dependent semantics
|
||||||
|
* Identical behavior in dev, staging, and production
|
||||||
|
|
||||||
|
**Examples**
|
||||||
|
|
||||||
|
* Handling HTTP requests
|
||||||
|
* Rendering templates
|
||||||
|
* Running background jobs / queues
|
||||||
|
* Sending emails triggered by application logic
|
||||||
|
|
||||||
|
**Non‑goals**
|
||||||
|
|
||||||
|
* No schema changes
|
||||||
|
* No data backfills
|
||||||
|
* No destructive behavior
|
||||||
|
* No operational or lifecycle management
|
||||||
|
|
||||||
|
**Rule of thumb**
|
||||||
|
If removing the framework would require rewriting *how* it runs but not *what* it does, the command belongs here.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 2. Management Commands (`mgmt`)
|
||||||
|
|
||||||
|
**What they are**
|
||||||
|
Operational, *production‑safe* commands used to evolve and maintain a live system.
|
||||||
|
|
||||||
|
These commands assume real data exists and must not be casually destroyed.
|
||||||
|
|
||||||
|
**Core properties**
|
||||||
|
|
||||||
|
* Forward‑only
|
||||||
|
* Idempotent or safely repeatable
|
||||||
|
* Designed to run in production
|
||||||
|
* Explicit, auditable intent
|
||||||
|
|
||||||
|
**Examples**
|
||||||
|
|
||||||
|
* Applying migrations
|
||||||
|
* Running seeders that assert invariant data
|
||||||
|
* Reindexing or rebuilding derived data
|
||||||
|
* Rotating keys, recalculating counters
|
||||||
|
|
||||||
|
**Design constraints**
|
||||||
|
|
||||||
|
* No implicit rollbacks
|
||||||
|
* No hidden destructive actions
|
||||||
|
* Fail fast if assumptions are violated
|
||||||
|
|
||||||
|
**Rule of thumb**
|
||||||
|
If you would run it at 3am while tired and worried, it must live here.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 3. Development Commands (`develop`)
|
||||||
|
|
||||||
|
**What they are**
|
||||||
|
Sharp, *unsafe by design* tools meant exclusively for local development and experimentation.
|
||||||
|
|
||||||
|
These commands optimize for speed, learning, and iteration — not safety.
|
||||||
|
|
||||||
|
**Core properties**
|
||||||
|
|
||||||
|
* Destructive operations allowed
|
||||||
|
* May reset or mutate large amounts of data
|
||||||
|
* Assume a clean or disposable environment
|
||||||
|
* Explicitly gated in production
|
||||||
|
|
||||||
|
**Examples**
|
||||||
|
|
||||||
|
* Dropping and recreating databases
|
||||||
|
* Rolling migrations backward
|
||||||
|
* Loading fixtures or scenarios
|
||||||
|
* Generating fake or randomized data
|
||||||
|
|
||||||
|
**Safety model**
|
||||||
|
|
||||||
|
* Hard to run in production
|
||||||
|
* Requires explicit opt‑in if ever enabled
|
||||||
|
* Clear, noisy warnings when invoked
|
||||||
|
|
||||||
|
**Rule of thumb**
|
||||||
|
If it would be irresponsible to run against real user data, it belongs here.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Why This Split Matters
|
||||||
|
|
||||||
|
Many frameworks blur these concerns, leading to:
|
||||||
|
|
||||||
|
* Fearful production operations
|
||||||
|
* Overpowered dev tools leaking into prod
|
||||||
|
* Environment‑specific behavior and bugs
|
||||||
|
|
||||||
|
By naming and enforcing these three command types:
|
||||||
|
|
||||||
|
* Intent is visible at the CLI level
|
||||||
|
* Safety properties are architectural, not cultural
|
||||||
|
* Developers can move fast *without* normalizing risk
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## One‑Sentence Summary
|
||||||
|
|
||||||
|
> **App commands run the system, mgmt commands evolve it safely, and develop commands let you break things on purpose — but only where it’s allowed.**
|
||||||
37
docs/concentric-circles.md
Normal file
37
docs/concentric-circles.md
Normal file
@@ -0,0 +1,37 @@
|
|||||||
|
Let's consider a bullseye with the following concentric circles:
|
||||||
|
|
||||||
|
- Ring 0: small, simple systems
|
||||||
|
- Single jurisdiction
|
||||||
|
- Email + password
|
||||||
|
- A few roles
|
||||||
|
- Naïve or soft deletion
|
||||||
|
- Minimal audit needs
|
||||||
|
|
||||||
|
- Ring 1: grown-up systems
|
||||||
|
- Long-lived data
|
||||||
|
- Changing requirements
|
||||||
|
- Shared accounts
|
||||||
|
- GDPR-style erasure/anonymization
|
||||||
|
- Some cross-border concerns
|
||||||
|
- Historical data must remain usable
|
||||||
|
- “Oops, we should have thought about that” moments
|
||||||
|
|
||||||
|
- Ring 2: heavy compliance
|
||||||
|
- Formal audit trails
|
||||||
|
- Legal hold
|
||||||
|
- Non-repudiation
|
||||||
|
- Regulatory reporting
|
||||||
|
- Strong identity guarantees
|
||||||
|
- Jurisdiction-aware data partitioning
|
||||||
|
|
||||||
|
- Ring 3: banking / defense / healthcare at scale
|
||||||
|
- Cryptographic auditability
|
||||||
|
- Append-only ledgers
|
||||||
|
- Explicit legal models
|
||||||
|
- Independent compliance teams
|
||||||
|
- Lawyers embedded in engineeRing
|
||||||
|
|
||||||
|
diachron is designed to be suitable for Rings 0 and 1. Occasionally we may
|
||||||
|
look over the fence into Ring 2, but it's not what we've principally designed
|
||||||
|
for. Please take this framing into account when evaluating diachron for
|
||||||
|
greenfield projects.
|
||||||
1
docs/deployment.md
Normal file
1
docs/deployment.md
Normal file
@@ -0,0 +1 @@
|
|||||||
|
.
|
||||||
142
docs/freedom-hacking-and-responsibility.md
Normal file
142
docs/freedom-hacking-and-responsibility.md
Normal file
@@ -0,0 +1,142 @@
|
|||||||
|
# Freedom, Hacking, and Responsibility
|
||||||
|
|
||||||
|
This framework is **free and open source software**.
|
||||||
|
|
||||||
|
That fact is not incidental. It is a deliberate ethical, practical, and technical choice.
|
||||||
|
|
||||||
|
This document explains how freedom to modify coexists with strong guidance about *how the framework is meant to be used* — without contradiction, and without apology.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## The short version
|
||||||
|
|
||||||
|
* This is free software. You are free to modify it.
|
||||||
|
* The framework has documented invariants for good reasons.
|
||||||
|
* You are encouraged to explore, question, and patch.
|
||||||
|
* You are discouraged from casually undermining guarantees you still expect to rely on.
|
||||||
|
* Clarity beats enforcement.
|
||||||
|
|
||||||
|
Freedom with understanding beats both lock-in and chaos.
|
||||||
|
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Your Freedom
|
||||||
|
|
||||||
|
You are free to:
|
||||||
|
|
||||||
|
* study the source code
|
||||||
|
* run the software for any purpose
|
||||||
|
* modify it in any way
|
||||||
|
* fork it
|
||||||
|
* redistribute it, with or without changes
|
||||||
|
* submit patches, extensions, or experiments
|
||||||
|
|
||||||
|
…subject only to the terms of the license.
|
||||||
|
|
||||||
|
These freedoms are foundational. They are not granted reluctantly, and they are not symbolic. They exist so that:
|
||||||
|
|
||||||
|
* you can understand what your software is really doing
|
||||||
|
* you are not trapped by vendor control
|
||||||
|
* the system can outlive its original authors
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Freedom Is Not the Same as Endorsement
|
||||||
|
|
||||||
|
While you are free to change anything, **not all changes are equally wise**.
|
||||||
|
|
||||||
|
Some parts of the framework are carefully constrained because they encode:
|
||||||
|
|
||||||
|
* security assumptions
|
||||||
|
* lifecycle invariants
|
||||||
|
* hard-won lessons from real systems under stress
|
||||||
|
|
||||||
|
You are free to violate these constraints in your own fork.
|
||||||
|
|
||||||
|
But the framework’s documentation will often say things like:
|
||||||
|
|
||||||
|
* “do not modify this”
|
||||||
|
* “application code must not depend on this”
|
||||||
|
* “this table or class is framework-owned”
|
||||||
|
|
||||||
|
These statements are **technical guidance**, not legal restrictions.
|
||||||
|
|
||||||
|
They exist to answer the question:
|
||||||
|
|
||||||
|
> *If you want this system to remain upgradeable, predictable, and boring — what should you leave alone?*
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## The Intended Social Contract
|
||||||
|
|
||||||
|
The framework makes a clear offer:
|
||||||
|
|
||||||
|
* We expose our internals so you can learn.
|
||||||
|
* We provide explicit extension points so you can adapt.
|
||||||
|
* We document invariants so you don’t have to rediscover them the hard way.
|
||||||
|
|
||||||
|
In return, we ask that:
|
||||||
|
|
||||||
|
* application code respects documented boundaries
|
||||||
|
* extensions use explicit seams rather than hidden hooks
|
||||||
|
* patches that change invariants are proposed consciously, not accidentally
|
||||||
|
|
||||||
|
Nothing here is enforced by technical locks.
|
||||||
|
|
||||||
|
It is enforced — insofar as it is enforced at all — by clarity and shared expectations.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Hacking Is Welcome
|
||||||
|
|
||||||
|
Exploration is not just allowed; it is encouraged.
|
||||||
|
|
||||||
|
Good reasons to hack on the framework include:
|
||||||
|
|
||||||
|
* understanding how it works
|
||||||
|
* evaluating whether its constraints make sense
|
||||||
|
* adapting it to unfamiliar environments
|
||||||
|
* testing alternative designs
|
||||||
|
* discovering better abstractions
|
||||||
|
|
||||||
|
Fork it. Instrument it. Break it. Learn from it.
|
||||||
|
|
||||||
|
Many of the framework’s constraints exist *because* someone once ignored them and paid the price.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Patches, Not Patches-in-Place
|
||||||
|
|
||||||
|
If you discover a problem or a better design:
|
||||||
|
|
||||||
|
* patches are welcome
|
||||||
|
* discussions are welcome
|
||||||
|
* disagreements are welcome
|
||||||
|
|
||||||
|
What is discouraged is **quietly patching around framework invariants inside application code**.
|
||||||
|
|
||||||
|
That approach:
|
||||||
|
|
||||||
|
* obscures intent
|
||||||
|
* creates one-off local truths
|
||||||
|
* makes systems harder to reason about
|
||||||
|
|
||||||
|
If the framework is wrong, it should be corrected *at the framework level*, or consciously forked.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Why This Is Not a Contradiction
|
||||||
|
|
||||||
|
Strong opinions and free software are not enemies.
|
||||||
|
|
||||||
|
Freedom means you can change the software.
|
||||||
|
|
||||||
|
Responsibility means understanding what you are changing, and why.
|
||||||
|
|
||||||
|
A system that pretends every modification is equally safe is dishonest.
|
||||||
|
|
||||||
|
A system that hides its internals to prevent modification is hostile.
|
||||||
|
|
||||||
|
This framework aims for neither.
|
||||||
|
|
||||||
27
docs/groups-and-roles.md
Normal file
27
docs/groups-and-roles.md
Normal file
@@ -0,0 +1,27 @@
|
|||||||
|
- Role: a named bundle of responsibilities (editor, admin, member)
|
||||||
|
|
||||||
|
- Group: a scope or context (org, team, project, publication)
|
||||||
|
|
||||||
|
- Permission / Capability (capability preferred in code): a boolean fact about
|
||||||
|
allowed behavior
|
||||||
|
|
||||||
|
|
||||||
|
## tips
|
||||||
|
|
||||||
|
- In the database, capabilities are boolean values. Their names should be
|
||||||
|
verb-subject. Don't include `can` and definitely do not include `cannot`.
|
||||||
|
|
||||||
|
✔️ `edit_post`
|
||||||
|
❌ `cannot_remove_comment`
|
||||||
|
|
||||||
|
- The capabilities table is deliberately flat. If you need to group them, use
|
||||||
|
`.` as a delimiter and sort and filter accordingly in queries and in your
|
||||||
|
UI.
|
||||||
|
✔️ `blog.edit_post`
|
||||||
|
✔️ `blog.moderate_comment`
|
||||||
|
or
|
||||||
|
✔️ `blog.post.edit`
|
||||||
|
✔️ `blog.post.delete`
|
||||||
|
✔️ `blog.comment.moderate`
|
||||||
|
✔️ `blog.comment.edit`
|
||||||
|
are all fine.
|
||||||
17
docs/index.md
Normal file
17
docs/index.md
Normal file
@@ -0,0 +1,17 @@
|
|||||||
|
misc notes for now. of course this needs to be written up for real.
|
||||||
|
|
||||||
|
|
||||||
|
## execution context
|
||||||
|
|
||||||
|
The execution context represents facts such as the runtime directory, the
|
||||||
|
operating system, hardware, and filesystem layout, distinct from environment
|
||||||
|
variables or request-scoped context.
|
||||||
|
|
||||||
|
## philosophy
|
||||||
|
|
||||||
|
- TODO-DESIGN.md
|
||||||
|
- concentric-circles.md
|
||||||
|
- nomenclature.md
|
||||||
|
- mutability.md
|
||||||
|
- commands.md
|
||||||
|
- groups-and-roles.md
|
||||||
34
docs/migrations-and-seeders-and-database-table-ownership.md
Normal file
34
docs/migrations-and-seeders-and-database-table-ownership.md
Normal file
@@ -0,0 +1,34 @@
|
|||||||
|
Some database tables are owned by diachron and some are owned by the
|
||||||
|
application.
|
||||||
|
|
||||||
|
This also applies to seeders: some are owned by diachron and some by the
|
||||||
|
application.
|
||||||
|
|
||||||
|
The database's structure is managed by migrations written in SQL.
|
||||||
|
|
||||||
|
Each migration gets its own file. These files' names should match
|
||||||
|
`yyyy-mm-dd_ss-description.sql`, eg `2026-01-01_01-users.sql`.
|
||||||
|
|
||||||
|
Files are sorted lexicographically by name and applied in order.
|
||||||
|
|
||||||
|
Note: in the future we may relax or modify the restriction on migration file
|
||||||
|
names, but they'll continue to be applied in lexicographical order.
|
||||||
|
|
||||||
|
## framework and application migrations
|
||||||
|
|
||||||
|
Migrations owned by the framework are kept in a separate directory from those
|
||||||
|
owned by applications. Pending framework migrations, if any, are applied
|
||||||
|
before pending application migrations, if any.
|
||||||
|
|
||||||
|
diachron will go to some lengths to ensure that framework migrations do not
|
||||||
|
break applications.
|
||||||
|
|
||||||
|
## no downward migrations
|
||||||
|
|
||||||
|
diachron does not provide them. "The only way out is through."
|
||||||
|
|
||||||
|
When developing locally, you can use the command `develop reset-db`. **NEVER
|
||||||
|
USE THIS IN PRODUCTION!** Always be sure that you can "get back to where you
|
||||||
|
were". Being careful when creating migrations and seeders can help, but
|
||||||
|
dumping and restoring known-good copies of the database can also take you a
|
||||||
|
long way.
|
||||||
1
docs/mutability.md
Normal file
1
docs/mutability.md
Normal file
@@ -0,0 +1 @@
|
|||||||
|
Describe and define what is expected to be mutable and what is not.
|
||||||
84
docs/new-project.md
Normal file
84
docs/new-project.md
Normal file
@@ -0,0 +1,84 @@
|
|||||||
|
If any of the steps here don't work or are unclear in any way, it is
|
||||||
|
probably a bug and we want to fix it!
|
||||||
|
|
||||||
|
## how to create a new diachron project
|
||||||
|
|
||||||
|
1. Create an empty directory for your project. This directory can be inside of a
|
||||||
|
git repository but it doesn't have to be.
|
||||||
|
|
||||||
|
2. Download the sync program and put it in the empty directory created in the
|
||||||
|
previous step. There is a sync program for every version of diachron.
|
||||||
|
You'll usually want to use the most recent stable version. [FIXME: explain
|
||||||
|
why you'd want to use something else.] And you'll want the version for
|
||||||
|
the operating system and hardware you're using.
|
||||||
|
|
||||||
|
3. Run the `setup` program. This program is [FIXME: will be] written in
|
||||||
|
[go](https://go.dev), so as long as you have downloaded the right file, it
|
||||||
|
ought to work.
|
||||||
|
|
||||||
|
This will create several files and directories. It will also download a number
|
||||||
|
of binaries and put them in different places in some of the directories
|
||||||
|
that are created.
|
||||||
|
|
||||||
|
4. At this point, you should have a usable, if not very useful, diachron
|
||||||
|
application. To see what it does, run the program `develop run`; it will run a
|
||||||
|
simple web application on localhost:3000. To make changes, have a look at
|
||||||
|
the files `src/app.ts` and `src/routes.ts`.
|
||||||
|
|
||||||
|
## where do we go from here?
|
||||||
|
|
||||||
|
Now that we have a very simple project, we need to attend to a few other
|
||||||
|
important matters.
|
||||||
|
|
||||||
|
### version control
|
||||||
|
|
||||||
|
(These instructions assume you're using git. If you're using a different
|
||||||
|
version control system then you will need to make allowances. In particular,
|
||||||
|
you should convert the `.gitignore` file to whatever your version control
|
||||||
|
system uses.)
|
||||||
|
|
||||||
|
You should add the whole directory to git and commit it. There will be two
|
||||||
|
`.gitignore` files, one in the root, and one in the `framework/` directory.
|
||||||
|
|
||||||
|
The root `.gitignore` created for you will be a good starting point, but you
|
||||||
|
can make changes to it as you see fit. However, you should not ever modify
|
||||||
|
`framework/.gitignore`. More on this in the next section.
|
||||||
|
|
||||||
|
### working with diachron
|
||||||
|
|
||||||
|
There are four commands to know about:
|
||||||
|
|
||||||
|
- `sync` is used to install all dependencies, including the ones you specify
|
||||||
|
as well as the ones that diachron provides
|
||||||
|
- `develop` is used to run "development-related" tasks. Run `develop help` to
|
||||||
|
get an overview of what it can do.
|
||||||
|
- `operate` is used to run "operations-related" tasks. Run `operate help` to
|
||||||
|
get an overview of what it can do.
|
||||||
|
- `cmd` runs diachron-managed commands, such as `pnpm`. When working on a
|
||||||
|
diachron project, you should always use these diachron-managed commands
|
||||||
|
instead of whatever else you may have available.
|
||||||
|
|
||||||
|
### what files belong to your project, what files belong to the framework
|
||||||
|
|
||||||
|
In a new diachron project, there are some files and directories that are
|
||||||
|
"owned" by the framework and others that are "owned" by the programmer.
|
||||||
|
|
||||||
|
In particular, you own everything in the directory `src/`. You own
|
||||||
|
`README.md` and `package.json` and `pnpm-lock.yaml`. You own any other files
|
||||||
|
our directories you create.
|
||||||
|
|
||||||
|
Everything else _belongs to the framework_ and you are not expected to change
|
||||||
|
it except when upgrading.
|
||||||
|
|
||||||
|
This is just an overview. It is exhaustively documented in
|
||||||
|
[ownership.md](ownership.md).
|
||||||
|
|
||||||
|
### updates
|
||||||
|
|
||||||
|
|
||||||
|
### when the docs sound a bit authoritarian...
|
||||||
|
|
||||||
|
Finally, remember that diachron's license allows you to do whatever you like
|
||||||
|
with it, with very few limitations. This includes making changes to files
|
||||||
|
about which, in the documentation, we say "you must not change" or "you are
|
||||||
|
not expected to change."
|
||||||
15
docs/nomenclature.md
Normal file
15
docs/nomenclature.md
Normal file
@@ -0,0 +1,15 @@
|
|||||||
|
We use `Call` and `Result` for our own types that wrap `Request` and
|
||||||
|
`Response`.
|
||||||
|
|
||||||
|
This hopefully will make things less confusing and avoid problems with shadowing.
|
||||||
|
|
||||||
|
## meta
|
||||||
|
|
||||||
|
- We use _algorithmic complexity_ for performance discussions, when
|
||||||
|
things like Big-O come up, etc
|
||||||
|
|
||||||
|
- We use _conceptual complexity_ for design and architecture
|
||||||
|
|
||||||
|
- We use _cognitive load_ when talking about developer experience
|
||||||
|
|
||||||
|
- We use _operational burden_ when talking about production reality
|
||||||
219
docs/ownership.md
Normal file
219
docs/ownership.md
Normal file
@@ -0,0 +1,219 @@
|
|||||||
|
# Framework vs Application Ownership
|
||||||
|
|
||||||
|
This document defines **ownership boundaries** between the framework and application code. These boundaries are intentional and non-negotiable: they exist to preserve upgradeability, predictability, and developer sanity under stress.
|
||||||
|
|
||||||
|
Ownership answers a simple question:
|
||||||
|
|
||||||
|
> **Who is allowed to change this, and under what rules?**
|
||||||
|
|
||||||
|
The framework draws a hard line between *framework‑owned* and *application‑owned* concerns, while still encouraging extension through explicit, visible mechanisms.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Core Principle
|
||||||
|
|
||||||
|
The framework is not a library of suggestions. It is a **runtime with invariants**.
|
||||||
|
|
||||||
|
Application code:
|
||||||
|
|
||||||
|
* **uses** the framework
|
||||||
|
* **extends** it through defined seams
|
||||||
|
* **never mutates or overrides its invariants**
|
||||||
|
|
||||||
|
Framework code:
|
||||||
|
|
||||||
|
* guarantees stable behavior
|
||||||
|
* owns critical lifecycle and security concerns
|
||||||
|
* must remain internally consistent across versions
|
||||||
|
|
||||||
|
Breaking this boundary creates systems that work *until they don’t*, usually during upgrades or emergencies.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Database Ownership
|
||||||
|
|
||||||
|
### Framework‑Owned Tables
|
||||||
|
|
||||||
|
Certain database tables are **owned and managed exclusively by the framework**.
|
||||||
|
|
||||||
|
Examples (illustrative, not exhaustive):
|
||||||
|
|
||||||
|
* authentication primitives
|
||||||
|
* session or token state
|
||||||
|
* internal capability/permission metadata
|
||||||
|
* migration bookkeeping
|
||||||
|
* framework feature flags or invariants
|
||||||
|
|
||||||
|
#### Rules
|
||||||
|
|
||||||
|
Application code **must not**:
|
||||||
|
|
||||||
|
* modify schema
|
||||||
|
* add columns
|
||||||
|
* delete rows
|
||||||
|
* update rows directly
|
||||||
|
* rely on undocumented columns or behaviors
|
||||||
|
|
||||||
|
Application code **may**:
|
||||||
|
|
||||||
|
* read via documented framework APIs
|
||||||
|
* reference stable identifiers explicitly exposed by the framework
|
||||||
|
|
||||||
|
Think of these tables as **private internal state** — even though they live in your database.
|
||||||
|
|
||||||
|
> If the framework needs you to interact with this data, it will expose an API for it.
|
||||||
|
|
||||||
|
#### Rationale
|
||||||
|
|
||||||
|
These tables:
|
||||||
|
|
||||||
|
* encode security or correctness invariants
|
||||||
|
* may change structure across framework versions
|
||||||
|
* must remain globally coherent
|
||||||
|
|
||||||
|
Treating them as app‑owned data tightly couples your app to framework internals and blocks safe upgrades.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Application‑Owned Tables
|
||||||
|
|
||||||
|
All domain data belongs to the application.
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
|
||||||
|
* users (as domain actors, not auth primitives)
|
||||||
|
* posts, orders, comments, invoices
|
||||||
|
* business‑specific joins and projections
|
||||||
|
* denormalized or performance‑oriented tables
|
||||||
|
|
||||||
|
#### Rules
|
||||||
|
|
||||||
|
Application code:
|
||||||
|
|
||||||
|
* owns schema design
|
||||||
|
* owns migrations
|
||||||
|
* owns constraints and indexes
|
||||||
|
* may evolve these tables freely
|
||||||
|
|
||||||
|
The framework:
|
||||||
|
|
||||||
|
* never mutates application tables implicitly
|
||||||
|
* interacts only through explicit queries or contracts
|
||||||
|
|
||||||
|
#### Integration Pattern
|
||||||
|
|
||||||
|
Where framework concepts must relate to app data:
|
||||||
|
|
||||||
|
* use **foreign keys to framework‑exposed identifiers**, or
|
||||||
|
* introduce **explicit join tables** owned by the application
|
||||||
|
|
||||||
|
No hidden coupling, no magic backfills.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Code Ownership
|
||||||
|
|
||||||
|
### Framework‑Owned Code
|
||||||
|
|
||||||
|
Some classes, constants, and modules are **framework‑owned**.
|
||||||
|
|
||||||
|
These include:
|
||||||
|
|
||||||
|
* core request/response abstractions
|
||||||
|
* auth and user primitives
|
||||||
|
* capability/permission evaluation logic
|
||||||
|
* lifecycle hooks
|
||||||
|
* low‑level utilities relied on by the framework itself
|
||||||
|
|
||||||
|
#### Rules
|
||||||
|
|
||||||
|
Application code **must not**:
|
||||||
|
|
||||||
|
* modify framework source
|
||||||
|
* monkey‑patch or override internals
|
||||||
|
* rely on undocumented behavior
|
||||||
|
* change constant values or internal defaults
|
||||||
|
|
||||||
|
Framework code is treated as **read‑only** from the app’s perspective.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Extension Is Encouraged (But Explicit)
|
||||||
|
|
||||||
|
Ownership does **not** mean rigidity.
|
||||||
|
|
||||||
|
The framework is designed to be extended via **intentional seams**, such as:
|
||||||
|
|
||||||
|
* subclassing
|
||||||
|
* composition
|
||||||
|
* adapters
|
||||||
|
* delegation
|
||||||
|
* configuration objects
|
||||||
|
* explicit registration APIs
|
||||||
|
|
||||||
|
#### Preferred Patterns
|
||||||
|
|
||||||
|
* **Subclass when behavior is stable and conceptual**
|
||||||
|
* **Compose when behavior is contextual or optional**
|
||||||
|
* **Delegate when authority should remain with the framework**
|
||||||
|
|
||||||
|
What matters is that extension is:
|
||||||
|
|
||||||
|
* visible in code
|
||||||
|
* locally understandable
|
||||||
|
* reversible
|
||||||
|
|
||||||
|
No spooky action at a distance.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## What the App Owns Completely
|
||||||
|
|
||||||
|
The application fully owns:
|
||||||
|
|
||||||
|
* domain models and data shapes
|
||||||
|
* SQL queries and result parsing
|
||||||
|
* business rules
|
||||||
|
* authorization policy *inputs* (not the engine)
|
||||||
|
* rendering decisions
|
||||||
|
* feature flags specific to the app
|
||||||
|
* performance trade‑offs
|
||||||
|
|
||||||
|
The framework does not attempt to infer intent from your domain.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## What the Framework Guarantees
|
||||||
|
|
||||||
|
In return for respecting ownership boundaries, the framework guarantees:
|
||||||
|
|
||||||
|
* stable semantics across versions
|
||||||
|
* forward‑only migrations for its own tables
|
||||||
|
* explicit deprecations
|
||||||
|
* no silent behavior changes
|
||||||
|
* identical runtime behavior in dev and prod
|
||||||
|
|
||||||
|
The framework may evolve internally — **but never by reaching into your app’s data or code**.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## A Useful Mental Model
|
||||||
|
|
||||||
|
* Framework‑owned things are **constitutional law**
|
||||||
|
* Application‑owned things are **legislation**
|
||||||
|
|
||||||
|
You can write any laws you want — but you don’t amend the constitution inline.
|
||||||
|
|
||||||
|
If you need a new power, the framework should expose it deliberately.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Summary
|
||||||
|
|
||||||
|
* Ownership is about **who is allowed to change what**
|
||||||
|
* Framework‑owned tables and code are read‑only to the app
|
||||||
|
* Application‑owned tables and code are sovereign
|
||||||
|
* Extension is encouraged, mutation is not
|
||||||
|
* Explicit seams beat clever hacks
|
||||||
|
|
||||||
|
Respecting these boundaries keeps systems boring — and boring systems survive stress.
|
||||||
1
docs/upgrades.md
Normal file
1
docs/upgrades.md
Normal file
@@ -0,0 +1 @@
|
|||||||
|
.
|
||||||
2
express/.gitignore
vendored
Normal file
2
express/.gitignore
vendored
Normal file
@@ -0,0 +1,2 @@
|
|||||||
|
out/
|
||||||
|
dist/
|
||||||
173
express/app.ts
Normal file
173
express/app.ts
Normal file
@@ -0,0 +1,173 @@
|
|||||||
|
import express, {
|
||||||
|
type Request as ExpressRequest,
|
||||||
|
type Response as ExpressResponse,
|
||||||
|
} from "express";
|
||||||
|
import { match } from "path-to-regexp";
|
||||||
|
import { Session } from "./auth";
|
||||||
|
import { cli } from "./cli";
|
||||||
|
import { contentTypes } from "./content-types";
|
||||||
|
import { runWithContext } from "./context";
|
||||||
|
import { core } from "./core";
|
||||||
|
import { httpCodes } from "./http-codes";
|
||||||
|
import { request } from "./request";
|
||||||
|
import { routes } from "./routes";
|
||||||
|
|
||||||
|
// import { URLPattern } from 'node:url';
|
||||||
|
import {
|
||||||
|
AuthenticationRequired,
|
||||||
|
AuthorizationDenied,
|
||||||
|
type Call,
|
||||||
|
type InternalHandler,
|
||||||
|
isRedirect,
|
||||||
|
type Method,
|
||||||
|
massageMethod,
|
||||||
|
methodParser,
|
||||||
|
type ProcessedRoute,
|
||||||
|
type Result,
|
||||||
|
type Route,
|
||||||
|
} from "./types";
|
||||||
|
|
||||||
|
const app = express();
|
||||||
|
|
||||||
|
// Parse request bodies
|
||||||
|
app.use(express.json());
|
||||||
|
app.use(express.urlencoded({ extended: true }));
|
||||||
|
|
||||||
|
core.logging.log({ source: "logging", text: ["1"] });
|
||||||
|
const processedRoutes: { [K in Method]: ProcessedRoute[] } = {
|
||||||
|
GET: [],
|
||||||
|
POST: [],
|
||||||
|
PUT: [],
|
||||||
|
PATCH: [],
|
||||||
|
DELETE: [],
|
||||||
|
};
|
||||||
|
|
||||||
|
function _isPromise<T>(value: T | Promise<T>): value is Promise<T> {
|
||||||
|
return typeof (value as any)?.then === "function";
|
||||||
|
}
|
||||||
|
|
||||||
|
routes.forEach((route: Route, _idx: number, _allRoutes: Route[]) => {
|
||||||
|
// const pattern /*: URLPattern */ = new URLPattern({ pathname: route.path });
|
||||||
|
const matcher = match<Record<string, string>>(route.path);
|
||||||
|
const methodList = route.methods;
|
||||||
|
|
||||||
|
const handler: InternalHandler = async (
|
||||||
|
expressRequest: ExpressRequest,
|
||||||
|
): Promise<Result> => {
|
||||||
|
const method = massageMethod(expressRequest.method);
|
||||||
|
|
||||||
|
console.log("method", method);
|
||||||
|
|
||||||
|
if (!methodList.includes(method)) {
|
||||||
|
// XXX: Worth asserting this?
|
||||||
|
}
|
||||||
|
|
||||||
|
console.log("request.originalUrl", expressRequest.originalUrl);
|
||||||
|
|
||||||
|
// Authenticate the request
|
||||||
|
const auth = await request.auth.validateRequest(expressRequest);
|
||||||
|
|
||||||
|
const req: Call = {
|
||||||
|
pattern: route.path,
|
||||||
|
path: expressRequest.originalUrl,
|
||||||
|
method,
|
||||||
|
parameters: { one: 1, two: 2 },
|
||||||
|
request: expressRequest,
|
||||||
|
user: auth.user,
|
||||||
|
session: new Session(auth.session, auth.user),
|
||||||
|
};
|
||||||
|
|
||||||
|
try {
|
||||||
|
const retval = await runWithContext({ user: auth.user }, () =>
|
||||||
|
route.handler(req),
|
||||||
|
);
|
||||||
|
return retval;
|
||||||
|
} catch (error) {
|
||||||
|
// Handle authentication errors
|
||||||
|
if (error instanceof AuthenticationRequired) {
|
||||||
|
return {
|
||||||
|
code: httpCodes.clientErrors.Unauthorized,
|
||||||
|
contentType: contentTypes.application.json,
|
||||||
|
result: JSON.stringify({
|
||||||
|
error: "Authentication required",
|
||||||
|
}),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
if (error instanceof AuthorizationDenied) {
|
||||||
|
return {
|
||||||
|
code: httpCodes.clientErrors.Forbidden,
|
||||||
|
contentType: contentTypes.application.json,
|
||||||
|
result: JSON.stringify({ error: "Access denied" }),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
for (const [_idx, method] of methodList.entries()) {
|
||||||
|
const pr: ProcessedRoute = { matcher, method, handler };
|
||||||
|
|
||||||
|
processedRoutes[method].push(pr);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
async function handler(
|
||||||
|
req: ExpressRequest,
|
||||||
|
_res: ExpressResponse,
|
||||||
|
): Promise<Result> {
|
||||||
|
const method = await methodParser.parseAsync(req.method);
|
||||||
|
|
||||||
|
const byMethod = processedRoutes[method];
|
||||||
|
console.log(
|
||||||
|
"DEBUG: req.path =",
|
||||||
|
JSON.stringify(req.path),
|
||||||
|
"method =",
|
||||||
|
method,
|
||||||
|
);
|
||||||
|
for (const [_idx, pr] of byMethod.entries()) {
|
||||||
|
const match = pr.matcher(req.path);
|
||||||
|
console.log("DEBUG: trying pattern, match result =", match);
|
||||||
|
if (match) {
|
||||||
|
console.log("match", match);
|
||||||
|
const resp = await pr.handler(req);
|
||||||
|
|
||||||
|
return resp;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const retval: Result = {
|
||||||
|
code: httpCodes.clientErrors.NotFound,
|
||||||
|
contentType: contentTypes.text.plain,
|
||||||
|
result: "not found!",
|
||||||
|
};
|
||||||
|
|
||||||
|
return retval;
|
||||||
|
}
|
||||||
|
|
||||||
|
app.use(async (req: ExpressRequest, res: ExpressResponse) => {
|
||||||
|
const result0 = await handler(req, res);
|
||||||
|
|
||||||
|
const code = result0.code.code;
|
||||||
|
const result = result0.result;
|
||||||
|
|
||||||
|
console.log(result);
|
||||||
|
|
||||||
|
// Set any cookies from the result
|
||||||
|
if (result0.cookies) {
|
||||||
|
for (const cookie of result0.cookies) {
|
||||||
|
res.cookie(cookie.name, cookie.value, cookie.options ?? {});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (isRedirect(result0)) {
|
||||||
|
res.redirect(code, result0.redirect);
|
||||||
|
} else {
|
||||||
|
res.status(code).send(result);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
process.title = `diachron:${cli.listen.port}`;
|
||||||
|
|
||||||
|
app.listen(cli.listen.port, cli.listen.host, () => {
|
||||||
|
console.log(`Listening on ${cli.listen.host}:${cli.listen.port}`);
|
||||||
|
});
|
||||||
20
express/auth/index.ts
Normal file
20
express/auth/index.ts
Normal file
@@ -0,0 +1,20 @@
|
|||||||
|
// index.ts
|
||||||
|
//
|
||||||
|
// Barrel export for auth module.
|
||||||
|
//
|
||||||
|
// NOTE: authRoutes is NOT exported here to avoid circular dependency:
|
||||||
|
// services.ts → auth/index.ts → auth/routes.ts → services.ts
|
||||||
|
// Import authRoutes directly from "./auth/routes" instead.
|
||||||
|
|
||||||
|
export { hashPassword, verifyPassword } from "./password";
|
||||||
|
export { type AuthResult, AuthService } from "./service";
|
||||||
|
export { type AuthStore, InMemoryAuthStore } from "./store";
|
||||||
|
export { generateToken, hashToken, SESSION_COOKIE_NAME } from "./token";
|
||||||
|
export {
|
||||||
|
type AuthMethod,
|
||||||
|
Session,
|
||||||
|
type SessionData,
|
||||||
|
type TokenId,
|
||||||
|
type TokenType,
|
||||||
|
tokenLifetimes,
|
||||||
|
} from "./types";
|
||||||
70
express/auth/password.ts
Normal file
70
express/auth/password.ts
Normal file
@@ -0,0 +1,70 @@
|
|||||||
|
// password.ts
|
||||||
|
//
|
||||||
|
// Password hashing using Node.js scrypt (no external dependencies).
|
||||||
|
// Format: $scrypt$N$r$p$salt$hash (all base64)
|
||||||
|
|
||||||
|
import {
|
||||||
|
randomBytes,
|
||||||
|
type ScryptOptions,
|
||||||
|
scrypt,
|
||||||
|
timingSafeEqual,
|
||||||
|
} from "node:crypto";
|
||||||
|
|
||||||
|
// Configuration
|
||||||
|
const SALT_LENGTH = 32;
|
||||||
|
const KEY_LENGTH = 64;
|
||||||
|
const SCRYPT_PARAMS: ScryptOptions = {
|
||||||
|
N: 16384, // CPU/memory cost parameter (2^14)
|
||||||
|
r: 8, // Block size
|
||||||
|
p: 1, // Parallelization
|
||||||
|
};
|
||||||
|
|
||||||
|
// Promisified scrypt with options support
|
||||||
|
function scryptAsync(
|
||||||
|
password: string,
|
||||||
|
salt: Buffer,
|
||||||
|
keylen: number,
|
||||||
|
options: ScryptOptions,
|
||||||
|
): Promise<Buffer> {
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
scrypt(password, salt, keylen, options, (err, derivedKey) => {
|
||||||
|
if (err) {
|
||||||
|
reject(err);
|
||||||
|
} else {
|
||||||
|
resolve(derivedKey);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async function hashPassword(password: string): Promise<string> {
|
||||||
|
const salt = randomBytes(SALT_LENGTH);
|
||||||
|
const hash = await scryptAsync(password, salt, KEY_LENGTH, SCRYPT_PARAMS);
|
||||||
|
|
||||||
|
const { N, r, p } = SCRYPT_PARAMS;
|
||||||
|
return `$scrypt$${N}$${r}$${p}$${salt.toString("base64")}$${hash.toString("base64")}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function verifyPassword(
|
||||||
|
password: string,
|
||||||
|
stored: string,
|
||||||
|
): Promise<boolean> {
|
||||||
|
const parts = stored.split("$");
|
||||||
|
if (parts[1] !== "scrypt" || parts.length !== 7) {
|
||||||
|
throw new Error("Invalid password hash format");
|
||||||
|
}
|
||||||
|
|
||||||
|
const [, , nStr, rStr, pStr, saltB64, hashB64] = parts;
|
||||||
|
const salt = Buffer.from(saltB64, "base64");
|
||||||
|
const storedHash = Buffer.from(hashB64, "base64");
|
||||||
|
|
||||||
|
const computedHash = await scryptAsync(password, salt, storedHash.length, {
|
||||||
|
N: parseInt(nStr, 10),
|
||||||
|
r: parseInt(rStr, 10),
|
||||||
|
p: parseInt(pStr, 10),
|
||||||
|
});
|
||||||
|
|
||||||
|
return timingSafeEqual(storedHash, computedHash);
|
||||||
|
}
|
||||||
|
|
||||||
|
export { hashPassword, verifyPassword };
|
||||||
231
express/auth/routes.ts
Normal file
231
express/auth/routes.ts
Normal file
@@ -0,0 +1,231 @@
|
|||||||
|
// routes.ts
|
||||||
|
//
|
||||||
|
// Authentication route handlers.
|
||||||
|
|
||||||
|
import { z } from "zod";
|
||||||
|
import { contentTypes } from "../content-types";
|
||||||
|
import { httpCodes } from "../http-codes";
|
||||||
|
import { request } from "../request";
|
||||||
|
import type { Call, Result, Route } from "../types";
|
||||||
|
import {
|
||||||
|
forgotPasswordInputParser,
|
||||||
|
loginInputParser,
|
||||||
|
registerInputParser,
|
||||||
|
resetPasswordInputParser,
|
||||||
|
} from "./types";
|
||||||
|
|
||||||
|
// Helper for JSON responses
|
||||||
|
const jsonResponse = (
|
||||||
|
code: (typeof httpCodes.success)[keyof typeof httpCodes.success],
|
||||||
|
data: object,
|
||||||
|
): Result => ({
|
||||||
|
code,
|
||||||
|
contentType: contentTypes.application.json,
|
||||||
|
result: JSON.stringify(data),
|
||||||
|
});
|
||||||
|
|
||||||
|
const errorResponse = (
|
||||||
|
code: (typeof httpCodes.clientErrors)[keyof typeof httpCodes.clientErrors],
|
||||||
|
error: string,
|
||||||
|
): Result => ({
|
||||||
|
code,
|
||||||
|
contentType: contentTypes.application.json,
|
||||||
|
result: JSON.stringify({ error }),
|
||||||
|
});
|
||||||
|
|
||||||
|
// POST /auth/login
|
||||||
|
const loginHandler = async (call: Call): Promise<Result> => {
|
||||||
|
try {
|
||||||
|
const body = call.request.body;
|
||||||
|
const { email, password } = loginInputParser.parse(body);
|
||||||
|
|
||||||
|
const result = await request.auth.login(email, password, "cookie", {
|
||||||
|
userAgent: call.request.get("User-Agent"),
|
||||||
|
ipAddress: call.request.ip,
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!result.success) {
|
||||||
|
return errorResponse(
|
||||||
|
httpCodes.clientErrors.Unauthorized,
|
||||||
|
result.error,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return jsonResponse(httpCodes.success.OK, {
|
||||||
|
token: result.token,
|
||||||
|
user: {
|
||||||
|
id: result.user.id,
|
||||||
|
email: result.user.email,
|
||||||
|
displayName: result.user.displayName,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
if (error instanceof z.ZodError) {
|
||||||
|
return errorResponse(
|
||||||
|
httpCodes.clientErrors.BadRequest,
|
||||||
|
"Invalid input",
|
||||||
|
);
|
||||||
|
}
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
// POST /auth/logout
|
||||||
|
const logoutHandler = async (call: Call): Promise<Result> => {
|
||||||
|
const token = request.auth.extractToken(call.request);
|
||||||
|
if (token) {
|
||||||
|
await request.auth.logout(token);
|
||||||
|
}
|
||||||
|
|
||||||
|
return jsonResponse(httpCodes.success.OK, { message: "Logged out" });
|
||||||
|
};
|
||||||
|
|
||||||
|
// POST /auth/register
|
||||||
|
const registerHandler = async (call: Call): Promise<Result> => {
|
||||||
|
try {
|
||||||
|
const body = call.request.body;
|
||||||
|
const { email, password, displayName } =
|
||||||
|
registerInputParser.parse(body);
|
||||||
|
|
||||||
|
const result = await request.auth.register(
|
||||||
|
email,
|
||||||
|
password,
|
||||||
|
displayName,
|
||||||
|
);
|
||||||
|
|
||||||
|
if (!result.success) {
|
||||||
|
return errorResponse(httpCodes.clientErrors.Conflict, result.error);
|
||||||
|
}
|
||||||
|
|
||||||
|
// TODO: Send verification email with result.verificationToken
|
||||||
|
// For now, log it for development
|
||||||
|
console.log(
|
||||||
|
`[AUTH] Verification token for ${email}: ${result.verificationToken}`,
|
||||||
|
);
|
||||||
|
|
||||||
|
return jsonResponse(httpCodes.success.Created, {
|
||||||
|
message:
|
||||||
|
"Registration successful. Please check your email to verify your account.",
|
||||||
|
user: {
|
||||||
|
id: result.user.id,
|
||||||
|
email: result.user.email,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
if (error instanceof z.ZodError) {
|
||||||
|
return errorResponse(
|
||||||
|
httpCodes.clientErrors.BadRequest,
|
||||||
|
"Invalid input",
|
||||||
|
);
|
||||||
|
}
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
// POST /auth/forgot-password
|
||||||
|
const forgotPasswordHandler = async (call: Call): Promise<Result> => {
|
||||||
|
try {
|
||||||
|
const body = call.request.body;
|
||||||
|
const { email } = forgotPasswordInputParser.parse(body);
|
||||||
|
|
||||||
|
const result = await request.auth.createPasswordResetToken(email);
|
||||||
|
|
||||||
|
// Always return success (don't reveal if email exists)
|
||||||
|
if (result) {
|
||||||
|
// TODO: Send password reset email
|
||||||
|
console.log(
|
||||||
|
`[AUTH] Password reset token for ${email}: ${result.token}`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return jsonResponse(httpCodes.success.OK, {
|
||||||
|
message:
|
||||||
|
"If an account exists with that email, a password reset link has been sent.",
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
if (error instanceof z.ZodError) {
|
||||||
|
return errorResponse(
|
||||||
|
httpCodes.clientErrors.BadRequest,
|
||||||
|
"Invalid input",
|
||||||
|
);
|
||||||
|
}
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
// POST /auth/reset-password
|
||||||
|
const resetPasswordHandler = async (call: Call): Promise<Result> => {
|
||||||
|
try {
|
||||||
|
const body = call.request.body;
|
||||||
|
const { token, password } = resetPasswordInputParser.parse(body);
|
||||||
|
|
||||||
|
const result = await request.auth.resetPassword(token, password);
|
||||||
|
|
||||||
|
if (!result.success) {
|
||||||
|
return errorResponse(
|
||||||
|
httpCodes.clientErrors.BadRequest,
|
||||||
|
result.error,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return jsonResponse(httpCodes.success.OK, {
|
||||||
|
message:
|
||||||
|
"Password has been reset. You can now log in with your new password.",
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
if (error instanceof z.ZodError) {
|
||||||
|
return errorResponse(
|
||||||
|
httpCodes.clientErrors.BadRequest,
|
||||||
|
"Invalid input",
|
||||||
|
);
|
||||||
|
}
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
// GET /auth/verify-email?token=xxx
|
||||||
|
const verifyEmailHandler = async (call: Call): Promise<Result> => {
|
||||||
|
const url = new URL(call.path, "http://localhost");
|
||||||
|
const token = url.searchParams.get("token");
|
||||||
|
|
||||||
|
if (!token) {
|
||||||
|
return errorResponse(
|
||||||
|
httpCodes.clientErrors.BadRequest,
|
||||||
|
"Missing token",
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await request.auth.verifyEmail(token);
|
||||||
|
|
||||||
|
if (!result.success) {
|
||||||
|
return errorResponse(httpCodes.clientErrors.BadRequest, result.error);
|
||||||
|
}
|
||||||
|
|
||||||
|
return jsonResponse(httpCodes.success.OK, {
|
||||||
|
message: "Email verified successfully. You can now log in.",
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
// Export routes
|
||||||
|
const authRoutes: Route[] = [
|
||||||
|
{ path: "/auth/login", methods: ["POST"], handler: loginHandler },
|
||||||
|
{ path: "/auth/logout", methods: ["POST"], handler: logoutHandler },
|
||||||
|
{ path: "/auth/register", methods: ["POST"], handler: registerHandler },
|
||||||
|
{
|
||||||
|
path: "/auth/forgot-password",
|
||||||
|
methods: ["POST"],
|
||||||
|
handler: forgotPasswordHandler,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
path: "/auth/reset-password",
|
||||||
|
methods: ["POST"],
|
||||||
|
handler: resetPasswordHandler,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
path: "/auth/verify-email",
|
||||||
|
methods: ["GET"],
|
||||||
|
handler: verifyEmailHandler,
|
||||||
|
},
|
||||||
|
];
|
||||||
|
|
||||||
|
export { authRoutes };
|
||||||
262
express/auth/service.ts
Normal file
262
express/auth/service.ts
Normal file
@@ -0,0 +1,262 @@
|
|||||||
|
// service.ts
|
||||||
|
//
|
||||||
|
// Core authentication service providing login, logout, registration,
|
||||||
|
// password reset, and email verification.
|
||||||
|
|
||||||
|
import type { Request as ExpressRequest } from "express";
|
||||||
|
import {
|
||||||
|
type AnonymousUser,
|
||||||
|
anonymousUser,
|
||||||
|
type User,
|
||||||
|
type UserId,
|
||||||
|
} from "../user";
|
||||||
|
import { hashPassword, verifyPassword } from "./password";
|
||||||
|
import type { AuthStore } from "./store";
|
||||||
|
import {
|
||||||
|
hashToken,
|
||||||
|
parseAuthorizationHeader,
|
||||||
|
SESSION_COOKIE_NAME,
|
||||||
|
} from "./token";
|
||||||
|
import { type SessionData, type TokenId, tokenLifetimes } from "./types";
|
||||||
|
|
||||||
|
type LoginResult =
|
||||||
|
| { success: true; token: string; user: User }
|
||||||
|
| { success: false; error: string };
|
||||||
|
|
||||||
|
type RegisterResult =
|
||||||
|
| { success: true; user: User; verificationToken: string }
|
||||||
|
| { success: false; error: string };
|
||||||
|
|
||||||
|
type SimpleResult = { success: true } | { success: false; error: string };
|
||||||
|
|
||||||
|
// Result of validating a request/token - contains both user and session
|
||||||
|
export type AuthResult =
|
||||||
|
| { authenticated: true; user: User; session: SessionData }
|
||||||
|
| { authenticated: false; user: AnonymousUser; session: null };
|
||||||
|
|
||||||
|
export class AuthService {
|
||||||
|
constructor(private store: AuthStore) {}
|
||||||
|
|
||||||
|
// === Login ===
|
||||||
|
|
||||||
|
async login(
|
||||||
|
email: string,
|
||||||
|
password: string,
|
||||||
|
authMethod: "cookie" | "bearer",
|
||||||
|
metadata?: { userAgent?: string; ipAddress?: string },
|
||||||
|
): Promise<LoginResult> {
|
||||||
|
const user = await this.store.getUserByEmail(email);
|
||||||
|
if (!user) {
|
||||||
|
return { success: false, error: "Invalid credentials" };
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!user.isActive()) {
|
||||||
|
return { success: false, error: "Account is not active" };
|
||||||
|
}
|
||||||
|
|
||||||
|
const passwordHash = await this.store.getUserPasswordHash(user.id);
|
||||||
|
if (!passwordHash) {
|
||||||
|
return { success: false, error: "Invalid credentials" };
|
||||||
|
}
|
||||||
|
|
||||||
|
const valid = await verifyPassword(password, passwordHash);
|
||||||
|
if (!valid) {
|
||||||
|
return { success: false, error: "Invalid credentials" };
|
||||||
|
}
|
||||||
|
|
||||||
|
const { token } = await this.store.createSession({
|
||||||
|
userId: user.id,
|
||||||
|
tokenType: "session",
|
||||||
|
authMethod,
|
||||||
|
expiresAt: new Date(Date.now() + tokenLifetimes.session),
|
||||||
|
userAgent: metadata?.userAgent,
|
||||||
|
ipAddress: metadata?.ipAddress,
|
||||||
|
});
|
||||||
|
|
||||||
|
return { success: true, token, user };
|
||||||
|
}
|
||||||
|
|
||||||
|
// === Session Validation ===
|
||||||
|
|
||||||
|
async validateRequest(request: ExpressRequest): Promise<AuthResult> {
|
||||||
|
// Try cookie first (for web requests)
|
||||||
|
let token = this.extractCookieToken(request);
|
||||||
|
|
||||||
|
// Fall back to Authorization header (for API requests)
|
||||||
|
if (!token) {
|
||||||
|
token = parseAuthorizationHeader(request.get("Authorization"));
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!token) {
|
||||||
|
return { authenticated: false, user: anonymousUser, session: null };
|
||||||
|
}
|
||||||
|
|
||||||
|
return this.validateToken(token);
|
||||||
|
}
|
||||||
|
|
||||||
|
async validateToken(token: string): Promise<AuthResult> {
|
||||||
|
const tokenId = hashToken(token) as TokenId;
|
||||||
|
const session = await this.store.getSession(tokenId);
|
||||||
|
|
||||||
|
if (!session) {
|
||||||
|
return { authenticated: false, user: anonymousUser, session: null };
|
||||||
|
}
|
||||||
|
|
||||||
|
if (session.tokenType !== "session") {
|
||||||
|
return { authenticated: false, user: anonymousUser, session: null };
|
||||||
|
}
|
||||||
|
|
||||||
|
const user = await this.store.getUserById(session.userId as UserId);
|
||||||
|
if (!user || !user.isActive()) {
|
||||||
|
return { authenticated: false, user: anonymousUser, session: null };
|
||||||
|
}
|
||||||
|
|
||||||
|
// Update last used (fire and forget)
|
||||||
|
this.store.updateLastUsed(tokenId).catch(() => {});
|
||||||
|
|
||||||
|
return { authenticated: true, user, session };
|
||||||
|
}
|
||||||
|
|
||||||
|
private extractCookieToken(request: ExpressRequest): string | null {
|
||||||
|
const cookies = request.get("Cookie");
|
||||||
|
if (!cookies) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
for (const cookie of cookies.split(";")) {
|
||||||
|
const [name, ...valueParts] = cookie.trim().split("=");
|
||||||
|
if (name === SESSION_COOKIE_NAME) {
|
||||||
|
return valueParts.join("="); // Handle = in token value
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
// === Logout ===
|
||||||
|
|
||||||
|
async logout(token: string): Promise<void> {
|
||||||
|
const tokenId = hashToken(token) as TokenId;
|
||||||
|
await this.store.deleteSession(tokenId);
|
||||||
|
}
|
||||||
|
|
||||||
|
async logoutAllSessions(userId: UserId): Promise<number> {
|
||||||
|
return this.store.deleteUserSessions(userId);
|
||||||
|
}
|
||||||
|
|
||||||
|
// === Registration ===
|
||||||
|
|
||||||
|
async register(
|
||||||
|
email: string,
|
||||||
|
password: string,
|
||||||
|
displayName?: string,
|
||||||
|
): Promise<RegisterResult> {
|
||||||
|
const existing = await this.store.getUserByEmail(email);
|
||||||
|
if (existing) {
|
||||||
|
return { success: false, error: "Email already registered" };
|
||||||
|
}
|
||||||
|
|
||||||
|
const passwordHash = await hashPassword(password);
|
||||||
|
const user = await this.store.createUser({
|
||||||
|
email,
|
||||||
|
passwordHash,
|
||||||
|
displayName,
|
||||||
|
});
|
||||||
|
|
||||||
|
// Create email verification token
|
||||||
|
const { token: verificationToken } = await this.store.createSession({
|
||||||
|
userId: user.id,
|
||||||
|
tokenType: "email_verify",
|
||||||
|
authMethod: "bearer",
|
||||||
|
expiresAt: new Date(Date.now() + tokenLifetimes.email_verify),
|
||||||
|
});
|
||||||
|
|
||||||
|
return { success: true, user, verificationToken };
|
||||||
|
}
|
||||||
|
|
||||||
|
// === Email Verification ===
|
||||||
|
|
||||||
|
async verifyEmail(token: string): Promise<SimpleResult> {
|
||||||
|
const tokenId = hashToken(token) as TokenId;
|
||||||
|
const session = await this.store.getSession(tokenId);
|
||||||
|
|
||||||
|
if (!session || session.tokenType !== "email_verify") {
|
||||||
|
return {
|
||||||
|
success: false,
|
||||||
|
error: "Invalid or expired verification token",
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
if (session.isUsed) {
|
||||||
|
return { success: false, error: "Token already used" };
|
||||||
|
}
|
||||||
|
|
||||||
|
await this.store.updateUserEmailVerified(session.userId as UserId);
|
||||||
|
await this.store.deleteSession(tokenId);
|
||||||
|
|
||||||
|
return { success: true };
|
||||||
|
}
|
||||||
|
|
||||||
|
// === Password Reset ===
|
||||||
|
|
||||||
|
async createPasswordResetToken(
|
||||||
|
email: string,
|
||||||
|
): Promise<{ token: string } | null> {
|
||||||
|
const user = await this.store.getUserByEmail(email);
|
||||||
|
if (!user) {
|
||||||
|
// Don't reveal whether email exists
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
const { token } = await this.store.createSession({
|
||||||
|
userId: user.id,
|
||||||
|
tokenType: "password_reset",
|
||||||
|
authMethod: "bearer",
|
||||||
|
expiresAt: new Date(Date.now() + tokenLifetimes.password_reset),
|
||||||
|
});
|
||||||
|
|
||||||
|
return { token };
|
||||||
|
}
|
||||||
|
|
||||||
|
async resetPassword(
|
||||||
|
token: string,
|
||||||
|
newPassword: string,
|
||||||
|
): Promise<SimpleResult> {
|
||||||
|
const tokenId = hashToken(token) as TokenId;
|
||||||
|
const session = await this.store.getSession(tokenId);
|
||||||
|
|
||||||
|
if (!session || session.tokenType !== "password_reset") {
|
||||||
|
return { success: false, error: "Invalid or expired reset token" };
|
||||||
|
}
|
||||||
|
|
||||||
|
if (session.isUsed) {
|
||||||
|
return { success: false, error: "Token already used" };
|
||||||
|
}
|
||||||
|
|
||||||
|
const passwordHash = await hashPassword(newPassword);
|
||||||
|
await this.store.setUserPassword(
|
||||||
|
session.userId as UserId,
|
||||||
|
passwordHash,
|
||||||
|
);
|
||||||
|
|
||||||
|
// Invalidate all existing sessions (security: password changed)
|
||||||
|
await this.store.deleteUserSessions(session.userId as UserId);
|
||||||
|
|
||||||
|
// Delete the reset token
|
||||||
|
await this.store.deleteSession(tokenId);
|
||||||
|
|
||||||
|
return { success: true };
|
||||||
|
}
|
||||||
|
|
||||||
|
// === Token Extraction Helper (for routes) ===
|
||||||
|
|
||||||
|
extractToken(request: ExpressRequest): string | null {
|
||||||
|
// Try Authorization header first
|
||||||
|
const token = parseAuthorizationHeader(request.get("Authorization"));
|
||||||
|
if (token) {
|
||||||
|
return token;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Try cookie
|
||||||
|
return this.extractCookieToken(request);
|
||||||
|
}
|
||||||
|
}
|
||||||
164
express/auth/store.ts
Normal file
164
express/auth/store.ts
Normal file
@@ -0,0 +1,164 @@
|
|||||||
|
// store.ts
|
||||||
|
//
|
||||||
|
// Authentication storage interface and in-memory implementation.
|
||||||
|
// The interface allows easy migration to PostgreSQL later.
|
||||||
|
|
||||||
|
import { AuthenticatedUser, type User, type UserId } from "../user";
|
||||||
|
import { generateToken, hashToken } from "./token";
|
||||||
|
import type { AuthMethod, SessionData, TokenId, TokenType } from "./types";
|
||||||
|
|
||||||
|
// Data for creating a new session (tokenId generated internally)
|
||||||
|
export type CreateSessionData = {
|
||||||
|
userId: string;
|
||||||
|
tokenType: TokenType;
|
||||||
|
authMethod: AuthMethod;
|
||||||
|
expiresAt: Date;
|
||||||
|
userAgent?: string;
|
||||||
|
ipAddress?: string;
|
||||||
|
};
|
||||||
|
|
||||||
|
// Data for creating a new user
|
||||||
|
export type CreateUserData = {
|
||||||
|
email: string;
|
||||||
|
passwordHash: string;
|
||||||
|
displayName?: string;
|
||||||
|
};
|
||||||
|
|
||||||
|
// Abstract interface for auth storage - implement for PostgreSQL later
|
||||||
|
export interface AuthStore {
|
||||||
|
// Session operations
|
||||||
|
createSession(
|
||||||
|
data: CreateSessionData,
|
||||||
|
): Promise<{ token: string; session: SessionData }>;
|
||||||
|
getSession(tokenId: TokenId): Promise<SessionData | null>;
|
||||||
|
updateLastUsed(tokenId: TokenId): Promise<void>;
|
||||||
|
deleteSession(tokenId: TokenId): Promise<void>;
|
||||||
|
deleteUserSessions(userId: UserId): Promise<number>;
|
||||||
|
|
||||||
|
// User operations
|
||||||
|
getUserByEmail(email: string): Promise<User | null>;
|
||||||
|
getUserById(userId: UserId): Promise<User | null>;
|
||||||
|
createUser(data: CreateUserData): Promise<User>;
|
||||||
|
getUserPasswordHash(userId: UserId): Promise<string | null>;
|
||||||
|
setUserPassword(userId: UserId, passwordHash: string): Promise<void>;
|
||||||
|
updateUserEmailVerified(userId: UserId): Promise<void>;
|
||||||
|
}
|
||||||
|
|
||||||
|
// In-memory implementation for development
|
||||||
|
export class InMemoryAuthStore implements AuthStore {
|
||||||
|
private sessions: Map<string, SessionData> = new Map();
|
||||||
|
private users: Map<string, User> = new Map();
|
||||||
|
private usersByEmail: Map<string, string> = new Map();
|
||||||
|
private passwordHashes: Map<string, string> = new Map();
|
||||||
|
private emailVerified: Map<string, boolean> = new Map();
|
||||||
|
|
||||||
|
async createSession(
|
||||||
|
data: CreateSessionData,
|
||||||
|
): Promise<{ token: string; session: SessionData }> {
|
||||||
|
const token = generateToken();
|
||||||
|
const tokenId = hashToken(token);
|
||||||
|
|
||||||
|
const session: SessionData = {
|
||||||
|
tokenId,
|
||||||
|
userId: data.userId,
|
||||||
|
tokenType: data.tokenType,
|
||||||
|
authMethod: data.authMethod,
|
||||||
|
createdAt: new Date(),
|
||||||
|
expiresAt: data.expiresAt,
|
||||||
|
userAgent: data.userAgent,
|
||||||
|
ipAddress: data.ipAddress,
|
||||||
|
};
|
||||||
|
|
||||||
|
this.sessions.set(tokenId, session);
|
||||||
|
return { token, session };
|
||||||
|
}
|
||||||
|
|
||||||
|
async getSession(tokenId: TokenId): Promise<SessionData | null> {
|
||||||
|
const session = this.sessions.get(tokenId);
|
||||||
|
if (!session) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check expiration
|
||||||
|
if (new Date() > session.expiresAt) {
|
||||||
|
this.sessions.delete(tokenId);
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
return session;
|
||||||
|
}
|
||||||
|
|
||||||
|
async updateLastUsed(tokenId: TokenId): Promise<void> {
|
||||||
|
const session = this.sessions.get(tokenId);
|
||||||
|
if (session) {
|
||||||
|
session.lastUsedAt = new Date();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async deleteSession(tokenId: TokenId): Promise<void> {
|
||||||
|
this.sessions.delete(tokenId);
|
||||||
|
}
|
||||||
|
|
||||||
|
async deleteUserSessions(userId: UserId): Promise<number> {
|
||||||
|
let count = 0;
|
||||||
|
for (const [tokenId, session] of this.sessions) {
|
||||||
|
if (session.userId === userId) {
|
||||||
|
this.sessions.delete(tokenId);
|
||||||
|
count++;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return count;
|
||||||
|
}
|
||||||
|
|
||||||
|
async getUserByEmail(email: string): Promise<User | null> {
|
||||||
|
const userId = this.usersByEmail.get(email.toLowerCase());
|
||||||
|
if (!userId) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
return this.users.get(userId) ?? null;
|
||||||
|
}
|
||||||
|
|
||||||
|
async getUserById(userId: UserId): Promise<User | null> {
|
||||||
|
return this.users.get(userId) ?? null;
|
||||||
|
}
|
||||||
|
|
||||||
|
async createUser(data: CreateUserData): Promise<User> {
|
||||||
|
const user = AuthenticatedUser.create(data.email, {
|
||||||
|
displayName: data.displayName,
|
||||||
|
status: "pending", // Pending until email verified
|
||||||
|
});
|
||||||
|
|
||||||
|
this.users.set(user.id, user);
|
||||||
|
this.usersByEmail.set(data.email.toLowerCase(), user.id);
|
||||||
|
this.passwordHashes.set(user.id, data.passwordHash);
|
||||||
|
this.emailVerified.set(user.id, false);
|
||||||
|
|
||||||
|
return user;
|
||||||
|
}
|
||||||
|
|
||||||
|
async getUserPasswordHash(userId: UserId): Promise<string | null> {
|
||||||
|
return this.passwordHashes.get(userId) ?? null;
|
||||||
|
}
|
||||||
|
|
||||||
|
async setUserPassword(userId: UserId, passwordHash: string): Promise<void> {
|
||||||
|
this.passwordHashes.set(userId, passwordHash);
|
||||||
|
}
|
||||||
|
|
||||||
|
async updateUserEmailVerified(userId: UserId): Promise<void> {
|
||||||
|
this.emailVerified.set(userId, true);
|
||||||
|
|
||||||
|
// Update user status to active
|
||||||
|
const user = this.users.get(userId);
|
||||||
|
if (user) {
|
||||||
|
// Create new user with active status
|
||||||
|
const updatedUser = AuthenticatedUser.create(user.email, {
|
||||||
|
id: user.id,
|
||||||
|
displayName: user.displayName,
|
||||||
|
status: "active",
|
||||||
|
roles: [...user.roles],
|
||||||
|
permissions: [...user.permissions],
|
||||||
|
});
|
||||||
|
this.users.set(userId, updatedUser);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
42
express/auth/token.ts
Normal file
42
express/auth/token.ts
Normal file
@@ -0,0 +1,42 @@
|
|||||||
|
// token.ts
|
||||||
|
//
|
||||||
|
// Token generation and hashing utilities for authentication.
|
||||||
|
// Raw tokens are never stored - only their SHA-256 hashes.
|
||||||
|
|
||||||
|
import { createHash, randomBytes } from "node:crypto";
|
||||||
|
|
||||||
|
const TOKEN_BYTES = 32; // 256 bits of entropy
|
||||||
|
|
||||||
|
// Generate a cryptographically secure random token
|
||||||
|
function generateToken(): string {
|
||||||
|
return randomBytes(TOKEN_BYTES).toString("base64url");
|
||||||
|
}
|
||||||
|
|
||||||
|
// Hash token for storage (never store raw tokens)
|
||||||
|
function hashToken(token: string): string {
|
||||||
|
return createHash("sha256").update(token).digest("hex");
|
||||||
|
}
|
||||||
|
|
||||||
|
// Parse token from Authorization header
|
||||||
|
function parseAuthorizationHeader(header: string | undefined): string | null {
|
||||||
|
if (!header) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
const parts = header.split(" ");
|
||||||
|
if (parts.length !== 2 || parts[0].toLowerCase() !== "bearer") {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
return parts[1];
|
||||||
|
}
|
||||||
|
|
||||||
|
// Cookie name for web sessions
|
||||||
|
const SESSION_COOKIE_NAME = "diachron_session";
|
||||||
|
|
||||||
|
export {
|
||||||
|
generateToken,
|
||||||
|
hashToken,
|
||||||
|
parseAuthorizationHeader,
|
||||||
|
SESSION_COOKIE_NAME,
|
||||||
|
};
|
||||||
96
express/auth/types.ts
Normal file
96
express/auth/types.ts
Normal file
@@ -0,0 +1,96 @@
|
|||||||
|
// types.ts
|
||||||
|
//
|
||||||
|
// Authentication types and Zod schemas.
|
||||||
|
|
||||||
|
import { z } from "zod";
|
||||||
|
|
||||||
|
// Branded type for token IDs (the hash, not the raw token)
|
||||||
|
export type TokenId = string & { readonly __brand: "TokenId" };
|
||||||
|
|
||||||
|
// Token types for different purposes
|
||||||
|
export const tokenTypeParser = z.enum([
|
||||||
|
"session",
|
||||||
|
"password_reset",
|
||||||
|
"email_verify",
|
||||||
|
]);
|
||||||
|
export type TokenType = z.infer<typeof tokenTypeParser>;
|
||||||
|
|
||||||
|
// Authentication method - how the token was delivered
|
||||||
|
export const authMethodParser = z.enum(["cookie", "bearer"]);
|
||||||
|
export type AuthMethod = z.infer<typeof authMethodParser>;
|
||||||
|
|
||||||
|
// Session data schema - what gets stored
|
||||||
|
export const sessionDataParser = z.object({
|
||||||
|
tokenId: z.string().min(1),
|
||||||
|
userId: z.string().min(1),
|
||||||
|
tokenType: tokenTypeParser,
|
||||||
|
authMethod: authMethodParser,
|
||||||
|
createdAt: z.coerce.date(),
|
||||||
|
expiresAt: z.coerce.date(),
|
||||||
|
lastUsedAt: z.coerce.date().optional(),
|
||||||
|
userAgent: z.string().optional(),
|
||||||
|
ipAddress: z.string().optional(),
|
||||||
|
isUsed: z.boolean().optional(), // For one-time tokens
|
||||||
|
});
|
||||||
|
|
||||||
|
export type SessionData = z.infer<typeof sessionDataParser>;
|
||||||
|
|
||||||
|
// Input validation schemas for auth endpoints
|
||||||
|
export const loginInputParser = z.object({
|
||||||
|
email: z.string().email(),
|
||||||
|
password: z.string().min(1),
|
||||||
|
});
|
||||||
|
|
||||||
|
export const registerInputParser = z.object({
|
||||||
|
email: z.string().email(),
|
||||||
|
password: z.string().min(8),
|
||||||
|
displayName: z.string().optional(),
|
||||||
|
});
|
||||||
|
|
||||||
|
export const forgotPasswordInputParser = z.object({
|
||||||
|
email: z.string().email(),
|
||||||
|
});
|
||||||
|
|
||||||
|
export const resetPasswordInputParser = z.object({
|
||||||
|
token: z.string().min(1),
|
||||||
|
password: z.string().min(8),
|
||||||
|
});
|
||||||
|
|
||||||
|
// Token lifetimes in milliseconds
|
||||||
|
export const tokenLifetimes: Record<TokenType, number> = {
|
||||||
|
session: 30 * 24 * 60 * 60 * 1000, // 30 days
|
||||||
|
password_reset: 1 * 60 * 60 * 1000, // 1 hour
|
||||||
|
email_verify: 24 * 60 * 60 * 1000, // 24 hours
|
||||||
|
};
|
||||||
|
|
||||||
|
// Import here to avoid circular dependency at module load time
|
||||||
|
import type { User } from "../user";
|
||||||
|
|
||||||
|
// Session wrapper class providing a consistent interface for handlers.
|
||||||
|
// Always present on Call (never null), but may represent an anonymous session.
|
||||||
|
export class Session {
|
||||||
|
constructor(
|
||||||
|
private readonly data: SessionData | null,
|
||||||
|
private readonly user: User,
|
||||||
|
) {}
|
||||||
|
|
||||||
|
getUser(): User {
|
||||||
|
return this.user;
|
||||||
|
}
|
||||||
|
|
||||||
|
getData(): SessionData | null {
|
||||||
|
return this.data;
|
||||||
|
}
|
||||||
|
|
||||||
|
isAuthenticated(): boolean {
|
||||||
|
return !this.user.isAnonymous();
|
||||||
|
}
|
||||||
|
|
||||||
|
get tokenId(): string | undefined {
|
||||||
|
return this.data?.tokenId;
|
||||||
|
}
|
||||||
|
|
||||||
|
get userId(): string | undefined {
|
||||||
|
return this.data?.userId;
|
||||||
|
}
|
||||||
|
}
|
||||||
62
express/basic/login.ts
Normal file
62
express/basic/login.ts
Normal file
@@ -0,0 +1,62 @@
|
|||||||
|
import { SESSION_COOKIE_NAME } from "../auth/token";
|
||||||
|
import { tokenLifetimes } from "../auth/types";
|
||||||
|
import { request } from "../request";
|
||||||
|
import { html, redirect, render } from "../request/util";
|
||||||
|
import type { Call, Result, Route } from "../types";
|
||||||
|
|
||||||
|
const loginHandler = async (call: Call): Promise<Result> => {
|
||||||
|
if (call.method === "GET") {
|
||||||
|
const c = await render("basic/login", {});
|
||||||
|
return html(c);
|
||||||
|
}
|
||||||
|
|
||||||
|
// POST - handle login
|
||||||
|
const { email, password } = call.request.body;
|
||||||
|
|
||||||
|
if (!email || !password) {
|
||||||
|
const c = await render("basic/login", {
|
||||||
|
error: "Email and password are required",
|
||||||
|
email,
|
||||||
|
});
|
||||||
|
return html(c);
|
||||||
|
}
|
||||||
|
|
||||||
|
const result = await request.auth.login(email, password, "cookie", {
|
||||||
|
userAgent: call.request.get("User-Agent"),
|
||||||
|
ipAddress: call.request.ip,
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!result.success) {
|
||||||
|
const c = await render("basic/login", {
|
||||||
|
error: result.error,
|
||||||
|
email,
|
||||||
|
});
|
||||||
|
return html(c);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Success - set cookie and redirect to home
|
||||||
|
const redirectResult = redirect("/");
|
||||||
|
redirectResult.cookies = [
|
||||||
|
{
|
||||||
|
name: SESSION_COOKIE_NAME,
|
||||||
|
value: result.token,
|
||||||
|
options: {
|
||||||
|
httpOnly: true,
|
||||||
|
secure: false, // Set to true in production with HTTPS
|
||||||
|
sameSite: "lax",
|
||||||
|
maxAge: tokenLifetimes.session,
|
||||||
|
path: "/",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
];
|
||||||
|
|
||||||
|
return redirectResult;
|
||||||
|
};
|
||||||
|
|
||||||
|
const loginRoute: Route = {
|
||||||
|
path: "/login",
|
||||||
|
methods: ["GET", "POST"],
|
||||||
|
handler: loginHandler,
|
||||||
|
};
|
||||||
|
|
||||||
|
export { loginRoute };
|
||||||
38
express/basic/logout.ts
Normal file
38
express/basic/logout.ts
Normal file
@@ -0,0 +1,38 @@
|
|||||||
|
import { SESSION_COOKIE_NAME } from "../auth/token";
|
||||||
|
import { request } from "../request";
|
||||||
|
import { redirect } from "../request/util";
|
||||||
|
import type { Call, Result, Route } from "../types";
|
||||||
|
|
||||||
|
const logoutHandler = async (call: Call): Promise<Result> => {
|
||||||
|
// Extract token from cookie and invalidate the session
|
||||||
|
const token = request.auth.extractToken(call.request);
|
||||||
|
if (token) {
|
||||||
|
await request.auth.logout(token);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Clear the cookie and redirect to login
|
||||||
|
const redirectResult = redirect("/login");
|
||||||
|
redirectResult.cookies = [
|
||||||
|
{
|
||||||
|
name: SESSION_COOKIE_NAME,
|
||||||
|
value: "",
|
||||||
|
options: {
|
||||||
|
httpOnly: true,
|
||||||
|
secure: false,
|
||||||
|
sameSite: "lax",
|
||||||
|
maxAge: 0,
|
||||||
|
path: "/",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
];
|
||||||
|
|
||||||
|
return redirectResult;
|
||||||
|
};
|
||||||
|
|
||||||
|
const logoutRoute: Route = {
|
||||||
|
path: "/logout",
|
||||||
|
methods: ["GET", "POST"],
|
||||||
|
handler: logoutHandler,
|
||||||
|
};
|
||||||
|
|
||||||
|
export { logoutRoute };
|
||||||
43
express/basic/routes.ts
Normal file
43
express/basic/routes.ts
Normal file
@@ -0,0 +1,43 @@
|
|||||||
|
import { DateTime } from "ts-luxon";
|
||||||
|
import { request } from "../request";
|
||||||
|
import { html, render } from "../request/util";
|
||||||
|
import type { Call, Result, Route } from "../types";
|
||||||
|
import { loginRoute } from "./login";
|
||||||
|
import { logoutRoute } from "./logout";
|
||||||
|
|
||||||
|
const routes: Record<string, Route> = {
|
||||||
|
hello: {
|
||||||
|
path: "/hello",
|
||||||
|
methods: ["GET"],
|
||||||
|
handler: async (_call: Call): Promise<Result> => {
|
||||||
|
const now = DateTime.now();
|
||||||
|
const c = await render("basic/hello", { now });
|
||||||
|
|
||||||
|
return html(c);
|
||||||
|
},
|
||||||
|
},
|
||||||
|
home: {
|
||||||
|
path: "/",
|
||||||
|
methods: ["GET"],
|
||||||
|
handler: async (_call: Call): Promise<Result> => {
|
||||||
|
const _auth = request.auth;
|
||||||
|
const me = request.session.getUser();
|
||||||
|
|
||||||
|
const email = me.toString();
|
||||||
|
const showLogin = me.isAnonymous();
|
||||||
|
const showLogout = !me.isAnonymous();
|
||||||
|
|
||||||
|
const c = await render("basic/home", {
|
||||||
|
email,
|
||||||
|
showLogin,
|
||||||
|
showLogout,
|
||||||
|
});
|
||||||
|
|
||||||
|
return html(c);
|
||||||
|
},
|
||||||
|
},
|
||||||
|
login: loginRoute,
|
||||||
|
logout: logoutRoute,
|
||||||
|
};
|
||||||
|
|
||||||
|
export { routes };
|
||||||
39
express/biome.jsonc
Normal file
39
express/biome.jsonc
Normal file
@@ -0,0 +1,39 @@
|
|||||||
|
{
|
||||||
|
"$schema": "https://biomejs.dev/schemas/2.3.10/schema.json",
|
||||||
|
"vcs": {
|
||||||
|
"enabled": true,
|
||||||
|
"clientKind": "git",
|
||||||
|
"useIgnoreFile": true
|
||||||
|
},
|
||||||
|
"files": {
|
||||||
|
"includes": ["**", "!!**/dist"]
|
||||||
|
},
|
||||||
|
"formatter": {
|
||||||
|
"enabled": true,
|
||||||
|
"indentStyle": "space",
|
||||||
|
"indentWidth": 4
|
||||||
|
},
|
||||||
|
|
||||||
|
"linter": {
|
||||||
|
"enabled": true,
|
||||||
|
"rules": {
|
||||||
|
"recommended": true,
|
||||||
|
"style": {
|
||||||
|
"useBlockStatements": "error"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"javascript": {
|
||||||
|
"formatter": {
|
||||||
|
"quoteStyle": "double"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"assist": {
|
||||||
|
"enabled": true,
|
||||||
|
"actions": {
|
||||||
|
"source": {
|
||||||
|
"organizeImports": "on"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
9
express/build.sh
Executable file
9
express/build.sh
Executable file
@@ -0,0 +1,9 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
set -eu
|
||||||
|
|
||||||
|
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
|
||||||
|
cd "$DIR"
|
||||||
|
|
||||||
|
../cmd pnpm ncc build ./app.ts -o dist
|
||||||
14
express/check.sh
Executable file
14
express/check.sh
Executable file
@@ -0,0 +1,14 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
set -eu
|
||||||
|
|
||||||
|
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
|
||||||
|
check_dir="$DIR"
|
||||||
|
|
||||||
|
out_dir="$check_dir/out"
|
||||||
|
|
||||||
|
source "$check_dir"/../framework/shims/common
|
||||||
|
source "$check_dir"/../framework/shims/node.common
|
||||||
|
|
||||||
|
$ROOT/cmd pnpm tsc --outDir "$out_dir"
|
||||||
55
express/cli.ts
Normal file
55
express/cli.ts
Normal file
@@ -0,0 +1,55 @@
|
|||||||
|
import { parseArgs } from "node:util";
|
||||||
|
|
||||||
|
const { values } = parseArgs({
|
||||||
|
options: {
|
||||||
|
listen: {
|
||||||
|
type: "string",
|
||||||
|
short: "l",
|
||||||
|
},
|
||||||
|
"log-address": {
|
||||||
|
type: "string",
|
||||||
|
default: "8085",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
strict: true,
|
||||||
|
allowPositionals: false,
|
||||||
|
});
|
||||||
|
|
||||||
|
function parseListenAddress(listen: string | undefined): {
|
||||||
|
host: string;
|
||||||
|
port: number;
|
||||||
|
} {
|
||||||
|
const defaultHost = "127.0.0.1";
|
||||||
|
const defaultPort = 3500;
|
||||||
|
|
||||||
|
if (!listen) {
|
||||||
|
return { host: defaultHost, port: defaultPort };
|
||||||
|
}
|
||||||
|
|
||||||
|
const lastColon = listen.lastIndexOf(":");
|
||||||
|
if (lastColon === -1) {
|
||||||
|
// Just a port number
|
||||||
|
const port = parseInt(listen, 10);
|
||||||
|
if (Number.isNaN(port)) {
|
||||||
|
throw new Error(`Invalid listen address: ${listen}`);
|
||||||
|
}
|
||||||
|
return { host: defaultHost, port };
|
||||||
|
}
|
||||||
|
|
||||||
|
const host = listen.slice(0, lastColon);
|
||||||
|
const port = parseInt(listen.slice(lastColon + 1), 10);
|
||||||
|
|
||||||
|
if (Number.isNaN(port)) {
|
||||||
|
throw new Error(`Invalid port in listen address: ${listen}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
return { host, port };
|
||||||
|
}
|
||||||
|
|
||||||
|
const listenAddress = parseListenAddress(values.listen);
|
||||||
|
const logAddress = parseListenAddress(values["log-address"]);
|
||||||
|
|
||||||
|
export const cli = {
|
||||||
|
listen: listenAddress,
|
||||||
|
logAddress,
|
||||||
|
};
|
||||||
11
express/config.ts
Normal file
11
express/config.ts
Normal file
@@ -0,0 +1,11 @@
|
|||||||
|
const config = {
|
||||||
|
database: {
|
||||||
|
user: "abc123",
|
||||||
|
password: "abc123",
|
||||||
|
host: "localhost",
|
||||||
|
port: "5432",
|
||||||
|
database: "abc123",
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
export { config };
|
||||||
122
express/content-types.ts
Normal file
122
express/content-types.ts
Normal file
@@ -0,0 +1,122 @@
|
|||||||
|
// This file belongs to the framework. You are not expected to modify it.
|
||||||
|
|
||||||
|
export type ContentType = string;
|
||||||
|
|
||||||
|
// tx claude https://claude.ai/share/344fc7bd-5321-4763-af2f-b82275e9f865
|
||||||
|
const contentTypes = {
|
||||||
|
text: {
|
||||||
|
plain: "text/plain",
|
||||||
|
html: "text/html",
|
||||||
|
css: "text/css",
|
||||||
|
javascript: "text/javascript",
|
||||||
|
xml: "text/xml",
|
||||||
|
csv: "text/csv",
|
||||||
|
markdown: "text/markdown",
|
||||||
|
calendar: "text/calendar",
|
||||||
|
},
|
||||||
|
image: {
|
||||||
|
jpeg: "image/jpeg",
|
||||||
|
png: "image/png",
|
||||||
|
gif: "image/gif",
|
||||||
|
svgPlusXml: "image/svg+xml",
|
||||||
|
webp: "image/webp",
|
||||||
|
bmp: "image/bmp",
|
||||||
|
ico: "image/x-icon",
|
||||||
|
tiff: "image/tiff",
|
||||||
|
avif: "image/avif",
|
||||||
|
},
|
||||||
|
audio: {
|
||||||
|
mpeg: "audio/mpeg",
|
||||||
|
wav: "audio/wav",
|
||||||
|
ogg: "audio/ogg",
|
||||||
|
webm: "audio/webm",
|
||||||
|
aac: "audio/aac",
|
||||||
|
midi: "audio/midi",
|
||||||
|
opus: "audio/opus",
|
||||||
|
flac: "audio/flac",
|
||||||
|
},
|
||||||
|
video: {
|
||||||
|
mp4: "video/mp4",
|
||||||
|
webm: "video/webm",
|
||||||
|
xMsvideo: "video/x-msvideo",
|
||||||
|
mpeg: "video/mpeg",
|
||||||
|
ogg: "video/ogg",
|
||||||
|
quicktime: "video/quicktime",
|
||||||
|
xMatroska: "video/x-matroska",
|
||||||
|
},
|
||||||
|
application: {
|
||||||
|
json: "application/json",
|
||||||
|
pdf: "application/pdf",
|
||||||
|
zip: "application/zip",
|
||||||
|
xWwwFormUrlencoded: "application/x-www-form-urlencoded",
|
||||||
|
octetStream: "application/octet-stream",
|
||||||
|
xml: "application/xml",
|
||||||
|
gzip: "application/gzip",
|
||||||
|
javascript: "application/javascript",
|
||||||
|
ld_json: "application/ld+json",
|
||||||
|
msword: "application/msword",
|
||||||
|
vndOpenxmlformatsOfficedocumentWordprocessingmlDocument:
|
||||||
|
"application/vnd.openxmlformats-officedocument.wordprocessingml.document",
|
||||||
|
vndMsExcel: "application/vnd.ms-excel",
|
||||||
|
vndOpenxmlformatsOfficedocumentSpreadsheetmlSheet:
|
||||||
|
"application/vnd.openxmlformats-officedocument.spreadsheetml.sheet",
|
||||||
|
vndMsPowerpoint: "application/vnd.ms-powerpoint",
|
||||||
|
vndOpenxmlformatsOfficedocumentPresentationmlPresentation:
|
||||||
|
"application/vnd.openxmlformats-officedocument.presentationml.presentation",
|
||||||
|
sql: "application/sql",
|
||||||
|
graphql: "application/graphql",
|
||||||
|
wasm: "application/wasm",
|
||||||
|
xTar: "application/x-tar",
|
||||||
|
x7zCompressed: "application/x-7z-compressed",
|
||||||
|
xRarCompressed: "application/x-rar-compressed",
|
||||||
|
},
|
||||||
|
multipart: {
|
||||||
|
formData: "multipart/form-data",
|
||||||
|
byteranges: "multipart/byteranges",
|
||||||
|
},
|
||||||
|
font: {
|
||||||
|
woff: "font/woff",
|
||||||
|
woff2: "font/woff2",
|
||||||
|
ttf: "font/ttf",
|
||||||
|
otf: "font/otf",
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
export { contentTypes };
|
||||||
|
|
||||||
|
/*
|
||||||
|
|
||||||
|
possible additions for later
|
||||||
|
|
||||||
|
Looking at what's there, here are a few gaps that might be worth filling:
|
||||||
|
Streaming/Modern Web:
|
||||||
|
|
||||||
|
application/x-ndjson or application/jsonlines - newline-delimited JSON (popular for streaming APIs)
|
||||||
|
text/event-stream - Server-Sent Events
|
||||||
|
|
||||||
|
API/Data Exchange:
|
||||||
|
|
||||||
|
application/yaml or text/yaml - YAML files
|
||||||
|
application/protobuf - Protocol Buffers
|
||||||
|
application/msgpack - MessagePack
|
||||||
|
|
||||||
|
Archives you're missing:
|
||||||
|
|
||||||
|
application/x-bzip2 - bzip2 compression
|
||||||
|
|
||||||
|
Images:
|
||||||
|
|
||||||
|
image/heic - HEIC/HEIF (common on iOS)
|
||||||
|
|
||||||
|
Fonts:
|
||||||
|
|
||||||
|
application/vnd.ms-fontobject - EOT fonts (legacy but still seen)
|
||||||
|
|
||||||
|
Text:
|
||||||
|
|
||||||
|
text/rtf - Rich Text Format
|
||||||
|
|
||||||
|
The most impactful would probably be text/event-stream (if you do any SSE), application/x-ndjson (common in modern APIs), and maybe text/yaml. The rest are more situational.
|
||||||
|
But honestly, what you have covers 95% of common web development scenarios. You can definitely add as you go when you encounter specific needs!
|
||||||
|
|
||||||
|
*/
|
||||||
27
express/context.ts
Normal file
27
express/context.ts
Normal file
@@ -0,0 +1,27 @@
|
|||||||
|
// context.ts
|
||||||
|
//
|
||||||
|
// Request-scoped context using AsyncLocalStorage.
|
||||||
|
// Allows services to access request data (like the current user) without
|
||||||
|
// needing to pass Call through every function.
|
||||||
|
|
||||||
|
import { AsyncLocalStorage } from "node:async_hooks";
|
||||||
|
import { anonymousUser, type User } from "./user";
|
||||||
|
|
||||||
|
type RequestContext = {
|
||||||
|
user: User;
|
||||||
|
};
|
||||||
|
|
||||||
|
const asyncLocalStorage = new AsyncLocalStorage<RequestContext>();
|
||||||
|
|
||||||
|
// Run a function within a request context
|
||||||
|
function runWithContext<T>(context: RequestContext, fn: () => T): T {
|
||||||
|
return asyncLocalStorage.run(context, fn);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get the current user from context, or AnonymousUser if not in a request
|
||||||
|
function getCurrentUser(): User {
|
||||||
|
const context = asyncLocalStorage.getStore();
|
||||||
|
return context?.user ?? anonymousUser;
|
||||||
|
}
|
||||||
|
|
||||||
|
export { getCurrentUser, runWithContext, type RequestContext };
|
||||||
48
express/core/index.ts
Normal file
48
express/core/index.ts
Normal file
@@ -0,0 +1,48 @@
|
|||||||
|
import nunjucks from "nunjucks";
|
||||||
|
import { db, migrate, migrationStatus } from "../database";
|
||||||
|
import { getLogs, log } from "../logging";
|
||||||
|
|
||||||
|
// FIXME: This doesn't belong here; move it somewhere else.
|
||||||
|
const conf = {
|
||||||
|
templateEngine: () => {
|
||||||
|
return {
|
||||||
|
renderTemplate: (template: string, context: object) => {
|
||||||
|
return nunjucks.renderString(template, context);
|
||||||
|
},
|
||||||
|
};
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
const database = {
|
||||||
|
db,
|
||||||
|
migrate,
|
||||||
|
migrationStatus,
|
||||||
|
};
|
||||||
|
|
||||||
|
const logging = {
|
||||||
|
log,
|
||||||
|
getLogs,
|
||||||
|
};
|
||||||
|
|
||||||
|
const random = {
|
||||||
|
randomNumber: () => {
|
||||||
|
return Math.random();
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
const misc = {
|
||||||
|
sleep: (ms: number) => {
|
||||||
|
return new Promise((resolve) => setTimeout(resolve, ms));
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
// Keep this asciibetically sorted
|
||||||
|
const core = {
|
||||||
|
conf,
|
||||||
|
database,
|
||||||
|
logging,
|
||||||
|
misc,
|
||||||
|
random,
|
||||||
|
};
|
||||||
|
|
||||||
|
export { core };
|
||||||
548
express/database.ts
Normal file
548
express/database.ts
Normal file
@@ -0,0 +1,548 @@
|
|||||||
|
// database.ts
|
||||||
|
// PostgreSQL database access with Kysely query builder and simple migrations
|
||||||
|
|
||||||
|
import * as fs from "node:fs";
|
||||||
|
import * as path from "node:path";
|
||||||
|
import {
|
||||||
|
type Generated,
|
||||||
|
Kysely,
|
||||||
|
PostgresDialect,
|
||||||
|
type Selectable,
|
||||||
|
sql,
|
||||||
|
} from "kysely";
|
||||||
|
import { Pool } from "pg";
|
||||||
|
import type {
|
||||||
|
AuthStore,
|
||||||
|
CreateSessionData,
|
||||||
|
CreateUserData,
|
||||||
|
} from "./auth/store";
|
||||||
|
import { generateToken, hashToken } from "./auth/token";
|
||||||
|
import type { SessionData, TokenId } from "./auth/types";
|
||||||
|
import type { Domain } from "./types";
|
||||||
|
import { AuthenticatedUser, type User, type UserId } from "./user";
|
||||||
|
|
||||||
|
// Connection configuration
|
||||||
|
const connectionConfig = {
|
||||||
|
host: "localhost",
|
||||||
|
port: 5432,
|
||||||
|
user: "diachron",
|
||||||
|
password: "diachron",
|
||||||
|
database: "diachron",
|
||||||
|
};
|
||||||
|
|
||||||
|
// Database schema types for Kysely
|
||||||
|
// Generated<T> marks columns with database defaults (optional on insert)
|
||||||
|
interface UsersTable {
|
||||||
|
id: string;
|
||||||
|
status: Generated<string>;
|
||||||
|
display_name: string | null;
|
||||||
|
created_at: Generated<Date>;
|
||||||
|
updated_at: Generated<Date>;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface UserEmailsTable {
|
||||||
|
id: string;
|
||||||
|
user_id: string;
|
||||||
|
email: string;
|
||||||
|
normalized_email: string;
|
||||||
|
is_primary: Generated<boolean>;
|
||||||
|
is_verified: Generated<boolean>;
|
||||||
|
created_at: Generated<Date>;
|
||||||
|
verified_at: Date | null;
|
||||||
|
revoked_at: Date | null;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface UserCredentialsTable {
|
||||||
|
id: string;
|
||||||
|
user_id: string;
|
||||||
|
credential_type: Generated<string>;
|
||||||
|
password_hash: string | null;
|
||||||
|
created_at: Generated<Date>;
|
||||||
|
updated_at: Generated<Date>;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface SessionsTable {
|
||||||
|
id: Generated<string>;
|
||||||
|
token_hash: string;
|
||||||
|
user_id: string;
|
||||||
|
user_email_id: string | null;
|
||||||
|
token_type: string;
|
||||||
|
auth_method: string;
|
||||||
|
created_at: Generated<Date>;
|
||||||
|
expires_at: Date;
|
||||||
|
revoked_at: Date | null;
|
||||||
|
ip_address: string | null;
|
||||||
|
user_agent: string | null;
|
||||||
|
is_used: Generated<boolean | null>;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface Database {
|
||||||
|
users: UsersTable;
|
||||||
|
user_emails: UserEmailsTable;
|
||||||
|
user_credentials: UserCredentialsTable;
|
||||||
|
sessions: SessionsTable;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create the connection pool
|
||||||
|
const pool = new Pool(connectionConfig);
|
||||||
|
|
||||||
|
// Create the Kysely instance
|
||||||
|
const db = new Kysely<Database>({
|
||||||
|
dialect: new PostgresDialect({ pool }),
|
||||||
|
});
|
||||||
|
|
||||||
|
// Raw pool access for when you need it
|
||||||
|
const rawPool = pool;
|
||||||
|
|
||||||
|
// Execute raw SQL (for when Kysely doesn't fit)
|
||||||
|
async function raw<T = unknown>(
|
||||||
|
query: string,
|
||||||
|
params: unknown[] = [],
|
||||||
|
): Promise<T[]> {
|
||||||
|
const result = await pool.query(query, params);
|
||||||
|
return result.rows as T[];
|
||||||
|
}
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Migrations
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
// Migration file naming convention:
|
||||||
|
// yyyy-mm-dd_ss_description.sql
|
||||||
|
// e.g., 2025-01-15_01_initial.sql, 2025-01-15_02_add_users.sql
|
||||||
|
//
|
||||||
|
// Migrations directory: express/migrations/
|
||||||
|
|
||||||
|
const FRAMEWORK_MIGRATIONS_DIR = path.join(__dirname, "framework/migrations");
|
||||||
|
const APP_MIGRATIONS_DIR = path.join(__dirname, "migrations");
|
||||||
|
const MIGRATIONS_TABLE = "_migrations";
|
||||||
|
|
||||||
|
interface MigrationRecord {
|
||||||
|
id: number;
|
||||||
|
name: string;
|
||||||
|
applied_at: Date;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Ensure migrations table exists
|
||||||
|
async function ensureMigrationsTable(): Promise<void> {
|
||||||
|
await pool.query(`
|
||||||
|
CREATE TABLE IF NOT EXISTS ${MIGRATIONS_TABLE} (
|
||||||
|
id SERIAL PRIMARY KEY,
|
||||||
|
name TEXT NOT NULL UNIQUE,
|
||||||
|
applied_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
||||||
|
)
|
||||||
|
`);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get list of applied migrations
|
||||||
|
async function getAppliedMigrations(): Promise<string[]> {
|
||||||
|
const result = await pool.query<MigrationRecord>(
|
||||||
|
`SELECT name FROM ${MIGRATIONS_TABLE} ORDER BY name`,
|
||||||
|
);
|
||||||
|
return result.rows.map((r) => r.name);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get pending migration files
|
||||||
|
function getMigrationFiles(kind: Domain): string[] {
|
||||||
|
const dir = kind === "fw" ? FRAMEWORK_MIGRATIONS_DIR : APP_MIGRATIONS_DIR;
|
||||||
|
|
||||||
|
if (!fs.existsSync(dir)) {
|
||||||
|
return [];
|
||||||
|
}
|
||||||
|
|
||||||
|
const root = __dirname;
|
||||||
|
|
||||||
|
const mm = fs
|
||||||
|
.readdirSync(dir)
|
||||||
|
.filter((f) => f.endsWith(".sql"))
|
||||||
|
.filter((f) => /^\d{4}-\d{2}-\d{2}_\d{2}-/.test(f))
|
||||||
|
.map((f) => `${dir}/${f}`)
|
||||||
|
.map((f) => f.replace(`${root}/`, ""))
|
||||||
|
.sort();
|
||||||
|
|
||||||
|
return mm;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Run a single migration
|
||||||
|
async function runMigration(filename: string): Promise<void> {
|
||||||
|
// const filepath = path.join(MIGRATIONS_DIR, filename);
|
||||||
|
const filepath = filename;
|
||||||
|
const content = fs.readFileSync(filepath, "utf-8");
|
||||||
|
|
||||||
|
process.stdout.write(` Migration: ${filename}...`);
|
||||||
|
|
||||||
|
// Run migration in a transaction
|
||||||
|
const client = await pool.connect();
|
||||||
|
try {
|
||||||
|
await client.query("BEGIN");
|
||||||
|
await client.query(content);
|
||||||
|
await client.query(
|
||||||
|
`INSERT INTO ${MIGRATIONS_TABLE} (name) VALUES ($1)`,
|
||||||
|
[filename],
|
||||||
|
);
|
||||||
|
await client.query("COMMIT");
|
||||||
|
console.log(" ✓");
|
||||||
|
} catch (err) {
|
||||||
|
console.log(" ✗");
|
||||||
|
const message = err instanceof Error ? err.message : String(err);
|
||||||
|
console.error(` Error: ${message}`);
|
||||||
|
await client.query("ROLLBACK");
|
||||||
|
throw err;
|
||||||
|
} finally {
|
||||||
|
client.release();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function getAllMigrationFiles() {
|
||||||
|
const fw_files = getMigrationFiles("fw");
|
||||||
|
const app_files = getMigrationFiles("app");
|
||||||
|
const all = [...fw_files, ...app_files];
|
||||||
|
|
||||||
|
return all;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Run all pending migrations
|
||||||
|
async function migrate(): Promise<void> {
|
||||||
|
await ensureMigrationsTable();
|
||||||
|
|
||||||
|
const applied = new Set(await getAppliedMigrations());
|
||||||
|
const all = getAllMigrationFiles();
|
||||||
|
const pending = all.filter((all) => !applied.has(all));
|
||||||
|
|
||||||
|
if (pending.length === 0) {
|
||||||
|
console.log("No pending migrations");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
console.log(`Applying ${pending.length} migration(s):`);
|
||||||
|
for (const file of pending) {
|
||||||
|
await runMigration(file);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// List migration status
|
||||||
|
async function migrationStatus(): Promise<{
|
||||||
|
applied: string[];
|
||||||
|
pending: string[];
|
||||||
|
}> {
|
||||||
|
await ensureMigrationsTable();
|
||||||
|
const applied = new Set(await getAppliedMigrations());
|
||||||
|
const ff = getAllMigrationFiles();
|
||||||
|
return {
|
||||||
|
applied: ff.filter((ff) => applied.has(ff)),
|
||||||
|
pending: ff.filter((ff) => !applied.has(ff)),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// PostgresAuthStore - Database-backed authentication storage
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
class PostgresAuthStore implements AuthStore {
|
||||||
|
// Session operations
|
||||||
|
|
||||||
|
async createSession(
|
||||||
|
data: CreateSessionData,
|
||||||
|
): Promise<{ token: string; session: SessionData }> {
|
||||||
|
const token = generateToken();
|
||||||
|
const tokenHash = hashToken(token);
|
||||||
|
|
||||||
|
const row = await db
|
||||||
|
.insertInto("sessions")
|
||||||
|
.values({
|
||||||
|
token_hash: tokenHash,
|
||||||
|
user_id: data.userId,
|
||||||
|
token_type: data.tokenType,
|
||||||
|
auth_method: data.authMethod,
|
||||||
|
expires_at: data.expiresAt,
|
||||||
|
user_agent: data.userAgent ?? null,
|
||||||
|
ip_address: data.ipAddress ?? null,
|
||||||
|
})
|
||||||
|
.returningAll()
|
||||||
|
.executeTakeFirstOrThrow();
|
||||||
|
|
||||||
|
const session: SessionData = {
|
||||||
|
tokenId: row.token_hash,
|
||||||
|
userId: row.user_id,
|
||||||
|
tokenType: row.token_type as SessionData["tokenType"],
|
||||||
|
authMethod: row.auth_method as SessionData["authMethod"],
|
||||||
|
createdAt: row.created_at,
|
||||||
|
expiresAt: row.expires_at,
|
||||||
|
userAgent: row.user_agent ?? undefined,
|
||||||
|
ipAddress: row.ip_address ?? undefined,
|
||||||
|
isUsed: row.is_used ?? undefined,
|
||||||
|
};
|
||||||
|
|
||||||
|
return { token, session };
|
||||||
|
}
|
||||||
|
|
||||||
|
async getSession(tokenId: TokenId): Promise<SessionData | null> {
|
||||||
|
const row = await db
|
||||||
|
.selectFrom("sessions")
|
||||||
|
.selectAll()
|
||||||
|
.where("token_hash", "=", tokenId)
|
||||||
|
.where("expires_at", ">", new Date())
|
||||||
|
.where("revoked_at", "is", null)
|
||||||
|
.executeTakeFirst();
|
||||||
|
|
||||||
|
if (!row) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
tokenId: row.token_hash,
|
||||||
|
userId: row.user_id,
|
||||||
|
tokenType: row.token_type as SessionData["tokenType"],
|
||||||
|
authMethod: row.auth_method as SessionData["authMethod"],
|
||||||
|
createdAt: row.created_at,
|
||||||
|
expiresAt: row.expires_at,
|
||||||
|
userAgent: row.user_agent ?? undefined,
|
||||||
|
ipAddress: row.ip_address ?? undefined,
|
||||||
|
isUsed: row.is_used ?? undefined,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
async updateLastUsed(_tokenId: TokenId): Promise<void> {
|
||||||
|
// The new schema doesn't have last_used_at column
|
||||||
|
// This is now a no-op; session activity tracking could be added later
|
||||||
|
}
|
||||||
|
|
||||||
|
async deleteSession(tokenId: TokenId): Promise<void> {
|
||||||
|
// Soft delete by setting revoked_at
|
||||||
|
await db
|
||||||
|
.updateTable("sessions")
|
||||||
|
.set({ revoked_at: new Date() })
|
||||||
|
.where("token_hash", "=", tokenId)
|
||||||
|
.execute();
|
||||||
|
}
|
||||||
|
|
||||||
|
async deleteUserSessions(userId: UserId): Promise<number> {
|
||||||
|
const result = await db
|
||||||
|
.updateTable("sessions")
|
||||||
|
.set({ revoked_at: new Date() })
|
||||||
|
.where("user_id", "=", userId)
|
||||||
|
.where("revoked_at", "is", null)
|
||||||
|
.executeTakeFirst();
|
||||||
|
|
||||||
|
return Number(result.numUpdatedRows);
|
||||||
|
}
|
||||||
|
|
||||||
|
// User operations
|
||||||
|
|
||||||
|
async getUserByEmail(email: string): Promise<User | null> {
|
||||||
|
// Find user through user_emails table
|
||||||
|
const normalizedEmail = email.toLowerCase().trim();
|
||||||
|
|
||||||
|
const row = await db
|
||||||
|
.selectFrom("user_emails")
|
||||||
|
.innerJoin("users", "users.id", "user_emails.user_id")
|
||||||
|
.select([
|
||||||
|
"users.id",
|
||||||
|
"users.status",
|
||||||
|
"users.display_name",
|
||||||
|
"users.created_at",
|
||||||
|
"users.updated_at",
|
||||||
|
"user_emails.email",
|
||||||
|
])
|
||||||
|
.where("user_emails.normalized_email", "=", normalizedEmail)
|
||||||
|
.where("user_emails.revoked_at", "is", null)
|
||||||
|
.executeTakeFirst();
|
||||||
|
|
||||||
|
if (!row) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
return this.rowToUser(row);
|
||||||
|
}
|
||||||
|
|
||||||
|
async getUserById(userId: UserId): Promise<User | null> {
|
||||||
|
// Get user with their primary email
|
||||||
|
const row = await db
|
||||||
|
.selectFrom("users")
|
||||||
|
.leftJoin("user_emails", (join) =>
|
||||||
|
join
|
||||||
|
.onRef("user_emails.user_id", "=", "users.id")
|
||||||
|
.on("user_emails.is_primary", "=", true)
|
||||||
|
.on("user_emails.revoked_at", "is", null),
|
||||||
|
)
|
||||||
|
.select([
|
||||||
|
"users.id",
|
||||||
|
"users.status",
|
||||||
|
"users.display_name",
|
||||||
|
"users.created_at",
|
||||||
|
"users.updated_at",
|
||||||
|
"user_emails.email",
|
||||||
|
])
|
||||||
|
.where("users.id", "=", userId)
|
||||||
|
.executeTakeFirst();
|
||||||
|
|
||||||
|
if (!row) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
return this.rowToUser(row);
|
||||||
|
}
|
||||||
|
|
||||||
|
async createUser(data: CreateUserData): Promise<User> {
|
||||||
|
const userId = crypto.randomUUID();
|
||||||
|
const emailId = crypto.randomUUID();
|
||||||
|
const credentialId = crypto.randomUUID();
|
||||||
|
const now = new Date();
|
||||||
|
const normalizedEmail = data.email.toLowerCase().trim();
|
||||||
|
|
||||||
|
// Create user record
|
||||||
|
await db
|
||||||
|
.insertInto("users")
|
||||||
|
.values({
|
||||||
|
id: userId,
|
||||||
|
display_name: data.displayName ?? null,
|
||||||
|
status: "pending",
|
||||||
|
created_at: now,
|
||||||
|
updated_at: now,
|
||||||
|
})
|
||||||
|
.execute();
|
||||||
|
|
||||||
|
// Create user_email record
|
||||||
|
await db
|
||||||
|
.insertInto("user_emails")
|
||||||
|
.values({
|
||||||
|
id: emailId,
|
||||||
|
user_id: userId,
|
||||||
|
email: data.email,
|
||||||
|
normalized_email: normalizedEmail,
|
||||||
|
is_primary: true,
|
||||||
|
is_verified: false,
|
||||||
|
created_at: now,
|
||||||
|
})
|
||||||
|
.execute();
|
||||||
|
|
||||||
|
// Create user_credential record
|
||||||
|
await db
|
||||||
|
.insertInto("user_credentials")
|
||||||
|
.values({
|
||||||
|
id: credentialId,
|
||||||
|
user_id: userId,
|
||||||
|
credential_type: "password",
|
||||||
|
password_hash: data.passwordHash,
|
||||||
|
created_at: now,
|
||||||
|
updated_at: now,
|
||||||
|
})
|
||||||
|
.execute();
|
||||||
|
|
||||||
|
return new AuthenticatedUser({
|
||||||
|
id: userId,
|
||||||
|
email: data.email,
|
||||||
|
displayName: data.displayName,
|
||||||
|
status: "pending",
|
||||||
|
roles: [],
|
||||||
|
permissions: [],
|
||||||
|
createdAt: now,
|
||||||
|
updatedAt: now,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async getUserPasswordHash(userId: UserId): Promise<string | null> {
|
||||||
|
const row = await db
|
||||||
|
.selectFrom("user_credentials")
|
||||||
|
.select("password_hash")
|
||||||
|
.where("user_id", "=", userId)
|
||||||
|
.where("credential_type", "=", "password")
|
||||||
|
.executeTakeFirst();
|
||||||
|
|
||||||
|
return row?.password_hash ?? null;
|
||||||
|
}
|
||||||
|
|
||||||
|
async setUserPassword(userId: UserId, passwordHash: string): Promise<void> {
|
||||||
|
const now = new Date();
|
||||||
|
|
||||||
|
// Try to update existing credential
|
||||||
|
const result = await db
|
||||||
|
.updateTable("user_credentials")
|
||||||
|
.set({ password_hash: passwordHash, updated_at: now })
|
||||||
|
.where("user_id", "=", userId)
|
||||||
|
.where("credential_type", "=", "password")
|
||||||
|
.executeTakeFirst();
|
||||||
|
|
||||||
|
// If no existing credential, create one
|
||||||
|
if (Number(result.numUpdatedRows) === 0) {
|
||||||
|
await db
|
||||||
|
.insertInto("user_credentials")
|
||||||
|
.values({
|
||||||
|
id: crypto.randomUUID(),
|
||||||
|
user_id: userId,
|
||||||
|
credential_type: "password",
|
||||||
|
password_hash: passwordHash,
|
||||||
|
created_at: now,
|
||||||
|
updated_at: now,
|
||||||
|
})
|
||||||
|
.execute();
|
||||||
|
}
|
||||||
|
|
||||||
|
// Update user's updated_at
|
||||||
|
await db
|
||||||
|
.updateTable("users")
|
||||||
|
.set({ updated_at: now })
|
||||||
|
.where("id", "=", userId)
|
||||||
|
.execute();
|
||||||
|
}
|
||||||
|
|
||||||
|
async updateUserEmailVerified(userId: UserId): Promise<void> {
|
||||||
|
const now = new Date();
|
||||||
|
|
||||||
|
// Update user_emails to mark as verified
|
||||||
|
await db
|
||||||
|
.updateTable("user_emails")
|
||||||
|
.set({
|
||||||
|
is_verified: true,
|
||||||
|
verified_at: now,
|
||||||
|
})
|
||||||
|
.where("user_id", "=", userId)
|
||||||
|
.where("is_primary", "=", true)
|
||||||
|
.execute();
|
||||||
|
|
||||||
|
// Update user status to active
|
||||||
|
await db
|
||||||
|
.updateTable("users")
|
||||||
|
.set({
|
||||||
|
status: "active",
|
||||||
|
updated_at: now,
|
||||||
|
})
|
||||||
|
.where("id", "=", userId)
|
||||||
|
.execute();
|
||||||
|
}
|
||||||
|
|
||||||
|
// Helper to convert database row to User object
|
||||||
|
private rowToUser(row: {
|
||||||
|
id: string;
|
||||||
|
status: string;
|
||||||
|
display_name: string | null;
|
||||||
|
created_at: Date;
|
||||||
|
updated_at: Date;
|
||||||
|
email: string | null;
|
||||||
|
}): User {
|
||||||
|
return new AuthenticatedUser({
|
||||||
|
id: row.id,
|
||||||
|
email: row.email ?? "unknown@example.com",
|
||||||
|
displayName: row.display_name ?? undefined,
|
||||||
|
status: row.status as "active" | "suspended" | "pending",
|
||||||
|
roles: [], // TODO: query from RBAC tables
|
||||||
|
permissions: [], // TODO: query from RBAC tables
|
||||||
|
createdAt: row.created_at,
|
||||||
|
updatedAt: row.updated_at,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Exports
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
export {
|
||||||
|
db,
|
||||||
|
raw,
|
||||||
|
rawPool,
|
||||||
|
pool,
|
||||||
|
migrate,
|
||||||
|
migrationStatus,
|
||||||
|
connectionConfig,
|
||||||
|
PostgresAuthStore,
|
||||||
|
type Database,
|
||||||
|
};
|
||||||
7
express/deps.ts
Normal file
7
express/deps.ts
Normal file
@@ -0,0 +1,7 @@
|
|||||||
|
// Database
|
||||||
|
//export { Client as PostgresClient } from "https://deno.land/x/postgres@v0.19.3/mod.ts";
|
||||||
|
//export type { ClientOptions as PostgresOptions } from "https://deno.land/x/postgres@v0.19.3/mod.ts";
|
||||||
|
|
||||||
|
// Redis
|
||||||
|
//export { connect as redisConnect } from "https://deno.land/x/redis@v0.37.1/mod.ts";
|
||||||
|
//export type { Redis } from "https://deno.land/x/redis@v0.37.1/mod.ts";
|
||||||
17
express/develop/clear-db.ts
Normal file
17
express/develop/clear-db.ts
Normal file
@@ -0,0 +1,17 @@
|
|||||||
|
import { connectionConfig, migrate, pool } from "../database";
|
||||||
|
import { dropTables, exitIfUnforced } from "./util";
|
||||||
|
|
||||||
|
async function main(): Promise<void> {
|
||||||
|
exitIfUnforced();
|
||||||
|
|
||||||
|
try {
|
||||||
|
await dropTables();
|
||||||
|
} finally {
|
||||||
|
await pool.end();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
main().catch((err) => {
|
||||||
|
console.error("Failed to clear database:", err.message);
|
||||||
|
process.exit(1);
|
||||||
|
});
|
||||||
26
express/develop/reset-db.ts
Normal file
26
express/develop/reset-db.ts
Normal file
@@ -0,0 +1,26 @@
|
|||||||
|
// reset-db.ts
|
||||||
|
// Development command to wipe the database and apply all migrations from scratch
|
||||||
|
|
||||||
|
import { connectionConfig, migrate, pool } from "../database";
|
||||||
|
import { dropTables, exitIfUnforced } from "./util";
|
||||||
|
|
||||||
|
async function main(): Promise<void> {
|
||||||
|
exitIfUnforced();
|
||||||
|
|
||||||
|
try {
|
||||||
|
await dropTables();
|
||||||
|
|
||||||
|
console.log("");
|
||||||
|
await migrate();
|
||||||
|
|
||||||
|
console.log("");
|
||||||
|
console.log("Database reset complete.");
|
||||||
|
} finally {
|
||||||
|
await pool.end();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
main().catch((err) => {
|
||||||
|
console.error("Failed to reset database:", err.message);
|
||||||
|
process.exit(1);
|
||||||
|
});
|
||||||
42
express/develop/util.ts
Normal file
42
express/develop/util.ts
Normal file
@@ -0,0 +1,42 @@
|
|||||||
|
// FIXME: this is at the wrong level of specificity
|
||||||
|
|
||||||
|
import { connectionConfig, migrate, pool } from "../database";
|
||||||
|
|
||||||
|
const exitIfUnforced = () => {
|
||||||
|
const args = process.argv.slice(2);
|
||||||
|
|
||||||
|
// Require explicit confirmation unless --force is passed
|
||||||
|
if (!args.includes("--force")) {
|
||||||
|
console.error("This will DROP ALL TABLES in the database!");
|
||||||
|
console.error(` Database: ${connectionConfig.database}`);
|
||||||
|
console.error(
|
||||||
|
` Host: ${connectionConfig.host}:${connectionConfig.port}`,
|
||||||
|
);
|
||||||
|
console.error("");
|
||||||
|
console.error("Run with --force to proceed.");
|
||||||
|
process.exit(1);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const dropTables = async () => {
|
||||||
|
console.log("Dropping all tables...");
|
||||||
|
|
||||||
|
// Get all table names in the public schema
|
||||||
|
const result = await pool.query<{ tablename: string }>(`
|
||||||
|
SELECT tablename FROM pg_tables
|
||||||
|
WHERE schemaname = 'public'
|
||||||
|
`);
|
||||||
|
|
||||||
|
if (result.rows.length > 0) {
|
||||||
|
// Drop all tables with CASCADE to handle foreign key constraints
|
||||||
|
const tableNames = result.rows
|
||||||
|
.map((r) => `"${r.tablename}"`)
|
||||||
|
.join(", ");
|
||||||
|
await pool.query(`DROP TABLE IF EXISTS ${tableNames} CASCADE`);
|
||||||
|
console.log(`Dropped ${result.rows.length} table(s)`);
|
||||||
|
} else {
|
||||||
|
console.log("No tables to drop");
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
export { dropTables, exitIfUnforced };
|
||||||
13
express/execution-context-schema.ts
Normal file
13
express/execution-context-schema.ts
Normal file
@@ -0,0 +1,13 @@
|
|||||||
|
import { z } from "zod";
|
||||||
|
|
||||||
|
export const executionContextSchema = z.object({
|
||||||
|
diachron_root: z.string(),
|
||||||
|
});
|
||||||
|
|
||||||
|
export type ExecutionContext = z.infer<typeof executionContextSchema>;
|
||||||
|
|
||||||
|
export function parseExecutionContext(
|
||||||
|
env: Record<string, string | undefined>,
|
||||||
|
): ExecutionContext {
|
||||||
|
return executionContextSchema.parse(env);
|
||||||
|
}
|
||||||
38
express/execution-context.spec.ts
Normal file
38
express/execution-context.spec.ts
Normal file
@@ -0,0 +1,38 @@
|
|||||||
|
import assert from "node:assert/strict";
|
||||||
|
import { describe, it } from "node:test";
|
||||||
|
import { ZodError } from "zod";
|
||||||
|
|
||||||
|
import {
|
||||||
|
executionContextSchema,
|
||||||
|
parseExecutionContext,
|
||||||
|
} from "./execution-context-schema";
|
||||||
|
|
||||||
|
describe("parseExecutionContext", () => {
|
||||||
|
it("parses valid executionContext with diachron_root", () => {
|
||||||
|
const env = { diachron_root: "/some/path" };
|
||||||
|
const result = parseExecutionContext(env);
|
||||||
|
assert.deepEqual(result, { diachron_root: "/some/path" });
|
||||||
|
});
|
||||||
|
|
||||||
|
it("throws ZodError when diachron_root is missing", () => {
|
||||||
|
const env = {};
|
||||||
|
assert.throws(() => parseExecutionContext(env), ZodError);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("strips extra fields not in schema", () => {
|
||||||
|
const env = {
|
||||||
|
diachron_root: "/some/path",
|
||||||
|
EXTRA_VAR: "should be stripped",
|
||||||
|
};
|
||||||
|
const result = parseExecutionContext(env);
|
||||||
|
assert.deepEqual(result, { diachron_root: "/some/path" });
|
||||||
|
assert.equal("EXTRA_VAR" in result, false);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe("executionContextSchema", () => {
|
||||||
|
it("requires diachron_root to be a string", () => {
|
||||||
|
const result = executionContextSchema.safeParse({ diachron_root: 123 });
|
||||||
|
assert.equal(result.success, false);
|
||||||
|
});
|
||||||
|
});
|
||||||
5
express/execution-context.ts
Normal file
5
express/execution-context.ts
Normal file
@@ -0,0 +1,5 @@
|
|||||||
|
import { parseExecutionContext } from "./execution-context-schema";
|
||||||
|
|
||||||
|
const executionContext = parseExecutionContext(process.env);
|
||||||
|
|
||||||
|
export { executionContext };
|
||||||
0
express/extensible.ts
Normal file
0
express/extensible.ts
Normal file
29
express/framework/migrations/2026-01-01_01-users.sql
Normal file
29
express/framework/migrations/2026-01-01_01-users.sql
Normal file
@@ -0,0 +1,29 @@
|
|||||||
|
-- 0001_users.sql
|
||||||
|
-- Create users table for authentication
|
||||||
|
|
||||||
|
CREATE TABLE users (
|
||||||
|
id UUID PRIMARY KEY,
|
||||||
|
status TEXT NOT NULL DEFAULT 'active',
|
||||||
|
display_name TEXT,
|
||||||
|
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE TABLE user_emails (
|
||||||
|
id UUID PRIMARY KEY,
|
||||||
|
user_id UUID NOT NULL REFERENCES users(id),
|
||||||
|
email TEXT NOT NULL,
|
||||||
|
normalized_email TEXT NOT NULL,
|
||||||
|
is_primary BOOLEAN NOT NULL DEFAULT FALSE,
|
||||||
|
is_verified BOOLEAN NOT NULL DEFAULT FALSE,
|
||||||
|
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
verified_at TIMESTAMPTZ,
|
||||||
|
revoked_at TIMESTAMPTZ
|
||||||
|
);
|
||||||
|
|
||||||
|
-- Enforce uniqueness only among *active* emails
|
||||||
|
CREATE UNIQUE INDEX user_emails_unique_active
|
||||||
|
ON user_emails (normalized_email)
|
||||||
|
WHERE revoked_at IS NULL;
|
||||||
|
|
||||||
|
|
||||||
26
express/framework/migrations/2026-01-01_02-sessions.sql
Normal file
26
express/framework/migrations/2026-01-01_02-sessions.sql
Normal file
@@ -0,0 +1,26 @@
|
|||||||
|
-- 0002_sessions.sql
|
||||||
|
-- Create sessions table for auth tokens
|
||||||
|
|
||||||
|
CREATE TABLE sessions (
|
||||||
|
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||||
|
token_hash TEXT UNIQUE NOT NULL,
|
||||||
|
user_id UUID NOT NULL REFERENCES users(id),
|
||||||
|
user_email_id UUID REFERENCES user_emails(id),
|
||||||
|
token_type TEXT NOT NULL,
|
||||||
|
auth_method TEXT NOT NULL,
|
||||||
|
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
expires_at TIMESTAMPTZ NOT NULL,
|
||||||
|
revoked_at TIMESTAMPTZ,
|
||||||
|
ip_address INET,
|
||||||
|
user_agent TEXT,
|
||||||
|
is_used BOOLEAN DEFAULT FALSE
|
||||||
|
);
|
||||||
|
|
||||||
|
-- Index for user session lookups (logout all, etc.)
|
||||||
|
CREATE INDEX sessions_user_id_idx ON sessions (user_id);
|
||||||
|
|
||||||
|
-- Index for expiration cleanup
|
||||||
|
CREATE INDEX sessions_expires_at_idx ON sessions (expires_at);
|
||||||
|
|
||||||
|
-- Index for token type filtering
|
||||||
|
CREATE INDEX sessions_token_type_idx ON sessions (token_type);
|
||||||
@@ -0,0 +1,17 @@
|
|||||||
|
-- 0003_user_credentials.sql
|
||||||
|
-- Create user_credentials table for password storage (extensible for other auth methods)
|
||||||
|
|
||||||
|
CREATE TABLE user_credentials (
|
||||||
|
id UUID PRIMARY KEY,
|
||||||
|
user_id UUID NOT NULL REFERENCES users(id),
|
||||||
|
credential_type TEXT NOT NULL DEFAULT 'password',
|
||||||
|
password_hash TEXT,
|
||||||
|
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
||||||
|
);
|
||||||
|
|
||||||
|
-- Each user can have at most one credential per type
|
||||||
|
CREATE UNIQUE INDEX user_credentials_user_type_idx ON user_credentials (user_id, credential_type);
|
||||||
|
|
||||||
|
-- Index for user lookups
|
||||||
|
CREATE INDEX user_credentials_user_id_idx ON user_credentials (user_id);
|
||||||
@@ -0,0 +1,20 @@
|
|||||||
|
CREATE TABLE roles (
|
||||||
|
id UUID PRIMARY KEY,
|
||||||
|
name TEXT UNIQUE NOT NULL,
|
||||||
|
description TEXT
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE TABLE groups (
|
||||||
|
id UUID PRIMARY KEY,
|
||||||
|
name TEXT NOT NULL,
|
||||||
|
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE TABLE user_group_roles (
|
||||||
|
user_id UUID NOT NULL REFERENCES users(id),
|
||||||
|
group_id UUID NOT NULL REFERENCES groups(id),
|
||||||
|
role_id UUID NOT NULL REFERENCES roles(id),
|
||||||
|
granted_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
revoked_at TIMESTAMPTZ,
|
||||||
|
PRIMARY KEY (user_id, group_id, role_id)
|
||||||
|
);
|
||||||
14
express/framework/migrations/2026-01-24_02-capabilities.sql
Normal file
14
express/framework/migrations/2026-01-24_02-capabilities.sql
Normal file
@@ -0,0 +1,14 @@
|
|||||||
|
CREATE TABLE capabilities (
|
||||||
|
id UUID PRIMARY KEY,
|
||||||
|
name TEXT UNIQUE NOT NULL,
|
||||||
|
description TEXT
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE TABLE role_capabilities (
|
||||||
|
role_id UUID NOT NULL REFERENCES roles(id),
|
||||||
|
capability_id UUID NOT NULL REFERENCES capabilities(id),
|
||||||
|
granted_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||||
|
revoked_at TIMESTAMPTZ,
|
||||||
|
PRIMARY KEY (role_id, capability_id)
|
||||||
|
);
|
||||||
|
|
||||||
19
express/handlers.ts
Normal file
19
express/handlers.ts
Normal file
@@ -0,0 +1,19 @@
|
|||||||
|
import { contentTypes } from "./content-types";
|
||||||
|
import { core } from "./core";
|
||||||
|
import { httpCodes } from "./http-codes";
|
||||||
|
import type { Call, Handler, Result } from "./types";
|
||||||
|
|
||||||
|
const multiHandler: Handler = async (call: Call): Promise<Result> => {
|
||||||
|
const code = httpCodes.success.OK;
|
||||||
|
const rn = core.random.randomNumber();
|
||||||
|
|
||||||
|
const retval: Result = {
|
||||||
|
code,
|
||||||
|
result: `that was ${call.method} (${rn})`,
|
||||||
|
contentType: contentTypes.text.plain,
|
||||||
|
};
|
||||||
|
|
||||||
|
return retval;
|
||||||
|
};
|
||||||
|
|
||||||
|
export { multiHandler };
|
||||||
76
express/http-codes.ts
Normal file
76
express/http-codes.ts
Normal file
@@ -0,0 +1,76 @@
|
|||||||
|
// This file belongs to the framework. You are not expected to modify it.
|
||||||
|
|
||||||
|
export type HttpCode = {
|
||||||
|
code: number;
|
||||||
|
name: string;
|
||||||
|
description?: string;
|
||||||
|
};
|
||||||
|
type Group = "success" | "redirection" | "clientErrors" | "serverErrors";
|
||||||
|
type CodeDefinitions = {
|
||||||
|
[K in Group]: {
|
||||||
|
[K: string]: HttpCode;
|
||||||
|
};
|
||||||
|
};
|
||||||
|
|
||||||
|
// tx claude https://claude.ai/share/344fc7bd-5321-4763-af2f-b82275e9f865
|
||||||
|
const httpCodes: CodeDefinitions = {
|
||||||
|
success: {
|
||||||
|
OK: { code: 200, name: "OK", description: "" },
|
||||||
|
Created: { code: 201, name: "Created" },
|
||||||
|
Accepted: { code: 202, name: "Accepted" },
|
||||||
|
NonAuthoritativeInformation: {
|
||||||
|
code: 203,
|
||||||
|
name: "Non-Authoritative Information",
|
||||||
|
},
|
||||||
|
NoContent: { code: 204, name: "No Content" },
|
||||||
|
ResetContent: { code: 205, name: "Reset Content" },
|
||||||
|
PartialContent: { code: 206, name: "Partial Content" },
|
||||||
|
},
|
||||||
|
redirection: {
|
||||||
|
MultipleChoices: { code: 300, name: "Multiple Choices" },
|
||||||
|
MovedPermanently: { code: 301, name: "Moved Permanently" },
|
||||||
|
Found: { code: 302, name: "Found" },
|
||||||
|
SeeOther: { code: 303, name: "See Other" },
|
||||||
|
NotModified: { code: 304, name: "Not Modified" },
|
||||||
|
TemporaryRedirect: { code: 307, name: "Temporary Redirect" },
|
||||||
|
PermanentRedirect: { code: 308, name: "Permanent Redirect" },
|
||||||
|
},
|
||||||
|
clientErrors: {
|
||||||
|
BadRequest: { code: 400, name: "Bad Request" },
|
||||||
|
Unauthorized: { code: 401, name: "Unauthorized" },
|
||||||
|
PaymentRequired: { code: 402, name: "Payment Required" },
|
||||||
|
Forbidden: { code: 403, name: "Forbidden" },
|
||||||
|
NotFound: { code: 404, name: "Not Found" },
|
||||||
|
MethodNotAllowed: { code: 405, name: "Method Not Allowed" },
|
||||||
|
NotAcceptable: { code: 406, name: "Not Acceptable" },
|
||||||
|
ProxyAuthenticationRequired: {
|
||||||
|
code: 407,
|
||||||
|
name: "Proxy Authentication Required",
|
||||||
|
},
|
||||||
|
RequestTimeout: { code: 408, name: "Request Timeout" },
|
||||||
|
Conflict: { code: 409, name: "Conflict" },
|
||||||
|
Gone: { code: 410, name: "Gone" },
|
||||||
|
LengthRequired: { code: 411, name: "Length Required" },
|
||||||
|
PreconditionFailed: { code: 412, name: "Precondition Failed" },
|
||||||
|
PayloadTooLarge: { code: 413, name: "Payload Too Large" },
|
||||||
|
URITooLong: { code: 414, name: "URI Too Long" },
|
||||||
|
UnsupportedMediaType: { code: 415, name: "Unsupported Media Type" },
|
||||||
|
RangeNotSatisfiable: { code: 416, name: "Range Not Satisfiable" },
|
||||||
|
ExpectationFailed: { code: 417, name: "Expectation Failed" },
|
||||||
|
ImATeapot: { code: 418, name: "I'm a teapot" },
|
||||||
|
UnprocessableEntity: { code: 422, name: "Unprocessable Entity" },
|
||||||
|
TooManyRequests: { code: 429, name: "Too Many Requests" },
|
||||||
|
},
|
||||||
|
serverErrors: {
|
||||||
|
InternalServerError: { code: 500, name: "Internal Server Error" },
|
||||||
|
NotImplemented: { code: 501, name: "Not Implemented" },
|
||||||
|
BadGateway: { code: 502, name: "Bad Gateway" },
|
||||||
|
ServiceUnavailable: { code: 503, name: "Service Unavailable" },
|
||||||
|
GatewayTimeout: { code: 504, name: "Gateway Timeout" },
|
||||||
|
HTTPVersionNotSupported: {
|
||||||
|
code: 505,
|
||||||
|
name: "HTTP Version Not Supported",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
};
|
||||||
|
export { httpCodes };
|
||||||
3
express/interfaces.ts
Normal file
3
express/interfaces.ts
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
type Brand<K, T> = K & { readonly __brand: T };
|
||||||
|
|
||||||
|
export type Extensible = Brand<"Extensible", {}>;
|
||||||
73
express/logging.ts
Normal file
73
express/logging.ts
Normal file
@@ -0,0 +1,73 @@
|
|||||||
|
// internal-logging.ts
|
||||||
|
|
||||||
|
import { cli } from "./cli";
|
||||||
|
|
||||||
|
// FIXME: Move this to somewhere more appropriate
|
||||||
|
type AtLeastOne<T> = [T, ...T[]];
|
||||||
|
|
||||||
|
type MessageSource = "logging" | "diagnostic" | "user";
|
||||||
|
|
||||||
|
type Message = {
|
||||||
|
// FIXME: number probably isn't what we want here
|
||||||
|
timestamp?: number;
|
||||||
|
source: MessageSource;
|
||||||
|
|
||||||
|
text: AtLeastOne<string>;
|
||||||
|
};
|
||||||
|
|
||||||
|
const _m1: Message = { timestamp: 123, source: "logging", text: ["foo"] };
|
||||||
|
const _m2: Message = {
|
||||||
|
timestamp: 321,
|
||||||
|
source: "diagnostic",
|
||||||
|
text: ["ok", "whatever"],
|
||||||
|
};
|
||||||
|
|
||||||
|
type FilterArgument = {
|
||||||
|
limit?: number;
|
||||||
|
before?: number;
|
||||||
|
after?: number;
|
||||||
|
|
||||||
|
// FIXME: add offsets to use instead of or in addition to before/after
|
||||||
|
|
||||||
|
match?: (string | RegExp)[];
|
||||||
|
};
|
||||||
|
|
||||||
|
const loggerUrl = `http://${cli.logAddress.host}:${cli.logAddress.port}`;
|
||||||
|
|
||||||
|
const log = (message: Message) => {
|
||||||
|
const payload = {
|
||||||
|
timestamp: message.timestamp ?? Date.now(),
|
||||||
|
source: message.source,
|
||||||
|
text: message.text,
|
||||||
|
};
|
||||||
|
|
||||||
|
fetch(`${loggerUrl}/log`, {
|
||||||
|
method: "POST",
|
||||||
|
headers: { "Content-Type": "application/json" },
|
||||||
|
body: JSON.stringify(payload),
|
||||||
|
}).catch((err) => {
|
||||||
|
console.error("[logging] Failed to send log:", err.message);
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
const getLogs = async (filter: FilterArgument): Promise<Message[]> => {
|
||||||
|
const params = new URLSearchParams();
|
||||||
|
if (filter.limit) {
|
||||||
|
params.set("limit", String(filter.limit));
|
||||||
|
}
|
||||||
|
if (filter.before) {
|
||||||
|
params.set("before", String(filter.before));
|
||||||
|
}
|
||||||
|
if (filter.after) {
|
||||||
|
params.set("after", String(filter.after));
|
||||||
|
}
|
||||||
|
|
||||||
|
const url = `${loggerUrl}/logs?${params.toString()}`;
|
||||||
|
const response = await fetch(url);
|
||||||
|
return response.json();
|
||||||
|
};
|
||||||
|
|
||||||
|
// FIXME: there's scope for more specialized functions although they
|
||||||
|
// probably should be defined in terms of the basic ones here.
|
||||||
|
|
||||||
|
export { getLogs, log };
|
||||||
69
express/mgmt/add-user.ts
Normal file
69
express/mgmt/add-user.ts
Normal file
@@ -0,0 +1,69 @@
|
|||||||
|
// add-user.ts
|
||||||
|
// Management command to create users from the command line
|
||||||
|
|
||||||
|
import { hashPassword } from "../auth/password";
|
||||||
|
import { PostgresAuthStore, pool } from "../database";
|
||||||
|
|
||||||
|
async function main(): Promise<void> {
|
||||||
|
const args = process.argv.slice(2);
|
||||||
|
|
||||||
|
if (args.length < 2) {
|
||||||
|
console.error(
|
||||||
|
"Usage: ./mgmt add-user <email> <password> [--display-name <name>] [--active]",
|
||||||
|
);
|
||||||
|
process.exit(1);
|
||||||
|
}
|
||||||
|
|
||||||
|
const email = args[0];
|
||||||
|
const password = args[1];
|
||||||
|
|
||||||
|
// Parse optional flags
|
||||||
|
let displayName: string | undefined;
|
||||||
|
let makeActive = false;
|
||||||
|
|
||||||
|
for (let i = 2; i < args.length; i++) {
|
||||||
|
if (args[i] === "--display-name" && args[i + 1]) {
|
||||||
|
displayName = args[i + 1];
|
||||||
|
i++;
|
||||||
|
} else if (args[i] === "--active") {
|
||||||
|
makeActive = true;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
const store = new PostgresAuthStore();
|
||||||
|
|
||||||
|
// Check if user already exists
|
||||||
|
const existing = await store.getUserByEmail(email);
|
||||||
|
if (existing) {
|
||||||
|
console.error(`Error: User with email '${email}' already exists`);
|
||||||
|
process.exit(1);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Hash password and create user
|
||||||
|
const passwordHash = await hashPassword(password);
|
||||||
|
const user = await store.createUser({
|
||||||
|
email,
|
||||||
|
passwordHash,
|
||||||
|
displayName,
|
||||||
|
});
|
||||||
|
|
||||||
|
// Optionally activate user immediately
|
||||||
|
if (makeActive) {
|
||||||
|
await store.updateUserEmailVerified(user.id);
|
||||||
|
console.log(
|
||||||
|
`Created and activated user: ${user.email} (${user.id})`,
|
||||||
|
);
|
||||||
|
} else {
|
||||||
|
console.log(`Created user: ${user.email} (${user.id})`);
|
||||||
|
console.log(" Status: pending (use --active to create as active)");
|
||||||
|
}
|
||||||
|
} finally {
|
||||||
|
await pool.end();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
main().catch((err) => {
|
||||||
|
console.error("Failed to create user:", err.message);
|
||||||
|
process.exit(1);
|
||||||
|
});
|
||||||
45
express/migrate.ts
Normal file
45
express/migrate.ts
Normal file
@@ -0,0 +1,45 @@
|
|||||||
|
// migrate.ts
|
||||||
|
// CLI script for running database migrations
|
||||||
|
|
||||||
|
import { migrate, migrationStatus, pool } from "./database";
|
||||||
|
|
||||||
|
async function main(): Promise<void> {
|
||||||
|
const command = process.argv[2] || "run";
|
||||||
|
|
||||||
|
try {
|
||||||
|
switch (command) {
|
||||||
|
case "run":
|
||||||
|
await migrate();
|
||||||
|
break;
|
||||||
|
|
||||||
|
case "status": {
|
||||||
|
const status = await migrationStatus();
|
||||||
|
console.log("Applied migrations:");
|
||||||
|
for (const name of status.applied) {
|
||||||
|
console.log(` ✓ ${name}`);
|
||||||
|
}
|
||||||
|
if (status.pending.length > 0) {
|
||||||
|
console.log("\nPending migrations:");
|
||||||
|
for (const name of status.pending) {
|
||||||
|
console.log(` • ${name}`);
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
console.log("\nNo pending migrations");
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
|
||||||
|
default:
|
||||||
|
console.error(`Unknown command: ${command}`);
|
||||||
|
console.error("Usage: migrate [run|status]");
|
||||||
|
process.exit(1);
|
||||||
|
}
|
||||||
|
} finally {
|
||||||
|
await pool.end();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
main().catch((err) => {
|
||||||
|
console.error("Migration failed:", err);
|
||||||
|
process.exit(1);
|
||||||
|
});
|
||||||
35
express/package.json
Normal file
35
express/package.json
Normal file
@@ -0,0 +1,35 @@
|
|||||||
|
{
|
||||||
|
"name": "express",
|
||||||
|
"version": "1.0.0",
|
||||||
|
"description": "",
|
||||||
|
"main": "index.js",
|
||||||
|
"scripts": {
|
||||||
|
"test": "echo \"Error: no test specified\" && exit 1",
|
||||||
|
"nodemon": "nodemon dist/index.js"
|
||||||
|
},
|
||||||
|
"keywords": [],
|
||||||
|
"author": "",
|
||||||
|
"license": "ISC",
|
||||||
|
"packageManager": "pnpm@10.12.4",
|
||||||
|
"dependencies": {
|
||||||
|
"@types/node": "^24.10.1",
|
||||||
|
"@types/nunjucks": "^3.2.6",
|
||||||
|
"@vercel/ncc": "^0.38.4",
|
||||||
|
"express": "^5.1.0",
|
||||||
|
"kysely": "^0.28.9",
|
||||||
|
"nodemon": "^3.1.11",
|
||||||
|
"nunjucks": "^3.2.4",
|
||||||
|
"path-to-regexp": "^8.3.0",
|
||||||
|
"pg": "^8.16.3",
|
||||||
|
"ts-luxon": "^6.2.0",
|
||||||
|
"ts-node": "^10.9.2",
|
||||||
|
"tsx": "^4.20.6",
|
||||||
|
"typescript": "^5.9.3",
|
||||||
|
"zod": "^4.1.12"
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"@biomejs/biome": "2.3.10",
|
||||||
|
"@types/express": "^5.0.5",
|
||||||
|
"@types/pg": "^8.16.0"
|
||||||
|
}
|
||||||
|
}
|
||||||
1627
express/pnpm-lock.yaml
generated
Normal file
1627
express/pnpm-lock.yaml
generated
Normal file
File diff suppressed because it is too large
Load Diff
25
express/request/index.ts
Normal file
25
express/request/index.ts
Normal file
@@ -0,0 +1,25 @@
|
|||||||
|
import { AuthService } from "../auth";
|
||||||
|
import { getCurrentUser } from "../context";
|
||||||
|
import { PostgresAuthStore } from "../database";
|
||||||
|
import type { User } from "../user";
|
||||||
|
import { html, redirect, render } from "./util";
|
||||||
|
|
||||||
|
const util = { html, redirect, render };
|
||||||
|
|
||||||
|
const session = {
|
||||||
|
getUser: (): User => {
|
||||||
|
return getCurrentUser();
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
// Initialize auth with PostgreSQL store
|
||||||
|
const authStore = new PostgresAuthStore();
|
||||||
|
const auth = new AuthService(authStore);
|
||||||
|
|
||||||
|
const request = {
|
||||||
|
auth,
|
||||||
|
session,
|
||||||
|
util,
|
||||||
|
};
|
||||||
|
|
||||||
|
export { request };
|
||||||
45
express/request/util.ts
Normal file
45
express/request/util.ts
Normal file
@@ -0,0 +1,45 @@
|
|||||||
|
import { contentTypes } from "../content-types";
|
||||||
|
import { core } from "../core";
|
||||||
|
import { executionContext } from "../execution-context";
|
||||||
|
import { httpCodes } from "../http-codes";
|
||||||
|
import type { RedirectResult, Result } from "../types";
|
||||||
|
import { loadFile } from "../util";
|
||||||
|
import { request } from "./index";
|
||||||
|
|
||||||
|
type NoUser = {
|
||||||
|
[key: string]: unknown;
|
||||||
|
} & {
|
||||||
|
user?: never;
|
||||||
|
};
|
||||||
|
|
||||||
|
const render = async (path: string, ctx?: NoUser): Promise<string> => {
|
||||||
|
const fullPath = `${executionContext.diachron_root}/templates/${path}.html.njk`;
|
||||||
|
const template = await loadFile(fullPath);
|
||||||
|
const user = request.session.getUser();
|
||||||
|
const context = { user, ...ctx };
|
||||||
|
const engine = core.conf.templateEngine();
|
||||||
|
const retval = engine.renderTemplate(template, context);
|
||||||
|
|
||||||
|
return retval;
|
||||||
|
};
|
||||||
|
|
||||||
|
const html = (payload: string): Result => {
|
||||||
|
const retval: Result = {
|
||||||
|
code: httpCodes.success.OK,
|
||||||
|
result: payload,
|
||||||
|
contentType: contentTypes.text.html,
|
||||||
|
};
|
||||||
|
|
||||||
|
return retval;
|
||||||
|
};
|
||||||
|
|
||||||
|
const redirect = (location: string): RedirectResult => {
|
||||||
|
return {
|
||||||
|
code: httpCodes.redirection.SeeOther,
|
||||||
|
contentType: contentTypes.text.plain,
|
||||||
|
result: "",
|
||||||
|
redirect: location,
|
||||||
|
};
|
||||||
|
};
|
||||||
|
|
||||||
|
export { html, redirect, render };
|
||||||
148
express/routes.ts
Normal file
148
express/routes.ts
Normal file
@@ -0,0 +1,148 @@
|
|||||||
|
/// <reference lib="dom" />
|
||||||
|
|
||||||
|
import nunjucks from "nunjucks";
|
||||||
|
import { DateTime } from "ts-luxon";
|
||||||
|
import { authRoutes } from "./auth/routes";
|
||||||
|
import { routes as basicRoutes } from "./basic/routes";
|
||||||
|
import { contentTypes } from "./content-types";
|
||||||
|
import { core } from "./core";
|
||||||
|
import { multiHandler } from "./handlers";
|
||||||
|
import { httpCodes } from "./http-codes";
|
||||||
|
import type { Call, Result, Route } from "./types";
|
||||||
|
|
||||||
|
// FIXME: Obviously put this somewhere else
|
||||||
|
const okText = (result: string): Result => {
|
||||||
|
const code = httpCodes.success.OK;
|
||||||
|
|
||||||
|
const retval: Result = {
|
||||||
|
code,
|
||||||
|
result,
|
||||||
|
contentType: contentTypes.text.plain,
|
||||||
|
};
|
||||||
|
|
||||||
|
return retval;
|
||||||
|
};
|
||||||
|
|
||||||
|
const routes: Route[] = [
|
||||||
|
...authRoutes,
|
||||||
|
basicRoutes.home,
|
||||||
|
basicRoutes.hello,
|
||||||
|
basicRoutes.login,
|
||||||
|
basicRoutes.logout,
|
||||||
|
{
|
||||||
|
path: "/slow",
|
||||||
|
methods: ["GET"],
|
||||||
|
handler: async (_call: Call): Promise<Result> => {
|
||||||
|
console.log("starting slow request");
|
||||||
|
|
||||||
|
await core.misc.sleep(2);
|
||||||
|
|
||||||
|
console.log("finishing slow request");
|
||||||
|
const retval = okText("that was slow");
|
||||||
|
|
||||||
|
return retval;
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
path: "/list",
|
||||||
|
methods: ["GET"],
|
||||||
|
handler: async (_call: Call): Promise<Result> => {
|
||||||
|
const code = httpCodes.success.OK;
|
||||||
|
const lr = (rr: Route[]) => {
|
||||||
|
const ret = rr.map((r: Route) => {
|
||||||
|
return r.path;
|
||||||
|
});
|
||||||
|
|
||||||
|
return ret;
|
||||||
|
};
|
||||||
|
|
||||||
|
const rrr = lr(routes);
|
||||||
|
|
||||||
|
const template = `
|
||||||
|
<html>
|
||||||
|
<head></head>
|
||||||
|
<body>
|
||||||
|
<ul>
|
||||||
|
{% for route in rrr %}
|
||||||
|
<li><a href="{{ route }}">{{ route }}</a></li>
|
||||||
|
{% endfor %}
|
||||||
|
</ul>
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
|
`;
|
||||||
|
const result = nunjucks.renderString(template, { rrr });
|
||||||
|
|
||||||
|
const _listing = lr(routes).join(", ");
|
||||||
|
return {
|
||||||
|
code,
|
||||||
|
result,
|
||||||
|
contentType: contentTypes.text.html,
|
||||||
|
};
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
path: "/whoami",
|
||||||
|
methods: ["GET"],
|
||||||
|
handler: async (call: Call): Promise<Result> => {
|
||||||
|
const me = call.session.getUser();
|
||||||
|
const template = `
|
||||||
|
<html>
|
||||||
|
<head></head>
|
||||||
|
<body>
|
||||||
|
{{ me }}
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
|
`;
|
||||||
|
|
||||||
|
const result = nunjucks.renderString(template, { me });
|
||||||
|
|
||||||
|
return {
|
||||||
|
code: httpCodes.success.OK,
|
||||||
|
contentType: contentTypes.text.html,
|
||||||
|
result,
|
||||||
|
};
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
path: "/ok",
|
||||||
|
methods: ["GET", "POST", "PUT"],
|
||||||
|
handler: multiHandler,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
path: "/alsook",
|
||||||
|
methods: ["GET"],
|
||||||
|
handler: async (_req): Promise<Result> => {
|
||||||
|
const code = httpCodes.success.OK;
|
||||||
|
return {
|
||||||
|
code,
|
||||||
|
result: "it is also ok",
|
||||||
|
contentType: contentTypes.text.plain,
|
||||||
|
};
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
path: "/time",
|
||||||
|
methods: ["GET"],
|
||||||
|
handler: async (_req): Promise<Result> => {
|
||||||
|
const now = DateTime.now();
|
||||||
|
const template = `
|
||||||
|
<html>
|
||||||
|
<head></head>
|
||||||
|
<body>
|
||||||
|
{{ now }}
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
|
`;
|
||||||
|
|
||||||
|
const result = nunjucks.renderString(template, { now });
|
||||||
|
|
||||||
|
return {
|
||||||
|
code: httpCodes.success.OK,
|
||||||
|
contentType: contentTypes.text.html,
|
||||||
|
result,
|
||||||
|
};
|
||||||
|
},
|
||||||
|
},
|
||||||
|
];
|
||||||
|
|
||||||
|
export { routes };
|
||||||
9
express/run.sh
Executable file
9
express/run.sh
Executable file
@@ -0,0 +1,9 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
set -eu
|
||||||
|
|
||||||
|
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
|
||||||
|
cd "$DIR"
|
||||||
|
|
||||||
|
exec ../cmd node dist/index.js "$@"
|
||||||
12
express/show-config.sh
Executable file
12
express/show-config.sh
Executable file
@@ -0,0 +1,12 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
set -e
|
||||||
|
|
||||||
|
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
|
||||||
|
check_dir="$DIR"
|
||||||
|
|
||||||
|
source "$check_dir"/../framework/shims/common
|
||||||
|
source "$check_dir"/../framework/shims/node.common
|
||||||
|
|
||||||
|
$ROOT/cmd pnpm tsc --showConfig
|
||||||
13
express/tsconfig.json
Normal file
13
express/tsconfig.json
Normal file
@@ -0,0 +1,13 @@
|
|||||||
|
{
|
||||||
|
"compilerOptions": {
|
||||||
|
"esModuleInterop": true,
|
||||||
|
"target": "ES2022",
|
||||||
|
"lib": ["ES2023"],
|
||||||
|
"module": "NodeNext",
|
||||||
|
"moduleResolution": "NodeNext",
|
||||||
|
"noImplicitAny": true,
|
||||||
|
"strict": true,
|
||||||
|
"types": ["node"],
|
||||||
|
"outDir": "out"
|
||||||
|
}
|
||||||
|
}
|
||||||
117
express/types.ts
Normal file
117
express/types.ts
Normal file
@@ -0,0 +1,117 @@
|
|||||||
|
// types.ts
|
||||||
|
|
||||||
|
// FIXME: split this up into types used by app developers and types internal
|
||||||
|
// to the framework.
|
||||||
|
import type { Request as ExpressRequest } from "express";
|
||||||
|
import type { MatchFunction } from "path-to-regexp";
|
||||||
|
import { z } from "zod";
|
||||||
|
import type { Session } from "./auth/types";
|
||||||
|
import type { ContentType } from "./content-types";
|
||||||
|
import type { HttpCode } from "./http-codes";
|
||||||
|
import type { Permission, User } from "./user";
|
||||||
|
|
||||||
|
const methodParser = z.union([
|
||||||
|
z.literal("GET"),
|
||||||
|
z.literal("POST"),
|
||||||
|
z.literal("PUT"),
|
||||||
|
z.literal("PATCH"),
|
||||||
|
z.literal("DELETE"),
|
||||||
|
]);
|
||||||
|
|
||||||
|
export type Method = z.infer<typeof methodParser>;
|
||||||
|
const massageMethod = (input: string): Method => {
|
||||||
|
const r = methodParser.parse(input.toUpperCase());
|
||||||
|
|
||||||
|
return r;
|
||||||
|
};
|
||||||
|
|
||||||
|
export type Call = {
|
||||||
|
pattern: string;
|
||||||
|
path: string;
|
||||||
|
method: Method;
|
||||||
|
parameters: object;
|
||||||
|
request: ExpressRequest;
|
||||||
|
user: User;
|
||||||
|
session: Session;
|
||||||
|
};
|
||||||
|
|
||||||
|
export type InternalHandler = (req: ExpressRequest) => Promise<Result>;
|
||||||
|
|
||||||
|
export type Handler = (call: Call) => Promise<Result>;
|
||||||
|
export type ProcessedRoute = {
|
||||||
|
matcher: MatchFunction<Record<string, string>>;
|
||||||
|
method: Method;
|
||||||
|
handler: InternalHandler;
|
||||||
|
};
|
||||||
|
|
||||||
|
export type CookieOptions = {
|
||||||
|
httpOnly?: boolean;
|
||||||
|
secure?: boolean;
|
||||||
|
sameSite?: "strict" | "lax" | "none";
|
||||||
|
maxAge?: number;
|
||||||
|
path?: string;
|
||||||
|
};
|
||||||
|
|
||||||
|
export type Cookie = {
|
||||||
|
name: string;
|
||||||
|
value: string;
|
||||||
|
options?: CookieOptions;
|
||||||
|
};
|
||||||
|
|
||||||
|
export type Result = {
|
||||||
|
code: HttpCode;
|
||||||
|
contentType: ContentType;
|
||||||
|
result: string;
|
||||||
|
cookies?: Cookie[];
|
||||||
|
};
|
||||||
|
|
||||||
|
export type RedirectResult = Result & {
|
||||||
|
redirect: string;
|
||||||
|
};
|
||||||
|
|
||||||
|
export function isRedirect(result: Result): result is RedirectResult {
|
||||||
|
return "redirect" in result;
|
||||||
|
}
|
||||||
|
|
||||||
|
export type Route = {
|
||||||
|
path: string;
|
||||||
|
methods: Method[];
|
||||||
|
handler: Handler;
|
||||||
|
interruptable?: boolean;
|
||||||
|
};
|
||||||
|
|
||||||
|
// Authentication error classes
|
||||||
|
export class AuthenticationRequired extends Error {
|
||||||
|
constructor() {
|
||||||
|
super("Authentication required");
|
||||||
|
this.name = "AuthenticationRequired";
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export class AuthorizationDenied extends Error {
|
||||||
|
constructor() {
|
||||||
|
super("Authorization denied");
|
||||||
|
this.name = "AuthorizationDenied";
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Helper for handlers to require authentication
|
||||||
|
export function requireAuth(call: Call): User {
|
||||||
|
if (call.user.isAnonymous()) {
|
||||||
|
throw new AuthenticationRequired();
|
||||||
|
}
|
||||||
|
return call.user;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Helper for handlers to require specific permission
|
||||||
|
export function requirePermission(call: Call, permission: Permission): User {
|
||||||
|
const user = requireAuth(call);
|
||||||
|
if (!user.hasPermission(permission)) {
|
||||||
|
throw new AuthorizationDenied();
|
||||||
|
}
|
||||||
|
return user;
|
||||||
|
}
|
||||||
|
|
||||||
|
export type Domain = "app" | "fw";
|
||||||
|
|
||||||
|
export { methodParser, massageMethod };
|
||||||
232
express/user.ts
Normal file
232
express/user.ts
Normal file
@@ -0,0 +1,232 @@
|
|||||||
|
// user.ts
|
||||||
|
//
|
||||||
|
// User model for authentication and authorization.
|
||||||
|
//
|
||||||
|
// Design notes:
|
||||||
|
// - `id` is the stable internal identifier (UUID when database-backed)
|
||||||
|
// - `email` is the primary human-facing identifier
|
||||||
|
// - Roles provide coarse-grained authorization (admin, editor, etc.)
|
||||||
|
// - Permissions provide fine-grained authorization (posts:create, etc.)
|
||||||
|
// - Users can have both roles (which grant permissions) and direct permissions
|
||||||
|
|
||||||
|
import { z } from "zod";
|
||||||
|
|
||||||
|
// Branded type for user IDs to prevent accidental mixing with other strings
|
||||||
|
export type UserId = string & { readonly __brand: "UserId" };
|
||||||
|
|
||||||
|
// User account status
|
||||||
|
const userStatusParser = z.enum(["active", "suspended", "pending"]);
|
||||||
|
export type UserStatus = z.infer<typeof userStatusParser>;
|
||||||
|
|
||||||
|
// Role - simple string identifier
|
||||||
|
const roleParser = z.string().min(1);
|
||||||
|
export type Role = z.infer<typeof roleParser>;
|
||||||
|
|
||||||
|
// Permission format: "resource:action" e.g. "posts:create", "users:delete"
|
||||||
|
const permissionParser = z.string().regex(/^[a-z_]+:[a-z_]+$/, {
|
||||||
|
message: "Permission must be in format 'resource:action'",
|
||||||
|
});
|
||||||
|
export type Permission = z.infer<typeof permissionParser>;
|
||||||
|
|
||||||
|
// Core user data schema - this is what gets stored/serialized
|
||||||
|
const userDataParser = z.object({
|
||||||
|
id: z.string().min(1),
|
||||||
|
email: z.email(),
|
||||||
|
displayName: z.string().optional(),
|
||||||
|
status: userStatusParser,
|
||||||
|
roles: z.array(roleParser),
|
||||||
|
permissions: z.array(permissionParser),
|
||||||
|
createdAt: z.coerce.date(),
|
||||||
|
updatedAt: z.coerce.date(),
|
||||||
|
});
|
||||||
|
|
||||||
|
export type UserData = z.infer<typeof userDataParser>;
|
||||||
|
|
||||||
|
// Role-to-permission mappings
|
||||||
|
// In a real system this might be database-driven or configurable
|
||||||
|
type RolePermissionMap = Map<Role, Permission[]>;
|
||||||
|
|
||||||
|
const defaultRolePermissions: RolePermissionMap = new Map([
|
||||||
|
["admin", ["users:read", "users:create", "users:update", "users:delete"]],
|
||||||
|
["user", ["users:read"]],
|
||||||
|
]);
|
||||||
|
|
||||||
|
export abstract class User {
|
||||||
|
protected readonly data: UserData;
|
||||||
|
protected rolePermissions: RolePermissionMap;
|
||||||
|
|
||||||
|
constructor(data: UserData, rolePermissions?: RolePermissionMap) {
|
||||||
|
this.data = userDataParser.parse(data);
|
||||||
|
this.rolePermissions = rolePermissions ?? defaultRolePermissions;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Identity
|
||||||
|
get id(): UserId {
|
||||||
|
return this.data.id as UserId;
|
||||||
|
}
|
||||||
|
|
||||||
|
get email(): string {
|
||||||
|
return this.data.email;
|
||||||
|
}
|
||||||
|
|
||||||
|
get displayName(): string | undefined {
|
||||||
|
return this.data.displayName;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Status
|
||||||
|
get status(): UserStatus {
|
||||||
|
return this.data.status;
|
||||||
|
}
|
||||||
|
|
||||||
|
isActive(): boolean {
|
||||||
|
return this.data.status === "active";
|
||||||
|
}
|
||||||
|
|
||||||
|
// Roles
|
||||||
|
get roles(): readonly Role[] {
|
||||||
|
return this.data.roles;
|
||||||
|
}
|
||||||
|
|
||||||
|
hasRole(role: Role): boolean {
|
||||||
|
return this.data.roles.includes(role);
|
||||||
|
}
|
||||||
|
|
||||||
|
hasAnyRole(roles: Role[]): boolean {
|
||||||
|
return roles.some((role) => this.hasRole(role));
|
||||||
|
}
|
||||||
|
|
||||||
|
hasAllRoles(roles: Role[]): boolean {
|
||||||
|
return roles.every((role) => this.hasRole(role));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Permissions
|
||||||
|
get permissions(): readonly Permission[] {
|
||||||
|
return this.data.permissions;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get all permissions: direct + role-derived
|
||||||
|
effectivePermissions(): Set<Permission> {
|
||||||
|
const perms = new Set<Permission>(this.data.permissions);
|
||||||
|
|
||||||
|
for (const role of this.data.roles) {
|
||||||
|
const rolePerms = this.rolePermissions.get(role);
|
||||||
|
if (rolePerms) {
|
||||||
|
for (const p of rolePerms) {
|
||||||
|
perms.add(p);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return perms;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if user has a specific permission (direct or via role)
|
||||||
|
hasPermission(permission: Permission): boolean {
|
||||||
|
// Check direct permissions first
|
||||||
|
if (this.data.permissions.includes(permission)) {
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check role-derived permissions
|
||||||
|
for (const role of this.data.roles) {
|
||||||
|
const rolePerms = this.rolePermissions.get(role);
|
||||||
|
if (rolePerms?.includes(permission)) {
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Convenience method: can user perform action on resource?
|
||||||
|
can(action: string, resource: string): boolean {
|
||||||
|
const permission = `${resource}:${action}` as Permission;
|
||||||
|
return this.hasPermission(permission);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Timestamps
|
||||||
|
get createdAt(): Date {
|
||||||
|
return this.data.createdAt;
|
||||||
|
}
|
||||||
|
|
||||||
|
get updatedAt(): Date {
|
||||||
|
return this.data.updatedAt;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Serialization - returns plain object for storage/transmission
|
||||||
|
toJSON(): UserData {
|
||||||
|
return { ...this.data };
|
||||||
|
}
|
||||||
|
|
||||||
|
toString(): string {
|
||||||
|
return `User(id ${this.id})`;
|
||||||
|
}
|
||||||
|
|
||||||
|
abstract isAnonymous(): boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
export class AuthenticatedUser extends User {
|
||||||
|
// Factory for creating new users with sensible defaults
|
||||||
|
static create(
|
||||||
|
email: string,
|
||||||
|
options?: {
|
||||||
|
id?: string;
|
||||||
|
displayName?: string;
|
||||||
|
status?: UserStatus;
|
||||||
|
roles?: Role[];
|
||||||
|
permissions?: Permission[];
|
||||||
|
},
|
||||||
|
): User {
|
||||||
|
const now = new Date();
|
||||||
|
return new AuthenticatedUser({
|
||||||
|
id: options?.id ?? crypto.randomUUID(),
|
||||||
|
email,
|
||||||
|
displayName: options?.displayName,
|
||||||
|
status: options?.status ?? "active",
|
||||||
|
roles: options?.roles ?? [],
|
||||||
|
permissions: options?.permissions ?? [],
|
||||||
|
createdAt: now,
|
||||||
|
updatedAt: now,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
isAnonymous(): boolean {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// For representing "no user" in contexts where user is optional
|
||||||
|
export class AnonymousUser extends User {
|
||||||
|
// FIXME: this is C&Ped with only minimal changes. No bueno.
|
||||||
|
static create(
|
||||||
|
email: string,
|
||||||
|
options?: {
|
||||||
|
id?: string;
|
||||||
|
displayName?: string;
|
||||||
|
status?: UserStatus;
|
||||||
|
roles?: Role[];
|
||||||
|
permissions?: Permission[];
|
||||||
|
},
|
||||||
|
): AnonymousUser {
|
||||||
|
const now = new Date(0);
|
||||||
|
return new AnonymousUser({
|
||||||
|
id: options?.id ?? crypto.randomUUID(),
|
||||||
|
email,
|
||||||
|
displayName: options?.displayName,
|
||||||
|
status: options?.status ?? "active",
|
||||||
|
roles: options?.roles ?? [],
|
||||||
|
permissions: options?.permissions ?? [],
|
||||||
|
createdAt: now,
|
||||||
|
updatedAt: now,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
isAnonymous(): boolean {
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export const anonymousUser = AnonymousUser.create("anonymous@example.com", {
|
||||||
|
id: "-1",
|
||||||
|
displayName: "Anonymous User",
|
||||||
|
});
|
||||||
11
express/util.ts
Normal file
11
express/util.ts
Normal file
@@ -0,0 +1,11 @@
|
|||||||
|
import { readFile } from "node:fs/promises";
|
||||||
|
|
||||||
|
// FIXME: Handle the error here
|
||||||
|
const loadFile = async (path: string): Promise<string> => {
|
||||||
|
// Specifying 'utf8' returns a string; otherwise, it returns a Buffer
|
||||||
|
const data = await readFile(path, "utf8");
|
||||||
|
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
|
||||||
|
export { loadFile };
|
||||||
14
express/watch.sh
Executable file
14
express/watch.sh
Executable file
@@ -0,0 +1,14 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
set -e
|
||||||
|
|
||||||
|
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
|
||||||
|
check_dir="$DIR"
|
||||||
|
|
||||||
|
source "$check_dir"/../framework/shims/common
|
||||||
|
source "$check_dir"/../framework/shims/node.common
|
||||||
|
|
||||||
|
# $ROOT/cmd pnpm tsc --lib ES2023 --esModuleInterop -w $check_dir/app.ts
|
||||||
|
# $ROOT/cmd pnpm tsc -w $check_dir/app.ts
|
||||||
|
$ROOT/cmd pnpm tsc --watch --project ./tsconfig.json
|
||||||
26
fixup.sh
Executable file
26
fixup.sh
Executable file
@@ -0,0 +1,26 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
set -eu
|
||||||
|
|
||||||
|
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
|
||||||
|
cd "$DIR"
|
||||||
|
|
||||||
|
# uv run ruff check --select I --fix .
|
||||||
|
|
||||||
|
# uv run ruff format .
|
||||||
|
|
||||||
|
shell_scripts="$(fd '.sh$' | xargs)"
|
||||||
|
shfmt -i 4 -w "$DIR/cmd" "$DIR"/framework/cmd.d/* "$DIR"/framework/shims/* "$DIR"/master/master "$DIR"/logger/logger
|
||||||
|
# "$shell_scripts"
|
||||||
|
for ss in $shell_scripts; do
|
||||||
|
shfmt -i 4 -w $ss
|
||||||
|
done
|
||||||
|
|
||||||
|
pushd "$DIR/master"
|
||||||
|
go fmt
|
||||||
|
popd
|
||||||
|
|
||||||
|
pushd "$DIR/express"
|
||||||
|
../cmd pnpm biome check --write
|
||||||
|
popd
|
||||||
0
framework/.nodejs-config/.gitignore
vendored
Normal file
0
framework/.nodejs-config/.gitignore
vendored
Normal file
0
framework/.nodejs/.gitignore
vendored
Normal file
0
framework/.nodejs/.gitignore
vendored
Normal file
0
framework/binaries/.gitignore
vendored
Normal file
0
framework/binaries/.gitignore
vendored
Normal file
9
framework/cmd.d/list
Executable file
9
framework/cmd.d/list
Executable file
@@ -0,0 +1,9 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
set -eu
|
||||||
|
|
||||||
|
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
|
||||||
|
cd "$DIR"
|
||||||
|
|
||||||
|
ls .
|
||||||
7
framework/cmd.d/node
Executable file
7
framework/cmd.d/node
Executable file
@@ -0,0 +1,7 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
set -eu
|
||||||
|
|
||||||
|
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
|
||||||
|
exec "$DIR"/../shims/node "$@"
|
||||||
7
framework/cmd.d/pnpm
Executable file
7
framework/cmd.d/pnpm
Executable file
@@ -0,0 +1,7 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
set -eu
|
||||||
|
|
||||||
|
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
|
||||||
|
"$DIR"/../shims/pnpm "$@"
|
||||||
18
framework/cmd.d/sync
Executable file
18
framework/cmd.d/sync
Executable file
@@ -0,0 +1,18 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
set -eu
|
||||||
|
|
||||||
|
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
|
||||||
|
# figure out the platform we're on
|
||||||
|
|
||||||
|
# source ../framework/versions
|
||||||
|
# [eventually: check for it in user's cache dir
|
||||||
|
# download $nodejs_version
|
||||||
|
# verify its checksum against $nodejs_checksum
|
||||||
|
|
||||||
|
cd "$DIR/../node"
|
||||||
|
|
||||||
|
"$DIR"/pnpm install
|
||||||
|
|
||||||
|
echo we will download other files here later
|
||||||
15
framework/cmd.d/test
Executable file
15
framework/cmd.d/test
Executable file
@@ -0,0 +1,15 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
set -eu
|
||||||
|
|
||||||
|
shopt -s globstar nullglob
|
||||||
|
|
||||||
|
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
|
||||||
|
cd "$DIR/../../express"
|
||||||
|
|
||||||
|
if [ $# -eq 0 ]; then
|
||||||
|
"$DIR"/../shims/pnpm tsx --test ./**/*.spec.ts ./**/*.test.ts
|
||||||
|
else
|
||||||
|
"$DIR"/../shims/pnpm tsx --test "$@"
|
||||||
|
fi
|
||||||
5
framework/cmd.d/ts-node
Executable file
5
framework/cmd.d/ts-node
Executable file
@@ -0,0 +1,5 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
|
||||||
|
"$DIR"/../shims/pnpm ts-node "$@"
|
||||||
5
framework/cmd.d/tsx
Executable file
5
framework/cmd.d/tsx
Executable file
@@ -0,0 +1,5 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
|
||||||
|
"$DIR"/../shims/pnpm tsx "$@"
|
||||||
9
framework/common.d/db
Executable file
9
framework/common.d/db
Executable file
@@ -0,0 +1,9 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
set -eu
|
||||||
|
|
||||||
|
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
ROOT="$DIR/../.."
|
||||||
|
|
||||||
|
# FIXME: don't hard code this of course
|
||||||
|
PGPASSWORD=diachron psql -U diachron -h localhost diachron
|
||||||
9
framework/common.d/migrate
Executable file
9
framework/common.d/migrate
Executable file
@@ -0,0 +1,9 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
set -eu
|
||||||
|
|
||||||
|
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
ROOT="$DIR/../.."
|
||||||
|
|
||||||
|
cd "$ROOT/express"
|
||||||
|
"$DIR"/tsx migrate.ts "$@"
|
||||||
11
framework/develop.d/clear-db
Executable file
11
framework/develop.d/clear-db
Executable file
@@ -0,0 +1,11 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
# This file belongs to the framework. You are not expected to modify it.
|
||||||
|
|
||||||
|
set -eu
|
||||||
|
|
||||||
|
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
ROOT="$DIR/../.."
|
||||||
|
|
||||||
|
cd "$ROOT/express"
|
||||||
|
"$DIR"/../cmd.d/tsx develop/clear-db.ts "$@"
|
||||||
1
framework/develop.d/db
Symbolic link
1
framework/develop.d/db
Symbolic link
@@ -0,0 +1 @@
|
|||||||
|
../common.d/db
|
||||||
1
framework/develop.d/migrate
Symbolic link
1
framework/develop.d/migrate
Symbolic link
@@ -0,0 +1 @@
|
|||||||
|
../common.d/migrate
|
||||||
9
framework/develop.d/reset-db
Executable file
9
framework/develop.d/reset-db
Executable file
@@ -0,0 +1,9 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
set -eu
|
||||||
|
|
||||||
|
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
ROOT="$DIR/../.."
|
||||||
|
|
||||||
|
cd "$ROOT/express"
|
||||||
|
"$DIR"/../cmd.d/tsx develop/reset-db.ts "$@"
|
||||||
0
framework/downloads/.gitignore
vendored
Normal file
0
framework/downloads/.gitignore
vendored
Normal file
9
framework/mgmt.d/add-user
Executable file
9
framework/mgmt.d/add-user
Executable file
@@ -0,0 +1,9 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
set -eu
|
||||||
|
|
||||||
|
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
ROOT="$DIR/../.."
|
||||||
|
|
||||||
|
cd "$ROOT/express"
|
||||||
|
"$DIR"/../cmd.d/tsx mgmt/add-user.ts "$@"
|
||||||
1
framework/mgmt.d/db
Symbolic link
1
framework/mgmt.d/db
Symbolic link
@@ -0,0 +1 @@
|
|||||||
|
../common.d/db
|
||||||
1
framework/mgmt.d/migrate
Symbolic link
1
framework/mgmt.d/migrate
Symbolic link
@@ -0,0 +1 @@
|
|||||||
|
../common.d/migrate
|
||||||
26
framework/platform
Normal file
26
framework/platform
Normal file
@@ -0,0 +1,26 @@
|
|||||||
|
# shellcheck shell=bash
|
||||||
|
|
||||||
|
# Detect platform (OS and architecture)
|
||||||
|
|
||||||
|
os=$(uname -s | tr '[:upper:]' '[:lower:]')
|
||||||
|
arch=$(uname -m)
|
||||||
|
|
||||||
|
case "$os" in
|
||||||
|
linux) platform_os=linux ;;
|
||||||
|
darwin) platform_os=darwin ;;
|
||||||
|
*) echo "Unsupported OS: $os" >&2; exit 1 ;;
|
||||||
|
esac
|
||||||
|
|
||||||
|
case "$arch" in
|
||||||
|
x86_64) platform_arch=x86_64 ;;
|
||||||
|
*) echo "Unsupported architecture: $arch" >&2; exit 1 ;;
|
||||||
|
esac
|
||||||
|
|
||||||
|
platform="${platform_os}_${platform_arch}"
|
||||||
|
|
||||||
|
# Platform-specific checksum command
|
||||||
|
if [ "$platform_os" = "darwin" ]; then
|
||||||
|
sha256_check() { shasum -a 256 -c -; }
|
||||||
|
else
|
||||||
|
sha256_check() { sha256sum -c -; }
|
||||||
|
fi
|
||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user