Compare commits
74 Commits
e2ea472a10
...
hydrators
| Author | SHA1 | Date | |
|---|---|---|---|
| 811c446895 | |||
| 5a8c0028d7 | |||
| f7e6e56aca | |||
| cd19a32be5 | |||
| 478305bc4f | |||
| 421628d49e | |||
| 4f37a72d7b | |||
| e30bf5d96d | |||
| 8704c4a8d5 | |||
| 579a19669e | |||
| 474420ac1e | |||
| 960f78a1ad | |||
| d921679058 | |||
| 350bf7c865 | |||
| 8a7682e953 | |||
| e59bb35ac9 | |||
| a345a2adfb | |||
| 00d84d6686 | |||
| 7ed05695b9 | |||
| 03cc4cf4eb | |||
| 2121a6b5de | |||
|
|
6ace2163ed | ||
|
|
93ab4b5d53 | ||
|
|
70ddcb2a94 | ||
|
|
1da81089cd | ||
| f383c6a465 | |||
| e34d47b352 | |||
| de70be996e | |||
| 096a1235b5 | |||
| 4a4dc11aa4 | |||
| 7399cbe785 | |||
| 14d20be9a2 | |||
| 55f5cc699d | |||
| afcb447b2b | |||
| 1c1eeddcbe | |||
| 7cecf5326d | |||
| 47f6bee75f | |||
| 6e96c33457 | |||
| 9e3329fa58 | |||
| 05eaf938fa | |||
| df2d4eea3f | |||
| b235a6be9a | |||
| 8cd4b42cc6 | |||
| 241d3e799e | |||
| 49dc0e3fe0 | |||
| c7b8cd33da | |||
| 6c0895de07 | |||
| 17ea6ba02d | |||
| 661def8a5c | |||
| 74d75d08dd | |||
| ad6d405206 | |||
| e9ccf6d757 | |||
| 34ec5be7ec | |||
| e136c07928 | |||
| c926f15aab | |||
| 39cd93c81e | |||
| c246e0384f | |||
| 788ea2ab19 | |||
| 6297a95d3c | |||
| 63cf0a670d | |||
| 5524eaf18f | |||
| 03980e114b | |||
| 539717efda | |||
| 8be88bb696 | |||
| ab74695f4c | |||
| dc5a70ba33 | |||
| 4adf6cf358 | |||
| bee6938a67 | |||
| b0ee53f7d5 | |||
| 5c93c9e982 | |||
| 5606a59614 | |||
| 22dde8c213 | |||
| 30463b60a5 | |||
| a0043fd475 |
4
.beads/issues.jsonl
Normal file
4
.beads/issues.jsonl
Normal file
@@ -0,0 +1,4 @@
|
||||
{"id":"diachron-2vh","title":"Add unit testing to golang programs","status":"open","priority":2,"issue_type":"task","created_at":"2026-01-03T12:31:41.281891462-06:00","created_by":"mw","updated_at":"2026-01-03T12:31:41.281891462-06:00"}
|
||||
{"id":"diachron-64w","title":"Add unit testing to express backend","status":"open","priority":2,"issue_type":"task","created_at":"2026-01-03T12:31:30.439206099-06:00","created_by":"mw","updated_at":"2026-01-03T12:31:30.439206099-06:00"}
|
||||
{"id":"diachron-fzd","title":"Add generic 'user' functionality","status":"open","priority":2,"issue_type":"task","created_at":"2026-01-03T12:35:53.73213604-06:00","created_by":"mw","updated_at":"2026-01-03T12:35:53.73213604-06:00"}
|
||||
{"id":"diachron-ngx","title":"Teach the master and/or build process to send messages with notify-send when builds fail or succeed. Ideally this will be fairly generic.","status":"open","priority":2,"issue_type":"task","created_at":"2026-01-03T14:10:11.773218844-06:00","created_by":"mw","updated_at":"2026-01-03T14:10:11.773218844-06:00"}
|
||||
1
.go-version
Normal file
1
.go-version
Normal file
@@ -0,0 +1 @@
|
||||
1.23.6
|
||||
@@ -31,14 +31,14 @@ master process. Key design principles:
|
||||
|
||||
### Development
|
||||
|
||||
**Check shell scripts (shellcheck + shfmt) (eventually go fmt and prettier or similar):**
|
||||
**Check shell scripts (shellcheck + shfmt) (eventually go fmt and biome or similar):**
|
||||
```bash
|
||||
./check.sh
|
||||
```
|
||||
|
||||
**Format TypeScript code:**
|
||||
```bash
|
||||
cd express && ../cmd pnpm prettier --write .
|
||||
cd express && ../cmd pnpm biome check --write .
|
||||
```
|
||||
|
||||
**Build Go master process:**
|
||||
@@ -108,6 +108,10 @@ Early stage - most implementations are stubs:
|
||||
|
||||
# meta
|
||||
|
||||
## formatting and sorting
|
||||
|
||||
- When a typescript file exports symbols, they should be listed in order
|
||||
|
||||
## guidelines for this document
|
||||
|
||||
- Try to keep lines below 80 characters in length, especially prose. But if
|
||||
|
||||
@@ -54,10 +54,9 @@ To run a more complete system, you also need to have docker compose installed.
|
||||
|
||||
To hack on diachron itself, you need the following:
|
||||
|
||||
- docker compose
|
||||
- bash
|
||||
- docker and docker compose
|
||||
- [fd](https://github.com/sharkdp/fd)
|
||||
- golang, version 1.23.6 or greater
|
||||
- shellcheck
|
||||
- shfmt
|
||||
|
||||
|
||||
|
||||
180
TODO.md
180
TODO.md
@@ -1,21 +1,177 @@
|
||||
- [ ] Update check script:
|
||||
- [ ] Run `go fmt` on all .go files
|
||||
- [ ] Run prettier on all .ts files
|
||||
- [ ] Eventually, run unit tests
|
||||
|
||||
- [ ] Adapt master program so that it reads configuration from command line
|
||||
args instead of from environment variables
|
||||
- Should have sane defaults
|
||||
- Adding new arguments should be easy and obvious
|
||||
|
||||
- [ ] Add wrapper script to run main program (so that various assumptions related
|
||||
to relative paths are safer)
|
||||
## high importance
|
||||
|
||||
- [ ] Add unit tests all over the place.
|
||||
- ⚠️ Huge task - needs breakdown before starting
|
||||
|
||||
|
||||
- [ ] migrations, seeding, fixtures
|
||||
|
||||
```sql
|
||||
CREATE SCHEMA fw;
|
||||
CREATE TABLE fw.users (...);
|
||||
CREATE TABLE fw.groups (...);
|
||||
```
|
||||
|
||||
```sql
|
||||
CREATE TABLE app.user_profiles (...);
|
||||
CREATE TABLE app.customer_metadata (...);
|
||||
```
|
||||
|
||||
- [ ] flesh out `mgmt` and `develop` (does not exist yet)
|
||||
|
||||
4.1 What belongs in develop
|
||||
|
||||
- Create migrations
|
||||
- Squash migrations
|
||||
- Reset DB
|
||||
- Roll back migrations
|
||||
- Seed large test datasets
|
||||
- Run tests
|
||||
- Snapshot / restore local DB state (!!!)
|
||||
|
||||
`develop` fails if APP_ENV (or whatever) is `production`. Or maybe even
|
||||
`testing`.
|
||||
|
||||
- [ ] Add default user table(s) to database.
|
||||
|
||||
|
||||
- [ ] Add authentication
|
||||
- [ ] password
|
||||
- [ ] third party?
|
||||
|
||||
|
||||
- [ ] Add middleware concept
|
||||
|
||||
- [ ] Add authorization
|
||||
- for specific routes / resources / etc
|
||||
|
||||
- [ ] Add basic text views
|
||||
Partially done; see the /time route. But we need to figure out where to
|
||||
store templates, static files, etc.
|
||||
|
||||
- [ ] fix process management: if you control-c `master` process sometimes it
|
||||
leaves around `master-bin`, `logger-bin`, and `diachron:nnnn` processes.
|
||||
Huge problem.
|
||||
|
||||
## medium importance
|
||||
|
||||
- [ ] Add a log viewer
|
||||
- with queries
|
||||
- convert to logfmt and is there a viewer UI we could pull in and use
|
||||
instead?
|
||||
|
||||
- [ ] add nested routes. Note that this might be easy to do without actually
|
||||
changing the logic in express/routes.ts. A function that takes an array
|
||||
of routes and maps over them rewriting them. Maybe.
|
||||
|
||||
- [ ] related: add something to do with default templates and stuff... I
|
||||
think we can make handlers a lot shorter to write, sometimes not even
|
||||
necessary at all, with some sane defaults and an easy to use override
|
||||
mechanism
|
||||
|
||||
- [ ] time library
|
||||
|
||||
- [ ] fill in the rest of express/http-codes.ts
|
||||
|
||||
- [ ] fill out express/content-types.ts
|
||||
|
||||
|
||||
- [ ] identify redundant "old skool" and ajax routes, factor out their
|
||||
commonalities, etc.
|
||||
|
||||
- [ ] figure out and add logging to disk
|
||||
|
||||
- [ ] I don't really feel close to satisfied with template location /
|
||||
rendering / etc. Rethink and rework.
|
||||
|
||||
- [ ] Add email verification (this is partially done already)
|
||||
|
||||
- [ ] Reading .env files and dealing with the environment should be immune to
|
||||
the extent possible from idiotic errors
|
||||
|
||||
- [ ] Update check script:
|
||||
- [x] shellcheck on shell scripts
|
||||
- [x] `go vet` on go files
|
||||
- [x] `golangci-lint` on go files
|
||||
- [x] Run `go fmt` on all .go files
|
||||
- [ ] Eventually, run unit tests
|
||||
|
||||
- [ ] write docs
|
||||
- upgrade docs
|
||||
- starting docs
|
||||
- taking over docs
|
||||
- reference
|
||||
- internals
|
||||
|
||||
- [ ] make migration creation default to something like yyyy-mm-dd_ssss (are
|
||||
9999 migrations in a day enough?)
|
||||
|
||||
- [ ] clean up `cmd` and `mgmt`: do the right thing with their commonalities
|
||||
and make very plain which is which for what. Consider additional
|
||||
commands. Maybe `develop` for specific development tasks,
|
||||
`operate` for operational tasks, and we keep `cmd` for project-specific
|
||||
commands. Something like that.
|
||||
|
||||
|
||||
## low importance
|
||||
|
||||
- [ ] add a prometheus-style `/metrics` endpoint to master
|
||||
- [ ] create a metrics server analogous to the logging server
|
||||
- accept various stats from the workers (TBD)
|
||||
|
||||
- [ ] move `master-bin` into a subdir like `master/cmd` or whatever is
|
||||
idiomatic for golang programs; adapt `master` wrapper shell script
|
||||
accordingly
|
||||
|
||||
- [ ] flesh out the `sync.sh` script
|
||||
- [ ] update framework-managed node
|
||||
- [ ] update framework-managed pnpm
|
||||
- [ ] update pnpm-managed deps
|
||||
- [ ] rebuild golang programs
|
||||
|
||||
- [ ] If the number of workers is large, then there is a long lapse between
|
||||
when you change a file and when the server responds
|
||||
- One solution: start and stop workers serially: stop one, restart it with new
|
||||
code; repeat
|
||||
- Slow start them: only start a few at first
|
||||
|
||||
- [ ] in express/user.ts: FIXME: set createdAt and updatedAt to start of epoch
|
||||
|
||||
|
||||
|
||||
|
||||
## finished
|
||||
|
||||
- [x] Reimplement fixup.sh
|
||||
- [x] run shfmt on all shell scripts (and the files they `source`)
|
||||
- [x] Run `go fmt` on all .go files
|
||||
- [x] Run ~~prettier~~ biome on all .ts files and maybe others
|
||||
|
||||
- [x] Adapt master program so that it reads configuration from command line
|
||||
args instead of from environment variables
|
||||
- Should have sane defaults
|
||||
- Adding new arguments should be easy and obvious
|
||||
|
||||
- [x] Add wrapper script to run master program (so that various assumptions related
|
||||
to relative paths are safer)
|
||||
|
||||
- [x] Add logging service
|
||||
- New golang program, in the same directory as master
|
||||
- Intended to be started by master
|
||||
- Listens to a port specified command line arg
|
||||
- Accepts POSTed (or possibly PUT) json messages, currently in a
|
||||
to-be-defined format. We will work on this format later.
|
||||
- Keeps the most recent N messages in memory. N can be a fairly large
|
||||
number; let's start by assuming 1 million.
|
||||
|
||||
- [x] Log to logging service from the express backend
|
||||
- Fill out types and functions in `express/logging.ts`
|
||||
|
||||
- [x] Add first cut at database access. Remember that ORMs are not all that!
|
||||
|
||||
- [x] Create initial docker-compose.yml file for local development
|
||||
- include most recent stable postgres
|
||||
- include beanstalkd
|
||||
- include memcached
|
||||
- include redis
|
||||
- include mailpit
|
||||
|
||||
22
cmd
22
cmd
@@ -2,20 +2,26 @@
|
||||
|
||||
# This file belongs to the framework. You are not expected to modify it.
|
||||
|
||||
# FIXME: Obviously this file isn't nearly robust enough. Make it so.
|
||||
# Managed binary runner - runs framework-managed binaries like node, pnpm, tsx
|
||||
# Usage: ./cmd <command> [args...]
|
||||
|
||||
set -eu
|
||||
|
||||
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
|
||||
if [ $# -lt 1 ]; then
|
||||
echo "Usage: ./cmd <command> [args...]"
|
||||
echo ""
|
||||
echo "Available commands:"
|
||||
for cmd in "$DIR"/framework/cmd.d/*; do
|
||||
if [ -x "$cmd" ]; then
|
||||
basename "$cmd"
|
||||
fi
|
||||
done
|
||||
exit 1
|
||||
fi
|
||||
|
||||
subcmd="$1"
|
||||
|
||||
# echo "$subcmd"
|
||||
|
||||
#exit 3
|
||||
|
||||
shift
|
||||
|
||||
echo will run "$DIR"/framework/cmd.d/"$subcmd" "$@"
|
||||
|
||||
exec "$DIR"/framework/cmd.d/"$subcmd" "$@"
|
||||
|
||||
27
develop
Executable file
27
develop
Executable file
@@ -0,0 +1,27 @@
|
||||
#!/bin/bash
|
||||
|
||||
# This file belongs to the framework. You are not expected to modify it.
|
||||
|
||||
# Development command runner - parallel to ./mgmt for development tasks
|
||||
# Usage: ./develop <command> [args...]
|
||||
|
||||
set -eu
|
||||
|
||||
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
|
||||
if [ $# -lt 1 ]; then
|
||||
echo "Usage: ./develop <command> [args...]"
|
||||
echo ""
|
||||
echo "Available commands:"
|
||||
for cmd in "$DIR"/framework/develop.d/*; do
|
||||
if [ -x "$cmd" ]; then
|
||||
basename "$cmd"
|
||||
fi
|
||||
done
|
||||
exit 1
|
||||
fi
|
||||
|
||||
subcmd="$1"
|
||||
shift
|
||||
|
||||
exec "$DIR"/framework/develop.d/"$subcmd" "$@"
|
||||
35
docker-compose.yml
Normal file
35
docker-compose.yml
Normal file
@@ -0,0 +1,35 @@
|
||||
services:
|
||||
postgres:
|
||||
image: postgres:17
|
||||
ports:
|
||||
- "5432:5432"
|
||||
environment:
|
||||
POSTGRES_USER: diachron
|
||||
POSTGRES_PASSWORD: diachron
|
||||
POSTGRES_DB: diachron
|
||||
volumes:
|
||||
- postgres_data:/var/lib/postgresql/data
|
||||
|
||||
redis:
|
||||
image: redis:7
|
||||
ports:
|
||||
- "6379:6379"
|
||||
|
||||
memcached:
|
||||
image: memcached:1.6
|
||||
ports:
|
||||
- "11211:11211"
|
||||
|
||||
beanstalkd:
|
||||
image: schickling/beanstalkd
|
||||
ports:
|
||||
- "11300:11300"
|
||||
|
||||
mailpit:
|
||||
image: axllent/mailpit
|
||||
ports:
|
||||
- "1025:1025" # SMTP
|
||||
- "8025:8025" # Web UI
|
||||
|
||||
volumes:
|
||||
postgres_data:
|
||||
125
docs/commands.md
Normal file
125
docs/commands.md
Normal file
@@ -0,0 +1,125 @@
|
||||
# The Three Types of Commands
|
||||
|
||||
This framework deliberately separates *how* you interact with the system into three distinct command types. The split is not cosmetic; it encodes safety, intent, and operational assumptions directly into the tooling so that mistakes are harder to make under stress.
|
||||
|
||||
The guiding idea: **production should feel boring and safe; exploration should feel powerful and a little dangerous; the application itself should not care how it is being operated.**
|
||||
|
||||
---
|
||||
|
||||
## 1. Application Commands (`app`)
|
||||
|
||||
**What they are**
|
||||
Commands defined *by the application itself*, for its own domain needs. They are not part of the framework, even though they are built on top of it.
|
||||
|
||||
The framework provides structure and affordances; the application supplies meaning.
|
||||
|
||||
**Core properties**
|
||||
|
||||
* Express domain behavior, not infrastructure concerns
|
||||
* Safe by definition
|
||||
* Deterministic and repeatable
|
||||
* No environment‑dependent semantics
|
||||
* Identical behavior in dev, staging, and production
|
||||
|
||||
**Examples**
|
||||
|
||||
* Handling HTTP requests
|
||||
* Rendering templates
|
||||
* Running background jobs / queues
|
||||
* Sending emails triggered by application logic
|
||||
|
||||
**Non‑goals**
|
||||
|
||||
* No schema changes
|
||||
* No data backfills
|
||||
* No destructive behavior
|
||||
* No operational or lifecycle management
|
||||
|
||||
**Rule of thumb**
|
||||
If removing the framework would require rewriting *how* it runs but not *what* it does, the command belongs here.
|
||||
|
||||
---
|
||||
|
||||
## 2. Management Commands (`mgmt`)
|
||||
|
||||
**What they are**
|
||||
Operational, *production‑safe* commands used to evolve and maintain a live system.
|
||||
|
||||
These commands assume real data exists and must not be casually destroyed.
|
||||
|
||||
**Core properties**
|
||||
|
||||
* Forward‑only
|
||||
* Idempotent or safely repeatable
|
||||
* Designed to run in production
|
||||
* Explicit, auditable intent
|
||||
|
||||
**Examples**
|
||||
|
||||
* Applying migrations
|
||||
* Running seeders that assert invariant data
|
||||
* Reindexing or rebuilding derived data
|
||||
* Rotating keys, recalculating counters
|
||||
|
||||
**Design constraints**
|
||||
|
||||
* No implicit rollbacks
|
||||
* No hidden destructive actions
|
||||
* Fail fast if assumptions are violated
|
||||
|
||||
**Rule of thumb**
|
||||
If you would run it at 3am while tired and worried, it must live here.
|
||||
|
||||
---
|
||||
|
||||
## 3. Development Commands (`develop`)
|
||||
|
||||
**What they are**
|
||||
Sharp, *unsafe by design* tools meant exclusively for local development and experimentation.
|
||||
|
||||
These commands optimize for speed, learning, and iteration — not safety.
|
||||
|
||||
**Core properties**
|
||||
|
||||
* Destructive operations allowed
|
||||
* May reset or mutate large amounts of data
|
||||
* Assume a clean or disposable environment
|
||||
* Explicitly gated in production
|
||||
|
||||
**Examples**
|
||||
|
||||
* Dropping and recreating databases
|
||||
* Rolling migrations backward
|
||||
* Loading fixtures or scenarios
|
||||
* Generating fake or randomized data
|
||||
|
||||
**Safety model**
|
||||
|
||||
* Hard to run in production
|
||||
* Requires explicit opt‑in if ever enabled
|
||||
* Clear, noisy warnings when invoked
|
||||
|
||||
**Rule of thumb**
|
||||
If it would be irresponsible to run against real user data, it belongs here.
|
||||
|
||||
---
|
||||
|
||||
## Why This Split Matters
|
||||
|
||||
Many frameworks blur these concerns, leading to:
|
||||
|
||||
* Fearful production operations
|
||||
* Overpowered dev tools leaking into prod
|
||||
* Environment‑specific behavior and bugs
|
||||
|
||||
By naming and enforcing these three command types:
|
||||
|
||||
* Intent is visible at the CLI level
|
||||
* Safety properties are architectural, not cultural
|
||||
* Developers can move fast *without* normalizing risk
|
||||
|
||||
---
|
||||
|
||||
## One‑Sentence Summary
|
||||
|
||||
> **App commands run the system, mgmt commands evolve it safely, and develop commands let you break things on purpose — but only where it’s allowed.**
|
||||
37
docs/concentric-circles.md
Normal file
37
docs/concentric-circles.md
Normal file
@@ -0,0 +1,37 @@
|
||||
Let's consider a bullseye with the following concentric circles:
|
||||
|
||||
- Ring 0: small, simple systems
|
||||
- Single jurisdiction
|
||||
- Email + password
|
||||
- A few roles
|
||||
- Naïve or soft deletion
|
||||
- Minimal audit needs
|
||||
|
||||
- Ring 1: grown-up systems
|
||||
- Long-lived data
|
||||
- Changing requirements
|
||||
- Shared accounts
|
||||
- GDPR-style erasure/anonymization
|
||||
- Some cross-border concerns
|
||||
- Historical data must remain usable
|
||||
- “Oops, we should have thought about that” moments
|
||||
|
||||
- Ring 2: heavy compliance
|
||||
- Formal audit trails
|
||||
- Legal hold
|
||||
- Non-repudiation
|
||||
- Regulatory reporting
|
||||
- Strong identity guarantees
|
||||
- Jurisdiction-aware data partitioning
|
||||
|
||||
- Ring 3: banking / defense / healthcare at scale
|
||||
- Cryptographic auditability
|
||||
- Append-only ledgers
|
||||
- Explicit legal models
|
||||
- Independent compliance teams
|
||||
- Lawyers embedded in engineeRing
|
||||
|
||||
diachron is designed to be suitable for Rings 0 and 1. Occasionally we may
|
||||
look over the fence into Ring 2, but it's not what we've principally designed
|
||||
for. Please take this framing into account when evaluating diachron for
|
||||
greenfield projects.
|
||||
142
docs/freedom-hacking-and-responsibility.md
Normal file
142
docs/freedom-hacking-and-responsibility.md
Normal file
@@ -0,0 +1,142 @@
|
||||
# Freedom, Hacking, and Responsibility
|
||||
|
||||
This framework is **free and open source software**.
|
||||
|
||||
That fact is not incidental. It is a deliberate ethical, practical, and technical choice.
|
||||
|
||||
This document explains how freedom to modify coexists with strong guidance about *how the framework is meant to be used* — without contradiction, and without apology.
|
||||
|
||||
---
|
||||
|
||||
## The short version
|
||||
|
||||
* This is free software. You are free to modify it.
|
||||
* The framework has documented invariants for good reasons.
|
||||
* You are encouraged to explore, question, and patch.
|
||||
* You are discouraged from casually undermining guarantees you still expect to rely on.
|
||||
* Clarity beats enforcement.
|
||||
|
||||
Freedom with understanding beats both lock-in and chaos.
|
||||
|
||||
|
||||
---
|
||||
|
||||
## Your Freedom
|
||||
|
||||
You are free to:
|
||||
|
||||
* study the source code
|
||||
* run the software for any purpose
|
||||
* modify it in any way
|
||||
* fork it
|
||||
* redistribute it, with or without changes
|
||||
* submit patches, extensions, or experiments
|
||||
|
||||
…subject only to the terms of the license.
|
||||
|
||||
These freedoms are foundational. They are not granted reluctantly, and they are not symbolic. They exist so that:
|
||||
|
||||
* you can understand what your software is really doing
|
||||
* you are not trapped by vendor control
|
||||
* the system can outlive its original authors
|
||||
|
||||
---
|
||||
|
||||
## Freedom Is Not the Same as Endorsement
|
||||
|
||||
While you are free to change anything, **not all changes are equally wise**.
|
||||
|
||||
Some parts of the framework are carefully constrained because they encode:
|
||||
|
||||
* security assumptions
|
||||
* lifecycle invariants
|
||||
* hard-won lessons from real systems under stress
|
||||
|
||||
You are free to violate these constraints in your own fork.
|
||||
|
||||
But the framework’s documentation will often say things like:
|
||||
|
||||
* “do not modify this”
|
||||
* “application code must not depend on this”
|
||||
* “this table or class is framework-owned”
|
||||
|
||||
These statements are **technical guidance**, not legal restrictions.
|
||||
|
||||
They exist to answer the question:
|
||||
|
||||
> *If you want this system to remain upgradeable, predictable, and boring — what should you leave alone?*
|
||||
|
||||
---
|
||||
|
||||
## The Intended Social Contract
|
||||
|
||||
The framework makes a clear offer:
|
||||
|
||||
* We expose our internals so you can learn.
|
||||
* We provide explicit extension points so you can adapt.
|
||||
* We document invariants so you don’t have to rediscover them the hard way.
|
||||
|
||||
In return, we ask that:
|
||||
|
||||
* application code respects documented boundaries
|
||||
* extensions use explicit seams rather than hidden hooks
|
||||
* patches that change invariants are proposed consciously, not accidentally
|
||||
|
||||
Nothing here is enforced by technical locks.
|
||||
|
||||
It is enforced — insofar as it is enforced at all — by clarity and shared expectations.
|
||||
|
||||
---
|
||||
|
||||
## Hacking Is Welcome
|
||||
|
||||
Exploration is not just allowed; it is encouraged.
|
||||
|
||||
Good reasons to hack on the framework include:
|
||||
|
||||
* understanding how it works
|
||||
* evaluating whether its constraints make sense
|
||||
* adapting it to unfamiliar environments
|
||||
* testing alternative designs
|
||||
* discovering better abstractions
|
||||
|
||||
Fork it. Instrument it. Break it. Learn from it.
|
||||
|
||||
Many of the framework’s constraints exist *because* someone once ignored them and paid the price.
|
||||
|
||||
---
|
||||
|
||||
## Patches, Not Patches-in-Place
|
||||
|
||||
If you discover a problem or a better design:
|
||||
|
||||
* patches are welcome
|
||||
* discussions are welcome
|
||||
* disagreements are welcome
|
||||
|
||||
What is discouraged is **quietly patching around framework invariants inside application code**.
|
||||
|
||||
That approach:
|
||||
|
||||
* obscures intent
|
||||
* creates one-off local truths
|
||||
* makes systems harder to reason about
|
||||
|
||||
If the framework is wrong, it should be corrected *at the framework level*, or consciously forked.
|
||||
|
||||
---
|
||||
|
||||
## Why This Is Not a Contradiction
|
||||
|
||||
Strong opinions and free software are not enemies.
|
||||
|
||||
Freedom means you can change the software.
|
||||
|
||||
Responsibility means understanding what you are changing, and why.
|
||||
|
||||
A system that pretends every modification is equally safe is dishonest.
|
||||
|
||||
A system that hides its internals to prevent modification is hostile.
|
||||
|
||||
This framework aims for neither.
|
||||
|
||||
27
docs/groups-and-roles.md
Normal file
27
docs/groups-and-roles.md
Normal file
@@ -0,0 +1,27 @@
|
||||
- Role: a named bundle of responsibilities (editor, admin, member)
|
||||
|
||||
- Group: a scope or context (org, team, project, publication)
|
||||
|
||||
- Permission / Capability (capability preferred in code): a boolean fact about
|
||||
allowed behavior
|
||||
|
||||
|
||||
## tips
|
||||
|
||||
- In the database, capabilities are boolean values. Their names should be
|
||||
verb-subject. Don't include `can` and definitely do not include `cannot`.
|
||||
|
||||
✔️ `edit_post`
|
||||
❌ `cannot_remove_comment`
|
||||
|
||||
- The capabilities table is deliberately flat. If you need to group them, use
|
||||
`.` as a delimiter and sort and filter accordingly in queries and in your
|
||||
UI.
|
||||
✔️ `blog.edit_post`
|
||||
✔️ `blog.moderate_comment`
|
||||
or
|
||||
✔️ `blog.post.edit`
|
||||
✔️ `blog.post.delete`
|
||||
✔️ `blog.comment.moderate`
|
||||
✔️ `blog.comment.edit`
|
||||
are all fine.
|
||||
17
docs/index.md
Normal file
17
docs/index.md
Normal file
@@ -0,0 +1,17 @@
|
||||
misc notes for now. of course this needs to be written up for real.
|
||||
|
||||
|
||||
## execution context
|
||||
|
||||
The execution context represents facts such as the runtime directory, the
|
||||
operating system, hardware, and filesystem layout, distinct from environment
|
||||
variables or request-scoped context.
|
||||
|
||||
## philosophy
|
||||
|
||||
- TODO-DESIGN.md
|
||||
- concentric-circles.md
|
||||
- nomenclature.md
|
||||
- mutability.md
|
||||
- commands.md
|
||||
- groups-and-roles.md
|
||||
34
docs/migrations-and-seeders-and-database-table-ownership.md
Normal file
34
docs/migrations-and-seeders-and-database-table-ownership.md
Normal file
@@ -0,0 +1,34 @@
|
||||
Some database tables are owned by diachron and some are owned by the
|
||||
application.
|
||||
|
||||
This also applies to seeders: some are owned by diachron and some by the
|
||||
application.
|
||||
|
||||
The database's structure is managed by migrations written in SQL.
|
||||
|
||||
Each migration gets its own file. These files' names should match
|
||||
`yyyy-mm-dd_ss-description.sql`, eg `2026-01-01_01-users.sql`.
|
||||
|
||||
Files are sorted lexicographically by name and applied in order.
|
||||
|
||||
Note: in the future we may relax or modify the restriction on migration file
|
||||
names, but they'll continue to be applied in lexicographical order.
|
||||
|
||||
## framework and application migrations
|
||||
|
||||
Migrations owned by the framework are kept in a separate directory from those
|
||||
owned by applications. Pending framework migrations, if any, are applied
|
||||
before pending application migrations, if any.
|
||||
|
||||
diachron will go to some lengths to ensure that framework migrations do not
|
||||
break applications.
|
||||
|
||||
## no downward migrations
|
||||
|
||||
diachron does not provide them. "The only way out is through."
|
||||
|
||||
When developing locally, you can use the command `develop reset-db`. **NEVER
|
||||
USE THIS IN PRODUCTION!** Always be sure that you can "get back to where you
|
||||
were". Being careful when creating migrations and seeders can help, but
|
||||
dumping and restoring known-good copies of the database can also take you a
|
||||
long way.
|
||||
1
docs/mutability.md
Normal file
1
docs/mutability.md
Normal file
@@ -0,0 +1 @@
|
||||
Describe and define what is expected to be mutable and what is not.
|
||||
@@ -2,3 +2,14 @@ We use `Call` and `Result` for our own types that wrap `Request` and
|
||||
`Response`.
|
||||
|
||||
This hopefully will make things less confusing and avoid problems with shadowing.
|
||||
|
||||
## meta
|
||||
|
||||
- We use _algorithmic complexity_ for performance discussions, when
|
||||
things like Big-O come up, etc
|
||||
|
||||
- We use _conceptual complexity_ for design and architecture
|
||||
|
||||
- We use _cognitive load_ when talking about developer experience
|
||||
|
||||
- We use _operational burden_ when talking about production reality
|
||||
|
||||
@@ -1 +1,219 @@
|
||||
.
|
||||
# Framework vs Application Ownership
|
||||
|
||||
This document defines **ownership boundaries** between the framework and application code. These boundaries are intentional and non-negotiable: they exist to preserve upgradeability, predictability, and developer sanity under stress.
|
||||
|
||||
Ownership answers a simple question:
|
||||
|
||||
> **Who is allowed to change this, and under what rules?**
|
||||
|
||||
The framework draws a hard line between *framework‑owned* and *application‑owned* concerns, while still encouraging extension through explicit, visible mechanisms.
|
||||
|
||||
---
|
||||
|
||||
## Core Principle
|
||||
|
||||
The framework is not a library of suggestions. It is a **runtime with invariants**.
|
||||
|
||||
Application code:
|
||||
|
||||
* **uses** the framework
|
||||
* **extends** it through defined seams
|
||||
* **never mutates or overrides its invariants**
|
||||
|
||||
Framework code:
|
||||
|
||||
* guarantees stable behavior
|
||||
* owns critical lifecycle and security concerns
|
||||
* must remain internally consistent across versions
|
||||
|
||||
Breaking this boundary creates systems that work *until they don’t*, usually during upgrades or emergencies.
|
||||
|
||||
---
|
||||
|
||||
## Database Ownership
|
||||
|
||||
### Framework‑Owned Tables
|
||||
|
||||
Certain database tables are **owned and managed exclusively by the framework**.
|
||||
|
||||
Examples (illustrative, not exhaustive):
|
||||
|
||||
* authentication primitives
|
||||
* session or token state
|
||||
* internal capability/permission metadata
|
||||
* migration bookkeeping
|
||||
* framework feature flags or invariants
|
||||
|
||||
#### Rules
|
||||
|
||||
Application code **must not**:
|
||||
|
||||
* modify schema
|
||||
* add columns
|
||||
* delete rows
|
||||
* update rows directly
|
||||
* rely on undocumented columns or behaviors
|
||||
|
||||
Application code **may**:
|
||||
|
||||
* read via documented framework APIs
|
||||
* reference stable identifiers explicitly exposed by the framework
|
||||
|
||||
Think of these tables as **private internal state** — even though they live in your database.
|
||||
|
||||
> If the framework needs you to interact with this data, it will expose an API for it.
|
||||
|
||||
#### Rationale
|
||||
|
||||
These tables:
|
||||
|
||||
* encode security or correctness invariants
|
||||
* may change structure across framework versions
|
||||
* must remain globally coherent
|
||||
|
||||
Treating them as app‑owned data tightly couples your app to framework internals and blocks safe upgrades.
|
||||
|
||||
---
|
||||
|
||||
### Application‑Owned Tables
|
||||
|
||||
All domain data belongs to the application.
|
||||
|
||||
Examples:
|
||||
|
||||
* users (as domain actors, not auth primitives)
|
||||
* posts, orders, comments, invoices
|
||||
* business‑specific joins and projections
|
||||
* denormalized or performance‑oriented tables
|
||||
|
||||
#### Rules
|
||||
|
||||
Application code:
|
||||
|
||||
* owns schema design
|
||||
* owns migrations
|
||||
* owns constraints and indexes
|
||||
* may evolve these tables freely
|
||||
|
||||
The framework:
|
||||
|
||||
* never mutates application tables implicitly
|
||||
* interacts only through explicit queries or contracts
|
||||
|
||||
#### Integration Pattern
|
||||
|
||||
Where framework concepts must relate to app data:
|
||||
|
||||
* use **foreign keys to framework‑exposed identifiers**, or
|
||||
* introduce **explicit join tables** owned by the application
|
||||
|
||||
No hidden coupling, no magic backfills.
|
||||
|
||||
---
|
||||
|
||||
## Code Ownership
|
||||
|
||||
### Framework‑Owned Code
|
||||
|
||||
Some classes, constants, and modules are **framework‑owned**.
|
||||
|
||||
These include:
|
||||
|
||||
* core request/response abstractions
|
||||
* auth and user primitives
|
||||
* capability/permission evaluation logic
|
||||
* lifecycle hooks
|
||||
* low‑level utilities relied on by the framework itself
|
||||
|
||||
#### Rules
|
||||
|
||||
Application code **must not**:
|
||||
|
||||
* modify framework source
|
||||
* monkey‑patch or override internals
|
||||
* rely on undocumented behavior
|
||||
* change constant values or internal defaults
|
||||
|
||||
Framework code is treated as **read‑only** from the app’s perspective.
|
||||
|
||||
---
|
||||
|
||||
### Extension Is Encouraged (But Explicit)
|
||||
|
||||
Ownership does **not** mean rigidity.
|
||||
|
||||
The framework is designed to be extended via **intentional seams**, such as:
|
||||
|
||||
* subclassing
|
||||
* composition
|
||||
* adapters
|
||||
* delegation
|
||||
* configuration objects
|
||||
* explicit registration APIs
|
||||
|
||||
#### Preferred Patterns
|
||||
|
||||
* **Subclass when behavior is stable and conceptual**
|
||||
* **Compose when behavior is contextual or optional**
|
||||
* **Delegate when authority should remain with the framework**
|
||||
|
||||
What matters is that extension is:
|
||||
|
||||
* visible in code
|
||||
* locally understandable
|
||||
* reversible
|
||||
|
||||
No spooky action at a distance.
|
||||
|
||||
---
|
||||
|
||||
## What the App Owns Completely
|
||||
|
||||
The application fully owns:
|
||||
|
||||
* domain models and data shapes
|
||||
* SQL queries and result parsing
|
||||
* business rules
|
||||
* authorization policy *inputs* (not the engine)
|
||||
* rendering decisions
|
||||
* feature flags specific to the app
|
||||
* performance trade‑offs
|
||||
|
||||
The framework does not attempt to infer intent from your domain.
|
||||
|
||||
---
|
||||
|
||||
## What the Framework Guarantees
|
||||
|
||||
In return for respecting ownership boundaries, the framework guarantees:
|
||||
|
||||
* stable semantics across versions
|
||||
* forward‑only migrations for its own tables
|
||||
* explicit deprecations
|
||||
* no silent behavior changes
|
||||
* identical runtime behavior in dev and prod
|
||||
|
||||
The framework may evolve internally — **but never by reaching into your app’s data or code**.
|
||||
|
||||
---
|
||||
|
||||
## A Useful Mental Model
|
||||
|
||||
* Framework‑owned things are **constitutional law**
|
||||
* Application‑owned things are **legislation**
|
||||
|
||||
You can write any laws you want — but you don’t amend the constitution inline.
|
||||
|
||||
If you need a new power, the framework should expose it deliberately.
|
||||
|
||||
---
|
||||
|
||||
## Summary
|
||||
|
||||
* Ownership is about **who is allowed to change what**
|
||||
* Framework‑owned tables and code are read‑only to the app
|
||||
* Application‑owned tables and code are sovereign
|
||||
* Extension is encouraged, mutation is not
|
||||
* Explicit seams beat clever hacks
|
||||
|
||||
Respecting these boundaries keeps systems boring — and boring systems survive stress.
|
||||
|
||||
@@ -3,15 +3,22 @@ import express, {
|
||||
type Response as ExpressResponse,
|
||||
} from "express";
|
||||
import { match } from "path-to-regexp";
|
||||
import { Session } from "./auth";
|
||||
import { cli } from "./cli";
|
||||
import { contentTypes } from "./content-types";
|
||||
import { runWithContext } from "./context";
|
||||
import { core } from "./core";
|
||||
import { httpCodes } from "./http-codes";
|
||||
import { request } from "./request";
|
||||
import { routes } from "./routes";
|
||||
import { services } from "./services";
|
||||
|
||||
// import { URLPattern } from 'node:url';
|
||||
import {
|
||||
AuthenticationRequired,
|
||||
AuthorizationDenied,
|
||||
type Call,
|
||||
type InternalHandler,
|
||||
isRedirect,
|
||||
type Method,
|
||||
massageMethod,
|
||||
methodParser,
|
||||
@@ -22,7 +29,11 @@ import {
|
||||
|
||||
const app = express();
|
||||
|
||||
services.logging.log({ source: "logging", text: ["1"] });
|
||||
// Parse request bodies
|
||||
app.use(express.json());
|
||||
app.use(express.urlencoded({ extended: true }));
|
||||
|
||||
core.logging.log({ source: "logging", text: ["1"] });
|
||||
const processedRoutes: { [K in Method]: ProcessedRoute[] } = {
|
||||
GET: [],
|
||||
POST: [],
|
||||
@@ -31,7 +42,7 @@ const processedRoutes: { [K in Method]: ProcessedRoute[] } = {
|
||||
DELETE: [],
|
||||
};
|
||||
|
||||
function isPromise<T>(value: T | Promise<T>): value is Promise<T> {
|
||||
function _isPromise<T>(value: T | Promise<T>): value is Promise<T> {
|
||||
return typeof (value as any)?.then === "function";
|
||||
}
|
||||
|
||||
@@ -41,9 +52,9 @@ routes.forEach((route: Route, _idx: number, _allRoutes: Route[]) => {
|
||||
const methodList = route.methods;
|
||||
|
||||
const handler: InternalHandler = async (
|
||||
request: ExpressRequest,
|
||||
expressRequest: ExpressRequest,
|
||||
): Promise<Result> => {
|
||||
const method = massageMethod(request.method);
|
||||
const method = massageMethod(expressRequest.method);
|
||||
|
||||
console.log("method", method);
|
||||
|
||||
@@ -51,27 +62,46 @@ routes.forEach((route: Route, _idx: number, _allRoutes: Route[]) => {
|
||||
// XXX: Worth asserting this?
|
||||
}
|
||||
|
||||
console.log("request.originalUrl", request.originalUrl);
|
||||
console.log("beavis");
|
||||
console.log("request.originalUrl", expressRequest.originalUrl);
|
||||
|
||||
// const p = new URL(request.originalUrl);
|
||||
// const path = p.pathname;
|
||||
|
||||
// console.log("p, path", p, path)
|
||||
|
||||
console.log("ok");
|
||||
// Authenticate the request
|
||||
const auth = await request.auth.validateRequest(expressRequest);
|
||||
|
||||
const req: Call = {
|
||||
pattern: route.path,
|
||||
// path,
|
||||
path: request.originalUrl,
|
||||
path: expressRequest.originalUrl,
|
||||
method,
|
||||
parameters: { one: 1, two: 2 },
|
||||
request,
|
||||
request: expressRequest,
|
||||
user: auth.user,
|
||||
session: new Session(auth.session, auth.user),
|
||||
};
|
||||
|
||||
const retval = await route.handler(req);
|
||||
return retval;
|
||||
try {
|
||||
const retval = await runWithContext({ user: auth.user }, () =>
|
||||
route.handler(req),
|
||||
);
|
||||
return retval;
|
||||
} catch (error) {
|
||||
// Handle authentication errors
|
||||
if (error instanceof AuthenticationRequired) {
|
||||
return {
|
||||
code: httpCodes.clientErrors.Unauthorized,
|
||||
contentType: contentTypes.application.json,
|
||||
result: JSON.stringify({
|
||||
error: "Authentication required",
|
||||
}),
|
||||
};
|
||||
}
|
||||
if (error instanceof AuthorizationDenied) {
|
||||
return {
|
||||
code: httpCodes.clientErrors.Forbidden,
|
||||
contentType: contentTypes.application.json,
|
||||
result: JSON.stringify({ error: "Access denied" }),
|
||||
};
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
};
|
||||
|
||||
for (const [_idx, method] of methodList.entries()) {
|
||||
@@ -88,8 +118,15 @@ async function handler(
|
||||
const method = await methodParser.parseAsync(req.method);
|
||||
|
||||
const byMethod = processedRoutes[method];
|
||||
console.log(
|
||||
"DEBUG: req.path =",
|
||||
JSON.stringify(req.path),
|
||||
"method =",
|
||||
method,
|
||||
);
|
||||
for (const [_idx, pr] of byMethod.entries()) {
|
||||
const match = pr.matcher(req.url);
|
||||
const match = pr.matcher(req.path);
|
||||
console.log("DEBUG: trying pattern, match result =", match);
|
||||
if (match) {
|
||||
console.log("match", match);
|
||||
const resp = await pr.handler(req);
|
||||
@@ -101,7 +138,7 @@ async function handler(
|
||||
const retval: Result = {
|
||||
code: httpCodes.clientErrors.NotFound,
|
||||
contentType: contentTypes.text.plain,
|
||||
result: "not found",
|
||||
result: "not found!",
|
||||
};
|
||||
|
||||
return retval;
|
||||
@@ -115,7 +152,18 @@ app.use(async (req: ExpressRequest, res: ExpressResponse) => {
|
||||
|
||||
console.log(result);
|
||||
|
||||
res.status(code).send(result);
|
||||
// Set any cookies from the result
|
||||
if (result0.cookies) {
|
||||
for (const cookie of result0.cookies) {
|
||||
res.cookie(cookie.name, cookie.value, cookie.options ?? {});
|
||||
}
|
||||
}
|
||||
|
||||
if (isRedirect(result0)) {
|
||||
res.redirect(code, result0.redirect);
|
||||
} else {
|
||||
res.status(code).send(result);
|
||||
}
|
||||
});
|
||||
|
||||
process.title = `diachron:${cli.listen.port}`;
|
||||
|
||||
20
express/auth/index.ts
Normal file
20
express/auth/index.ts
Normal file
@@ -0,0 +1,20 @@
|
||||
// index.ts
|
||||
//
|
||||
// Barrel export for auth module.
|
||||
//
|
||||
// NOTE: authRoutes is NOT exported here to avoid circular dependency:
|
||||
// services.ts → auth/index.ts → auth/routes.ts → services.ts
|
||||
// Import authRoutes directly from "./auth/routes" instead.
|
||||
|
||||
export { hashPassword, verifyPassword } from "./password";
|
||||
export { type AuthResult, AuthService } from "./service";
|
||||
export { type AuthStore, InMemoryAuthStore } from "./store";
|
||||
export { generateToken, hashToken, SESSION_COOKIE_NAME } from "./token";
|
||||
export {
|
||||
type AuthMethod,
|
||||
Session,
|
||||
type SessionData,
|
||||
type TokenId,
|
||||
type TokenType,
|
||||
tokenLifetimes,
|
||||
} from "./types";
|
||||
70
express/auth/password.ts
Normal file
70
express/auth/password.ts
Normal file
@@ -0,0 +1,70 @@
|
||||
// password.ts
|
||||
//
|
||||
// Password hashing using Node.js scrypt (no external dependencies).
|
||||
// Format: $scrypt$N$r$p$salt$hash (all base64)
|
||||
|
||||
import {
|
||||
randomBytes,
|
||||
type ScryptOptions,
|
||||
scrypt,
|
||||
timingSafeEqual,
|
||||
} from "node:crypto";
|
||||
|
||||
// Configuration
|
||||
const SALT_LENGTH = 32;
|
||||
const KEY_LENGTH = 64;
|
||||
const SCRYPT_PARAMS: ScryptOptions = {
|
||||
N: 16384, // CPU/memory cost parameter (2^14)
|
||||
r: 8, // Block size
|
||||
p: 1, // Parallelization
|
||||
};
|
||||
|
||||
// Promisified scrypt with options support
|
||||
function scryptAsync(
|
||||
password: string,
|
||||
salt: Buffer,
|
||||
keylen: number,
|
||||
options: ScryptOptions,
|
||||
): Promise<Buffer> {
|
||||
return new Promise((resolve, reject) => {
|
||||
scrypt(password, salt, keylen, options, (err, derivedKey) => {
|
||||
if (err) {
|
||||
reject(err);
|
||||
} else {
|
||||
resolve(derivedKey);
|
||||
}
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
async function hashPassword(password: string): Promise<string> {
|
||||
const salt = randomBytes(SALT_LENGTH);
|
||||
const hash = await scryptAsync(password, salt, KEY_LENGTH, SCRYPT_PARAMS);
|
||||
|
||||
const { N, r, p } = SCRYPT_PARAMS;
|
||||
return `$scrypt$${N}$${r}$${p}$${salt.toString("base64")}$${hash.toString("base64")}`;
|
||||
}
|
||||
|
||||
async function verifyPassword(
|
||||
password: string,
|
||||
stored: string,
|
||||
): Promise<boolean> {
|
||||
const parts = stored.split("$");
|
||||
if (parts[1] !== "scrypt" || parts.length !== 7) {
|
||||
throw new Error("Invalid password hash format");
|
||||
}
|
||||
|
||||
const [, , nStr, rStr, pStr, saltB64, hashB64] = parts;
|
||||
const salt = Buffer.from(saltB64, "base64");
|
||||
const storedHash = Buffer.from(hashB64, "base64");
|
||||
|
||||
const computedHash = await scryptAsync(password, salt, storedHash.length, {
|
||||
N: parseInt(nStr, 10),
|
||||
r: parseInt(rStr, 10),
|
||||
p: parseInt(pStr, 10),
|
||||
});
|
||||
|
||||
return timingSafeEqual(storedHash, computedHash);
|
||||
}
|
||||
|
||||
export { hashPassword, verifyPassword };
|
||||
231
express/auth/routes.ts
Normal file
231
express/auth/routes.ts
Normal file
@@ -0,0 +1,231 @@
|
||||
// routes.ts
|
||||
//
|
||||
// Authentication route handlers.
|
||||
|
||||
import { z } from "zod";
|
||||
import { contentTypes } from "../content-types";
|
||||
import { httpCodes } from "../http-codes";
|
||||
import { request } from "../request";
|
||||
import type { Call, Result, Route } from "../types";
|
||||
import {
|
||||
forgotPasswordInputParser,
|
||||
loginInputParser,
|
||||
registerInputParser,
|
||||
resetPasswordInputParser,
|
||||
} from "./types";
|
||||
|
||||
// Helper for JSON responses
|
||||
const jsonResponse = (
|
||||
code: (typeof httpCodes.success)[keyof typeof httpCodes.success],
|
||||
data: object,
|
||||
): Result => ({
|
||||
code,
|
||||
contentType: contentTypes.application.json,
|
||||
result: JSON.stringify(data),
|
||||
});
|
||||
|
||||
const errorResponse = (
|
||||
code: (typeof httpCodes.clientErrors)[keyof typeof httpCodes.clientErrors],
|
||||
error: string,
|
||||
): Result => ({
|
||||
code,
|
||||
contentType: contentTypes.application.json,
|
||||
result: JSON.stringify({ error }),
|
||||
});
|
||||
|
||||
// POST /auth/login
|
||||
const loginHandler = async (call: Call): Promise<Result> => {
|
||||
try {
|
||||
const body = call.request.body;
|
||||
const { email, password } = loginInputParser.parse(body);
|
||||
|
||||
const result = await request.auth.login(email, password, "cookie", {
|
||||
userAgent: call.request.get("User-Agent"),
|
||||
ipAddress: call.request.ip,
|
||||
});
|
||||
|
||||
if (!result.success) {
|
||||
return errorResponse(
|
||||
httpCodes.clientErrors.Unauthorized,
|
||||
result.error,
|
||||
);
|
||||
}
|
||||
|
||||
return jsonResponse(httpCodes.success.OK, {
|
||||
token: result.token,
|
||||
user: {
|
||||
id: result.user.id,
|
||||
email: result.user.email,
|
||||
displayName: result.user.displayName,
|
||||
},
|
||||
});
|
||||
} catch (error) {
|
||||
if (error instanceof z.ZodError) {
|
||||
return errorResponse(
|
||||
httpCodes.clientErrors.BadRequest,
|
||||
"Invalid input",
|
||||
);
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
};
|
||||
|
||||
// POST /auth/logout
|
||||
const logoutHandler = async (call: Call): Promise<Result> => {
|
||||
const token = request.auth.extractToken(call.request);
|
||||
if (token) {
|
||||
await request.auth.logout(token);
|
||||
}
|
||||
|
||||
return jsonResponse(httpCodes.success.OK, { message: "Logged out" });
|
||||
};
|
||||
|
||||
// POST /auth/register
|
||||
const registerHandler = async (call: Call): Promise<Result> => {
|
||||
try {
|
||||
const body = call.request.body;
|
||||
const { email, password, displayName } =
|
||||
registerInputParser.parse(body);
|
||||
|
||||
const result = await request.auth.register(
|
||||
email,
|
||||
password,
|
||||
displayName,
|
||||
);
|
||||
|
||||
if (!result.success) {
|
||||
return errorResponse(httpCodes.clientErrors.Conflict, result.error);
|
||||
}
|
||||
|
||||
// TODO: Send verification email with result.verificationToken
|
||||
// For now, log it for development
|
||||
console.log(
|
||||
`[AUTH] Verification token for ${email}: ${result.verificationToken}`,
|
||||
);
|
||||
|
||||
return jsonResponse(httpCodes.success.Created, {
|
||||
message:
|
||||
"Registration successful. Please check your email to verify your account.",
|
||||
user: {
|
||||
id: result.user.id,
|
||||
email: result.user.email,
|
||||
},
|
||||
});
|
||||
} catch (error) {
|
||||
if (error instanceof z.ZodError) {
|
||||
return errorResponse(
|
||||
httpCodes.clientErrors.BadRequest,
|
||||
"Invalid input",
|
||||
);
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
};
|
||||
|
||||
// POST /auth/forgot-password
|
||||
const forgotPasswordHandler = async (call: Call): Promise<Result> => {
|
||||
try {
|
||||
const body = call.request.body;
|
||||
const { email } = forgotPasswordInputParser.parse(body);
|
||||
|
||||
const result = await request.auth.createPasswordResetToken(email);
|
||||
|
||||
// Always return success (don't reveal if email exists)
|
||||
if (result) {
|
||||
// TODO: Send password reset email
|
||||
console.log(
|
||||
`[AUTH] Password reset token for ${email}: ${result.token}`,
|
||||
);
|
||||
}
|
||||
|
||||
return jsonResponse(httpCodes.success.OK, {
|
||||
message:
|
||||
"If an account exists with that email, a password reset link has been sent.",
|
||||
});
|
||||
} catch (error) {
|
||||
if (error instanceof z.ZodError) {
|
||||
return errorResponse(
|
||||
httpCodes.clientErrors.BadRequest,
|
||||
"Invalid input",
|
||||
);
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
};
|
||||
|
||||
// POST /auth/reset-password
|
||||
const resetPasswordHandler = async (call: Call): Promise<Result> => {
|
||||
try {
|
||||
const body = call.request.body;
|
||||
const { token, password } = resetPasswordInputParser.parse(body);
|
||||
|
||||
const result = await request.auth.resetPassword(token, password);
|
||||
|
||||
if (!result.success) {
|
||||
return errorResponse(
|
||||
httpCodes.clientErrors.BadRequest,
|
||||
result.error,
|
||||
);
|
||||
}
|
||||
|
||||
return jsonResponse(httpCodes.success.OK, {
|
||||
message:
|
||||
"Password has been reset. You can now log in with your new password.",
|
||||
});
|
||||
} catch (error) {
|
||||
if (error instanceof z.ZodError) {
|
||||
return errorResponse(
|
||||
httpCodes.clientErrors.BadRequest,
|
||||
"Invalid input",
|
||||
);
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
};
|
||||
|
||||
// GET /auth/verify-email?token=xxx
|
||||
const verifyEmailHandler = async (call: Call): Promise<Result> => {
|
||||
const url = new URL(call.path, "http://localhost");
|
||||
const token = url.searchParams.get("token");
|
||||
|
||||
if (!token) {
|
||||
return errorResponse(
|
||||
httpCodes.clientErrors.BadRequest,
|
||||
"Missing token",
|
||||
);
|
||||
}
|
||||
|
||||
const result = await request.auth.verifyEmail(token);
|
||||
|
||||
if (!result.success) {
|
||||
return errorResponse(httpCodes.clientErrors.BadRequest, result.error);
|
||||
}
|
||||
|
||||
return jsonResponse(httpCodes.success.OK, {
|
||||
message: "Email verified successfully. You can now log in.",
|
||||
});
|
||||
};
|
||||
|
||||
// Export routes
|
||||
const authRoutes: Route[] = [
|
||||
{ path: "/auth/login", methods: ["POST"], handler: loginHandler },
|
||||
{ path: "/auth/logout", methods: ["POST"], handler: logoutHandler },
|
||||
{ path: "/auth/register", methods: ["POST"], handler: registerHandler },
|
||||
{
|
||||
path: "/auth/forgot-password",
|
||||
methods: ["POST"],
|
||||
handler: forgotPasswordHandler,
|
||||
},
|
||||
{
|
||||
path: "/auth/reset-password",
|
||||
methods: ["POST"],
|
||||
handler: resetPasswordHandler,
|
||||
},
|
||||
{
|
||||
path: "/auth/verify-email",
|
||||
methods: ["GET"],
|
||||
handler: verifyEmailHandler,
|
||||
},
|
||||
];
|
||||
|
||||
export { authRoutes };
|
||||
262
express/auth/service.ts
Normal file
262
express/auth/service.ts
Normal file
@@ -0,0 +1,262 @@
|
||||
// service.ts
|
||||
//
|
||||
// Core authentication service providing login, logout, registration,
|
||||
// password reset, and email verification.
|
||||
|
||||
import type { Request as ExpressRequest } from "express";
|
||||
import {
|
||||
type AnonymousUser,
|
||||
anonymousUser,
|
||||
type User,
|
||||
type UserId,
|
||||
} from "../user";
|
||||
import { hashPassword, verifyPassword } from "./password";
|
||||
import type { AuthStore } from "./store";
|
||||
import {
|
||||
hashToken,
|
||||
parseAuthorizationHeader,
|
||||
SESSION_COOKIE_NAME,
|
||||
} from "./token";
|
||||
import { type SessionData, type TokenId, tokenLifetimes } from "./types";
|
||||
|
||||
type LoginResult =
|
||||
| { success: true; token: string; user: User }
|
||||
| { success: false; error: string };
|
||||
|
||||
type RegisterResult =
|
||||
| { success: true; user: User; verificationToken: string }
|
||||
| { success: false; error: string };
|
||||
|
||||
type SimpleResult = { success: true } | { success: false; error: string };
|
||||
|
||||
// Result of validating a request/token - contains both user and session
|
||||
export type AuthResult =
|
||||
| { authenticated: true; user: User; session: SessionData }
|
||||
| { authenticated: false; user: AnonymousUser; session: null };
|
||||
|
||||
export class AuthService {
|
||||
constructor(private store: AuthStore) {}
|
||||
|
||||
// === Login ===
|
||||
|
||||
async login(
|
||||
email: string,
|
||||
password: string,
|
||||
authMethod: "cookie" | "bearer",
|
||||
metadata?: { userAgent?: string; ipAddress?: string },
|
||||
): Promise<LoginResult> {
|
||||
const user = await this.store.getUserByEmail(email);
|
||||
if (!user) {
|
||||
return { success: false, error: "Invalid credentials" };
|
||||
}
|
||||
|
||||
if (!user.isActive()) {
|
||||
return { success: false, error: "Account is not active" };
|
||||
}
|
||||
|
||||
const passwordHash = await this.store.getUserPasswordHash(user.id);
|
||||
if (!passwordHash) {
|
||||
return { success: false, error: "Invalid credentials" };
|
||||
}
|
||||
|
||||
const valid = await verifyPassword(password, passwordHash);
|
||||
if (!valid) {
|
||||
return { success: false, error: "Invalid credentials" };
|
||||
}
|
||||
|
||||
const { token } = await this.store.createSession({
|
||||
userId: user.id,
|
||||
tokenType: "session",
|
||||
authMethod,
|
||||
expiresAt: new Date(Date.now() + tokenLifetimes.session),
|
||||
userAgent: metadata?.userAgent,
|
||||
ipAddress: metadata?.ipAddress,
|
||||
});
|
||||
|
||||
return { success: true, token, user };
|
||||
}
|
||||
|
||||
// === Session Validation ===
|
||||
|
||||
async validateRequest(request: ExpressRequest): Promise<AuthResult> {
|
||||
// Try cookie first (for web requests)
|
||||
let token = this.extractCookieToken(request);
|
||||
|
||||
// Fall back to Authorization header (for API requests)
|
||||
if (!token) {
|
||||
token = parseAuthorizationHeader(request.get("Authorization"));
|
||||
}
|
||||
|
||||
if (!token) {
|
||||
return { authenticated: false, user: anonymousUser, session: null };
|
||||
}
|
||||
|
||||
return this.validateToken(token);
|
||||
}
|
||||
|
||||
async validateToken(token: string): Promise<AuthResult> {
|
||||
const tokenId = hashToken(token) as TokenId;
|
||||
const session = await this.store.getSession(tokenId);
|
||||
|
||||
if (!session) {
|
||||
return { authenticated: false, user: anonymousUser, session: null };
|
||||
}
|
||||
|
||||
if (session.tokenType !== "session") {
|
||||
return { authenticated: false, user: anonymousUser, session: null };
|
||||
}
|
||||
|
||||
const user = await this.store.getUserById(session.userId as UserId);
|
||||
if (!user || !user.isActive()) {
|
||||
return { authenticated: false, user: anonymousUser, session: null };
|
||||
}
|
||||
|
||||
// Update last used (fire and forget)
|
||||
this.store.updateLastUsed(tokenId).catch(() => {});
|
||||
|
||||
return { authenticated: true, user, session };
|
||||
}
|
||||
|
||||
private extractCookieToken(request: ExpressRequest): string | null {
|
||||
const cookies = request.get("Cookie");
|
||||
if (!cookies) {
|
||||
return null;
|
||||
}
|
||||
|
||||
for (const cookie of cookies.split(";")) {
|
||||
const [name, ...valueParts] = cookie.trim().split("=");
|
||||
if (name === SESSION_COOKIE_NAME) {
|
||||
return valueParts.join("="); // Handle = in token value
|
||||
}
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
// === Logout ===
|
||||
|
||||
async logout(token: string): Promise<void> {
|
||||
const tokenId = hashToken(token) as TokenId;
|
||||
await this.store.deleteSession(tokenId);
|
||||
}
|
||||
|
||||
async logoutAllSessions(userId: UserId): Promise<number> {
|
||||
return this.store.deleteUserSessions(userId);
|
||||
}
|
||||
|
||||
// === Registration ===
|
||||
|
||||
async register(
|
||||
email: string,
|
||||
password: string,
|
||||
displayName?: string,
|
||||
): Promise<RegisterResult> {
|
||||
const existing = await this.store.getUserByEmail(email);
|
||||
if (existing) {
|
||||
return { success: false, error: "Email already registered" };
|
||||
}
|
||||
|
||||
const passwordHash = await hashPassword(password);
|
||||
const user = await this.store.createUser({
|
||||
email,
|
||||
passwordHash,
|
||||
displayName,
|
||||
});
|
||||
|
||||
// Create email verification token
|
||||
const { token: verificationToken } = await this.store.createSession({
|
||||
userId: user.id,
|
||||
tokenType: "email_verify",
|
||||
authMethod: "bearer",
|
||||
expiresAt: new Date(Date.now() + tokenLifetimes.email_verify),
|
||||
});
|
||||
|
||||
return { success: true, user, verificationToken };
|
||||
}
|
||||
|
||||
// === Email Verification ===
|
||||
|
||||
async verifyEmail(token: string): Promise<SimpleResult> {
|
||||
const tokenId = hashToken(token) as TokenId;
|
||||
const session = await this.store.getSession(tokenId);
|
||||
|
||||
if (!session || session.tokenType !== "email_verify") {
|
||||
return {
|
||||
success: false,
|
||||
error: "Invalid or expired verification token",
|
||||
};
|
||||
}
|
||||
|
||||
if (session.isUsed) {
|
||||
return { success: false, error: "Token already used" };
|
||||
}
|
||||
|
||||
await this.store.updateUserEmailVerified(session.userId as UserId);
|
||||
await this.store.deleteSession(tokenId);
|
||||
|
||||
return { success: true };
|
||||
}
|
||||
|
||||
// === Password Reset ===
|
||||
|
||||
async createPasswordResetToken(
|
||||
email: string,
|
||||
): Promise<{ token: string } | null> {
|
||||
const user = await this.store.getUserByEmail(email);
|
||||
if (!user) {
|
||||
// Don't reveal whether email exists
|
||||
return null;
|
||||
}
|
||||
|
||||
const { token } = await this.store.createSession({
|
||||
userId: user.id,
|
||||
tokenType: "password_reset",
|
||||
authMethod: "bearer",
|
||||
expiresAt: new Date(Date.now() + tokenLifetimes.password_reset),
|
||||
});
|
||||
|
||||
return { token };
|
||||
}
|
||||
|
||||
async resetPassword(
|
||||
token: string,
|
||||
newPassword: string,
|
||||
): Promise<SimpleResult> {
|
||||
const tokenId = hashToken(token) as TokenId;
|
||||
const session = await this.store.getSession(tokenId);
|
||||
|
||||
if (!session || session.tokenType !== "password_reset") {
|
||||
return { success: false, error: "Invalid or expired reset token" };
|
||||
}
|
||||
|
||||
if (session.isUsed) {
|
||||
return { success: false, error: "Token already used" };
|
||||
}
|
||||
|
||||
const passwordHash = await hashPassword(newPassword);
|
||||
await this.store.setUserPassword(
|
||||
session.userId as UserId,
|
||||
passwordHash,
|
||||
);
|
||||
|
||||
// Invalidate all existing sessions (security: password changed)
|
||||
await this.store.deleteUserSessions(session.userId as UserId);
|
||||
|
||||
// Delete the reset token
|
||||
await this.store.deleteSession(tokenId);
|
||||
|
||||
return { success: true };
|
||||
}
|
||||
|
||||
// === Token Extraction Helper (for routes) ===
|
||||
|
||||
extractToken(request: ExpressRequest): string | null {
|
||||
// Try Authorization header first
|
||||
const token = parseAuthorizationHeader(request.get("Authorization"));
|
||||
if (token) {
|
||||
return token;
|
||||
}
|
||||
|
||||
// Try cookie
|
||||
return this.extractCookieToken(request);
|
||||
}
|
||||
}
|
||||
164
express/auth/store.ts
Normal file
164
express/auth/store.ts
Normal file
@@ -0,0 +1,164 @@
|
||||
// store.ts
|
||||
//
|
||||
// Authentication storage interface and in-memory implementation.
|
||||
// The interface allows easy migration to PostgreSQL later.
|
||||
|
||||
import { AuthenticatedUser, type User, type UserId } from "../user";
|
||||
import { generateToken, hashToken } from "./token";
|
||||
import type { AuthMethod, SessionData, TokenId, TokenType } from "./types";
|
||||
|
||||
// Data for creating a new session (tokenId generated internally)
|
||||
export type CreateSessionData = {
|
||||
userId: string;
|
||||
tokenType: TokenType;
|
||||
authMethod: AuthMethod;
|
||||
expiresAt: Date;
|
||||
userAgent?: string;
|
||||
ipAddress?: string;
|
||||
};
|
||||
|
||||
// Data for creating a new user
|
||||
export type CreateUserData = {
|
||||
email: string;
|
||||
passwordHash: string;
|
||||
displayName?: string;
|
||||
};
|
||||
|
||||
// Abstract interface for auth storage - implement for PostgreSQL later
|
||||
export interface AuthStore {
|
||||
// Session operations
|
||||
createSession(
|
||||
data: CreateSessionData,
|
||||
): Promise<{ token: string; session: SessionData }>;
|
||||
getSession(tokenId: TokenId): Promise<SessionData | null>;
|
||||
updateLastUsed(tokenId: TokenId): Promise<void>;
|
||||
deleteSession(tokenId: TokenId): Promise<void>;
|
||||
deleteUserSessions(userId: UserId): Promise<number>;
|
||||
|
||||
// User operations
|
||||
getUserByEmail(email: string): Promise<User | null>;
|
||||
getUserById(userId: UserId): Promise<User | null>;
|
||||
createUser(data: CreateUserData): Promise<User>;
|
||||
getUserPasswordHash(userId: UserId): Promise<string | null>;
|
||||
setUserPassword(userId: UserId, passwordHash: string): Promise<void>;
|
||||
updateUserEmailVerified(userId: UserId): Promise<void>;
|
||||
}
|
||||
|
||||
// In-memory implementation for development
|
||||
export class InMemoryAuthStore implements AuthStore {
|
||||
private sessions: Map<string, SessionData> = new Map();
|
||||
private users: Map<string, User> = new Map();
|
||||
private usersByEmail: Map<string, string> = new Map();
|
||||
private passwordHashes: Map<string, string> = new Map();
|
||||
private emailVerified: Map<string, boolean> = new Map();
|
||||
|
||||
async createSession(
|
||||
data: CreateSessionData,
|
||||
): Promise<{ token: string; session: SessionData }> {
|
||||
const token = generateToken();
|
||||
const tokenId = hashToken(token);
|
||||
|
||||
const session: SessionData = {
|
||||
tokenId,
|
||||
userId: data.userId,
|
||||
tokenType: data.tokenType,
|
||||
authMethod: data.authMethod,
|
||||
createdAt: new Date(),
|
||||
expiresAt: data.expiresAt,
|
||||
userAgent: data.userAgent,
|
||||
ipAddress: data.ipAddress,
|
||||
};
|
||||
|
||||
this.sessions.set(tokenId, session);
|
||||
return { token, session };
|
||||
}
|
||||
|
||||
async getSession(tokenId: TokenId): Promise<SessionData | null> {
|
||||
const session = this.sessions.get(tokenId);
|
||||
if (!session) {
|
||||
return null;
|
||||
}
|
||||
|
||||
// Check expiration
|
||||
if (new Date() > session.expiresAt) {
|
||||
this.sessions.delete(tokenId);
|
||||
return null;
|
||||
}
|
||||
|
||||
return session;
|
||||
}
|
||||
|
||||
async updateLastUsed(tokenId: TokenId): Promise<void> {
|
||||
const session = this.sessions.get(tokenId);
|
||||
if (session) {
|
||||
session.lastUsedAt = new Date();
|
||||
}
|
||||
}
|
||||
|
||||
async deleteSession(tokenId: TokenId): Promise<void> {
|
||||
this.sessions.delete(tokenId);
|
||||
}
|
||||
|
||||
async deleteUserSessions(userId: UserId): Promise<number> {
|
||||
let count = 0;
|
||||
for (const [tokenId, session] of this.sessions) {
|
||||
if (session.userId === userId) {
|
||||
this.sessions.delete(tokenId);
|
||||
count++;
|
||||
}
|
||||
}
|
||||
return count;
|
||||
}
|
||||
|
||||
async getUserByEmail(email: string): Promise<User | null> {
|
||||
const userId = this.usersByEmail.get(email.toLowerCase());
|
||||
if (!userId) {
|
||||
return null;
|
||||
}
|
||||
return this.users.get(userId) ?? null;
|
||||
}
|
||||
|
||||
async getUserById(userId: UserId): Promise<User | null> {
|
||||
return this.users.get(userId) ?? null;
|
||||
}
|
||||
|
||||
async createUser(data: CreateUserData): Promise<User> {
|
||||
const user = AuthenticatedUser.create(data.email, {
|
||||
displayName: data.displayName,
|
||||
status: "pending", // Pending until email verified
|
||||
});
|
||||
|
||||
this.users.set(user.id, user);
|
||||
this.usersByEmail.set(data.email.toLowerCase(), user.id);
|
||||
this.passwordHashes.set(user.id, data.passwordHash);
|
||||
this.emailVerified.set(user.id, false);
|
||||
|
||||
return user;
|
||||
}
|
||||
|
||||
async getUserPasswordHash(userId: UserId): Promise<string | null> {
|
||||
return this.passwordHashes.get(userId) ?? null;
|
||||
}
|
||||
|
||||
async setUserPassword(userId: UserId, passwordHash: string): Promise<void> {
|
||||
this.passwordHashes.set(userId, passwordHash);
|
||||
}
|
||||
|
||||
async updateUserEmailVerified(userId: UserId): Promise<void> {
|
||||
this.emailVerified.set(userId, true);
|
||||
|
||||
// Update user status to active
|
||||
const user = this.users.get(userId);
|
||||
if (user) {
|
||||
// Create new user with active status
|
||||
const updatedUser = AuthenticatedUser.create(user.email, {
|
||||
id: user.id,
|
||||
displayName: user.displayName,
|
||||
status: "active",
|
||||
roles: [...user.roles],
|
||||
permissions: [...user.permissions],
|
||||
});
|
||||
this.users.set(userId, updatedUser);
|
||||
}
|
||||
}
|
||||
}
|
||||
42
express/auth/token.ts
Normal file
42
express/auth/token.ts
Normal file
@@ -0,0 +1,42 @@
|
||||
// token.ts
|
||||
//
|
||||
// Token generation and hashing utilities for authentication.
|
||||
// Raw tokens are never stored - only their SHA-256 hashes.
|
||||
|
||||
import { createHash, randomBytes } from "node:crypto";
|
||||
|
||||
const TOKEN_BYTES = 32; // 256 bits of entropy
|
||||
|
||||
// Generate a cryptographically secure random token
|
||||
function generateToken(): string {
|
||||
return randomBytes(TOKEN_BYTES).toString("base64url");
|
||||
}
|
||||
|
||||
// Hash token for storage (never store raw tokens)
|
||||
function hashToken(token: string): string {
|
||||
return createHash("sha256").update(token).digest("hex");
|
||||
}
|
||||
|
||||
// Parse token from Authorization header
|
||||
function parseAuthorizationHeader(header: string | undefined): string | null {
|
||||
if (!header) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const parts = header.split(" ");
|
||||
if (parts.length !== 2 || parts[0].toLowerCase() !== "bearer") {
|
||||
return null;
|
||||
}
|
||||
|
||||
return parts[1];
|
||||
}
|
||||
|
||||
// Cookie name for web sessions
|
||||
const SESSION_COOKIE_NAME = "diachron_session";
|
||||
|
||||
export {
|
||||
generateToken,
|
||||
hashToken,
|
||||
parseAuthorizationHeader,
|
||||
SESSION_COOKIE_NAME,
|
||||
};
|
||||
96
express/auth/types.ts
Normal file
96
express/auth/types.ts
Normal file
@@ -0,0 +1,96 @@
|
||||
// types.ts
|
||||
//
|
||||
// Authentication types and Zod schemas.
|
||||
|
||||
import { z } from "zod";
|
||||
|
||||
// Branded type for token IDs (the hash, not the raw token)
|
||||
export type TokenId = string & { readonly __brand: "TokenId" };
|
||||
|
||||
// Token types for different purposes
|
||||
export const tokenTypeParser = z.enum([
|
||||
"session",
|
||||
"password_reset",
|
||||
"email_verify",
|
||||
]);
|
||||
export type TokenType = z.infer<typeof tokenTypeParser>;
|
||||
|
||||
// Authentication method - how the token was delivered
|
||||
export const authMethodParser = z.enum(["cookie", "bearer"]);
|
||||
export type AuthMethod = z.infer<typeof authMethodParser>;
|
||||
|
||||
// Session data schema - what gets stored
|
||||
export const sessionDataParser = z.object({
|
||||
tokenId: z.string().min(1),
|
||||
userId: z.string().min(1),
|
||||
tokenType: tokenTypeParser,
|
||||
authMethod: authMethodParser,
|
||||
createdAt: z.coerce.date(),
|
||||
expiresAt: z.coerce.date(),
|
||||
lastUsedAt: z.coerce.date().optional(),
|
||||
userAgent: z.string().optional(),
|
||||
ipAddress: z.string().optional(),
|
||||
isUsed: z.boolean().optional(), // For one-time tokens
|
||||
});
|
||||
|
||||
export type SessionData = z.infer<typeof sessionDataParser>;
|
||||
|
||||
// Input validation schemas for auth endpoints
|
||||
export const loginInputParser = z.object({
|
||||
email: z.string().email(),
|
||||
password: z.string().min(1),
|
||||
});
|
||||
|
||||
export const registerInputParser = z.object({
|
||||
email: z.string().email(),
|
||||
password: z.string().min(8),
|
||||
displayName: z.string().optional(),
|
||||
});
|
||||
|
||||
export const forgotPasswordInputParser = z.object({
|
||||
email: z.string().email(),
|
||||
});
|
||||
|
||||
export const resetPasswordInputParser = z.object({
|
||||
token: z.string().min(1),
|
||||
password: z.string().min(8),
|
||||
});
|
||||
|
||||
// Token lifetimes in milliseconds
|
||||
export const tokenLifetimes: Record<TokenType, number> = {
|
||||
session: 30 * 24 * 60 * 60 * 1000, // 30 days
|
||||
password_reset: 1 * 60 * 60 * 1000, // 1 hour
|
||||
email_verify: 24 * 60 * 60 * 1000, // 24 hours
|
||||
};
|
||||
|
||||
// Import here to avoid circular dependency at module load time
|
||||
import type { User } from "../user";
|
||||
|
||||
// Session wrapper class providing a consistent interface for handlers.
|
||||
// Always present on Call (never null), but may represent an anonymous session.
|
||||
export class Session {
|
||||
constructor(
|
||||
private readonly data: SessionData | null,
|
||||
private readonly user: User,
|
||||
) {}
|
||||
|
||||
getUser(): User {
|
||||
return this.user;
|
||||
}
|
||||
|
||||
getData(): SessionData | null {
|
||||
return this.data;
|
||||
}
|
||||
|
||||
isAuthenticated(): boolean {
|
||||
return !this.user.isAnonymous();
|
||||
}
|
||||
|
||||
get tokenId(): string | undefined {
|
||||
return this.data?.tokenId;
|
||||
}
|
||||
|
||||
get userId(): string | undefined {
|
||||
return this.data?.userId;
|
||||
}
|
||||
}
|
||||
62
express/basic/login.ts
Normal file
62
express/basic/login.ts
Normal file
@@ -0,0 +1,62 @@
|
||||
import { SESSION_COOKIE_NAME } from "../auth/token";
|
||||
import { tokenLifetimes } from "../auth/types";
|
||||
import { request } from "../request";
|
||||
import { html, redirect, render } from "../request/util";
|
||||
import type { Call, Result, Route } from "../types";
|
||||
|
||||
const loginHandler = async (call: Call): Promise<Result> => {
|
||||
if (call.method === "GET") {
|
||||
const c = await render("basic/login", {});
|
||||
return html(c);
|
||||
}
|
||||
|
||||
// POST - handle login
|
||||
const { email, password } = call.request.body;
|
||||
|
||||
if (!email || !password) {
|
||||
const c = await render("basic/login", {
|
||||
error: "Email and password are required",
|
||||
email,
|
||||
});
|
||||
return html(c);
|
||||
}
|
||||
|
||||
const result = await request.auth.login(email, password, "cookie", {
|
||||
userAgent: call.request.get("User-Agent"),
|
||||
ipAddress: call.request.ip,
|
||||
});
|
||||
|
||||
if (!result.success) {
|
||||
const c = await render("basic/login", {
|
||||
error: result.error,
|
||||
email,
|
||||
});
|
||||
return html(c);
|
||||
}
|
||||
|
||||
// Success - set cookie and redirect to home
|
||||
const redirectResult = redirect("/");
|
||||
redirectResult.cookies = [
|
||||
{
|
||||
name: SESSION_COOKIE_NAME,
|
||||
value: result.token,
|
||||
options: {
|
||||
httpOnly: true,
|
||||
secure: false, // Set to true in production with HTTPS
|
||||
sameSite: "lax",
|
||||
maxAge: tokenLifetimes.session,
|
||||
path: "/",
|
||||
},
|
||||
},
|
||||
];
|
||||
|
||||
return redirectResult;
|
||||
};
|
||||
|
||||
const loginRoute: Route = {
|
||||
path: "/login",
|
||||
methods: ["GET", "POST"],
|
||||
handler: loginHandler,
|
||||
};
|
||||
|
||||
export { loginRoute };
|
||||
38
express/basic/logout.ts
Normal file
38
express/basic/logout.ts
Normal file
@@ -0,0 +1,38 @@
|
||||
import { SESSION_COOKIE_NAME } from "../auth/token";
|
||||
import { request } from "../request";
|
||||
import { redirect } from "../request/util";
|
||||
import type { Call, Result, Route } from "../types";
|
||||
|
||||
const logoutHandler = async (call: Call): Promise<Result> => {
|
||||
// Extract token from cookie and invalidate the session
|
||||
const token = request.auth.extractToken(call.request);
|
||||
if (token) {
|
||||
await request.auth.logout(token);
|
||||
}
|
||||
|
||||
// Clear the cookie and redirect to login
|
||||
const redirectResult = redirect("/login");
|
||||
redirectResult.cookies = [
|
||||
{
|
||||
name: SESSION_COOKIE_NAME,
|
||||
value: "",
|
||||
options: {
|
||||
httpOnly: true,
|
||||
secure: false,
|
||||
sameSite: "lax",
|
||||
maxAge: 0,
|
||||
path: "/",
|
||||
},
|
||||
},
|
||||
];
|
||||
|
||||
return redirectResult;
|
||||
};
|
||||
|
||||
const logoutRoute: Route = {
|
||||
path: "/logout",
|
||||
methods: ["GET", "POST"],
|
||||
handler: logoutHandler,
|
||||
};
|
||||
|
||||
export { logoutRoute };
|
||||
43
express/basic/routes.ts
Normal file
43
express/basic/routes.ts
Normal file
@@ -0,0 +1,43 @@
|
||||
import { DateTime } from "ts-luxon";
|
||||
import { request } from "../request";
|
||||
import { html, render } from "../request/util";
|
||||
import type { Call, Result, Route } from "../types";
|
||||
import { loginRoute } from "./login";
|
||||
import { logoutRoute } from "./logout";
|
||||
|
||||
const routes: Record<string, Route> = {
|
||||
hello: {
|
||||
path: "/hello",
|
||||
methods: ["GET"],
|
||||
handler: async (_call: Call): Promise<Result> => {
|
||||
const now = DateTime.now();
|
||||
const c = await render("basic/hello", { now });
|
||||
|
||||
return html(c);
|
||||
},
|
||||
},
|
||||
home: {
|
||||
path: "/",
|
||||
methods: ["GET"],
|
||||
handler: async (_call: Call): Promise<Result> => {
|
||||
const _auth = request.auth;
|
||||
const me = request.session.getUser();
|
||||
|
||||
const email = me.toString();
|
||||
const showLogin = me.isAnonymous();
|
||||
const showLogout = !me.isAnonymous();
|
||||
|
||||
const c = await render("basic/home", {
|
||||
email,
|
||||
showLogin,
|
||||
showLogout,
|
||||
});
|
||||
|
||||
return html(c);
|
||||
},
|
||||
},
|
||||
login: loginRoute,
|
||||
logout: logoutRoute,
|
||||
};
|
||||
|
||||
export { routes };
|
||||
@@ -17,7 +17,10 @@
|
||||
"linter": {
|
||||
"enabled": true,
|
||||
"rules": {
|
||||
"recommended": true
|
||||
"recommended": true,
|
||||
"style": {
|
||||
"useBlockStatements": "error"
|
||||
}
|
||||
}
|
||||
},
|
||||
"javascript": {
|
||||
|
||||
@@ -6,6 +6,10 @@ const { values } = parseArgs({
|
||||
type: "string",
|
||||
short: "l",
|
||||
},
|
||||
"log-address": {
|
||||
type: "string",
|
||||
default: "8085",
|
||||
},
|
||||
},
|
||||
strict: true,
|
||||
allowPositionals: false,
|
||||
@@ -16,7 +20,7 @@ function parseListenAddress(listen: string | undefined): {
|
||||
port: number;
|
||||
} {
|
||||
const defaultHost = "127.0.0.1";
|
||||
const defaultPort = 3000;
|
||||
const defaultPort = 3500;
|
||||
|
||||
if (!listen) {
|
||||
return { host: defaultHost, port: defaultPort };
|
||||
@@ -26,7 +30,7 @@ function parseListenAddress(listen: string | undefined): {
|
||||
if (lastColon === -1) {
|
||||
// Just a port number
|
||||
const port = parseInt(listen, 10);
|
||||
if (isNaN(port)) {
|
||||
if (Number.isNaN(port)) {
|
||||
throw new Error(`Invalid listen address: ${listen}`);
|
||||
}
|
||||
return { host: defaultHost, port };
|
||||
@@ -35,7 +39,7 @@ function parseListenAddress(listen: string | undefined): {
|
||||
const host = listen.slice(0, lastColon);
|
||||
const port = parseInt(listen.slice(lastColon + 1), 10);
|
||||
|
||||
if (isNaN(port)) {
|
||||
if (Number.isNaN(port)) {
|
||||
throw new Error(`Invalid port in listen address: ${listen}`);
|
||||
}
|
||||
|
||||
@@ -43,7 +47,9 @@ function parseListenAddress(listen: string | undefined): {
|
||||
}
|
||||
|
||||
const listenAddress = parseListenAddress(values.listen);
|
||||
const logAddress = parseListenAddress(values["log-address"]);
|
||||
|
||||
export const cli = {
|
||||
listen: listenAddress,
|
||||
logAddress,
|
||||
};
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import { Extensible } from "./interfaces";
|
||||
// This file belongs to the framework. You are not expected to modify it.
|
||||
|
||||
export type ContentType = string;
|
||||
|
||||
|
||||
27
express/context.ts
Normal file
27
express/context.ts
Normal file
@@ -0,0 +1,27 @@
|
||||
// context.ts
|
||||
//
|
||||
// Request-scoped context using AsyncLocalStorage.
|
||||
// Allows services to access request data (like the current user) without
|
||||
// needing to pass Call through every function.
|
||||
|
||||
import { AsyncLocalStorage } from "node:async_hooks";
|
||||
import { anonymousUser, type User } from "./user";
|
||||
|
||||
type RequestContext = {
|
||||
user: User;
|
||||
};
|
||||
|
||||
const asyncLocalStorage = new AsyncLocalStorage<RequestContext>();
|
||||
|
||||
// Run a function within a request context
|
||||
function runWithContext<T>(context: RequestContext, fn: () => T): T {
|
||||
return asyncLocalStorage.run(context, fn);
|
||||
}
|
||||
|
||||
// Get the current user from context, or AnonymousUser if not in a request
|
||||
function getCurrentUser(): User {
|
||||
const context = asyncLocalStorage.getStore();
|
||||
return context?.user ?? anonymousUser;
|
||||
}
|
||||
|
||||
export { getCurrentUser, runWithContext, type RequestContext };
|
||||
48
express/core/index.ts
Normal file
48
express/core/index.ts
Normal file
@@ -0,0 +1,48 @@
|
||||
import nunjucks from "nunjucks";
|
||||
import { db, migrate, migrationStatus } from "../database";
|
||||
import { getLogs, log } from "../logging";
|
||||
|
||||
// FIXME: This doesn't belong here; move it somewhere else.
|
||||
const conf = {
|
||||
templateEngine: () => {
|
||||
return {
|
||||
renderTemplate: (template: string, context: object) => {
|
||||
return nunjucks.renderString(template, context);
|
||||
},
|
||||
};
|
||||
},
|
||||
};
|
||||
|
||||
const database = {
|
||||
db,
|
||||
migrate,
|
||||
migrationStatus,
|
||||
};
|
||||
|
||||
const logging = {
|
||||
log,
|
||||
getLogs,
|
||||
};
|
||||
|
||||
const random = {
|
||||
randomNumber: () => {
|
||||
return Math.random();
|
||||
},
|
||||
};
|
||||
|
||||
const misc = {
|
||||
sleep: (ms: number) => {
|
||||
return new Promise((resolve) => setTimeout(resolve, ms));
|
||||
},
|
||||
};
|
||||
|
||||
// Keep this asciibetically sorted
|
||||
const core = {
|
||||
conf,
|
||||
database,
|
||||
logging,
|
||||
misc,
|
||||
random,
|
||||
};
|
||||
|
||||
export { core };
|
||||
548
express/database.ts
Normal file
548
express/database.ts
Normal file
@@ -0,0 +1,548 @@
|
||||
// database.ts
|
||||
// PostgreSQL database access with Kysely query builder and simple migrations
|
||||
|
||||
import * as fs from "node:fs";
|
||||
import * as path from "node:path";
|
||||
import {
|
||||
type Generated,
|
||||
Kysely,
|
||||
PostgresDialect,
|
||||
type Selectable,
|
||||
sql,
|
||||
} from "kysely";
|
||||
import { Pool } from "pg";
|
||||
import type {
|
||||
AuthStore,
|
||||
CreateSessionData,
|
||||
CreateUserData,
|
||||
} from "./auth/store";
|
||||
import { generateToken, hashToken } from "./auth/token";
|
||||
import type { SessionData, TokenId } from "./auth/types";
|
||||
import type { Domain } from "./types";
|
||||
import { AuthenticatedUser, type User, type UserId } from "./user";
|
||||
|
||||
// Connection configuration
|
||||
const connectionConfig = {
|
||||
host: "localhost",
|
||||
port: 5432,
|
||||
user: "diachron",
|
||||
password: "diachron",
|
||||
database: "diachron",
|
||||
};
|
||||
|
||||
// Database schema types for Kysely
|
||||
// Generated<T> marks columns with database defaults (optional on insert)
|
||||
interface UsersTable {
|
||||
id: string;
|
||||
status: Generated<string>;
|
||||
display_name: string | null;
|
||||
created_at: Generated<Date>;
|
||||
updated_at: Generated<Date>;
|
||||
}
|
||||
|
||||
interface UserEmailsTable {
|
||||
id: string;
|
||||
user_id: string;
|
||||
email: string;
|
||||
normalized_email: string;
|
||||
is_primary: Generated<boolean>;
|
||||
is_verified: Generated<boolean>;
|
||||
created_at: Generated<Date>;
|
||||
verified_at: Date | null;
|
||||
revoked_at: Date | null;
|
||||
}
|
||||
|
||||
interface UserCredentialsTable {
|
||||
id: string;
|
||||
user_id: string;
|
||||
credential_type: Generated<string>;
|
||||
password_hash: string | null;
|
||||
created_at: Generated<Date>;
|
||||
updated_at: Generated<Date>;
|
||||
}
|
||||
|
||||
interface SessionsTable {
|
||||
id: Generated<string>;
|
||||
token_hash: string;
|
||||
user_id: string;
|
||||
user_email_id: string | null;
|
||||
token_type: string;
|
||||
auth_method: string;
|
||||
created_at: Generated<Date>;
|
||||
expires_at: Date;
|
||||
revoked_at: Date | null;
|
||||
ip_address: string | null;
|
||||
user_agent: string | null;
|
||||
is_used: Generated<boolean | null>;
|
||||
}
|
||||
|
||||
interface Database {
|
||||
users: UsersTable;
|
||||
user_emails: UserEmailsTable;
|
||||
user_credentials: UserCredentialsTable;
|
||||
sessions: SessionsTable;
|
||||
}
|
||||
|
||||
// Create the connection pool
|
||||
const pool = new Pool(connectionConfig);
|
||||
|
||||
// Create the Kysely instance
|
||||
const db = new Kysely<Database>({
|
||||
dialect: new PostgresDialect({ pool }),
|
||||
});
|
||||
|
||||
// Raw pool access for when you need it
|
||||
const rawPool = pool;
|
||||
|
||||
// Execute raw SQL (for when Kysely doesn't fit)
|
||||
async function raw<T = unknown>(
|
||||
query: string,
|
||||
params: unknown[] = [],
|
||||
): Promise<T[]> {
|
||||
const result = await pool.query(query, params);
|
||||
return result.rows as T[];
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Migrations
|
||||
// ============================================================================
|
||||
|
||||
// Migration file naming convention:
|
||||
// yyyy-mm-dd_ss_description.sql
|
||||
// e.g., 2025-01-15_01_initial.sql, 2025-01-15_02_add_users.sql
|
||||
//
|
||||
// Migrations directory: express/migrations/
|
||||
|
||||
const FRAMEWORK_MIGRATIONS_DIR = path.join(__dirname, "framework/migrations");
|
||||
const APP_MIGRATIONS_DIR = path.join(__dirname, "migrations");
|
||||
const MIGRATIONS_TABLE = "_migrations";
|
||||
|
||||
interface MigrationRecord {
|
||||
id: number;
|
||||
name: string;
|
||||
applied_at: Date;
|
||||
}
|
||||
|
||||
// Ensure migrations table exists
|
||||
async function ensureMigrationsTable(): Promise<void> {
|
||||
await pool.query(`
|
||||
CREATE TABLE IF NOT EXISTS ${MIGRATIONS_TABLE} (
|
||||
id SERIAL PRIMARY KEY,
|
||||
name TEXT NOT NULL UNIQUE,
|
||||
applied_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
||||
)
|
||||
`);
|
||||
}
|
||||
|
||||
// Get list of applied migrations
|
||||
async function getAppliedMigrations(): Promise<string[]> {
|
||||
const result = await pool.query<MigrationRecord>(
|
||||
`SELECT name FROM ${MIGRATIONS_TABLE} ORDER BY name`,
|
||||
);
|
||||
return result.rows.map((r) => r.name);
|
||||
}
|
||||
|
||||
// Get pending migration files
|
||||
function getMigrationFiles(kind: Domain): string[] {
|
||||
const dir = kind === "fw" ? FRAMEWORK_MIGRATIONS_DIR : APP_MIGRATIONS_DIR;
|
||||
|
||||
if (!fs.existsSync(dir)) {
|
||||
return [];
|
||||
}
|
||||
|
||||
const root = __dirname;
|
||||
|
||||
const mm = fs
|
||||
.readdirSync(dir)
|
||||
.filter((f) => f.endsWith(".sql"))
|
||||
.filter((f) => /^\d{4}-\d{2}-\d{2}_\d{2}-/.test(f))
|
||||
.map((f) => `${dir}/${f}`)
|
||||
.map((f) => f.replace(`${root}/`, ""))
|
||||
.sort();
|
||||
|
||||
return mm;
|
||||
}
|
||||
|
||||
// Run a single migration
|
||||
async function runMigration(filename: string): Promise<void> {
|
||||
// const filepath = path.join(MIGRATIONS_DIR, filename);
|
||||
const filepath = filename;
|
||||
const content = fs.readFileSync(filepath, "utf-8");
|
||||
|
||||
process.stdout.write(` Migration: ${filename}...`);
|
||||
|
||||
// Run migration in a transaction
|
||||
const client = await pool.connect();
|
||||
try {
|
||||
await client.query("BEGIN");
|
||||
await client.query(content);
|
||||
await client.query(
|
||||
`INSERT INTO ${MIGRATIONS_TABLE} (name) VALUES ($1)`,
|
||||
[filename],
|
||||
);
|
||||
await client.query("COMMIT");
|
||||
console.log(" ✓");
|
||||
} catch (err) {
|
||||
console.log(" ✗");
|
||||
const message = err instanceof Error ? err.message : String(err);
|
||||
console.error(` Error: ${message}`);
|
||||
await client.query("ROLLBACK");
|
||||
throw err;
|
||||
} finally {
|
||||
client.release();
|
||||
}
|
||||
}
|
||||
|
||||
function getAllMigrationFiles() {
|
||||
const fw_files = getMigrationFiles("fw");
|
||||
const app_files = getMigrationFiles("app");
|
||||
const all = [...fw_files, ...app_files];
|
||||
|
||||
return all;
|
||||
}
|
||||
|
||||
// Run all pending migrations
|
||||
async function migrate(): Promise<void> {
|
||||
await ensureMigrationsTable();
|
||||
|
||||
const applied = new Set(await getAppliedMigrations());
|
||||
const all = getAllMigrationFiles();
|
||||
const pending = all.filter((all) => !applied.has(all));
|
||||
|
||||
if (pending.length === 0) {
|
||||
console.log("No pending migrations");
|
||||
return;
|
||||
}
|
||||
|
||||
console.log(`Applying ${pending.length} migration(s):`);
|
||||
for (const file of pending) {
|
||||
await runMigration(file);
|
||||
}
|
||||
}
|
||||
|
||||
// List migration status
|
||||
async function migrationStatus(): Promise<{
|
||||
applied: string[];
|
||||
pending: string[];
|
||||
}> {
|
||||
await ensureMigrationsTable();
|
||||
const applied = new Set(await getAppliedMigrations());
|
||||
const ff = getAllMigrationFiles();
|
||||
return {
|
||||
applied: ff.filter((ff) => applied.has(ff)),
|
||||
pending: ff.filter((ff) => !applied.has(ff)),
|
||||
};
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// PostgresAuthStore - Database-backed authentication storage
|
||||
// ============================================================================
|
||||
|
||||
class PostgresAuthStore implements AuthStore {
|
||||
// Session operations
|
||||
|
||||
async createSession(
|
||||
data: CreateSessionData,
|
||||
): Promise<{ token: string; session: SessionData }> {
|
||||
const token = generateToken();
|
||||
const tokenHash = hashToken(token);
|
||||
|
||||
const row = await db
|
||||
.insertInto("sessions")
|
||||
.values({
|
||||
token_hash: tokenHash,
|
||||
user_id: data.userId,
|
||||
token_type: data.tokenType,
|
||||
auth_method: data.authMethod,
|
||||
expires_at: data.expiresAt,
|
||||
user_agent: data.userAgent ?? null,
|
||||
ip_address: data.ipAddress ?? null,
|
||||
})
|
||||
.returningAll()
|
||||
.executeTakeFirstOrThrow();
|
||||
|
||||
const session: SessionData = {
|
||||
tokenId: row.token_hash,
|
||||
userId: row.user_id,
|
||||
tokenType: row.token_type as SessionData["tokenType"],
|
||||
authMethod: row.auth_method as SessionData["authMethod"],
|
||||
createdAt: row.created_at,
|
||||
expiresAt: row.expires_at,
|
||||
userAgent: row.user_agent ?? undefined,
|
||||
ipAddress: row.ip_address ?? undefined,
|
||||
isUsed: row.is_used ?? undefined,
|
||||
};
|
||||
|
||||
return { token, session };
|
||||
}
|
||||
|
||||
async getSession(tokenId: TokenId): Promise<SessionData | null> {
|
||||
const row = await db
|
||||
.selectFrom("sessions")
|
||||
.selectAll()
|
||||
.where("token_hash", "=", tokenId)
|
||||
.where("expires_at", ">", new Date())
|
||||
.where("revoked_at", "is", null)
|
||||
.executeTakeFirst();
|
||||
|
||||
if (!row) {
|
||||
return null;
|
||||
}
|
||||
|
||||
return {
|
||||
tokenId: row.token_hash,
|
||||
userId: row.user_id,
|
||||
tokenType: row.token_type as SessionData["tokenType"],
|
||||
authMethod: row.auth_method as SessionData["authMethod"],
|
||||
createdAt: row.created_at,
|
||||
expiresAt: row.expires_at,
|
||||
userAgent: row.user_agent ?? undefined,
|
||||
ipAddress: row.ip_address ?? undefined,
|
||||
isUsed: row.is_used ?? undefined,
|
||||
};
|
||||
}
|
||||
|
||||
async updateLastUsed(_tokenId: TokenId): Promise<void> {
|
||||
// The new schema doesn't have last_used_at column
|
||||
// This is now a no-op; session activity tracking could be added later
|
||||
}
|
||||
|
||||
async deleteSession(tokenId: TokenId): Promise<void> {
|
||||
// Soft delete by setting revoked_at
|
||||
await db
|
||||
.updateTable("sessions")
|
||||
.set({ revoked_at: new Date() })
|
||||
.where("token_hash", "=", tokenId)
|
||||
.execute();
|
||||
}
|
||||
|
||||
async deleteUserSessions(userId: UserId): Promise<number> {
|
||||
const result = await db
|
||||
.updateTable("sessions")
|
||||
.set({ revoked_at: new Date() })
|
||||
.where("user_id", "=", userId)
|
||||
.where("revoked_at", "is", null)
|
||||
.executeTakeFirst();
|
||||
|
||||
return Number(result.numUpdatedRows);
|
||||
}
|
||||
|
||||
// User operations
|
||||
|
||||
async getUserByEmail(email: string): Promise<User | null> {
|
||||
// Find user through user_emails table
|
||||
const normalizedEmail = email.toLowerCase().trim();
|
||||
|
||||
const row = await db
|
||||
.selectFrom("user_emails")
|
||||
.innerJoin("users", "users.id", "user_emails.user_id")
|
||||
.select([
|
||||
"users.id",
|
||||
"users.status",
|
||||
"users.display_name",
|
||||
"users.created_at",
|
||||
"users.updated_at",
|
||||
"user_emails.email",
|
||||
])
|
||||
.where("user_emails.normalized_email", "=", normalizedEmail)
|
||||
.where("user_emails.revoked_at", "is", null)
|
||||
.executeTakeFirst();
|
||||
|
||||
if (!row) {
|
||||
return null;
|
||||
}
|
||||
return this.rowToUser(row);
|
||||
}
|
||||
|
||||
async getUserById(userId: UserId): Promise<User | null> {
|
||||
// Get user with their primary email
|
||||
const row = await db
|
||||
.selectFrom("users")
|
||||
.leftJoin("user_emails", (join) =>
|
||||
join
|
||||
.onRef("user_emails.user_id", "=", "users.id")
|
||||
.on("user_emails.is_primary", "=", true)
|
||||
.on("user_emails.revoked_at", "is", null),
|
||||
)
|
||||
.select([
|
||||
"users.id",
|
||||
"users.status",
|
||||
"users.display_name",
|
||||
"users.created_at",
|
||||
"users.updated_at",
|
||||
"user_emails.email",
|
||||
])
|
||||
.where("users.id", "=", userId)
|
||||
.executeTakeFirst();
|
||||
|
||||
if (!row) {
|
||||
return null;
|
||||
}
|
||||
return this.rowToUser(row);
|
||||
}
|
||||
|
||||
async createUser(data: CreateUserData): Promise<User> {
|
||||
const userId = crypto.randomUUID();
|
||||
const emailId = crypto.randomUUID();
|
||||
const credentialId = crypto.randomUUID();
|
||||
const now = new Date();
|
||||
const normalizedEmail = data.email.toLowerCase().trim();
|
||||
|
||||
// Create user record
|
||||
await db
|
||||
.insertInto("users")
|
||||
.values({
|
||||
id: userId,
|
||||
display_name: data.displayName ?? null,
|
||||
status: "pending",
|
||||
created_at: now,
|
||||
updated_at: now,
|
||||
})
|
||||
.execute();
|
||||
|
||||
// Create user_email record
|
||||
await db
|
||||
.insertInto("user_emails")
|
||||
.values({
|
||||
id: emailId,
|
||||
user_id: userId,
|
||||
email: data.email,
|
||||
normalized_email: normalizedEmail,
|
||||
is_primary: true,
|
||||
is_verified: false,
|
||||
created_at: now,
|
||||
})
|
||||
.execute();
|
||||
|
||||
// Create user_credential record
|
||||
await db
|
||||
.insertInto("user_credentials")
|
||||
.values({
|
||||
id: credentialId,
|
||||
user_id: userId,
|
||||
credential_type: "password",
|
||||
password_hash: data.passwordHash,
|
||||
created_at: now,
|
||||
updated_at: now,
|
||||
})
|
||||
.execute();
|
||||
|
||||
return new AuthenticatedUser({
|
||||
id: userId,
|
||||
email: data.email,
|
||||
displayName: data.displayName,
|
||||
status: "pending",
|
||||
roles: [],
|
||||
permissions: [],
|
||||
createdAt: now,
|
||||
updatedAt: now,
|
||||
});
|
||||
}
|
||||
|
||||
async getUserPasswordHash(userId: UserId): Promise<string | null> {
|
||||
const row = await db
|
||||
.selectFrom("user_credentials")
|
||||
.select("password_hash")
|
||||
.where("user_id", "=", userId)
|
||||
.where("credential_type", "=", "password")
|
||||
.executeTakeFirst();
|
||||
|
||||
return row?.password_hash ?? null;
|
||||
}
|
||||
|
||||
async setUserPassword(userId: UserId, passwordHash: string): Promise<void> {
|
||||
const now = new Date();
|
||||
|
||||
// Try to update existing credential
|
||||
const result = await db
|
||||
.updateTable("user_credentials")
|
||||
.set({ password_hash: passwordHash, updated_at: now })
|
||||
.where("user_id", "=", userId)
|
||||
.where("credential_type", "=", "password")
|
||||
.executeTakeFirst();
|
||||
|
||||
// If no existing credential, create one
|
||||
if (Number(result.numUpdatedRows) === 0) {
|
||||
await db
|
||||
.insertInto("user_credentials")
|
||||
.values({
|
||||
id: crypto.randomUUID(),
|
||||
user_id: userId,
|
||||
credential_type: "password",
|
||||
password_hash: passwordHash,
|
||||
created_at: now,
|
||||
updated_at: now,
|
||||
})
|
||||
.execute();
|
||||
}
|
||||
|
||||
// Update user's updated_at
|
||||
await db
|
||||
.updateTable("users")
|
||||
.set({ updated_at: now })
|
||||
.where("id", "=", userId)
|
||||
.execute();
|
||||
}
|
||||
|
||||
async updateUserEmailVerified(userId: UserId): Promise<void> {
|
||||
const now = new Date();
|
||||
|
||||
// Update user_emails to mark as verified
|
||||
await db
|
||||
.updateTable("user_emails")
|
||||
.set({
|
||||
is_verified: true,
|
||||
verified_at: now,
|
||||
})
|
||||
.where("user_id", "=", userId)
|
||||
.where("is_primary", "=", true)
|
||||
.execute();
|
||||
|
||||
// Update user status to active
|
||||
await db
|
||||
.updateTable("users")
|
||||
.set({
|
||||
status: "active",
|
||||
updated_at: now,
|
||||
})
|
||||
.where("id", "=", userId)
|
||||
.execute();
|
||||
}
|
||||
|
||||
// Helper to convert database row to User object
|
||||
private rowToUser(row: {
|
||||
id: string;
|
||||
status: string;
|
||||
display_name: string | null;
|
||||
created_at: Date;
|
||||
updated_at: Date;
|
||||
email: string | null;
|
||||
}): User {
|
||||
return new AuthenticatedUser({
|
||||
id: row.id,
|
||||
email: row.email ?? "unknown@example.com",
|
||||
displayName: row.display_name ?? undefined,
|
||||
status: row.status as "active" | "suspended" | "pending",
|
||||
roles: [], // TODO: query from RBAC tables
|
||||
permissions: [], // TODO: query from RBAC tables
|
||||
createdAt: row.created_at,
|
||||
updatedAt: row.updated_at,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Exports
|
||||
// ============================================================================
|
||||
|
||||
export {
|
||||
db,
|
||||
raw,
|
||||
rawPool,
|
||||
pool,
|
||||
migrate,
|
||||
migrationStatus,
|
||||
connectionConfig,
|
||||
PostgresAuthStore,
|
||||
type Database,
|
||||
};
|
||||
17
express/develop/clear-db.ts
Normal file
17
express/develop/clear-db.ts
Normal file
@@ -0,0 +1,17 @@
|
||||
import { connectionConfig, migrate, pool } from "../database";
|
||||
import { dropTables, exitIfUnforced } from "./util";
|
||||
|
||||
async function main(): Promise<void> {
|
||||
exitIfUnforced();
|
||||
|
||||
try {
|
||||
await dropTables();
|
||||
} finally {
|
||||
await pool.end();
|
||||
}
|
||||
}
|
||||
|
||||
main().catch((err) => {
|
||||
console.error("Failed to clear database:", err.message);
|
||||
process.exit(1);
|
||||
});
|
||||
26
express/develop/reset-db.ts
Normal file
26
express/develop/reset-db.ts
Normal file
@@ -0,0 +1,26 @@
|
||||
// reset-db.ts
|
||||
// Development command to wipe the database and apply all migrations from scratch
|
||||
|
||||
import { connectionConfig, migrate, pool } from "../database";
|
||||
import { dropTables, exitIfUnforced } from "./util";
|
||||
|
||||
async function main(): Promise<void> {
|
||||
exitIfUnforced();
|
||||
|
||||
try {
|
||||
await dropTables();
|
||||
|
||||
console.log("");
|
||||
await migrate();
|
||||
|
||||
console.log("");
|
||||
console.log("Database reset complete.");
|
||||
} finally {
|
||||
await pool.end();
|
||||
}
|
||||
}
|
||||
|
||||
main().catch((err) => {
|
||||
console.error("Failed to reset database:", err.message);
|
||||
process.exit(1);
|
||||
});
|
||||
42
express/develop/util.ts
Normal file
42
express/develop/util.ts
Normal file
@@ -0,0 +1,42 @@
|
||||
// FIXME: this is at the wrong level of specificity
|
||||
|
||||
import { connectionConfig, migrate, pool } from "../database";
|
||||
|
||||
const exitIfUnforced = () => {
|
||||
const args = process.argv.slice(2);
|
||||
|
||||
// Require explicit confirmation unless --force is passed
|
||||
if (!args.includes("--force")) {
|
||||
console.error("This will DROP ALL TABLES in the database!");
|
||||
console.error(` Database: ${connectionConfig.database}`);
|
||||
console.error(
|
||||
` Host: ${connectionConfig.host}:${connectionConfig.port}`,
|
||||
);
|
||||
console.error("");
|
||||
console.error("Run with --force to proceed.");
|
||||
process.exit(1);
|
||||
}
|
||||
};
|
||||
|
||||
const dropTables = async () => {
|
||||
console.log("Dropping all tables...");
|
||||
|
||||
// Get all table names in the public schema
|
||||
const result = await pool.query<{ tablename: string }>(`
|
||||
SELECT tablename FROM pg_tables
|
||||
WHERE schemaname = 'public'
|
||||
`);
|
||||
|
||||
if (result.rows.length > 0) {
|
||||
// Drop all tables with CASCADE to handle foreign key constraints
|
||||
const tableNames = result.rows
|
||||
.map((r) => `"${r.tablename}"`)
|
||||
.join(", ");
|
||||
await pool.query(`DROP TABLE IF EXISTS ${tableNames} CASCADE`);
|
||||
console.log(`Dropped ${result.rows.length} table(s)`);
|
||||
} else {
|
||||
console.log("No tables to drop");
|
||||
}
|
||||
};
|
||||
|
||||
export { dropTables, exitIfUnforced };
|
||||
13
express/execution-context-schema.ts
Normal file
13
express/execution-context-schema.ts
Normal file
@@ -0,0 +1,13 @@
|
||||
import { z } from "zod";
|
||||
|
||||
export const executionContextSchema = z.object({
|
||||
diachron_root: z.string(),
|
||||
});
|
||||
|
||||
export type ExecutionContext = z.infer<typeof executionContextSchema>;
|
||||
|
||||
export function parseExecutionContext(
|
||||
env: Record<string, string | undefined>,
|
||||
): ExecutionContext {
|
||||
return executionContextSchema.parse(env);
|
||||
}
|
||||
38
express/execution-context.spec.ts
Normal file
38
express/execution-context.spec.ts
Normal file
@@ -0,0 +1,38 @@
|
||||
import assert from "node:assert/strict";
|
||||
import { describe, it } from "node:test";
|
||||
import { ZodError } from "zod";
|
||||
|
||||
import {
|
||||
executionContextSchema,
|
||||
parseExecutionContext,
|
||||
} from "./execution-context-schema";
|
||||
|
||||
describe("parseExecutionContext", () => {
|
||||
it("parses valid executionContext with diachron_root", () => {
|
||||
const env = { diachron_root: "/some/path" };
|
||||
const result = parseExecutionContext(env);
|
||||
assert.deepEqual(result, { diachron_root: "/some/path" });
|
||||
});
|
||||
|
||||
it("throws ZodError when diachron_root is missing", () => {
|
||||
const env = {};
|
||||
assert.throws(() => parseExecutionContext(env), ZodError);
|
||||
});
|
||||
|
||||
it("strips extra fields not in schema", () => {
|
||||
const env = {
|
||||
diachron_root: "/some/path",
|
||||
EXTRA_VAR: "should be stripped",
|
||||
};
|
||||
const result = parseExecutionContext(env);
|
||||
assert.deepEqual(result, { diachron_root: "/some/path" });
|
||||
assert.equal("EXTRA_VAR" in result, false);
|
||||
});
|
||||
});
|
||||
|
||||
describe("executionContextSchema", () => {
|
||||
it("requires diachron_root to be a string", () => {
|
||||
const result = executionContextSchema.safeParse({ diachron_root: 123 });
|
||||
assert.equal(result.success, false);
|
||||
});
|
||||
});
|
||||
5
express/execution-context.ts
Normal file
5
express/execution-context.ts
Normal file
@@ -0,0 +1,5 @@
|
||||
import { parseExecutionContext } from "./execution-context-schema";
|
||||
|
||||
const executionContext = parseExecutionContext(process.env);
|
||||
|
||||
export { executionContext };
|
||||
29
express/framework/migrations/2026-01-01_01-users.sql
Normal file
29
express/framework/migrations/2026-01-01_01-users.sql
Normal file
@@ -0,0 +1,29 @@
|
||||
-- 0001_users.sql
|
||||
-- Create users table for authentication
|
||||
|
||||
CREATE TABLE users (
|
||||
id UUID PRIMARY KEY,
|
||||
status TEXT NOT NULL DEFAULT 'active',
|
||||
display_name TEXT,
|
||||
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
||||
);
|
||||
|
||||
CREATE TABLE user_emails (
|
||||
id UUID PRIMARY KEY,
|
||||
user_id UUID NOT NULL REFERENCES users(id),
|
||||
email TEXT NOT NULL,
|
||||
normalized_email TEXT NOT NULL,
|
||||
is_primary BOOLEAN NOT NULL DEFAULT FALSE,
|
||||
is_verified BOOLEAN NOT NULL DEFAULT FALSE,
|
||||
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
verified_at TIMESTAMPTZ,
|
||||
revoked_at TIMESTAMPTZ
|
||||
);
|
||||
|
||||
-- Enforce uniqueness only among *active* emails
|
||||
CREATE UNIQUE INDEX user_emails_unique_active
|
||||
ON user_emails (normalized_email)
|
||||
WHERE revoked_at IS NULL;
|
||||
|
||||
|
||||
26
express/framework/migrations/2026-01-01_02-sessions.sql
Normal file
26
express/framework/migrations/2026-01-01_02-sessions.sql
Normal file
@@ -0,0 +1,26 @@
|
||||
-- 0002_sessions.sql
|
||||
-- Create sessions table for auth tokens
|
||||
|
||||
CREATE TABLE sessions (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
token_hash TEXT UNIQUE NOT NULL,
|
||||
user_id UUID NOT NULL REFERENCES users(id),
|
||||
user_email_id UUID REFERENCES user_emails(id),
|
||||
token_type TEXT NOT NULL,
|
||||
auth_method TEXT NOT NULL,
|
||||
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
expires_at TIMESTAMPTZ NOT NULL,
|
||||
revoked_at TIMESTAMPTZ,
|
||||
ip_address INET,
|
||||
user_agent TEXT,
|
||||
is_used BOOLEAN DEFAULT FALSE
|
||||
);
|
||||
|
||||
-- Index for user session lookups (logout all, etc.)
|
||||
CREATE INDEX sessions_user_id_idx ON sessions (user_id);
|
||||
|
||||
-- Index for expiration cleanup
|
||||
CREATE INDEX sessions_expires_at_idx ON sessions (expires_at);
|
||||
|
||||
-- Index for token type filtering
|
||||
CREATE INDEX sessions_token_type_idx ON sessions (token_type);
|
||||
@@ -0,0 +1,17 @@
|
||||
-- 0003_user_credentials.sql
|
||||
-- Create user_credentials table for password storage (extensible for other auth methods)
|
||||
|
||||
CREATE TABLE user_credentials (
|
||||
id UUID PRIMARY KEY,
|
||||
user_id UUID NOT NULL REFERENCES users(id),
|
||||
credential_type TEXT NOT NULL DEFAULT 'password',
|
||||
password_hash TEXT,
|
||||
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
||||
);
|
||||
|
||||
-- Each user can have at most one credential per type
|
||||
CREATE UNIQUE INDEX user_credentials_user_type_idx ON user_credentials (user_id, credential_type);
|
||||
|
||||
-- Index for user lookups
|
||||
CREATE INDEX user_credentials_user_id_idx ON user_credentials (user_id);
|
||||
@@ -0,0 +1,20 @@
|
||||
CREATE TABLE roles (
|
||||
id UUID PRIMARY KEY,
|
||||
name TEXT UNIQUE NOT NULL,
|
||||
description TEXT
|
||||
);
|
||||
|
||||
CREATE TABLE groups (
|
||||
id UUID PRIMARY KEY,
|
||||
name TEXT NOT NULL,
|
||||
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
|
||||
);
|
||||
|
||||
CREATE TABLE user_group_roles (
|
||||
user_id UUID NOT NULL REFERENCES users(id),
|
||||
group_id UUID NOT NULL REFERENCES groups(id),
|
||||
role_id UUID NOT NULL REFERENCES roles(id),
|
||||
granted_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
revoked_at TIMESTAMPTZ,
|
||||
PRIMARY KEY (user_id, group_id, role_id)
|
||||
);
|
||||
14
express/framework/migrations/2026-01-24_02-capabilities.sql
Normal file
14
express/framework/migrations/2026-01-24_02-capabilities.sql
Normal file
@@ -0,0 +1,14 @@
|
||||
CREATE TABLE capabilities (
|
||||
id UUID PRIMARY KEY,
|
||||
name TEXT UNIQUE NOT NULL,
|
||||
description TEXT
|
||||
);
|
||||
|
||||
CREATE TABLE role_capabilities (
|
||||
role_id UUID NOT NULL REFERENCES roles(id),
|
||||
capability_id UUID NOT NULL REFERENCES capabilities(id),
|
||||
granted_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
|
||||
revoked_at TIMESTAMPTZ,
|
||||
PRIMARY KEY (role_id, capability_id)
|
||||
);
|
||||
|
||||
@@ -1,11 +1,11 @@
|
||||
import { contentTypes } from "./content-types";
|
||||
import { core } from "./core";
|
||||
import { httpCodes } from "./http-codes";
|
||||
import { services } from "./services";
|
||||
import type { Call, Handler, Result } from "./types";
|
||||
|
||||
const multiHandler: Handler = async (call: Call): Promise<Result> => {
|
||||
const code = httpCodes.success.OK;
|
||||
const rn = services.random.randomNumber();
|
||||
const rn = core.random.randomNumber();
|
||||
|
||||
const retval: Result = {
|
||||
code,
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import { Extensible } from "./interfaces";
|
||||
// This file belongs to the framework. You are not expected to modify it.
|
||||
|
||||
export type HttpCode = {
|
||||
code: number;
|
||||
|
||||
@@ -1,5 +1,7 @@
|
||||
// internal-logging.ts
|
||||
|
||||
import { cli } from "./cli";
|
||||
|
||||
// FIXME: Move this to somewhere more appropriate
|
||||
type AtLeastOne<T> = [T, ...T[]];
|
||||
|
||||
@@ -13,8 +15,8 @@ type Message = {
|
||||
text: AtLeastOne<string>;
|
||||
};
|
||||
|
||||
const m1: Message = { timestamp: 123, source: "logging", text: ["foo"] };
|
||||
const m2: Message = {
|
||||
const _m1: Message = { timestamp: 123, source: "logging", text: ["foo"] };
|
||||
const _m2: Message = {
|
||||
timestamp: 321,
|
||||
source: "diagnostic",
|
||||
text: ["ok", "whatever"],
|
||||
@@ -30,12 +32,39 @@ type FilterArgument = {
|
||||
match?: (string | RegExp)[];
|
||||
};
|
||||
|
||||
const log = (_message: Message) => {
|
||||
// WRITEME
|
||||
const loggerUrl = `http://${cli.logAddress.host}:${cli.logAddress.port}`;
|
||||
|
||||
const log = (message: Message) => {
|
||||
const payload = {
|
||||
timestamp: message.timestamp ?? Date.now(),
|
||||
source: message.source,
|
||||
text: message.text,
|
||||
};
|
||||
|
||||
fetch(`${loggerUrl}/log`, {
|
||||
method: "POST",
|
||||
headers: { "Content-Type": "application/json" },
|
||||
body: JSON.stringify(payload),
|
||||
}).catch((err) => {
|
||||
console.error("[logging] Failed to send log:", err.message);
|
||||
});
|
||||
};
|
||||
|
||||
const getLogs = (filter: FilterArgument) => {
|
||||
// WRITEME
|
||||
const getLogs = async (filter: FilterArgument): Promise<Message[]> => {
|
||||
const params = new URLSearchParams();
|
||||
if (filter.limit) {
|
||||
params.set("limit", String(filter.limit));
|
||||
}
|
||||
if (filter.before) {
|
||||
params.set("before", String(filter.before));
|
||||
}
|
||||
if (filter.after) {
|
||||
params.set("after", String(filter.after));
|
||||
}
|
||||
|
||||
const url = `${loggerUrl}/logs?${params.toString()}`;
|
||||
const response = await fetch(url);
|
||||
return response.json();
|
||||
};
|
||||
|
||||
// FIXME: there's scope for more specialized functions although they
|
||||
|
||||
69
express/mgmt/add-user.ts
Normal file
69
express/mgmt/add-user.ts
Normal file
@@ -0,0 +1,69 @@
|
||||
// add-user.ts
|
||||
// Management command to create users from the command line
|
||||
|
||||
import { hashPassword } from "../auth/password";
|
||||
import { PostgresAuthStore, pool } from "../database";
|
||||
|
||||
async function main(): Promise<void> {
|
||||
const args = process.argv.slice(2);
|
||||
|
||||
if (args.length < 2) {
|
||||
console.error(
|
||||
"Usage: ./mgmt add-user <email> <password> [--display-name <name>] [--active]",
|
||||
);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
const email = args[0];
|
||||
const password = args[1];
|
||||
|
||||
// Parse optional flags
|
||||
let displayName: string | undefined;
|
||||
let makeActive = false;
|
||||
|
||||
for (let i = 2; i < args.length; i++) {
|
||||
if (args[i] === "--display-name" && args[i + 1]) {
|
||||
displayName = args[i + 1];
|
||||
i++;
|
||||
} else if (args[i] === "--active") {
|
||||
makeActive = true;
|
||||
}
|
||||
}
|
||||
|
||||
try {
|
||||
const store = new PostgresAuthStore();
|
||||
|
||||
// Check if user already exists
|
||||
const existing = await store.getUserByEmail(email);
|
||||
if (existing) {
|
||||
console.error(`Error: User with email '${email}' already exists`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
// Hash password and create user
|
||||
const passwordHash = await hashPassword(password);
|
||||
const user = await store.createUser({
|
||||
email,
|
||||
passwordHash,
|
||||
displayName,
|
||||
});
|
||||
|
||||
// Optionally activate user immediately
|
||||
if (makeActive) {
|
||||
await store.updateUserEmailVerified(user.id);
|
||||
console.log(
|
||||
`Created and activated user: ${user.email} (${user.id})`,
|
||||
);
|
||||
} else {
|
||||
console.log(`Created user: ${user.email} (${user.id})`);
|
||||
console.log(" Status: pending (use --active to create as active)");
|
||||
}
|
||||
} finally {
|
||||
await pool.end();
|
||||
}
|
||||
}
|
||||
|
||||
main().catch((err) => {
|
||||
console.error("Failed to create user:", err.message);
|
||||
process.exit(1);
|
||||
});
|
||||
45
express/migrate.ts
Normal file
45
express/migrate.ts
Normal file
@@ -0,0 +1,45 @@
|
||||
// migrate.ts
|
||||
// CLI script for running database migrations
|
||||
|
||||
import { migrate, migrationStatus, pool } from "./database";
|
||||
|
||||
async function main(): Promise<void> {
|
||||
const command = process.argv[2] || "run";
|
||||
|
||||
try {
|
||||
switch (command) {
|
||||
case "run":
|
||||
await migrate();
|
||||
break;
|
||||
|
||||
case "status": {
|
||||
const status = await migrationStatus();
|
||||
console.log("Applied migrations:");
|
||||
for (const name of status.applied) {
|
||||
console.log(` ✓ ${name}`);
|
||||
}
|
||||
if (status.pending.length > 0) {
|
||||
console.log("\nPending migrations:");
|
||||
for (const name of status.pending) {
|
||||
console.log(` • ${name}`);
|
||||
}
|
||||
} else {
|
||||
console.log("\nNo pending migrations");
|
||||
}
|
||||
break;
|
||||
}
|
||||
|
||||
default:
|
||||
console.error(`Unknown command: ${command}`);
|
||||
console.error("Usage: migrate [run|status]");
|
||||
process.exit(1);
|
||||
}
|
||||
} finally {
|
||||
await pool.end();
|
||||
}
|
||||
}
|
||||
|
||||
main().catch((err) => {
|
||||
console.error("Migration failed:", err);
|
||||
process.exit(1);
|
||||
});
|
||||
@@ -5,7 +5,6 @@
|
||||
"main": "index.js",
|
||||
"scripts": {
|
||||
"test": "echo \"Error: no test specified\" && exit 1",
|
||||
"prettier": "prettier",
|
||||
"nodemon": "nodemon dist/index.js"
|
||||
},
|
||||
"keywords": [],
|
||||
@@ -13,37 +12,25 @@
|
||||
"license": "ISC",
|
||||
"packageManager": "pnpm@10.12.4",
|
||||
"dependencies": {
|
||||
"@ianvs/prettier-plugin-sort-imports": "^4.7.0",
|
||||
"@types/node": "^24.10.1",
|
||||
"@types/nunjucks": "^3.2.6",
|
||||
"@vercel/ncc": "^0.38.4",
|
||||
"express": "^5.1.0",
|
||||
"kysely": "^0.28.9",
|
||||
"nodemon": "^3.1.11",
|
||||
"nunjucks": "^3.2.4",
|
||||
"path-to-regexp": "^8.3.0",
|
||||
"prettier": "^3.6.2",
|
||||
"pg": "^8.16.3",
|
||||
"ts-luxon": "^6.2.0",
|
||||
"ts-node": "^10.9.2",
|
||||
"tsx": "^4.20.6",
|
||||
"typescript": "^5.9.3",
|
||||
"zod": "^4.1.12"
|
||||
},
|
||||
"prettier": {
|
||||
"arrowParens": "always",
|
||||
"bracketSpacing": true,
|
||||
"trailingComma": "all",
|
||||
"tabWidth": 4,
|
||||
"semi": true,
|
||||
"singleQuote": false,
|
||||
"importOrder": [
|
||||
"<THIRD_PARTY_MODULES>",
|
||||
"^[./]"
|
||||
],
|
||||
"importOrderCaseSensitive": true,
|
||||
"plugins": [
|
||||
"@ianvs/prettier-plugin-sort-imports"
|
||||
]
|
||||
},
|
||||
"devDependencies": {
|
||||
"@biomejs/biome": "2.3.10",
|
||||
"@types/express": "^5.0.5"
|
||||
"@types/express": "^5.0.5",
|
||||
"@types/pg": "^8.16.0",
|
||||
"kysely-codegen": "^0.19.0"
|
||||
}
|
||||
}
|
||||
|
||||
669
express/pnpm-lock.yaml
generated
669
express/pnpm-lock.yaml
generated
File diff suppressed because it is too large
Load Diff
25
express/request/index.ts
Normal file
25
express/request/index.ts
Normal file
@@ -0,0 +1,25 @@
|
||||
import { AuthService } from "../auth";
|
||||
import { getCurrentUser } from "../context";
|
||||
import { PostgresAuthStore } from "../database";
|
||||
import type { User } from "../user";
|
||||
import { html, redirect, render } from "./util";
|
||||
|
||||
const util = { html, redirect, render };
|
||||
|
||||
const session = {
|
||||
getUser: (): User => {
|
||||
return getCurrentUser();
|
||||
},
|
||||
};
|
||||
|
||||
// Initialize auth with PostgreSQL store
|
||||
const authStore = new PostgresAuthStore();
|
||||
const auth = new AuthService(authStore);
|
||||
|
||||
const request = {
|
||||
auth,
|
||||
session,
|
||||
util,
|
||||
};
|
||||
|
||||
export { request };
|
||||
45
express/request/util.ts
Normal file
45
express/request/util.ts
Normal file
@@ -0,0 +1,45 @@
|
||||
import { contentTypes } from "../content-types";
|
||||
import { core } from "../core";
|
||||
import { executionContext } from "../execution-context";
|
||||
import { httpCodes } from "../http-codes";
|
||||
import type { RedirectResult, Result } from "../types";
|
||||
import { loadFile } from "../util";
|
||||
import { request } from "./index";
|
||||
|
||||
type NoUser = {
|
||||
[key: string]: unknown;
|
||||
} & {
|
||||
user?: never;
|
||||
};
|
||||
|
||||
const render = async (path: string, ctx?: NoUser): Promise<string> => {
|
||||
const fullPath = `${executionContext.diachron_root}/templates/${path}.html.njk`;
|
||||
const template = await loadFile(fullPath);
|
||||
const user = request.session.getUser();
|
||||
const context = { user, ...ctx };
|
||||
const engine = core.conf.templateEngine();
|
||||
const retval = engine.renderTemplate(template, context);
|
||||
|
||||
return retval;
|
||||
};
|
||||
|
||||
const html = (payload: string): Result => {
|
||||
const retval: Result = {
|
||||
code: httpCodes.success.OK,
|
||||
result: payload,
|
||||
contentType: contentTypes.text.html,
|
||||
};
|
||||
|
||||
return retval;
|
||||
};
|
||||
|
||||
const redirect = (location: string): RedirectResult => {
|
||||
return {
|
||||
code: httpCodes.redirection.SeeOther,
|
||||
contentType: contentTypes.text.plain,
|
||||
result: "",
|
||||
redirect: location,
|
||||
};
|
||||
};
|
||||
|
||||
export { html, redirect, render };
|
||||
@@ -1,10 +1,14 @@
|
||||
/// <reference lib="dom" />
|
||||
|
||||
import nunjucks from "nunjucks";
|
||||
import { DateTime } from "ts-luxon";
|
||||
import { authRoutes } from "./auth/routes";
|
||||
import { routes as basicRoutes } from "./basic/routes";
|
||||
import { contentTypes } from "./content-types";
|
||||
import { core } from "./core";
|
||||
import { multiHandler } from "./handlers";
|
||||
import { HttpCode, httpCodes } from "./http-codes";
|
||||
import { services } from "./services";
|
||||
import { type Call, ProcessedRoute, type Result, type Route } from "./types";
|
||||
import { httpCodes } from "./http-codes";
|
||||
import type { Call, Result, Route } from "./types";
|
||||
|
||||
// FIXME: Obviously put this somewhere else
|
||||
const okText = (result: string): Result => {
|
||||
@@ -20,13 +24,18 @@ const okText = (result: string): Result => {
|
||||
};
|
||||
|
||||
const routes: Route[] = [
|
||||
...authRoutes,
|
||||
basicRoutes.home,
|
||||
basicRoutes.hello,
|
||||
basicRoutes.login,
|
||||
basicRoutes.logout,
|
||||
{
|
||||
path: "/slow",
|
||||
methods: ["GET"],
|
||||
handler: async (_call: Call): Promise<Result> => {
|
||||
console.log("starting slow request");
|
||||
|
||||
await services.misc.sleep(2);
|
||||
await core.misc.sleep(2);
|
||||
|
||||
console.log("finishing slow request");
|
||||
const retval = okText("that was slow");
|
||||
@@ -37,7 +46,7 @@ const routes: Route[] = [
|
||||
{
|
||||
path: "/list",
|
||||
methods: ["GET"],
|
||||
handler: async (call: Call): Promise<Result> => {
|
||||
handler: async (_call: Call): Promise<Result> => {
|
||||
const code = httpCodes.success.OK;
|
||||
const lr = (rr: Route[]) => {
|
||||
const ret = rr.map((r: Route) => {
|
||||
@@ -47,11 +56,50 @@ const routes: Route[] = [
|
||||
return ret;
|
||||
};
|
||||
|
||||
const listing = lr(routes).join(", ");
|
||||
const rrr = lr(routes);
|
||||
|
||||
const template = `
|
||||
<html>
|
||||
<head></head>
|
||||
<body>
|
||||
<ul>
|
||||
{% for route in rrr %}
|
||||
<li><a href="{{ route }}">{{ route }}</a></li>
|
||||
{% endfor %}
|
||||
</ul>
|
||||
</body>
|
||||
</html>
|
||||
`;
|
||||
const result = nunjucks.renderString(template, { rrr });
|
||||
|
||||
const _listing = lr(routes).join(", ");
|
||||
return {
|
||||
code,
|
||||
result: listing + "\n",
|
||||
contentType: contentTypes.text.plain,
|
||||
result,
|
||||
contentType: contentTypes.text.html,
|
||||
};
|
||||
},
|
||||
},
|
||||
{
|
||||
path: "/whoami",
|
||||
methods: ["GET"],
|
||||
handler: async (call: Call): Promise<Result> => {
|
||||
const me = call.session.getUser();
|
||||
const template = `
|
||||
<html>
|
||||
<head></head>
|
||||
<body>
|
||||
{{ me }}
|
||||
</body>
|
||||
</html>
|
||||
`;
|
||||
|
||||
const result = nunjucks.renderString(template, { me });
|
||||
|
||||
return {
|
||||
code: httpCodes.success.OK,
|
||||
contentType: contentTypes.text.html,
|
||||
result,
|
||||
};
|
||||
},
|
||||
},
|
||||
@@ -72,6 +120,29 @@ const routes: Route[] = [
|
||||
};
|
||||
},
|
||||
},
|
||||
{
|
||||
path: "/time",
|
||||
methods: ["GET"],
|
||||
handler: async (_req): Promise<Result> => {
|
||||
const now = DateTime.now();
|
||||
const template = `
|
||||
<html>
|
||||
<head></head>
|
||||
<body>
|
||||
{{ now }}
|
||||
</body>
|
||||
</html>
|
||||
`;
|
||||
|
||||
const result = nunjucks.renderString(template, { now });
|
||||
|
||||
return {
|
||||
code: httpCodes.success.OK,
|
||||
contentType: contentTypes.text.html,
|
||||
result,
|
||||
};
|
||||
},
|
||||
},
|
||||
];
|
||||
|
||||
export { routes };
|
||||
|
||||
@@ -1,36 +0,0 @@
|
||||
// services.ts
|
||||
|
||||
import { config } from "./config";
|
||||
import { getLogs, log } from "./logging";
|
||||
|
||||
//const database = Client({
|
||||
|
||||
//})
|
||||
|
||||
const database = {};
|
||||
|
||||
const logging = {
|
||||
log,
|
||||
getLogs,
|
||||
};
|
||||
|
||||
const random = {
|
||||
randomNumber: () => {
|
||||
return Math.random();
|
||||
},
|
||||
};
|
||||
|
||||
const misc = {
|
||||
sleep: (ms: number) => {
|
||||
return new Promise((resolve) => setTimeout(resolve, ms));
|
||||
},
|
||||
};
|
||||
|
||||
const services = {
|
||||
database,
|
||||
logging,
|
||||
misc,
|
||||
random,
|
||||
};
|
||||
|
||||
export { services };
|
||||
@@ -2,14 +2,13 @@
|
||||
|
||||
// FIXME: split this up into types used by app developers and types internal
|
||||
// to the framework.
|
||||
import {
|
||||
type Request as ExpressRequest,
|
||||
Response as ExpressResponse,
|
||||
} from "express";
|
||||
import type { Request as ExpressRequest } from "express";
|
||||
import type { MatchFunction } from "path-to-regexp";
|
||||
import { z } from "zod";
|
||||
import { type ContentType, contentTypes } from "./content-types";
|
||||
import { type HttpCode, httpCodes } from "./http-codes";
|
||||
import type { Session } from "./auth/types";
|
||||
import type { ContentType } from "./content-types";
|
||||
import type { HttpCode } from "./http-codes";
|
||||
import type { Permission, User } from "./user";
|
||||
|
||||
const methodParser = z.union([
|
||||
z.literal("GET"),
|
||||
@@ -32,6 +31,8 @@ export type Call = {
|
||||
method: Method;
|
||||
parameters: object;
|
||||
request: ExpressRequest;
|
||||
user: User;
|
||||
session: Session;
|
||||
};
|
||||
|
||||
export type InternalHandler = (req: ExpressRequest) => Promise<Result>;
|
||||
@@ -43,12 +44,35 @@ export type ProcessedRoute = {
|
||||
handler: InternalHandler;
|
||||
};
|
||||
|
||||
export type CookieOptions = {
|
||||
httpOnly?: boolean;
|
||||
secure?: boolean;
|
||||
sameSite?: "strict" | "lax" | "none";
|
||||
maxAge?: number;
|
||||
path?: string;
|
||||
};
|
||||
|
||||
export type Cookie = {
|
||||
name: string;
|
||||
value: string;
|
||||
options?: CookieOptions;
|
||||
};
|
||||
|
||||
export type Result = {
|
||||
code: HttpCode;
|
||||
contentType: ContentType;
|
||||
result: string;
|
||||
cookies?: Cookie[];
|
||||
};
|
||||
|
||||
export type RedirectResult = Result & {
|
||||
redirect: string;
|
||||
};
|
||||
|
||||
export function isRedirect(result: Result): result is RedirectResult {
|
||||
return "redirect" in result;
|
||||
}
|
||||
|
||||
export type Route = {
|
||||
path: string;
|
||||
methods: Method[];
|
||||
@@ -56,4 +80,38 @@ export type Route = {
|
||||
interruptable?: boolean;
|
||||
};
|
||||
|
||||
// Authentication error classes
|
||||
export class AuthenticationRequired extends Error {
|
||||
constructor() {
|
||||
super("Authentication required");
|
||||
this.name = "AuthenticationRequired";
|
||||
}
|
||||
}
|
||||
|
||||
export class AuthorizationDenied extends Error {
|
||||
constructor() {
|
||||
super("Authorization denied");
|
||||
this.name = "AuthorizationDenied";
|
||||
}
|
||||
}
|
||||
|
||||
// Helper for handlers to require authentication
|
||||
export function requireAuth(call: Call): User {
|
||||
if (call.user.isAnonymous()) {
|
||||
throw new AuthenticationRequired();
|
||||
}
|
||||
return call.user;
|
||||
}
|
||||
|
||||
// Helper for handlers to require specific permission
|
||||
export function requirePermission(call: Call, permission: Permission): User {
|
||||
const user = requireAuth(call);
|
||||
if (!user.hasPermission(permission)) {
|
||||
throw new AuthorizationDenied();
|
||||
}
|
||||
return user;
|
||||
}
|
||||
|
||||
export type Domain = "app" | "fw";
|
||||
|
||||
export { methodParser, massageMethod };
|
||||
|
||||
232
express/user.ts
Normal file
232
express/user.ts
Normal file
@@ -0,0 +1,232 @@
|
||||
// user.ts
|
||||
//
|
||||
// User model for authentication and authorization.
|
||||
//
|
||||
// Design notes:
|
||||
// - `id` is the stable internal identifier (UUID when database-backed)
|
||||
// - `email` is the primary human-facing identifier
|
||||
// - Roles provide coarse-grained authorization (admin, editor, etc.)
|
||||
// - Permissions provide fine-grained authorization (posts:create, etc.)
|
||||
// - Users can have both roles (which grant permissions) and direct permissions
|
||||
|
||||
import { z } from "zod";
|
||||
|
||||
// Branded type for user IDs to prevent accidental mixing with other strings
|
||||
export type UserId = string & { readonly __brand: "UserId" };
|
||||
|
||||
// User account status
|
||||
const userStatusParser = z.enum(["active", "suspended", "pending"]);
|
||||
export type UserStatus = z.infer<typeof userStatusParser>;
|
||||
|
||||
// Role - simple string identifier
|
||||
const roleParser = z.string().min(1);
|
||||
export type Role = z.infer<typeof roleParser>;
|
||||
|
||||
// Permission format: "resource:action" e.g. "posts:create", "users:delete"
|
||||
const permissionParser = z.string().regex(/^[a-z_]+:[a-z_]+$/, {
|
||||
message: "Permission must be in format 'resource:action'",
|
||||
});
|
||||
export type Permission = z.infer<typeof permissionParser>;
|
||||
|
||||
// Core user data schema - this is what gets stored/serialized
|
||||
const userDataParser = z.object({
|
||||
id: z.string().min(1),
|
||||
email: z.email(),
|
||||
displayName: z.string().optional(),
|
||||
status: userStatusParser,
|
||||
roles: z.array(roleParser),
|
||||
permissions: z.array(permissionParser),
|
||||
createdAt: z.coerce.date(),
|
||||
updatedAt: z.coerce.date(),
|
||||
});
|
||||
|
||||
export type UserData = z.infer<typeof userDataParser>;
|
||||
|
||||
// Role-to-permission mappings
|
||||
// In a real system this might be database-driven or configurable
|
||||
type RolePermissionMap = Map<Role, Permission[]>;
|
||||
|
||||
const defaultRolePermissions: RolePermissionMap = new Map([
|
||||
["admin", ["users:read", "users:create", "users:update", "users:delete"]],
|
||||
["user", ["users:read"]],
|
||||
]);
|
||||
|
||||
export abstract class User {
|
||||
protected readonly data: UserData;
|
||||
protected rolePermissions: RolePermissionMap;
|
||||
|
||||
constructor(data: UserData, rolePermissions?: RolePermissionMap) {
|
||||
this.data = userDataParser.parse(data);
|
||||
this.rolePermissions = rolePermissions ?? defaultRolePermissions;
|
||||
}
|
||||
|
||||
// Identity
|
||||
get id(): UserId {
|
||||
return this.data.id as UserId;
|
||||
}
|
||||
|
||||
get email(): string {
|
||||
return this.data.email;
|
||||
}
|
||||
|
||||
get displayName(): string | undefined {
|
||||
return this.data.displayName;
|
||||
}
|
||||
|
||||
// Status
|
||||
get status(): UserStatus {
|
||||
return this.data.status;
|
||||
}
|
||||
|
||||
isActive(): boolean {
|
||||
return this.data.status === "active";
|
||||
}
|
||||
|
||||
// Roles
|
||||
get roles(): readonly Role[] {
|
||||
return this.data.roles;
|
||||
}
|
||||
|
||||
hasRole(role: Role): boolean {
|
||||
return this.data.roles.includes(role);
|
||||
}
|
||||
|
||||
hasAnyRole(roles: Role[]): boolean {
|
||||
return roles.some((role) => this.hasRole(role));
|
||||
}
|
||||
|
||||
hasAllRoles(roles: Role[]): boolean {
|
||||
return roles.every((role) => this.hasRole(role));
|
||||
}
|
||||
|
||||
// Permissions
|
||||
get permissions(): readonly Permission[] {
|
||||
return this.data.permissions;
|
||||
}
|
||||
|
||||
// Get all permissions: direct + role-derived
|
||||
effectivePermissions(): Set<Permission> {
|
||||
const perms = new Set<Permission>(this.data.permissions);
|
||||
|
||||
for (const role of this.data.roles) {
|
||||
const rolePerms = this.rolePermissions.get(role);
|
||||
if (rolePerms) {
|
||||
for (const p of rolePerms) {
|
||||
perms.add(p);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return perms;
|
||||
}
|
||||
|
||||
// Check if user has a specific permission (direct or via role)
|
||||
hasPermission(permission: Permission): boolean {
|
||||
// Check direct permissions first
|
||||
if (this.data.permissions.includes(permission)) {
|
||||
return true;
|
||||
}
|
||||
|
||||
// Check role-derived permissions
|
||||
for (const role of this.data.roles) {
|
||||
const rolePerms = this.rolePermissions.get(role);
|
||||
if (rolePerms?.includes(permission)) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
||||
return false;
|
||||
}
|
||||
|
||||
// Convenience method: can user perform action on resource?
|
||||
can(action: string, resource: string): boolean {
|
||||
const permission = `${resource}:${action}` as Permission;
|
||||
return this.hasPermission(permission);
|
||||
}
|
||||
|
||||
// Timestamps
|
||||
get createdAt(): Date {
|
||||
return this.data.createdAt;
|
||||
}
|
||||
|
||||
get updatedAt(): Date {
|
||||
return this.data.updatedAt;
|
||||
}
|
||||
|
||||
// Serialization - returns plain object for storage/transmission
|
||||
toJSON(): UserData {
|
||||
return { ...this.data };
|
||||
}
|
||||
|
||||
toString(): string {
|
||||
return `User(id ${this.id})`;
|
||||
}
|
||||
|
||||
abstract isAnonymous(): boolean;
|
||||
}
|
||||
|
||||
export class AuthenticatedUser extends User {
|
||||
// Factory for creating new users with sensible defaults
|
||||
static create(
|
||||
email: string,
|
||||
options?: {
|
||||
id?: string;
|
||||
displayName?: string;
|
||||
status?: UserStatus;
|
||||
roles?: Role[];
|
||||
permissions?: Permission[];
|
||||
},
|
||||
): User {
|
||||
const now = new Date();
|
||||
return new AuthenticatedUser({
|
||||
id: options?.id ?? crypto.randomUUID(),
|
||||
email,
|
||||
displayName: options?.displayName,
|
||||
status: options?.status ?? "active",
|
||||
roles: options?.roles ?? [],
|
||||
permissions: options?.permissions ?? [],
|
||||
createdAt: now,
|
||||
updatedAt: now,
|
||||
});
|
||||
}
|
||||
|
||||
isAnonymous(): boolean {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
// For representing "no user" in contexts where user is optional
|
||||
export class AnonymousUser extends User {
|
||||
// FIXME: this is C&Ped with only minimal changes. No bueno.
|
||||
static create(
|
||||
email: string,
|
||||
options?: {
|
||||
id?: string;
|
||||
displayName?: string;
|
||||
status?: UserStatus;
|
||||
roles?: Role[];
|
||||
permissions?: Permission[];
|
||||
},
|
||||
): AnonymousUser {
|
||||
const now = new Date(0);
|
||||
return new AnonymousUser({
|
||||
id: options?.id ?? crypto.randomUUID(),
|
||||
email,
|
||||
displayName: options?.displayName,
|
||||
status: options?.status ?? "active",
|
||||
roles: options?.roles ?? [],
|
||||
permissions: options?.permissions ?? [],
|
||||
createdAt: now,
|
||||
updatedAt: now,
|
||||
});
|
||||
}
|
||||
|
||||
isAnonymous(): boolean {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
||||
export const anonymousUser = AnonymousUser.create("anonymous@example.com", {
|
||||
id: "-1",
|
||||
displayName: "Anonymous User",
|
||||
});
|
||||
11
express/util.ts
Normal file
11
express/util.ts
Normal file
@@ -0,0 +1,11 @@
|
||||
import { readFile } from "node:fs/promises";
|
||||
|
||||
// FIXME: Handle the error here
|
||||
const loadFile = async (path: string): Promise<string> => {
|
||||
// Specifying 'utf8' returns a string; otherwise, it returns a Buffer
|
||||
const data = await readFile(path, "utf8");
|
||||
|
||||
return data;
|
||||
};
|
||||
|
||||
export { loadFile };
|
||||
4
fixup.sh
4
fixup.sh
@@ -10,8 +10,8 @@ cd "$DIR"
|
||||
|
||||
# uv run ruff format .
|
||||
|
||||
shell_scripts="$(fd .sh | xargs)"
|
||||
shfmt -i 4 -w "$DIR/cmd" "$DIR"/framework/cmd.d/* "$DIR"/framework/shims/*
|
||||
shell_scripts="$(fd '.sh$' | xargs)"
|
||||
shfmt -i 4 -w "$DIR/cmd" "$DIR"/framework/cmd.d/* "$DIR"/framework/shims/* "$DIR"/master/master "$DIR"/logger/logger
|
||||
# "$shell_scripts"
|
||||
for ss in $shell_scripts; do
|
||||
shfmt -i 4 -w $ss
|
||||
|
||||
15
framework/cmd.d/test
Executable file
15
framework/cmd.d/test
Executable file
@@ -0,0 +1,15 @@
|
||||
#!/bin/bash
|
||||
|
||||
set -eu
|
||||
|
||||
shopt -s globstar nullglob
|
||||
|
||||
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
|
||||
cd "$DIR/../../express"
|
||||
|
||||
if [ $# -eq 0 ]; then
|
||||
"$DIR"/../shims/pnpm tsx --test ./**/*.spec.ts ./**/*.test.ts
|
||||
else
|
||||
"$DIR"/../shims/pnpm tsx --test "$@"
|
||||
fi
|
||||
9
framework/common.d/db
Executable file
9
framework/common.d/db
Executable file
@@ -0,0 +1,9 @@
|
||||
#!/bin/bash
|
||||
|
||||
set -eu
|
||||
|
||||
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
ROOT="$DIR/../.."
|
||||
|
||||
# FIXME: don't hard code this of course
|
||||
PGPASSWORD=diachron psql -U diachron -h localhost diachron
|
||||
9
framework/common.d/migrate
Executable file
9
framework/common.d/migrate
Executable file
@@ -0,0 +1,9 @@
|
||||
#!/bin/bash
|
||||
|
||||
set -eu
|
||||
|
||||
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
ROOT="$DIR/../.."
|
||||
|
||||
cd "$ROOT/express"
|
||||
"$DIR"/tsx migrate.ts "$@"
|
||||
11
framework/develop.d/clear-db
Executable file
11
framework/develop.d/clear-db
Executable file
@@ -0,0 +1,11 @@
|
||||
#!/bin/bash
|
||||
|
||||
# This file belongs to the framework. You are not expected to modify it.
|
||||
|
||||
set -eu
|
||||
|
||||
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
ROOT="$DIR/../.."
|
||||
|
||||
cd "$ROOT/express"
|
||||
"$DIR"/../cmd.d/tsx develop/clear-db.ts "$@"
|
||||
1
framework/develop.d/db
Symbolic link
1
framework/develop.d/db
Symbolic link
@@ -0,0 +1 @@
|
||||
../common.d/db
|
||||
1
framework/develop.d/migrate
Symbolic link
1
framework/develop.d/migrate
Symbolic link
@@ -0,0 +1 @@
|
||||
../common.d/migrate
|
||||
9
framework/develop.d/reset-db
Executable file
9
framework/develop.d/reset-db
Executable file
@@ -0,0 +1,9 @@
|
||||
#!/bin/bash
|
||||
|
||||
set -eu
|
||||
|
||||
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
ROOT="$DIR/../.."
|
||||
|
||||
cd "$ROOT/express"
|
||||
"$DIR"/../cmd.d/tsx develop/reset-db.ts "$@"
|
||||
9
framework/mgmt.d/add-user
Executable file
9
framework/mgmt.d/add-user
Executable file
@@ -0,0 +1,9 @@
|
||||
#!/bin/bash
|
||||
|
||||
set -eu
|
||||
|
||||
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
ROOT="$DIR/../.."
|
||||
|
||||
cd "$ROOT/express"
|
||||
"$DIR"/../cmd.d/tsx mgmt/add-user.ts "$@"
|
||||
1
framework/mgmt.d/db
Symbolic link
1
framework/mgmt.d/db
Symbolic link
@@ -0,0 +1 @@
|
||||
../common.d/db
|
||||
1
framework/mgmt.d/migrate
Symbolic link
1
framework/mgmt.d/migrate
Symbolic link
@@ -0,0 +1 @@
|
||||
../common.d/migrate
|
||||
@@ -5,10 +5,8 @@
|
||||
set -eu
|
||||
|
||||
node_shim_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
export node_shim_DIR
|
||||
|
||||
source "$node_shim_DIR"/../versions
|
||||
|
||||
# shellcheck source=node.common
|
||||
source "$node_shim_DIR"/node.common
|
||||
|
||||
exec "$nodejs_binary_dir/node" "$@"
|
||||
|
||||
@@ -2,23 +2,19 @@
|
||||
# shellcheck shell=bash
|
||||
|
||||
node_common_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
project_root="$node_common_DIR/../.."
|
||||
|
||||
# FIXME this shouldn't be hardcoded here of course
|
||||
nodejs_binary_dir="$node_common_DIR/../binaries/node-v22.15.1-linux-x64/bin"
|
||||
# shellcheck source=../versions
|
||||
source "$node_common_DIR"/../versions
|
||||
|
||||
nodejs_binary_dir="$project_root/$nodejs_bin_dir"
|
||||
|
||||
# This might be too restrictive. Or not restrictive enough.
|
||||
PATH="$nodejs_binary_dir":/bin:/usr/bin
|
||||
|
||||
project_root="$node_common_DIR/../.."
|
||||
node_dist_dir="$project_root/$nodejs_dist_dir"
|
||||
|
||||
node_dir="$project_root/$nodejs_binary_dir"
|
||||
|
||||
export NPM_CONFIG_PREFIX="$node_dir/npm"
|
||||
export NPM_CONFIG_CACHE="$node_dir/cache"
|
||||
export NPM_CONFIG_TMP="$node_dir/tmp"
|
||||
export NODE_PATH="$node_dir/node_modules"
|
||||
|
||||
# echo $NPM_CONFIG_PREFIX
|
||||
# echo $NPM_CONFIG_CACHE
|
||||
# echo $NPM_CONFIG_TMP
|
||||
# echo $NODE_PATH
|
||||
export NPM_CONFIG_PREFIX="$node_dist_dir/npm"
|
||||
export NPM_CONFIG_CACHE="$node_dist_dir/cache"
|
||||
export NPM_CONFIG_TMP="$node_dist_dir/tmp"
|
||||
export NODE_PATH="$node_dist_dir/node_modules"
|
||||
|
||||
19
framework/versions
Normal file
19
framework/versions
Normal file
@@ -0,0 +1,19 @@
|
||||
# shellcheck shell=bash
|
||||
|
||||
# This file belongs to the framework. You are not expected to modify it.
|
||||
|
||||
# https://nodejs.org/dist
|
||||
nodejs_binary_linux_x86_64=https://nodejs.org/dist/v24.12.0/node-v24.12.0-linux-x64.tar.xz
|
||||
nodejs_checksum_linux_x86_64=bdebee276e58d0ef5448f3d5ac12c67daa963dd5e0a9bb621a53d1cefbc852fd
|
||||
nodejs_dist_dir=framework/binaries/node-v22.15.1-linux-x64
|
||||
nodejs_bin_dir="$nodejs_dist_dir/bin"
|
||||
|
||||
caddy_binary_linux_x86_64=fixme
|
||||
caddy_checksum_linux_x86_64=fixmetoo
|
||||
|
||||
# https://github.com/pnpm/pnpm/releases
|
||||
pnpm_binary_linux_x86_64=https://github.com/pnpm/pnpm/releases/download/v10.28.0/pnpm-linux-x64
|
||||
pnpm_checksum_linux_x86_64=sha256:348e863d17a62411a65f900e8d91395acabae9e9237653ccc3c36cb385965f28
|
||||
|
||||
|
||||
golangci_lint=v2.7.2-alpine
|
||||
1
logger/.gitignore
vendored
Normal file
1
logger/.gitignore
vendored
Normal file
@@ -0,0 +1 @@
|
||||
logger-bin
|
||||
3
logger/go.mod
Normal file
3
logger/go.mod
Normal file
@@ -0,0 +1,3 @@
|
||||
module philologue.net/diachron/logger-bin
|
||||
|
||||
go 1.23.3
|
||||
7
logger/logger
Executable file
7
logger/logger
Executable file
@@ -0,0 +1,7 @@
|
||||
#!/bin/bash
|
||||
|
||||
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
|
||||
cd "$DIR"
|
||||
|
||||
./logger-bin "$@"
|
||||
70
logger/main.go
Normal file
70
logger/main.go
Normal file
@@ -0,0 +1,70 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"flag"
|
||||
"fmt"
|
||||
"log"
|
||||
"net/http"
|
||||
"strconv"
|
||||
)
|
||||
|
||||
func main() {
|
||||
port := flag.Int("port", 8085, "port to listen on")
|
||||
capacity := flag.Int("capacity", 1000000, "max messages to store")
|
||||
|
||||
flag.Parse()
|
||||
|
||||
store := NewLogStore(*capacity)
|
||||
|
||||
http.HandleFunc("POST /log", func(w http.ResponseWriter, r *http.Request) {
|
||||
var msg Message
|
||||
if err := json.NewDecoder(r.Body).Decode(&msg); err != nil {
|
||||
http.Error(w, "invalid JSON", http.StatusBadRequest)
|
||||
return
|
||||
}
|
||||
|
||||
store.Add(msg)
|
||||
w.WriteHeader(http.StatusCreated)
|
||||
})
|
||||
|
||||
http.HandleFunc("GET /logs", func(w http.ResponseWriter, r *http.Request) {
|
||||
params := FilterParams{}
|
||||
|
||||
if limit := r.URL.Query().Get("limit"); limit != "" {
|
||||
if n, err := strconv.Atoi(limit); err == nil {
|
||||
params.Limit = n
|
||||
}
|
||||
}
|
||||
if before := r.URL.Query().Get("before"); before != "" {
|
||||
if ts, err := strconv.ParseInt(before, 10, 64); err == nil {
|
||||
params.Before = ts
|
||||
}
|
||||
}
|
||||
if after := r.URL.Query().Get("after"); after != "" {
|
||||
if ts, err := strconv.ParseInt(after, 10, 64); err == nil {
|
||||
params.After = ts
|
||||
}
|
||||
}
|
||||
|
||||
messages := store.GetFiltered(params)
|
||||
|
||||
w.Header().Set("Content-Type", "application/json")
|
||||
json.NewEncoder(w).Encode(messages)
|
||||
})
|
||||
|
||||
http.HandleFunc("GET /status", func(w http.ResponseWriter, r *http.Request) {
|
||||
status := map[string]any{
|
||||
"count": store.Count(),
|
||||
"capacity": *capacity,
|
||||
}
|
||||
w.Header().Set("Content-Type", "application/json")
|
||||
json.NewEncoder(w).Encode(status)
|
||||
})
|
||||
|
||||
listenAddr := fmt.Sprintf(":%d", *port)
|
||||
log.Printf("[logger] Listening on %s (capacity: %d)", listenAddr, *capacity)
|
||||
if err := http.ListenAndServe(listenAddr, nil); err != nil {
|
||||
log.Fatalf("[logger] Failed to start: %v", err)
|
||||
}
|
||||
}
|
||||
126
logger/store.go
Normal file
126
logger/store.go
Normal file
@@ -0,0 +1,126 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"sync"
|
||||
)
|
||||
|
||||
// Message represents a log entry from the express backend
|
||||
type Message struct {
|
||||
Timestamp int64 `json:"timestamp"`
|
||||
Source string `json:"source"` // "logging" | "diagnostic" | "user"
|
||||
Text []string `json:"text"`
|
||||
}
|
||||
|
||||
// LogStore is a thread-safe ring buffer for log messages
|
||||
type LogStore struct {
|
||||
mu sync.RWMutex
|
||||
messages []Message
|
||||
head int // next write position
|
||||
full bool // whether buffer has wrapped
|
||||
capacity int
|
||||
}
|
||||
|
||||
// NewLogStore creates a new log store with the given capacity
|
||||
func NewLogStore(capacity int) *LogStore {
|
||||
return &LogStore{
|
||||
messages: make([]Message, capacity),
|
||||
capacity: capacity,
|
||||
}
|
||||
}
|
||||
|
||||
// Add inserts a new message into the store
|
||||
func (s *LogStore) Add(msg Message) {
|
||||
s.mu.Lock()
|
||||
defer s.mu.Unlock()
|
||||
|
||||
s.messages[s.head] = msg
|
||||
s.head++
|
||||
if s.head >= s.capacity {
|
||||
s.head = 0
|
||||
s.full = true
|
||||
}
|
||||
}
|
||||
|
||||
// Count returns the number of messages in the store
|
||||
func (s *LogStore) Count() int {
|
||||
s.mu.RLock()
|
||||
defer s.mu.RUnlock()
|
||||
|
||||
if s.full {
|
||||
return s.capacity
|
||||
}
|
||||
return s.head
|
||||
}
|
||||
|
||||
// GetRecent returns the most recent n messages, newest first
|
||||
func (s *LogStore) GetRecent(n int) []Message {
|
||||
s.mu.RLock()
|
||||
defer s.mu.RUnlock()
|
||||
|
||||
count := s.Count()
|
||||
if n > count {
|
||||
n = count
|
||||
}
|
||||
if n == 0 {
|
||||
return nil
|
||||
}
|
||||
|
||||
result := make([]Message, n)
|
||||
pos := s.head - 1
|
||||
for i := 0; i < n; i++ {
|
||||
if pos < 0 {
|
||||
pos = s.capacity - 1
|
||||
}
|
||||
result[i] = s.messages[pos]
|
||||
pos--
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
// Filter parameters for retrieving logs
|
||||
type FilterParams struct {
|
||||
Limit int // max messages to return (0 = default 100)
|
||||
Before int64 // only messages before this timestamp
|
||||
After int64 // only messages after this timestamp
|
||||
}
|
||||
|
||||
// GetFiltered returns messages matching the filter criteria
|
||||
func (s *LogStore) GetFiltered(params FilterParams) []Message {
|
||||
s.mu.RLock()
|
||||
defer s.mu.RUnlock()
|
||||
|
||||
limit := params.Limit
|
||||
if limit <= 0 {
|
||||
limit = 100
|
||||
}
|
||||
|
||||
count := s.Count()
|
||||
if count == 0 {
|
||||
return nil
|
||||
}
|
||||
|
||||
result := make([]Message, 0, limit)
|
||||
pos := s.head - 1
|
||||
|
||||
for i := 0; i < count && len(result) < limit; i++ {
|
||||
if pos < 0 {
|
||||
pos = s.capacity - 1
|
||||
}
|
||||
msg := s.messages[pos]
|
||||
|
||||
// Apply filters
|
||||
if params.Before > 0 && msg.Timestamp >= params.Before {
|
||||
pos--
|
||||
continue
|
||||
}
|
||||
if params.After > 0 && msg.Timestamp <= params.After {
|
||||
pos--
|
||||
continue
|
||||
}
|
||||
|
||||
result = append(result, msg)
|
||||
pos--
|
||||
}
|
||||
|
||||
return result
|
||||
}
|
||||
2
master/.gitignore
vendored
2
master/.gitignore
vendored
@@ -1 +1 @@
|
||||
master
|
||||
master-bin
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
module philologue.net/diachron/master
|
||||
module philologue.net/diachron/master-bin
|
||||
|
||||
go 1.23.3
|
||||
|
||||
|
||||
@@ -1,46 +1,42 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"flag"
|
||||
"fmt"
|
||||
"os"
|
||||
"os/signal"
|
||||
"strconv"
|
||||
"syscall"
|
||||
)
|
||||
|
||||
func main() {
|
||||
watchedDir := os.Getenv("WATCHED_DIR")
|
||||
watchDir := flag.String("watch", "../express", "directory to watch for changes")
|
||||
workers := flag.Int("workers", 1, "number of worker processes")
|
||||
basePort := flag.Int("base-port", 3000, "base port for worker processes")
|
||||
listenPort := flag.Int("port", 8080, "port for the reverse proxy to listen on")
|
||||
loggerPort := flag.Int("logger-port", 8085, "port for the logger service")
|
||||
loggerCapacity := flag.Int("logger-capacity", 1000000, "max messages for logger to store")
|
||||
|
||||
numChildProcesses := 1
|
||||
if n, err := strconv.Atoi(os.Getenv("NUM_CHILD_PROCESSES")); err == nil && n > 0 {
|
||||
numChildProcesses = n
|
||||
}
|
||||
|
||||
basePort := 3000
|
||||
if p, err := strconv.Atoi(os.Getenv("BASE_PORT")); err == nil && p > 0 {
|
||||
basePort = p
|
||||
}
|
||||
|
||||
listenPort := 8080
|
||||
if p, err := strconv.Atoi(os.Getenv("LISTEN_PORT")); err == nil && p > 0 {
|
||||
listenPort = p
|
||||
}
|
||||
|
||||
// Create worker pool
|
||||
pool := NewWorkerPool()
|
||||
flag.Parse()
|
||||
|
||||
// Setup signal handling
|
||||
sigCh := make(chan os.Signal, 1)
|
||||
signal.Notify(sigCh, os.Interrupt, syscall.SIGTERM)
|
||||
|
||||
// Start and manage the logger process
|
||||
stopLogger := startLogger(*loggerPort, *loggerCapacity)
|
||||
defer stopLogger()
|
||||
|
||||
// Create worker pool
|
||||
pool := NewWorkerPool()
|
||||
|
||||
fileChanges := make(chan FileChange, 10)
|
||||
|
||||
go watchFiles(watchedDir, fileChanges)
|
||||
go watchFiles(*watchDir, fileChanges)
|
||||
|
||||
go runExpress(fileChanges, numChildProcesses, basePort, pool)
|
||||
go runExpress(fileChanges, *workers, *basePort, pool)
|
||||
|
||||
// Start the reverse proxy
|
||||
listenAddr := fmt.Sprintf(":%d", listenPort)
|
||||
listenAddr := fmt.Sprintf(":%d", *listenPort)
|
||||
go startProxy(listenAddr, pool)
|
||||
|
||||
// Wait for interrupt signal
|
||||
|
||||
9
master/master
Executable file
9
master/master
Executable file
@@ -0,0 +1,9 @@
|
||||
#!/bin/bash
|
||||
|
||||
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
|
||||
cd "$DIR"
|
||||
|
||||
export diachron_root="$DIR/.."
|
||||
|
||||
./master-bin "$@"
|
||||
106
master/runlogger.go
Normal file
106
master/runlogger.go
Normal file
@@ -0,0 +1,106 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"log"
|
||||
"os"
|
||||
"os/exec"
|
||||
"strconv"
|
||||
"sync"
|
||||
"syscall"
|
||||
"time"
|
||||
)
|
||||
|
||||
// startLogger starts the logger process and returns a function to stop it.
|
||||
// It automatically restarts the logger if it crashes.
|
||||
func startLogger(port int, capacity int) func() {
|
||||
var mu sync.Mutex
|
||||
var cmd *exec.Cmd
|
||||
var stopping bool
|
||||
|
||||
portStr := strconv.Itoa(port)
|
||||
capacityStr := strconv.Itoa(capacity)
|
||||
|
||||
start := func() *exec.Cmd {
|
||||
c := exec.Command("../logger/logger", "--port", portStr, "--capacity", capacityStr)
|
||||
c.Stdout = os.Stdout
|
||||
c.Stderr = os.Stderr
|
||||
|
||||
if err := c.Start(); err != nil {
|
||||
log.Printf("[logger] Failed to start: %v", err)
|
||||
return nil
|
||||
}
|
||||
|
||||
log.Printf("[logger] Started (pid %d) on port %s", c.Process.Pid, portStr)
|
||||
return c
|
||||
}
|
||||
|
||||
// Start initial logger
|
||||
cmd = start()
|
||||
|
||||
// Monitor and restart on crash
|
||||
go func() {
|
||||
for {
|
||||
mu.Lock()
|
||||
currentCmd := cmd
|
||||
mu.Unlock()
|
||||
|
||||
if currentCmd == nil {
|
||||
time.Sleep(time.Second)
|
||||
mu.Lock()
|
||||
if !stopping {
|
||||
cmd = start()
|
||||
}
|
||||
mu.Unlock()
|
||||
continue
|
||||
}
|
||||
|
||||
err := currentCmd.Wait()
|
||||
|
||||
mu.Lock()
|
||||
if stopping {
|
||||
mu.Unlock()
|
||||
return
|
||||
}
|
||||
|
||||
if err != nil {
|
||||
log.Printf("[logger] Process exited: %v, restarting...", err)
|
||||
} else {
|
||||
log.Printf("[logger] Process exited normally, restarting...")
|
||||
}
|
||||
|
||||
time.Sleep(time.Second)
|
||||
cmd = start()
|
||||
mu.Unlock()
|
||||
}
|
||||
}()
|
||||
|
||||
// Return stop function
|
||||
return func() {
|
||||
mu.Lock()
|
||||
defer mu.Unlock()
|
||||
|
||||
stopping = true
|
||||
|
||||
if cmd == nil || cmd.Process == nil {
|
||||
return
|
||||
}
|
||||
|
||||
log.Printf("[logger] Stopping (pid %d)", cmd.Process.Pid)
|
||||
cmd.Process.Signal(syscall.SIGTERM)
|
||||
|
||||
// Wait briefly for graceful shutdown
|
||||
done := make(chan struct{})
|
||||
go func() {
|
||||
cmd.Wait()
|
||||
close(done)
|
||||
}()
|
||||
|
||||
select {
|
||||
case <-done:
|
||||
log.Printf("[logger] Stopped gracefully")
|
||||
case <-time.After(5 * time.Second):
|
||||
log.Printf("[logger] Force killing")
|
||||
cmd.Process.Kill()
|
||||
}
|
||||
}
|
||||
}
|
||||
27
mgmt
Executable file
27
mgmt
Executable file
@@ -0,0 +1,27 @@
|
||||
#!/bin/bash
|
||||
|
||||
# This file belongs to the framework. You are not expected to modify it.
|
||||
|
||||
# Management command runner - parallel to ./cmd for operational tasks
|
||||
# Usage: ./mgmt <command> [args...]
|
||||
|
||||
set -eu
|
||||
|
||||
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
|
||||
if [ $# -lt 1 ]; then
|
||||
echo "Usage: ./mgmt <command> [args...]"
|
||||
echo ""
|
||||
echo "Available commands:"
|
||||
for cmd in "$DIR"/framework/mgmt.d/*; do
|
||||
if [ -x "$cmd" ]; then
|
||||
basename "$cmd"
|
||||
fi
|
||||
done
|
||||
exit 1
|
||||
fi
|
||||
|
||||
subcmd="$1"
|
||||
shift
|
||||
|
||||
exec "$DIR"/framework/mgmt.d/"$subcmd" "$@"
|
||||
66
sync.sh
Executable file
66
sync.sh
Executable file
@@ -0,0 +1,66 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Note: This is kind of AI slop and needs to be more carefully reviewed.
|
||||
|
||||
set -eu
|
||||
|
||||
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
|
||||
# shellcheck source=framework/versions
|
||||
source "$DIR/framework/versions"
|
||||
|
||||
# Ensure correct node version is installed
|
||||
node_installed_checksum_file="$DIR/framework/binaries/.node.checksum"
|
||||
node_installed_checksum=""
|
||||
if [ -f "$node_installed_checksum_file" ]; then
|
||||
node_installed_checksum=$(cat "$node_installed_checksum_file")
|
||||
fi
|
||||
|
||||
if [ "$node_installed_checksum" != "$nodejs_checksum_linux_x86_64" ]; then
|
||||
echo "Downloading Node.js..."
|
||||
node_archive="$DIR/framework/downloads/node.tar.xz"
|
||||
curl -fsSL "$nodejs_binary_linux_x86_64" -o "$node_archive"
|
||||
|
||||
echo "Verifying checksum..."
|
||||
echo "$nodejs_checksum_linux_x86_64 $node_archive" | sha256sum -c -
|
||||
|
||||
echo "Extracting Node.js..."
|
||||
tar -xf "$node_archive" -C "$DIR/framework/binaries"
|
||||
rm "$node_archive"
|
||||
|
||||
echo "$nodejs_checksum_linux_x86_64" >"$node_installed_checksum_file"
|
||||
fi
|
||||
|
||||
# Ensure correct pnpm version is installed
|
||||
pnpm_binary="$DIR/framework/binaries/pnpm"
|
||||
pnpm_installed_checksum_file="$DIR/framework/binaries/.pnpm.checksum"
|
||||
pnpm_installed_checksum=""
|
||||
if [ -f "$pnpm_installed_checksum_file" ]; then
|
||||
pnpm_installed_checksum=$(cat "$pnpm_installed_checksum_file")
|
||||
fi
|
||||
|
||||
# pnpm checksum includes "sha256:" prefix, strip it for sha256sum
|
||||
pnpm_checksum="${pnpm_checksum_linux_x86_64#sha256:}"
|
||||
|
||||
if [ "$pnpm_installed_checksum" != "$pnpm_checksum" ]; then
|
||||
echo "Downloading pnpm..."
|
||||
curl -fsSL "$pnpm_binary_linux_x86_64" -o "$pnpm_binary"
|
||||
|
||||
echo "Verifying checksum..."
|
||||
echo "$pnpm_checksum $pnpm_binary" | sha256sum -c -
|
||||
|
||||
chmod +x "$pnpm_binary"
|
||||
|
||||
echo "$pnpm_checksum" >"$pnpm_installed_checksum_file"
|
||||
fi
|
||||
|
||||
# Get golang binaries in place
|
||||
cd "$DIR/master"
|
||||
go build
|
||||
|
||||
cd "$DIR/logger"
|
||||
go build
|
||||
|
||||
# Update framework code
|
||||
cd "$DIR/express"
|
||||
../cmd pnpm install
|
||||
11
templates/basic/hello.html.njk
Normal file
11
templates/basic/hello.html.njk
Normal file
@@ -0,0 +1,11 @@
|
||||
<html>
|
||||
<head></head>
|
||||
<body>
|
||||
<p>
|
||||
Hello.
|
||||
</p>
|
||||
<p>
|
||||
The current time is {{ now }}.
|
||||
</p>
|
||||
</body>
|
||||
</html>
|
||||
19
templates/basic/home.html.njk
Normal file
19
templates/basic/home.html.njk
Normal file
@@ -0,0 +1,19 @@
|
||||
<html>
|
||||
<head></head>
|
||||
<body>
|
||||
<p>
|
||||
home
|
||||
</p>
|
||||
<p>
|
||||
|
||||
{{ email }}
|
||||
</p>
|
||||
{% if showLogin %}
|
||||
<a href="/login">login</a>
|
||||
{% endif %}
|
||||
|
||||
{% if showLogout %}
|
||||
<a href="/logout">logout</a>
|
||||
{% endif %}
|
||||
</body>
|
||||
</html>
|
||||
55
templates/basic/login.html.njk
Normal file
55
templates/basic/login.html.njk
Normal file
@@ -0,0 +1,55 @@
|
||||
<html>
|
||||
<head>
|
||||
<title>Login</title>
|
||||
<style>
|
||||
body {
|
||||
font-family: system-ui, sans-serif;
|
||||
max-width: 400px;
|
||||
margin: 50px auto;
|
||||
padding: 20px;
|
||||
}
|
||||
form {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: 15px;
|
||||
}
|
||||
label {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: 5px;
|
||||
}
|
||||
input {
|
||||
padding: 8px;
|
||||
font-size: 16px;
|
||||
}
|
||||
button {
|
||||
padding: 10px;
|
||||
font-size: 16px;
|
||||
cursor: pointer;
|
||||
}
|
||||
.error {
|
||||
color: red;
|
||||
padding: 10px;
|
||||
background: #fee;
|
||||
border: 1px solid #fcc;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<h1>Login</h1>
|
||||
{% if error %}
|
||||
<div class="error">{{ error }}</div>
|
||||
{% endif %}
|
||||
<form method="POST" action="/login">
|
||||
<label>
|
||||
Email
|
||||
<input type="email" name="email" required value="{{ email | default('') }}">
|
||||
</label>
|
||||
<label>
|
||||
Password
|
||||
<input type="password" name="password" required>
|
||||
</label>
|
||||
<button type="submit">Login</button>
|
||||
</form>
|
||||
</body>
|
||||
</html>
|
||||
Reference in New Issue
Block a user