Compare commits
38 Commits
master
...
8318e60c33
| Author | SHA1 | Date | |
|---|---|---|---|
| 8318e60c33 | |||
| cf04ecc78a | |||
| 82e87577cc | |||
| 98f1f554c1 | |||
| 554f0e0044 | |||
| 530e3dccef | |||
| 91780b6dca | |||
| a39ed37a03 | |||
| 35165dcefe | |||
| dbd4e0a687 | |||
| 7b271da2b8 | |||
| 940cef138e | |||
| 296e460326 | |||
| 738e622fdc | |||
| 034b035a91 | |||
| f352ae44e1 | |||
| 341db4f821 | |||
| eabec3816b | |||
| b752eb5080 | |||
| 1ed5aa4b33 | |||
| 4d1c30b874 | |||
| 02edf259f0 | |||
| db1f2151de | |||
| 6e669d025a | |||
| a1dbf71de4 | |||
| 0afc3efa5d | |||
| 6f2ca2c15d | |||
| 6a41273835 | |||
| 33251d9b77 | |||
| 408032c30d | |||
| 19959a0325 | |||
| 87c9d1be16 | |||
| c2748bfcc6 | |||
| 2f5ef7c267 | |||
| bcd71f2801 | |||
| 82a8c03316 | |||
| b8065ead79 | |||
| 811c446895 |
8
.gitignore
vendored
8
.gitignore
vendored
@@ -1,5 +1,5 @@
|
||||
**/node_modules
|
||||
framework/downloads
|
||||
framework/binaries
|
||||
framework/.nodejs
|
||||
framework/.nodejs-config
|
||||
diachron/downloads
|
||||
diachron/binaries
|
||||
diachron/.nodejs
|
||||
diachron/.nodejs-config
|
||||
|
||||
14
CLAUDE.md
14
CLAUDE.md
@@ -38,7 +38,7 @@ master process. Key design principles:
|
||||
|
||||
**Format TypeScript code:**
|
||||
```bash
|
||||
cd express && ../cmd pnpm biome check --write .
|
||||
cd backend && ../cmd pnpm biome check --write .
|
||||
```
|
||||
|
||||
**Build Go master process:**
|
||||
@@ -54,9 +54,9 @@ cd master && go build
|
||||
|
||||
### Components
|
||||
|
||||
- **express/** - TypeScript/Express.js backend application
|
||||
- **backend/** - TypeScript/Express.js backend application
|
||||
- **master/** - Go-based master process for file watching and process management
|
||||
- **framework/** - Managed binaries (Node.js, pnpm), command wrappers, and
|
||||
- **diachron/** - Managed binaries (Node.js, pnpm), command wrappers, and
|
||||
framework-specific library code
|
||||
- **monitor/** - Go file watcher that triggers rebuilds (experimental)
|
||||
|
||||
@@ -68,7 +68,7 @@ Responsibilities:
|
||||
- Proxy web requests to backend workers
|
||||
- Behaves identically in all environments (no dev/prod distinction)
|
||||
|
||||
### Express App Structure
|
||||
### Backend App Structure
|
||||
|
||||
- `app.ts` - Main Express application setup with route matching
|
||||
- `routes.ts` - Route definitions
|
||||
@@ -78,7 +78,7 @@ Responsibilities:
|
||||
|
||||
### Framework Command System
|
||||
|
||||
Commands flow through: `./cmd` → `framework/cmd.d/*` → `framework/shims/*` → managed binaries in `framework/binaries/`
|
||||
Commands flow through: `./cmd` → `diachron/cmd.d/*` → `diachron/shims/*` → managed binaries in `diachron/binaries/`
|
||||
|
||||
This ensures consistent tooling versions across the team without system-wide installations.
|
||||
|
||||
@@ -94,8 +94,8 @@ This ensures consistent tooling versions across the team without system-wide ins
|
||||
|
||||
## Platform Requirements
|
||||
|
||||
Linux x86_64 only (currently). Requires:
|
||||
- Modern libc for Go binaries
|
||||
Linux or macOS on x86_64. Requires:
|
||||
- Modern libc for Go binaries (Linux)
|
||||
- docker compose (for full stack)
|
||||
- fd, shellcheck, shfmt (for development)
|
||||
|
||||
|
||||
168
DIACHRON.md
Normal file
168
DIACHRON.md
Normal file
@@ -0,0 +1,168 @@
|
||||
# What is diachron?
|
||||
|
||||
diachron is a web framework for TypeScript and Node.js. It uses a Go-based
|
||||
master process that handles file watching, building, process management, and
|
||||
request proxying. The application code is TypeScript running on Express.js.
|
||||
|
||||
If you're joining a project that uses diachron, this document will orient you.
|
||||
|
||||
## Why diachron exists
|
||||
|
||||
diachron was built around a few frustrations with mainstream web frameworks:
|
||||
|
||||
- **No dev/prod split.** Most frameworks behave differently in development and
|
||||
production. diachron doesn't. The master process watches files, rebuilds,
|
||||
and manages workers the same way everywhere. There is no `NODE_ENV`.
|
||||
|
||||
- **Managed tooling.** Node.js, pnpm, and other tools are downloaded and
|
||||
pinned to exact versions inside the project. You don't install them
|
||||
system-wide. Everyone on the team runs the same binaries.
|
||||
|
||||
- **PostgreSQL, directly.** No ORM, no database abstraction layer. You write
|
||||
SQL (via Kysely for type safety) and talk to PostgreSQL. If you need
|
||||
MySQL or SQLite support, this is not the framework for you.
|
||||
|
||||
- **Debuggability over magic.** Everything is explicit and inspectable.
|
||||
Logging and observability are first-class concerns, not afterthoughts.
|
||||
|
||||
diachron is inspired by the
|
||||
[Taking PHP Seriously](https://slack.engineering/taking-php-seriously/) essay
|
||||
from Slack Engineering. It's designed for small to medium systems (what we
|
||||
call "Ring 0 and Ring 1") -- not heavy-compliance or banking-scale
|
||||
applications.
|
||||
|
||||
## How it works
|
||||
|
||||
When you run `./master`, the following happens:
|
||||
|
||||
1. The Go master process starts and watches your TypeScript source files.
|
||||
2. It builds the backend using `@vercel/ncc`, producing a single bundled JS
|
||||
file.
|
||||
3. It starts one or more Node.js worker processes running your Express app.
|
||||
4. It proxies HTTP requests from port 8080 to the workers.
|
||||
5. When you edit a source file, it rebuilds and restarts the workers
|
||||
automatically.
|
||||
6. If a worker crashes, it restarts automatically.
|
||||
|
||||
There is no separate "dev server" or "hot module replacement." The master
|
||||
process is the only way to run the application.
|
||||
|
||||
## Project structure
|
||||
|
||||
A diachron project looks like this:
|
||||
|
||||
```
|
||||
.
|
||||
├── DIACHRON.md # This file (framework overview for newcomers)
|
||||
├── master/ # Go master process (framework-owned)
|
||||
├── logger/ # Go logging service (framework-owned)
|
||||
├── diachron/ # Managed binaries, shims, framework library
|
||||
│ ├── AGENTS.md # Guide for AI coding agents
|
||||
│ ├── binaries/ # Downloaded Node.js, pnpm (gitignored)
|
||||
│ ├── cmd.d/ # Commands available via ./cmd
|
||||
│ ├── shims/ # Wrappers that use managed binaries
|
||||
│ └── ...
|
||||
├── backend/ # Your application code
|
||||
│ ├── app.ts # Entry point
|
||||
│ ├── routes.ts # Route definitions
|
||||
│ ├── handlers.ts # Route handlers
|
||||
│ ├── services.ts # Service layer
|
||||
│ ├── types.ts # Application types
|
||||
│ ├── config.ts # Application configuration
|
||||
│ └── diachron/ # Framework library code (framework-owned)
|
||||
├── cmd # Run managed commands (./cmd pnpm install, etc.)
|
||||
├── develop # Development-only commands (./develop reset-db, etc.)
|
||||
├── mgmt # Management commands safe for production
|
||||
├── sync.sh # Install/update all dependencies
|
||||
├── master # The compiled master binary (after sync)
|
||||
└── docker-compose.yml
|
||||
```
|
||||
|
||||
### File ownership
|
||||
|
||||
There are two owners of files in a diachron project:
|
||||
|
||||
- **You own** everything in `backend/` (except `backend/diachron/`), plus
|
||||
`docker-compose.yml`, `package.json`, and anything else you create.
|
||||
- **The framework owns** `master/`, `logger/`, `diachron/`,
|
||||
`backend/diachron/`, and the top-level scripts (`cmd`, `develop`, `mgmt`,
|
||||
`sync.sh`, `check.sh`).
|
||||
|
||||
Don't modify framework-owned files. This separation keeps framework upgrades
|
||||
clean.
|
||||
|
||||
## Getting started
|
||||
|
||||
```bash
|
||||
# Install dependencies and build the master process
|
||||
./sync.sh
|
||||
|
||||
# Start the application
|
||||
./master
|
||||
```
|
||||
|
||||
The app will be available at `http://localhost:8080`.
|
||||
|
||||
You need Linux or macOS on x86_64. For the full stack (database, Redis,
|
||||
etc.), you also need `docker compose`.
|
||||
|
||||
## The command system
|
||||
|
||||
diachron has three types of commands, separated by intent and safety:
|
||||
|
||||
- **`./cmd <command>`** -- Run managed tools (node, pnpm, tsx, etc.). These
|
||||
use the project's pinned versions, not whatever is installed on your system.
|
||||
|
||||
```bash
|
||||
./cmd pnpm install
|
||||
./cmd pnpm test
|
||||
```
|
||||
|
||||
- **`./mgmt <command>`** -- Management commands that are safe to run in
|
||||
production. Migrations, user management, that sort of thing.
|
||||
|
||||
```bash
|
||||
./mgmt migrate
|
||||
./mgmt add-user
|
||||
```
|
||||
|
||||
- **`./develop <command>`** -- Development commands that may be destructive.
|
||||
Database resets, fixture loading, etc. These are gated in production.
|
||||
|
||||
```bash
|
||||
./develop reset-db
|
||||
./develop db # Open a database shell
|
||||
```
|
||||
|
||||
The rule of thumb: if you'd run it at 3am while tired and worried, it's a
|
||||
`mgmt` command. If it destroys data on purpose, it's a `develop` command.
|
||||
|
||||
## Key concepts
|
||||
|
||||
### Call and Result
|
||||
|
||||
diachron wraps Express's `Request` and `Response` in its own types called
|
||||
`Call` and `Result`. This avoids shadowing and keeps the framework's
|
||||
interface distinct from Express internals. Your handlers receive a `Call`
|
||||
and return a `Result`.
|
||||
|
||||
### Routes
|
||||
|
||||
Routes are defined as data (arrays of `Route` objects in `routes.ts`), not
|
||||
through decorators or method chaining. The framework processes them into
|
||||
Express handlers.
|
||||
|
||||
### No environment variables for behavior
|
||||
|
||||
There is no `NODE_ENV`, no `DEBUG`, no mode switching. Configuration that
|
||||
must vary between deployments (database URLs, secrets) lives in
|
||||
configuration files, but the application's behavior doesn't branch on
|
||||
environment.
|
||||
|
||||
## Further reading
|
||||
|
||||
- `README.md` -- Project introduction and requirements
|
||||
- `diachron/AGENTS.md` -- Guide for AI coding agents
|
||||
- `docs/` -- Design documents and philosophy
|
||||
- `docs/commands.md` -- Detailed explanation of the command system
|
||||
- `docs/concentric-circles.md` -- What diachron is (and isn't) designed for
|
||||
25
README.md
25
README.md
@@ -2,16 +2,13 @@ diachron
|
||||
|
||||
## Introduction
|
||||
|
||||
Is your answer to some of these questions "yes"? If so, you might like
|
||||
diachron. (When it comes to that dev/test/prod one, hear us out first, ok?)
|
||||
|
||||
- Do you want to share a lot of backend and frontend code?
|
||||
|
||||
- Are you tired of your web stack breaking when you blink too hard?
|
||||
|
||||
- Have you read [Taking PHP
|
||||
Seriously](https://slack.engineering/taking-php-seriously/) and wish you had
|
||||
something similar for Typescript?
|
||||
Seriously](https://slack.engineering/taking-php-seriously/) and do you wish
|
||||
you had something similar for Typescript?
|
||||
|
||||
- Do you think that ORMs are not all that? Do you wish you had first class
|
||||
unmediated access to your database? And do you think that database
|
||||
@@ -35,6 +32,9 @@ diachron. (When it comes to that dev/test/prod one, hear us out first, ok?)
|
||||
you're trying to fix? We're talking authentication, authorization, XSS,
|
||||
https, nested paths, all that stuff.
|
||||
|
||||
Is your answer to some of these questions "yes"? If so, you might like
|
||||
diachron. (When it comes to that dev/test/prod one, hear us out first, ok?)
|
||||
|
||||
## Getting started
|
||||
|
||||
Different situations require different getting started docs.
|
||||
@@ -44,12 +44,21 @@ Different situations require different getting started docs.
|
||||
|
||||
## Requirements
|
||||
|
||||
To run diachron, you currently need to have a Linux box running x86_64 with a
|
||||
new enough libc to run golang binaries. Support for other platforms will come
|
||||
eventually.
|
||||
To run diachron, you need Linux or macOS on x86_64. Linux requires a new
|
||||
enough libc to run golang binaries.
|
||||
|
||||
To run a more complete system, you also need to have docker compose installed.
|
||||
|
||||
### Database
|
||||
|
||||
To connect to the database, you need psql (PostgreSQL client, for
|
||||
`./diachron/common.d/db`)
|
||||
|
||||
- macOS: `brew install libpq` (and follow the caveat to add it to your PATH),
|
||||
or `brew install postgresql`
|
||||
- Debian/Ubuntu: `apt install postgresql-client`
|
||||
- Fedora/RHEL: `dnf install postgresql`
|
||||
|
||||
### Development requirements
|
||||
|
||||
To hack on diachron itself, you need the following:
|
||||
|
||||
10
TODO.md
10
TODO.md
@@ -1,8 +1,11 @@
|
||||
## high importance
|
||||
|
||||
- Many of the UUIDs generated are not UUIDv7. This needs to be fixed.
|
||||
|
||||
- [ ] Add unit tests all over the place.
|
||||
- ⚠️ Huge task - needs breakdown before starting
|
||||
|
||||
- [ ] map exceptions back to source lines
|
||||
|
||||
- [ ] migrations, seeding, fixtures
|
||||
|
||||
@@ -53,6 +56,13 @@ CREATE TABLE app.customer_metadata (...);
|
||||
leaves around `master-bin`, `logger-bin`, and `diachron:nnnn` processes.
|
||||
Huge problem.
|
||||
|
||||
- [ ] Fix format used by master (and logger?)'s output: it should be logfmt
|
||||
- A lot of other stuff should probably be logfmt too but maybe we can get to
|
||||
that later
|
||||
|
||||
- [ ] master rebuilds (or tries to) too many times; need some sort of debounce
|
||||
or whatever it's called
|
||||
|
||||
## medium importance
|
||||
|
||||
- [ ] Add a log viewer
|
||||
|
||||
0
express/.gitignore → backend/.gitignore
vendored
0
express/.gitignore → backend/.gitignore
vendored
1
backend/.npmrc
Normal file
1
backend/.npmrc
Normal file
@@ -0,0 +1 @@
|
||||
shamefully-hoist=true
|
||||
20
backend/app.ts
Normal file
20
backend/app.ts
Normal file
@@ -0,0 +1,20 @@
|
||||
// This is a sample file provided by diachron. You are encouraged to modify it.
|
||||
|
||||
|
||||
|
||||
|
||||
import { core } from "./diachron/core";
|
||||
|
||||
import { routes } from "./routes";
|
||||
import {makeApp}from'./diachron/app'
|
||||
|
||||
|
||||
const app = makeApp({routes});
|
||||
|
||||
|
||||
core.logging.log({ source: "logging", text: ["1"] });
|
||||
|
||||
|
||||
|
||||
|
||||
app.start()
|
||||
@@ -6,4 +6,4 @@ DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
|
||||
cd "$DIR"
|
||||
|
||||
../cmd pnpm ncc build ./app.ts -o dist
|
||||
../cmd pnpm ncc build ./app.ts -o dist --source-map
|
||||
66
backend/check-deps.ts
Normal file
66
backend/check-deps.ts
Normal file
@@ -0,0 +1,66 @@
|
||||
import { readFileSync } from "node:fs";
|
||||
import { dirname, join } from "node:path";
|
||||
import { fileURLToPath } from "node:url";
|
||||
|
||||
const __dirname = dirname(fileURLToPath(import.meta.url));
|
||||
|
||||
interface PackageJson {
|
||||
dependencies?: Record<string, string>;
|
||||
devDependencies?: Record<string, string>;
|
||||
}
|
||||
|
||||
function readPackageJson(path: string): PackageJson {
|
||||
const content = readFileSync(path, "utf-8");
|
||||
return JSON.parse(content);
|
||||
}
|
||||
|
||||
function getAllDependencyNames(pkg: PackageJson): Set<string> {
|
||||
const names = new Set<string>();
|
||||
for (const name of Object.keys(pkg.dependencies ?? {})) {
|
||||
names.add(name);
|
||||
}
|
||||
for (const name of Object.keys(pkg.devDependencies ?? {})) {
|
||||
names.add(name);
|
||||
}
|
||||
return names;
|
||||
}
|
||||
|
||||
const diachronPkgPath = join(__dirname, "diachron", "package.json");
|
||||
const backendPkgPath = join(__dirname, "package.json");
|
||||
|
||||
const diachronPkg = readPackageJson(diachronPkgPath);
|
||||
const backendPkg = readPackageJson(backendPkgPath);
|
||||
|
||||
const diachronDeps = getAllDependencyNames(diachronPkg);
|
||||
const backendDeps = getAllDependencyNames(backendPkg);
|
||||
|
||||
const duplicates: string[] = [];
|
||||
|
||||
for (const dep of diachronDeps) {
|
||||
if (backendDeps.has(dep)) {
|
||||
duplicates.push(dep);
|
||||
}
|
||||
}
|
||||
|
||||
if (duplicates.length > 0) {
|
||||
console.error("Error: Duplicate dependencies found.");
|
||||
console.error("");
|
||||
console.error(
|
||||
"The following dependencies exist in both backend/package.json and backend/diachron/package.json:",
|
||||
);
|
||||
console.error("");
|
||||
for (const dep of duplicates.sort()) {
|
||||
console.error(` - ${dep}`);
|
||||
}
|
||||
console.error("");
|
||||
console.error(
|
||||
"Dependencies in backend/diachron/package.json are provided by the framework",
|
||||
);
|
||||
console.error(
|
||||
"and must not be duplicated in backend/package.json. Remove them from",
|
||||
);
|
||||
console.error("backend/package.json to fix this error.");
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
console.log("No duplicate dependencies found.");
|
||||
@@ -8,7 +8,8 @@ check_dir="$DIR"
|
||||
|
||||
out_dir="$check_dir/out"
|
||||
|
||||
source "$check_dir"/../framework/shims/common
|
||||
source "$check_dir"/../framework/shims/node.common
|
||||
source "$check_dir"/../diachron/shims/common
|
||||
source "$check_dir"/../diachron/shims/node.common
|
||||
|
||||
$ROOT/cmd tsx "$check_dir/check-deps.ts"
|
||||
$ROOT/cmd pnpm tsc --outDir "$out_dir"
|
||||
137
backend/diachron/app.ts
Normal file
137
backend/diachron/app.ts
Normal file
@@ -0,0 +1,137 @@
|
||||
// FIXME: rename this to make-app.ts and adjust imports accordingly
|
||||
|
||||
import{contentTypes} from './content-types'
|
||||
import{httpCodes}from'./http-codes'
|
||||
import express, {
|
||||
type Express,
|
||||
type NextFunction,
|
||||
type Request as ExpressRequest,
|
||||
type Response as ExpressResponse,
|
||||
} from "express";
|
||||
import { formatError, formatErrorHtml } from "./errors";
|
||||
import {isRedirect, InternalHandler, AuthenticationRequired,
|
||||
AuthorizationDenied, Call,type Method, type ProcessedRoute,methodParser, type Result, type Route,massageMethod } from "./types";
|
||||
|
||||
|
||||
|
||||
import { cli } from "./cli";
|
||||
import{processRoutes}from'./routing'
|
||||
|
||||
|
||||
process.on('uncaughtException', (err) => {
|
||||
console.error(formatError(err));
|
||||
process.exit(1);
|
||||
});
|
||||
|
||||
process.on('unhandledRejection', (reason) => {
|
||||
console.error(formatError(reason));
|
||||
});
|
||||
|
||||
|
||||
type MakeAppArgs={routes:Route[],
|
||||
processTitle?: string,
|
||||
}
|
||||
|
||||
export interface DiachronApp extends Express {
|
||||
start: () => void
|
||||
}
|
||||
|
||||
const makeApp = ({routes, processTitle}: MakeAppArgs) => {
|
||||
if (process.title) {
|
||||
process.title = `diachron:${cli.listen.port}`;
|
||||
}
|
||||
|
||||
const processedRoutes = processRoutes(routes)
|
||||
|
||||
async function handler(
|
||||
req: ExpressRequest,
|
||||
_res: ExpressResponse,
|
||||
): Promise<Result> {
|
||||
const method = await methodParser.parseAsync(req.method);
|
||||
|
||||
const byMethod = processedRoutes[method];
|
||||
console.log(
|
||||
"DEBUG: req.path =",
|
||||
JSON.stringify(req.path),
|
||||
"method =",
|
||||
method,
|
||||
);
|
||||
for (const [_idx, pr] of byMethod.entries()) {
|
||||
const match = pr.matcher(req.path);
|
||||
console.log("DEBUG: trying pattern, match result =", match);
|
||||
if (match) {
|
||||
console.log("match", match);
|
||||
const resp = await pr.handler(req);
|
||||
|
||||
return resp;
|
||||
}
|
||||
}
|
||||
|
||||
const retval: Result = {
|
||||
code: httpCodes.clientErrors.NotFound,
|
||||
contentType: contentTypes.text.plain,
|
||||
result: "not found!",
|
||||
};
|
||||
|
||||
return retval;
|
||||
}
|
||||
|
||||
// I don't like going around tsc but this is simple enough that it's probably OK.
|
||||
const app = express() as DiachronApp
|
||||
app.start = function() {
|
||||
this.listen(cli.listen.port, cli.listen.host, () => {
|
||||
console.log(`Listening on ${cli.listen.host}:${cli.listen.port}`);
|
||||
});
|
||||
};
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
app.use(express.json())
|
||||
|
||||
app.use(express.urlencoded({ extended: true }));
|
||||
|
||||
app.use(async (req: ExpressRequest, res: ExpressResponse) => {
|
||||
const result0 = await handler(req, res);
|
||||
|
||||
const code = result0.code.code;
|
||||
const result = result0.result;
|
||||
|
||||
console.log(result);
|
||||
|
||||
// Set any cookies from the result
|
||||
if (result0.cookies) {
|
||||
for (const cookie of result0.cookies) {
|
||||
res.cookie(cookie.name, cookie.value, cookie.options ?? {});
|
||||
}
|
||||
}
|
||||
|
||||
if (isRedirect(result0)) {
|
||||
res.redirect(code, result0.redirect);
|
||||
} else {
|
||||
res.status(code).send(result);
|
||||
}
|
||||
});
|
||||
|
||||
app.use(
|
||||
(
|
||||
err: Error,
|
||||
_req: ExpressRequest,
|
||||
res: ExpressResponse,
|
||||
_next: NextFunction,
|
||||
) => {
|
||||
console.error(formatError(err));
|
||||
res.status(500).type("html").send(formatErrorHtml(err));
|
||||
},
|
||||
);
|
||||
|
||||
return app;
|
||||
}
|
||||
|
||||
export{makeApp};
|
||||
|
||||
function _isPromise<T>(value: T | Promise<T>): value is Promise<T> {
|
||||
return typeof (value as any)?.then === "function";
|
||||
}
|
||||
|
||||
80
backend/diachron/auth/password.spec.ts
Normal file
80
backend/diachron/auth/password.spec.ts
Normal file
@@ -0,0 +1,80 @@
|
||||
// Tests for auth/password.ts
|
||||
// Pure unit tests - no database needed
|
||||
|
||||
import assert from "node:assert/strict";
|
||||
import { describe, it } from "node:test";
|
||||
import { hashPassword, verifyPassword } from "./password";
|
||||
|
||||
describe("password", () => {
|
||||
describe("hashPassword", () => {
|
||||
it("returns a scrypt formatted hash", async () => {
|
||||
const hash = await hashPassword("testpassword");
|
||||
assert.ok(hash.startsWith("$scrypt$"));
|
||||
});
|
||||
|
||||
it("includes all scrypt parameters", async () => {
|
||||
const hash = await hashPassword("testpassword");
|
||||
const parts = hash.split("$");
|
||||
// Format: $scrypt$N$r$p$salt$hash
|
||||
assert.equal(parts.length, 7);
|
||||
assert.equal(parts[0], "");
|
||||
assert.equal(parts[1], "scrypt");
|
||||
// N, r, p should be numbers
|
||||
assert.ok(!Number.isNaN(parseInt(parts[2], 10)));
|
||||
assert.ok(!Number.isNaN(parseInt(parts[3], 10)));
|
||||
assert.ok(!Number.isNaN(parseInt(parts[4], 10)));
|
||||
});
|
||||
|
||||
it("generates different hashes for same password (different salt)", async () => {
|
||||
const hash1 = await hashPassword("testpassword");
|
||||
const hash2 = await hashPassword("testpassword");
|
||||
assert.notEqual(hash1, hash2);
|
||||
});
|
||||
});
|
||||
|
||||
describe("verifyPassword", () => {
|
||||
it("returns true for correct password", async () => {
|
||||
const hash = await hashPassword("correctpassword");
|
||||
const result = await verifyPassword("correctpassword", hash);
|
||||
assert.equal(result, true);
|
||||
});
|
||||
|
||||
it("returns false for incorrect password", async () => {
|
||||
const hash = await hashPassword("correctpassword");
|
||||
const result = await verifyPassword("wrongpassword", hash);
|
||||
assert.equal(result, false);
|
||||
});
|
||||
|
||||
it("throws for invalid hash format", async () => {
|
||||
await assert.rejects(
|
||||
verifyPassword("password", "invalid-hash"),
|
||||
/Invalid password hash format/,
|
||||
);
|
||||
});
|
||||
|
||||
it("throws for non-scrypt hash", async () => {
|
||||
await assert.rejects(
|
||||
verifyPassword("password", "$bcrypt$10$salt$hash"),
|
||||
/Invalid password hash format/,
|
||||
);
|
||||
});
|
||||
|
||||
it("works with empty password", async () => {
|
||||
const hash = await hashPassword("");
|
||||
const result = await verifyPassword("", hash);
|
||||
assert.equal(result, true);
|
||||
});
|
||||
|
||||
it("works with unicode password", async () => {
|
||||
const hash = await hashPassword("p@$$w0rd\u{1F511}");
|
||||
const result = await verifyPassword("p@$$w0rd\u{1F511}", hash);
|
||||
assert.equal(result, true);
|
||||
});
|
||||
|
||||
it("is case sensitive", async () => {
|
||||
const hash = await hashPassword("Password");
|
||||
const result = await verifyPassword("password", hash);
|
||||
assert.equal(result, false);
|
||||
});
|
||||
});
|
||||
});
|
||||
419
backend/diachron/auth/service.spec.ts
Normal file
419
backend/diachron/auth/service.spec.ts
Normal file
@@ -0,0 +1,419 @@
|
||||
// Tests for auth/service.ts
|
||||
// Uses InMemoryAuthStore - no database needed
|
||||
|
||||
import assert from "node:assert/strict";
|
||||
import { beforeEach, describe, it } from "node:test";
|
||||
import { AuthService } from "./service";
|
||||
import { InMemoryAuthStore } from "./store";
|
||||
|
||||
describe("AuthService", () => {
|
||||
let store: InMemoryAuthStore;
|
||||
let service: AuthService;
|
||||
|
||||
beforeEach(() => {
|
||||
store = new InMemoryAuthStore();
|
||||
service = new AuthService(store);
|
||||
});
|
||||
|
||||
describe("register", () => {
|
||||
it("creates a new user", async () => {
|
||||
const result = await service.register(
|
||||
"test@example.com",
|
||||
"password123",
|
||||
"Test User",
|
||||
);
|
||||
|
||||
assert.equal(result.success, true);
|
||||
if (result.success) {
|
||||
assert.equal(result.user.email, "test@example.com");
|
||||
assert.equal(result.user.displayName, "Test User");
|
||||
assert.ok(result.verificationToken.length > 0);
|
||||
}
|
||||
});
|
||||
|
||||
it("fails when email already registered", async () => {
|
||||
await service.register("test@example.com", "password123");
|
||||
const result = await service.register(
|
||||
"test@example.com",
|
||||
"password456",
|
||||
);
|
||||
|
||||
assert.equal(result.success, false);
|
||||
if (!result.success) {
|
||||
assert.equal(result.error, "Email already registered");
|
||||
}
|
||||
});
|
||||
|
||||
it("creates user without displayName", async () => {
|
||||
const result = await service.register(
|
||||
"test@example.com",
|
||||
"password123",
|
||||
);
|
||||
|
||||
assert.equal(result.success, true);
|
||||
if (result.success) {
|
||||
assert.equal(result.user.displayName, undefined);
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
describe("login", () => {
|
||||
beforeEach(async () => {
|
||||
// Create and verify a user
|
||||
const result = await service.register(
|
||||
"test@example.com",
|
||||
"password123",
|
||||
"Test User",
|
||||
);
|
||||
if (result.success) {
|
||||
// Verify email to activate user
|
||||
await service.verifyEmail(result.verificationToken);
|
||||
}
|
||||
});
|
||||
|
||||
it("succeeds with correct credentials", async () => {
|
||||
const result = await service.login(
|
||||
"test@example.com",
|
||||
"password123",
|
||||
"cookie",
|
||||
);
|
||||
|
||||
assert.equal(result.success, true);
|
||||
if (result.success) {
|
||||
assert.ok(result.token.length > 0);
|
||||
assert.equal(result.user.email, "test@example.com");
|
||||
}
|
||||
});
|
||||
|
||||
it("fails with wrong password", async () => {
|
||||
const result = await service.login(
|
||||
"test@example.com",
|
||||
"wrongpassword",
|
||||
"cookie",
|
||||
);
|
||||
|
||||
assert.equal(result.success, false);
|
||||
if (!result.success) {
|
||||
assert.equal(result.error, "Invalid credentials");
|
||||
}
|
||||
});
|
||||
|
||||
it("fails with unknown email", async () => {
|
||||
const result = await service.login(
|
||||
"unknown@example.com",
|
||||
"password123",
|
||||
"cookie",
|
||||
);
|
||||
|
||||
assert.equal(result.success, false);
|
||||
if (!result.success) {
|
||||
assert.equal(result.error, "Invalid credentials");
|
||||
}
|
||||
});
|
||||
|
||||
it("fails for inactive user", async () => {
|
||||
// Create a user but don't verify email (stays pending)
|
||||
await service.register("pending@example.com", "password123");
|
||||
|
||||
const result = await service.login(
|
||||
"pending@example.com",
|
||||
"password123",
|
||||
"cookie",
|
||||
);
|
||||
|
||||
assert.equal(result.success, false);
|
||||
if (!result.success) {
|
||||
assert.equal(result.error, "Account is not active");
|
||||
}
|
||||
});
|
||||
|
||||
it("stores metadata", async () => {
|
||||
const result = await service.login(
|
||||
"test@example.com",
|
||||
"password123",
|
||||
"cookie",
|
||||
{ userAgent: "TestAgent", ipAddress: "192.168.1.1" },
|
||||
);
|
||||
|
||||
assert.equal(result.success, true);
|
||||
});
|
||||
});
|
||||
|
||||
describe("validateToken", () => {
|
||||
let token: string;
|
||||
|
||||
beforeEach(async () => {
|
||||
const regResult = await service.register(
|
||||
"test@example.com",
|
||||
"password123",
|
||||
);
|
||||
if (regResult.success) {
|
||||
await service.verifyEmail(regResult.verificationToken);
|
||||
}
|
||||
|
||||
const loginResult = await service.login(
|
||||
"test@example.com",
|
||||
"password123",
|
||||
"cookie",
|
||||
);
|
||||
if (loginResult.success) {
|
||||
token = loginResult.token;
|
||||
}
|
||||
});
|
||||
|
||||
it("returns authenticated for valid token", async () => {
|
||||
const result = await service.validateToken(token);
|
||||
|
||||
assert.equal(result.authenticated, true);
|
||||
if (result.authenticated) {
|
||||
assert.equal(result.user.email, "test@example.com");
|
||||
assert.notEqual(result.session, null);
|
||||
}
|
||||
});
|
||||
|
||||
it("returns unauthenticated for invalid token", async () => {
|
||||
const result = await service.validateToken("invalid-token");
|
||||
|
||||
assert.equal(result.authenticated, false);
|
||||
assert.equal(result.user.isAnonymous(), true);
|
||||
assert.equal(result.session, null);
|
||||
});
|
||||
});
|
||||
|
||||
describe("logout", () => {
|
||||
it("invalidates the session", async () => {
|
||||
const regResult = await service.register(
|
||||
"test@example.com",
|
||||
"password123",
|
||||
);
|
||||
if (regResult.success) {
|
||||
await service.verifyEmail(regResult.verificationToken);
|
||||
}
|
||||
|
||||
const loginResult = await service.login(
|
||||
"test@example.com",
|
||||
"password123",
|
||||
"cookie",
|
||||
);
|
||||
assert.equal(loginResult.success, true);
|
||||
if (!loginResult.success) return;
|
||||
|
||||
const token = loginResult.token;
|
||||
|
||||
// Token should be valid before logout
|
||||
const beforeLogout = await service.validateToken(token);
|
||||
assert.equal(beforeLogout.authenticated, true);
|
||||
|
||||
// Logout
|
||||
await service.logout(token);
|
||||
|
||||
// Token should be invalid after logout
|
||||
const afterLogout = await service.validateToken(token);
|
||||
assert.equal(afterLogout.authenticated, false);
|
||||
});
|
||||
});
|
||||
|
||||
describe("logoutAllSessions", () => {
|
||||
it("invalidates all user sessions", async () => {
|
||||
const regResult = await service.register(
|
||||
"test@example.com",
|
||||
"password123",
|
||||
);
|
||||
if (regResult.success) {
|
||||
await service.verifyEmail(regResult.verificationToken);
|
||||
}
|
||||
|
||||
// Create multiple sessions
|
||||
const login1 = await service.login(
|
||||
"test@example.com",
|
||||
"password123",
|
||||
"cookie",
|
||||
);
|
||||
const login2 = await service.login(
|
||||
"test@example.com",
|
||||
"password123",
|
||||
"bearer",
|
||||
);
|
||||
|
||||
assert.equal(login1.success, true);
|
||||
assert.equal(login2.success, true);
|
||||
if (!login1.success || !login2.success) return;
|
||||
|
||||
// Both should be valid
|
||||
const before1 = await service.validateToken(login1.token);
|
||||
const before2 = await service.validateToken(login2.token);
|
||||
assert.equal(before1.authenticated, true);
|
||||
assert.equal(before2.authenticated, true);
|
||||
|
||||
// Logout all
|
||||
const user = await store.getUserByEmail("test@example.com");
|
||||
const count = await service.logoutAllSessions(user!.id);
|
||||
assert.equal(count, 2);
|
||||
|
||||
// Both should be invalid
|
||||
const after1 = await service.validateToken(login1.token);
|
||||
const after2 = await service.validateToken(login2.token);
|
||||
assert.equal(after1.authenticated, false);
|
||||
assert.equal(after2.authenticated, false);
|
||||
});
|
||||
});
|
||||
|
||||
describe("verifyEmail", () => {
|
||||
it("activates user with valid token", async () => {
|
||||
const regResult = await service.register(
|
||||
"test@example.com",
|
||||
"password123",
|
||||
);
|
||||
assert.equal(regResult.success, true);
|
||||
if (!regResult.success) return;
|
||||
|
||||
const result = await service.verifyEmail(
|
||||
regResult.verificationToken,
|
||||
);
|
||||
assert.equal(result.success, true);
|
||||
|
||||
// User should now be active and can login
|
||||
const loginResult = await service.login(
|
||||
"test@example.com",
|
||||
"password123",
|
||||
"cookie",
|
||||
);
|
||||
assert.equal(loginResult.success, true);
|
||||
});
|
||||
|
||||
it("fails with invalid token", async () => {
|
||||
const result = await service.verifyEmail("invalid-token");
|
||||
|
||||
assert.equal(result.success, false);
|
||||
if (!result.success) {
|
||||
assert.equal(
|
||||
result.error,
|
||||
"Invalid or expired verification token",
|
||||
);
|
||||
}
|
||||
});
|
||||
|
||||
it("fails when token already used", async () => {
|
||||
const regResult = await service.register(
|
||||
"test@example.com",
|
||||
"password123",
|
||||
);
|
||||
assert.equal(regResult.success, true);
|
||||
if (!regResult.success) return;
|
||||
|
||||
// First verification succeeds
|
||||
const result1 = await service.verifyEmail(
|
||||
regResult.verificationToken,
|
||||
);
|
||||
assert.equal(result1.success, true);
|
||||
|
||||
// Second verification fails (token deleted)
|
||||
const result2 = await service.verifyEmail(
|
||||
regResult.verificationToken,
|
||||
);
|
||||
assert.equal(result2.success, false);
|
||||
});
|
||||
});
|
||||
|
||||
describe("createPasswordResetToken", () => {
|
||||
it("returns token for existing user", async () => {
|
||||
const regResult = await service.register(
|
||||
"test@example.com",
|
||||
"password123",
|
||||
);
|
||||
assert.equal(regResult.success, true);
|
||||
|
||||
const result =
|
||||
await service.createPasswordResetToken("test@example.com");
|
||||
assert.notEqual(result, null);
|
||||
assert.ok(result!.token.length > 0);
|
||||
});
|
||||
|
||||
it("returns null for unknown email", async () => {
|
||||
const result = await service.createPasswordResetToken(
|
||||
"unknown@example.com",
|
||||
);
|
||||
assert.equal(result, null);
|
||||
});
|
||||
});
|
||||
|
||||
describe("resetPassword", () => {
|
||||
it("changes password with valid token", async () => {
|
||||
const regResult = await service.register(
|
||||
"test@example.com",
|
||||
"oldpassword",
|
||||
);
|
||||
if (regResult.success) {
|
||||
await service.verifyEmail(regResult.verificationToken);
|
||||
}
|
||||
|
||||
const resetToken =
|
||||
await service.createPasswordResetToken("test@example.com");
|
||||
assert.notEqual(resetToken, null);
|
||||
|
||||
const result = await service.resetPassword(
|
||||
resetToken!.token,
|
||||
"newpassword",
|
||||
);
|
||||
assert.equal(result.success, true);
|
||||
|
||||
// Old password should no longer work
|
||||
const loginOld = await service.login(
|
||||
"test@example.com",
|
||||
"oldpassword",
|
||||
"cookie",
|
||||
);
|
||||
assert.equal(loginOld.success, false);
|
||||
|
||||
// New password should work
|
||||
const loginNew = await service.login(
|
||||
"test@example.com",
|
||||
"newpassword",
|
||||
"cookie",
|
||||
);
|
||||
assert.equal(loginNew.success, true);
|
||||
});
|
||||
|
||||
it("fails with invalid token", async () => {
|
||||
const result = await service.resetPassword(
|
||||
"invalid-token",
|
||||
"newpassword",
|
||||
);
|
||||
|
||||
assert.equal(result.success, false);
|
||||
if (!result.success) {
|
||||
assert.equal(result.error, "Invalid or expired reset token");
|
||||
}
|
||||
});
|
||||
|
||||
it("invalidates all existing sessions", async () => {
|
||||
const regResult = await service.register(
|
||||
"test@example.com",
|
||||
"password123",
|
||||
);
|
||||
if (regResult.success) {
|
||||
await service.verifyEmail(regResult.verificationToken);
|
||||
}
|
||||
|
||||
// Create a session
|
||||
const loginResult = await service.login(
|
||||
"test@example.com",
|
||||
"password123",
|
||||
"cookie",
|
||||
);
|
||||
assert.equal(loginResult.success, true);
|
||||
if (!loginResult.success) return;
|
||||
|
||||
const sessionToken = loginResult.token;
|
||||
|
||||
// Reset password
|
||||
const resetToken =
|
||||
await service.createPasswordResetToken("test@example.com");
|
||||
await service.resetPassword(resetToken!.token, "newpassword");
|
||||
|
||||
// Old session should be invalid
|
||||
const validateResult = await service.validateToken(sessionToken);
|
||||
assert.equal(validateResult.authenticated, false);
|
||||
});
|
||||
});
|
||||
});
|
||||
321
backend/diachron/auth/store.spec.ts
Normal file
321
backend/diachron/auth/store.spec.ts
Normal file
@@ -0,0 +1,321 @@
|
||||
// Tests for auth/store.ts (InMemoryAuthStore)
|
||||
// Pure unit tests - no database needed
|
||||
|
||||
import assert from "node:assert/strict";
|
||||
import { after, before, beforeEach, describe, it } from "node:test";
|
||||
import type { UserId } from "../user";
|
||||
import { InMemoryAuthStore } from "./store";
|
||||
import { hashToken } from "./token";
|
||||
import type { TokenId } from "./types";
|
||||
|
||||
describe("InMemoryAuthStore", () => {
|
||||
let store: InMemoryAuthStore;
|
||||
|
||||
beforeEach(() => {
|
||||
store = new InMemoryAuthStore();
|
||||
});
|
||||
|
||||
describe("createUser", () => {
|
||||
it("creates a user with pending status", async () => {
|
||||
const user = await store.createUser({
|
||||
email: "test@example.com",
|
||||
passwordHash: "hash123",
|
||||
displayName: "Test User",
|
||||
});
|
||||
|
||||
assert.equal(user.email, "test@example.com");
|
||||
assert.equal(user.displayName, "Test User");
|
||||
assert.equal(user.status, "pending");
|
||||
});
|
||||
|
||||
it("creates a user without displayName", async () => {
|
||||
const user = await store.createUser({
|
||||
email: "test@example.com",
|
||||
passwordHash: "hash123",
|
||||
});
|
||||
|
||||
assert.equal(user.email, "test@example.com");
|
||||
assert.equal(user.displayName, undefined);
|
||||
});
|
||||
|
||||
it("generates a unique id", async () => {
|
||||
const user1 = await store.createUser({
|
||||
email: "test1@example.com",
|
||||
passwordHash: "hash123",
|
||||
});
|
||||
const user2 = await store.createUser({
|
||||
email: "test2@example.com",
|
||||
passwordHash: "hash456",
|
||||
});
|
||||
|
||||
assert.notEqual(user1.id, user2.id);
|
||||
});
|
||||
});
|
||||
|
||||
describe("getUserByEmail", () => {
|
||||
it("returns user when found", async () => {
|
||||
await store.createUser({
|
||||
email: "test@example.com",
|
||||
passwordHash: "hash123",
|
||||
});
|
||||
|
||||
const user = await store.getUserByEmail("test@example.com");
|
||||
assert.notEqual(user, null);
|
||||
assert.equal(user!.email, "test@example.com");
|
||||
});
|
||||
|
||||
it("is case-insensitive", async () => {
|
||||
await store.createUser({
|
||||
email: "Test@Example.COM",
|
||||
passwordHash: "hash123",
|
||||
});
|
||||
|
||||
const user = await store.getUserByEmail("test@example.com");
|
||||
assert.notEqual(user, null);
|
||||
});
|
||||
|
||||
it("returns null when not found", async () => {
|
||||
const user = await store.getUserByEmail("notfound@example.com");
|
||||
assert.equal(user, null);
|
||||
});
|
||||
});
|
||||
|
||||
describe("getUserById", () => {
|
||||
it("returns user when found", async () => {
|
||||
const created = await store.createUser({
|
||||
email: "test@example.com",
|
||||
passwordHash: "hash123",
|
||||
});
|
||||
|
||||
const user = await store.getUserById(created.id);
|
||||
assert.notEqual(user, null);
|
||||
assert.equal(user!.id, created.id);
|
||||
});
|
||||
|
||||
it("returns null when not found", async () => {
|
||||
const user = await store.getUserById("nonexistent" as UserId);
|
||||
assert.equal(user, null);
|
||||
});
|
||||
});
|
||||
|
||||
describe("getUserPasswordHash", () => {
|
||||
it("returns hash when found", async () => {
|
||||
const user = await store.createUser({
|
||||
email: "test@example.com",
|
||||
passwordHash: "hash123",
|
||||
});
|
||||
|
||||
const hash = await store.getUserPasswordHash(user.id);
|
||||
assert.equal(hash, "hash123");
|
||||
});
|
||||
|
||||
it("returns null when not found", async () => {
|
||||
const hash = await store.getUserPasswordHash(
|
||||
"nonexistent" as UserId,
|
||||
);
|
||||
assert.equal(hash, null);
|
||||
});
|
||||
});
|
||||
|
||||
describe("setUserPassword", () => {
|
||||
it("updates password hash", async () => {
|
||||
const user = await store.createUser({
|
||||
email: "test@example.com",
|
||||
passwordHash: "oldhash",
|
||||
});
|
||||
|
||||
await store.setUserPassword(user.id, "newhash");
|
||||
|
||||
const hash = await store.getUserPasswordHash(user.id);
|
||||
assert.equal(hash, "newhash");
|
||||
});
|
||||
});
|
||||
|
||||
describe("updateUserEmailVerified", () => {
|
||||
it("sets user status to active", async () => {
|
||||
const created = await store.createUser({
|
||||
email: "test@example.com",
|
||||
passwordHash: "hash123",
|
||||
});
|
||||
assert.equal(created.status, "pending");
|
||||
|
||||
await store.updateUserEmailVerified(created.id);
|
||||
|
||||
const user = await store.getUserById(created.id);
|
||||
assert.equal(user!.status, "active");
|
||||
});
|
||||
});
|
||||
|
||||
describe("createSession", () => {
|
||||
it("creates a session with token", async () => {
|
||||
const user = await store.createUser({
|
||||
email: "test@example.com",
|
||||
passwordHash: "hash123",
|
||||
});
|
||||
|
||||
const { token, session } = await store.createSession({
|
||||
userId: user.id,
|
||||
tokenType: "session",
|
||||
authMethod: "cookie",
|
||||
expiresAt: new Date(Date.now() + 3600000),
|
||||
});
|
||||
|
||||
assert.ok(token.length > 0);
|
||||
assert.equal(session.userId, user.id);
|
||||
assert.equal(session.tokenType, "session");
|
||||
assert.equal(session.authMethod, "cookie");
|
||||
});
|
||||
|
||||
it("stores metadata", async () => {
|
||||
const user = await store.createUser({
|
||||
email: "test@example.com",
|
||||
passwordHash: "hash123",
|
||||
});
|
||||
|
||||
const { session } = await store.createSession({
|
||||
userId: user.id,
|
||||
tokenType: "session",
|
||||
authMethod: "cookie",
|
||||
expiresAt: new Date(Date.now() + 3600000),
|
||||
userAgent: "Mozilla/5.0",
|
||||
ipAddress: "127.0.0.1",
|
||||
});
|
||||
|
||||
assert.equal(session.userAgent, "Mozilla/5.0");
|
||||
assert.equal(session.ipAddress, "127.0.0.1");
|
||||
});
|
||||
});
|
||||
|
||||
describe("getSession", () => {
|
||||
it("returns session when found and not expired", async () => {
|
||||
const user = await store.createUser({
|
||||
email: "test@example.com",
|
||||
passwordHash: "hash123",
|
||||
});
|
||||
|
||||
const { token } = await store.createSession({
|
||||
userId: user.id,
|
||||
tokenType: "session",
|
||||
authMethod: "cookie",
|
||||
expiresAt: new Date(Date.now() + 3600000), // 1 hour from now
|
||||
});
|
||||
|
||||
const tokenId = hashToken(token) as TokenId;
|
||||
const session = await store.getSession(tokenId);
|
||||
assert.notEqual(session, null);
|
||||
assert.equal(session!.userId, user.id);
|
||||
});
|
||||
|
||||
it("returns null for expired session", async () => {
|
||||
const user = await store.createUser({
|
||||
email: "test@example.com",
|
||||
passwordHash: "hash123",
|
||||
});
|
||||
|
||||
const { token } = await store.createSession({
|
||||
userId: user.id,
|
||||
tokenType: "session",
|
||||
authMethod: "cookie",
|
||||
expiresAt: new Date(Date.now() - 1000), // Expired 1 second ago
|
||||
});
|
||||
|
||||
const tokenId = hashToken(token) as TokenId;
|
||||
const session = await store.getSession(tokenId);
|
||||
assert.equal(session, null);
|
||||
});
|
||||
|
||||
it("returns null for nonexistent session", async () => {
|
||||
const session = await store.getSession("nonexistent" as TokenId);
|
||||
assert.equal(session, null);
|
||||
});
|
||||
});
|
||||
|
||||
describe("deleteSession", () => {
|
||||
it("removes the session", async () => {
|
||||
const user = await store.createUser({
|
||||
email: "test@example.com",
|
||||
passwordHash: "hash123",
|
||||
});
|
||||
|
||||
const { token } = await store.createSession({
|
||||
userId: user.id,
|
||||
tokenType: "session",
|
||||
authMethod: "cookie",
|
||||
expiresAt: new Date(Date.now() + 3600000),
|
||||
});
|
||||
|
||||
const tokenId = hashToken(token) as TokenId;
|
||||
await store.deleteSession(tokenId);
|
||||
|
||||
const session = await store.getSession(tokenId);
|
||||
assert.equal(session, null);
|
||||
});
|
||||
});
|
||||
|
||||
describe("deleteUserSessions", () => {
|
||||
it("removes all sessions for user", async () => {
|
||||
const user = await store.createUser({
|
||||
email: "test@example.com",
|
||||
passwordHash: "hash123",
|
||||
});
|
||||
|
||||
const { token: token1 } = await store.createSession({
|
||||
userId: user.id,
|
||||
tokenType: "session",
|
||||
authMethod: "cookie",
|
||||
expiresAt: new Date(Date.now() + 3600000),
|
||||
});
|
||||
|
||||
const { token: token2 } = await store.createSession({
|
||||
userId: user.id,
|
||||
tokenType: "session",
|
||||
authMethod: "bearer",
|
||||
expiresAt: new Date(Date.now() + 3600000),
|
||||
});
|
||||
|
||||
const count = await store.deleteUserSessions(user.id);
|
||||
assert.equal(count, 2);
|
||||
|
||||
const session1 = await store.getSession(
|
||||
hashToken(token1) as TokenId,
|
||||
);
|
||||
const session2 = await store.getSession(
|
||||
hashToken(token2) as TokenId,
|
||||
);
|
||||
assert.equal(session1, null);
|
||||
assert.equal(session2, null);
|
||||
});
|
||||
|
||||
it("returns 0 when user has no sessions", async () => {
|
||||
const count = await store.deleteUserSessions(
|
||||
"nonexistent" as UserId,
|
||||
);
|
||||
assert.equal(count, 0);
|
||||
});
|
||||
});
|
||||
|
||||
describe("updateLastUsed", () => {
|
||||
it("updates lastUsedAt timestamp", async () => {
|
||||
const user = await store.createUser({
|
||||
email: "test@example.com",
|
||||
passwordHash: "hash123",
|
||||
});
|
||||
|
||||
const { token } = await store.createSession({
|
||||
userId: user.id,
|
||||
tokenType: "session",
|
||||
authMethod: "cookie",
|
||||
expiresAt: new Date(Date.now() + 3600000),
|
||||
});
|
||||
|
||||
const tokenId = hashToken(token) as TokenId;
|
||||
const beforeUpdate = await store.getSession(tokenId);
|
||||
assert.equal(beforeUpdate!.lastUsedAt, undefined);
|
||||
|
||||
await store.updateLastUsed(tokenId);
|
||||
|
||||
const afterUpdate = await store.getSession(tokenId);
|
||||
assert.ok(afterUpdate!.lastUsedAt instanceof Date);
|
||||
});
|
||||
});
|
||||
});
|
||||
94
backend/diachron/auth/token.spec.ts
Normal file
94
backend/diachron/auth/token.spec.ts
Normal file
@@ -0,0 +1,94 @@
|
||||
// Tests for auth/token.ts
|
||||
// Pure unit tests - no database needed
|
||||
|
||||
import assert from "node:assert/strict";
|
||||
import { describe, it } from "node:test";
|
||||
import {
|
||||
generateToken,
|
||||
hashToken,
|
||||
parseAuthorizationHeader,
|
||||
SESSION_COOKIE_NAME,
|
||||
} from "./token";
|
||||
|
||||
describe("token", () => {
|
||||
describe("generateToken", () => {
|
||||
it("generates a non-empty string", () => {
|
||||
const token = generateToken();
|
||||
assert.equal(typeof token, "string");
|
||||
assert.ok(token.length > 0);
|
||||
});
|
||||
|
||||
it("generates unique tokens", () => {
|
||||
const tokens = new Set<string>();
|
||||
for (let i = 0; i < 100; i++) {
|
||||
tokens.add(generateToken());
|
||||
}
|
||||
assert.equal(tokens.size, 100);
|
||||
});
|
||||
|
||||
it("generates base64url encoded tokens", () => {
|
||||
const token = generateToken();
|
||||
// base64url uses A-Z, a-z, 0-9, -, _
|
||||
assert.match(token, /^[A-Za-z0-9_-]+$/);
|
||||
});
|
||||
});
|
||||
|
||||
describe("hashToken", () => {
|
||||
it("returns a hex string", () => {
|
||||
const hash = hashToken("test-token");
|
||||
assert.match(hash, /^[a-f0-9]+$/);
|
||||
});
|
||||
|
||||
it("returns consistent hash for same input", () => {
|
||||
const hash1 = hashToken("test-token");
|
||||
const hash2 = hashToken("test-token");
|
||||
assert.equal(hash1, hash2);
|
||||
});
|
||||
|
||||
it("returns different hash for different input", () => {
|
||||
const hash1 = hashToken("token-1");
|
||||
const hash2 = hashToken("token-2");
|
||||
assert.notEqual(hash1, hash2);
|
||||
});
|
||||
|
||||
it("returns 64 character hash (SHA-256)", () => {
|
||||
const hash = hashToken("test-token");
|
||||
assert.equal(hash.length, 64);
|
||||
});
|
||||
});
|
||||
|
||||
describe("parseAuthorizationHeader", () => {
|
||||
it("returns null for undefined header", () => {
|
||||
assert.equal(parseAuthorizationHeader(undefined), null);
|
||||
});
|
||||
|
||||
it("returns null for empty string", () => {
|
||||
assert.equal(parseAuthorizationHeader(""), null);
|
||||
});
|
||||
|
||||
it("returns null for non-bearer auth", () => {
|
||||
assert.equal(parseAuthorizationHeader("Basic abc123"), null);
|
||||
});
|
||||
|
||||
it("returns null for malformed header", () => {
|
||||
assert.equal(parseAuthorizationHeader("Bearer"), null);
|
||||
assert.equal(parseAuthorizationHeader("Bearer token extra"), null);
|
||||
});
|
||||
|
||||
it("extracts token from valid bearer header", () => {
|
||||
assert.equal(parseAuthorizationHeader("Bearer abc123"), "abc123");
|
||||
});
|
||||
|
||||
it("is case-insensitive for Bearer keyword", () => {
|
||||
assert.equal(parseAuthorizationHeader("bearer abc123"), "abc123");
|
||||
assert.equal(parseAuthorizationHeader("BEARER abc123"), "abc123");
|
||||
});
|
||||
});
|
||||
|
||||
describe("SESSION_COOKIE_NAME", () => {
|
||||
it("is defined", () => {
|
||||
assert.equal(typeof SESSION_COOKIE_NAME, "string");
|
||||
assert.ok(SESSION_COOKIE_NAME.length > 0);
|
||||
});
|
||||
});
|
||||
});
|
||||
253
backend/diachron/auth/types.spec.ts
Normal file
253
backend/diachron/auth/types.spec.ts
Normal file
@@ -0,0 +1,253 @@
|
||||
// Tests for auth/types.ts
|
||||
// Pure unit tests - no database needed
|
||||
|
||||
import assert from "node:assert/strict";
|
||||
import { describe, it } from "node:test";
|
||||
import { z } from "zod";
|
||||
import { AuthenticatedUser, anonymousUser } from "../user";
|
||||
import {
|
||||
authMethodParser,
|
||||
forgotPasswordInputParser,
|
||||
loginInputParser,
|
||||
registerInputParser,
|
||||
resetPasswordInputParser,
|
||||
Session,
|
||||
sessionDataParser,
|
||||
tokenLifetimes,
|
||||
tokenTypeParser,
|
||||
} from "./types";
|
||||
|
||||
describe("auth/types", () => {
|
||||
describe("tokenTypeParser", () => {
|
||||
it("accepts valid token types", () => {
|
||||
assert.equal(tokenTypeParser.parse("session"), "session");
|
||||
assert.equal(
|
||||
tokenTypeParser.parse("password_reset"),
|
||||
"password_reset",
|
||||
);
|
||||
assert.equal(tokenTypeParser.parse("email_verify"), "email_verify");
|
||||
});
|
||||
|
||||
it("rejects invalid token types", () => {
|
||||
assert.throws(() => tokenTypeParser.parse("invalid"));
|
||||
});
|
||||
});
|
||||
|
||||
describe("authMethodParser", () => {
|
||||
it("accepts valid auth methods", () => {
|
||||
assert.equal(authMethodParser.parse("cookie"), "cookie");
|
||||
assert.equal(authMethodParser.parse("bearer"), "bearer");
|
||||
});
|
||||
|
||||
it("rejects invalid auth methods", () => {
|
||||
assert.throws(() => authMethodParser.parse("basic"));
|
||||
});
|
||||
});
|
||||
|
||||
describe("sessionDataParser", () => {
|
||||
it("accepts valid session data", () => {
|
||||
const data = {
|
||||
tokenId: "abc123",
|
||||
userId: "user-1",
|
||||
tokenType: "session",
|
||||
authMethod: "cookie",
|
||||
createdAt: new Date(),
|
||||
expiresAt: new Date(),
|
||||
};
|
||||
const result = sessionDataParser.parse(data);
|
||||
assert.equal(result.tokenId, "abc123");
|
||||
assert.equal(result.userId, "user-1");
|
||||
});
|
||||
|
||||
it("coerces date strings to dates", () => {
|
||||
const data = {
|
||||
tokenId: "abc123",
|
||||
userId: "user-1",
|
||||
tokenType: "session",
|
||||
authMethod: "cookie",
|
||||
createdAt: "2025-01-01T00:00:00Z",
|
||||
expiresAt: "2025-01-02T00:00:00Z",
|
||||
};
|
||||
const result = sessionDataParser.parse(data);
|
||||
assert.ok(result.createdAt instanceof Date);
|
||||
assert.ok(result.expiresAt instanceof Date);
|
||||
});
|
||||
|
||||
it("accepts optional fields", () => {
|
||||
const data = {
|
||||
tokenId: "abc123",
|
||||
userId: "user-1",
|
||||
tokenType: "session",
|
||||
authMethod: "cookie",
|
||||
createdAt: new Date(),
|
||||
expiresAt: new Date(),
|
||||
lastUsedAt: new Date(),
|
||||
userAgent: "Mozilla/5.0",
|
||||
ipAddress: "127.0.0.1",
|
||||
isUsed: true,
|
||||
};
|
||||
const result = sessionDataParser.parse(data);
|
||||
assert.equal(result.userAgent, "Mozilla/5.0");
|
||||
assert.equal(result.ipAddress, "127.0.0.1");
|
||||
assert.equal(result.isUsed, true);
|
||||
});
|
||||
});
|
||||
|
||||
describe("loginInputParser", () => {
|
||||
it("accepts valid login input", () => {
|
||||
const result = loginInputParser.parse({
|
||||
email: "test@example.com",
|
||||
password: "secret",
|
||||
});
|
||||
assert.equal(result.email, "test@example.com");
|
||||
assert.equal(result.password, "secret");
|
||||
});
|
||||
|
||||
it("rejects invalid email", () => {
|
||||
assert.throws(() =>
|
||||
loginInputParser.parse({
|
||||
email: "not-an-email",
|
||||
password: "secret",
|
||||
}),
|
||||
);
|
||||
});
|
||||
|
||||
it("rejects empty password", () => {
|
||||
assert.throws(() =>
|
||||
loginInputParser.parse({
|
||||
email: "test@example.com",
|
||||
password: "",
|
||||
}),
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
describe("registerInputParser", () => {
|
||||
it("accepts valid registration input", () => {
|
||||
const result = registerInputParser.parse({
|
||||
email: "test@example.com",
|
||||
password: "password123",
|
||||
displayName: "Test User",
|
||||
});
|
||||
assert.equal(result.email, "test@example.com");
|
||||
assert.equal(result.password, "password123");
|
||||
assert.equal(result.displayName, "Test User");
|
||||
});
|
||||
|
||||
it("accepts registration without displayName", () => {
|
||||
const result = registerInputParser.parse({
|
||||
email: "test@example.com",
|
||||
password: "password123",
|
||||
});
|
||||
assert.equal(result.displayName, undefined);
|
||||
});
|
||||
|
||||
it("rejects password shorter than 8 characters", () => {
|
||||
assert.throws(() =>
|
||||
registerInputParser.parse({
|
||||
email: "test@example.com",
|
||||
password: "short",
|
||||
}),
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
describe("forgotPasswordInputParser", () => {
|
||||
it("accepts valid email", () => {
|
||||
const result = forgotPasswordInputParser.parse({
|
||||
email: "test@example.com",
|
||||
});
|
||||
assert.equal(result.email, "test@example.com");
|
||||
});
|
||||
|
||||
it("rejects invalid email", () => {
|
||||
assert.throws(() =>
|
||||
forgotPasswordInputParser.parse({
|
||||
email: "invalid",
|
||||
}),
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
describe("resetPasswordInputParser", () => {
|
||||
it("accepts valid reset input", () => {
|
||||
const result = resetPasswordInputParser.parse({
|
||||
token: "abc123",
|
||||
password: "newpassword",
|
||||
});
|
||||
assert.equal(result.token, "abc123");
|
||||
assert.equal(result.password, "newpassword");
|
||||
});
|
||||
|
||||
it("rejects empty token", () => {
|
||||
assert.throws(() =>
|
||||
resetPasswordInputParser.parse({
|
||||
token: "",
|
||||
password: "newpassword",
|
||||
}),
|
||||
);
|
||||
});
|
||||
|
||||
it("rejects password shorter than 8 characters", () => {
|
||||
assert.throws(() =>
|
||||
resetPasswordInputParser.parse({
|
||||
token: "abc123",
|
||||
password: "short",
|
||||
}),
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
describe("tokenLifetimes", () => {
|
||||
it("defines session lifetime", () => {
|
||||
assert.ok(tokenLifetimes.session > 0);
|
||||
// 30 days in ms
|
||||
assert.equal(tokenLifetimes.session, 30 * 24 * 60 * 60 * 1000);
|
||||
});
|
||||
|
||||
it("defines password_reset lifetime", () => {
|
||||
assert.ok(tokenLifetimes.password_reset > 0);
|
||||
// 1 hour in ms
|
||||
assert.equal(tokenLifetimes.password_reset, 1 * 60 * 60 * 1000);
|
||||
});
|
||||
|
||||
it("defines email_verify lifetime", () => {
|
||||
assert.ok(tokenLifetimes.email_verify > 0);
|
||||
// 24 hours in ms
|
||||
assert.equal(tokenLifetimes.email_verify, 24 * 60 * 60 * 1000);
|
||||
});
|
||||
});
|
||||
|
||||
describe("Session", () => {
|
||||
it("wraps authenticated session", () => {
|
||||
const user = AuthenticatedUser.create("test@example.com", {
|
||||
id: "user-1",
|
||||
});
|
||||
const sessionData = {
|
||||
tokenId: "token-1",
|
||||
userId: "user-1",
|
||||
tokenType: "session" as const,
|
||||
authMethod: "cookie" as const,
|
||||
createdAt: new Date(),
|
||||
expiresAt: new Date(),
|
||||
};
|
||||
const session = new Session(sessionData, user);
|
||||
|
||||
assert.equal(session.isAuthenticated(), true);
|
||||
assert.equal(session.getUser(), user);
|
||||
assert.equal(session.getData(), sessionData);
|
||||
assert.equal(session.tokenId, "token-1");
|
||||
assert.equal(session.userId, "user-1");
|
||||
});
|
||||
|
||||
it("wraps anonymous session", () => {
|
||||
const session = new Session(null, anonymousUser);
|
||||
|
||||
assert.equal(session.isAuthenticated(), false);
|
||||
assert.equal(session.getUser(), anonymousUser);
|
||||
assert.equal(session.getData(), null);
|
||||
assert.equal(session.tokenId, undefined);
|
||||
assert.equal(session.userId, undefined);
|
||||
});
|
||||
});
|
||||
});
|
||||
24
backend/diachron/basic/login.spec.ts
Normal file
24
backend/diachron/basic/login.spec.ts
Normal file
@@ -0,0 +1,24 @@
|
||||
// Tests for basic/login.ts
|
||||
// These tests verify the route structure and export
|
||||
|
||||
import assert from "node:assert/strict";
|
||||
import { describe, it } from "node:test";
|
||||
import { loginRoute } from "./login";
|
||||
|
||||
describe("basic/login", () => {
|
||||
describe("loginRoute", () => {
|
||||
it("has correct path", () => {
|
||||
assert.equal(loginRoute.path, "/login");
|
||||
});
|
||||
|
||||
it("handles GET and POST methods", () => {
|
||||
assert.ok(loginRoute.methods.includes("GET"));
|
||||
assert.ok(loginRoute.methods.includes("POST"));
|
||||
assert.equal(loginRoute.methods.length, 2);
|
||||
});
|
||||
|
||||
it("has a handler function", () => {
|
||||
assert.equal(typeof loginRoute.handler, "function");
|
||||
});
|
||||
});
|
||||
});
|
||||
24
backend/diachron/basic/logout.spec.ts
Normal file
24
backend/diachron/basic/logout.spec.ts
Normal file
@@ -0,0 +1,24 @@
|
||||
// Tests for basic/logout.ts
|
||||
// These tests verify the route structure and export
|
||||
|
||||
import assert from "node:assert/strict";
|
||||
import { describe, it } from "node:test";
|
||||
import { logoutRoute } from "./logout";
|
||||
|
||||
describe("basic/logout", () => {
|
||||
describe("logoutRoute", () => {
|
||||
it("has correct path", () => {
|
||||
assert.equal(logoutRoute.path, "/logout");
|
||||
});
|
||||
|
||||
it("handles GET and POST methods", () => {
|
||||
assert.ok(logoutRoute.methods.includes("GET"));
|
||||
assert.ok(logoutRoute.methods.includes("POST"));
|
||||
assert.equal(logoutRoute.methods.length, 2);
|
||||
});
|
||||
|
||||
it("has a handler function", () => {
|
||||
assert.equal(typeof logoutRoute.handler, "function");
|
||||
});
|
||||
});
|
||||
});
|
||||
73
backend/diachron/basic/routes.spec.ts
Normal file
73
backend/diachron/basic/routes.spec.ts
Normal file
@@ -0,0 +1,73 @@
|
||||
// Tests for basic/routes.ts
|
||||
// These tests verify the route structure and exports
|
||||
|
||||
import assert from "node:assert/strict";
|
||||
import { describe, it } from "node:test";
|
||||
import { routes } from "./routes";
|
||||
|
||||
describe("basic/routes", () => {
|
||||
describe("routes object", () => {
|
||||
it("exports routes as an object", () => {
|
||||
assert.equal(typeof routes, "object");
|
||||
});
|
||||
|
||||
it("contains hello route", () => {
|
||||
assert.ok("hello" in routes);
|
||||
assert.equal(routes.hello.path, "/hello");
|
||||
assert.ok(routes.hello.methods.includes("GET"));
|
||||
});
|
||||
|
||||
it("contains home route", () => {
|
||||
assert.ok("home" in routes);
|
||||
assert.equal(routes.home.path, "/");
|
||||
assert.ok(routes.home.methods.includes("GET"));
|
||||
});
|
||||
|
||||
it("contains login route", () => {
|
||||
assert.ok("login" in routes);
|
||||
assert.equal(routes.login.path, "/login");
|
||||
});
|
||||
|
||||
it("contains logout route", () => {
|
||||
assert.ok("logout" in routes);
|
||||
assert.equal(routes.logout.path, "/logout");
|
||||
});
|
||||
|
||||
it("all routes have handlers", () => {
|
||||
for (const [name, route] of Object.entries(routes)) {
|
||||
assert.equal(
|
||||
typeof route.handler,
|
||||
"function",
|
||||
`Route ${name} should have a handler function`,
|
||||
);
|
||||
}
|
||||
});
|
||||
|
||||
it("all routes have methods array", () => {
|
||||
for (const [name, route] of Object.entries(routes)) {
|
||||
assert.ok(
|
||||
Array.isArray(route.methods),
|
||||
`Route ${name} should have methods array`,
|
||||
);
|
||||
assert.ok(
|
||||
route.methods.length > 0,
|
||||
`Route ${name} should have at least one method`,
|
||||
);
|
||||
}
|
||||
});
|
||||
|
||||
it("all routes have path string", () => {
|
||||
for (const [name, route] of Object.entries(routes)) {
|
||||
assert.equal(
|
||||
typeof route.path,
|
||||
"string",
|
||||
`Route ${name} should have a path string`,
|
||||
);
|
||||
assert.ok(
|
||||
route.path.startsWith("/"),
|
||||
`Route ${name} path should start with /`,
|
||||
);
|
||||
}
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -1,4 +1,5 @@
|
||||
import { DateTime } from "ts-luxon";
|
||||
import { get, User } from "../hydrators/user";
|
||||
import { request } from "../request";
|
||||
import { html, render } from "../request/util";
|
||||
import type { Call, Result, Route } from "../types";
|
||||
@@ -9,9 +10,16 @@ const routes: Record<string, Route> = {
|
||||
hello: {
|
||||
path: "/hello",
|
||||
methods: ["GET"],
|
||||
handler: async (_call: Call): Promise<Result> => {
|
||||
handler: async (call: Call): Promise<Result> => {
|
||||
const now = DateTime.now();
|
||||
const c = await render("basic/hello", { now });
|
||||
const args: {now: DateTime,greeting?: string} = {now};
|
||||
|
||||
if (call.path !== '/hello') {
|
||||
const greeting = call. path.replaceAll('/','').replaceAll('-', ' ')
|
||||
args.greeting = greeting;
|
||||
}
|
||||
|
||||
const c = await render("basic/hello", args);
|
||||
|
||||
return html(c);
|
||||
},
|
||||
@@ -23,11 +31,18 @@ const routes: Record<string, Route> = {
|
||||
const _auth = request.auth;
|
||||
const me = request.session.getUser();
|
||||
|
||||
const email = me.toString();
|
||||
const id = me.id;
|
||||
console.log(`*** id: ${id}`);
|
||||
|
||||
const u = await get(id);
|
||||
|
||||
const email = u?.email || "anonymous@example.com";
|
||||
const name = u?.display_name || "anonymous";
|
||||
const showLogin = me.isAnonymous();
|
||||
const showLogout = !me.isAnonymous();
|
||||
|
||||
const c = await render("basic/home", {
|
||||
name,
|
||||
email,
|
||||
showLogin,
|
||||
showLogout,
|
||||
@@ -1,5 +1,6 @@
|
||||
import nunjucks from "nunjucks";
|
||||
import { db, migrate, migrationStatus } from "../database";
|
||||
import { formatError, formatErrorHtml } from "../errors";
|
||||
import { getLogs, log } from "../logging";
|
||||
|
||||
// FIXME: This doesn't belong here; move it somewhere else.
|
||||
@@ -40,6 +41,7 @@ const misc = {
|
||||
const core = {
|
||||
conf,
|
||||
database,
|
||||
errors: { formatError, formatErrorHtml },
|
||||
logging,
|
||||
misc,
|
||||
random,
|
||||
285
backend/diachron/database.spec.ts
Normal file
285
backend/diachron/database.spec.ts
Normal file
@@ -0,0 +1,285 @@
|
||||
// Tests for database.ts
|
||||
// Requires test PostgreSQL: docker compose -f docker-compose.test.yml up -d
|
||||
|
||||
import assert from "node:assert/strict";
|
||||
import { after, before, beforeEach, describe, it } from "node:test";
|
||||
import {
|
||||
connectionConfig,
|
||||
db,
|
||||
migrate,
|
||||
migrationStatus,
|
||||
PostgresAuthStore,
|
||||
pool,
|
||||
raw,
|
||||
rawPool,
|
||||
} from "./database";
|
||||
import type { UserId } from "./user";
|
||||
|
||||
describe("database", () => {
|
||||
before(async () => {
|
||||
// Run migrations to set up schema
|
||||
await migrate();
|
||||
});
|
||||
|
||||
after(async () => {
|
||||
await pool.end();
|
||||
});
|
||||
|
||||
describe("connectionConfig", () => {
|
||||
it("has required fields", () => {
|
||||
assert.ok("host" in connectionConfig);
|
||||
assert.ok("port" in connectionConfig);
|
||||
assert.ok("user" in connectionConfig);
|
||||
assert.ok("password" in connectionConfig);
|
||||
assert.ok("database" in connectionConfig);
|
||||
});
|
||||
|
||||
it("port is a number", () => {
|
||||
assert.equal(typeof connectionConfig.port, "number");
|
||||
});
|
||||
});
|
||||
|
||||
describe("raw", () => {
|
||||
it("executes raw SQL queries", async () => {
|
||||
const result = await raw<{ one: number }>("SELECT 1 as one");
|
||||
assert.equal(result.length, 1);
|
||||
assert.equal(result[0].one, 1);
|
||||
});
|
||||
|
||||
it("supports parameterized queries", async () => {
|
||||
const result = await raw<{ sum: number }>(
|
||||
"SELECT $1::int + $2::int as sum",
|
||||
[2, 3],
|
||||
);
|
||||
assert.equal(result[0].sum, 5);
|
||||
});
|
||||
});
|
||||
|
||||
describe("db (Kysely instance)", () => {
|
||||
it("can execute SELECT queries", async () => {
|
||||
const result = await db
|
||||
.selectFrom("users")
|
||||
.select("id")
|
||||
.limit(1)
|
||||
.execute();
|
||||
|
||||
// May be empty, just verify it runs
|
||||
assert.ok(Array.isArray(result));
|
||||
});
|
||||
});
|
||||
|
||||
describe("rawPool", () => {
|
||||
it("is a pg Pool instance", () => {
|
||||
assert.ok(rawPool.query !== undefined);
|
||||
});
|
||||
|
||||
it("can execute queries", async () => {
|
||||
const result = await rawPool.query("SELECT 1 as one");
|
||||
assert.equal(result.rows[0].one, 1);
|
||||
});
|
||||
});
|
||||
|
||||
describe("migrate", () => {
|
||||
it("runs without error when migrations are up to date", async () => {
|
||||
// Should not throw
|
||||
await migrate();
|
||||
});
|
||||
});
|
||||
|
||||
describe("migrationStatus", () => {
|
||||
it("returns applied and pending arrays", async () => {
|
||||
const status = await migrationStatus();
|
||||
|
||||
assert.ok(Array.isArray(status.applied));
|
||||
assert.ok(Array.isArray(status.pending));
|
||||
});
|
||||
|
||||
it("shows framework migrations as applied", async () => {
|
||||
const status = await migrationStatus();
|
||||
|
||||
// At least the users migration should be applied
|
||||
const hasUsersMigration = status.applied.some((m) =>
|
||||
m.includes("users"),
|
||||
);
|
||||
assert.ok(hasUsersMigration);
|
||||
});
|
||||
});
|
||||
|
||||
describe("PostgresAuthStore", () => {
|
||||
let store: PostgresAuthStore;
|
||||
|
||||
before(() => {
|
||||
store = new PostgresAuthStore();
|
||||
});
|
||||
|
||||
beforeEach(async () => {
|
||||
// Clean up test data before each test
|
||||
await rawPool.query("DELETE FROM sessions");
|
||||
await rawPool.query("DELETE FROM user_credentials");
|
||||
await rawPool.query("DELETE FROM user_emails");
|
||||
await rawPool.query("DELETE FROM users");
|
||||
});
|
||||
|
||||
describe("createUser", () => {
|
||||
it("creates a user with pending status", async () => {
|
||||
const user = await store.createUser({
|
||||
email: "test@example.com",
|
||||
passwordHash: "hash123",
|
||||
displayName: "Test User",
|
||||
});
|
||||
|
||||
assert.equal(user.email, "test@example.com");
|
||||
assert.equal(user.displayName, "Test User");
|
||||
assert.equal(user.status, "pending");
|
||||
});
|
||||
|
||||
it("stores the password hash", async () => {
|
||||
const user = await store.createUser({
|
||||
email: "test@example.com",
|
||||
passwordHash: "secrethash",
|
||||
});
|
||||
|
||||
const hash = await store.getUserPasswordHash(user.id);
|
||||
assert.equal(hash, "secrethash");
|
||||
});
|
||||
});
|
||||
|
||||
describe("getUserByEmail", () => {
|
||||
it("returns user when found", async () => {
|
||||
await store.createUser({
|
||||
email: "find@example.com",
|
||||
passwordHash: "hash",
|
||||
});
|
||||
|
||||
const user = await store.getUserByEmail("find@example.com");
|
||||
assert.notEqual(user, null);
|
||||
assert.equal(user!.email, "find@example.com");
|
||||
});
|
||||
|
||||
it("is case-insensitive", async () => {
|
||||
await store.createUser({
|
||||
email: "UPPER@EXAMPLE.COM",
|
||||
passwordHash: "hash",
|
||||
});
|
||||
|
||||
const user = await store.getUserByEmail("upper@example.com");
|
||||
assert.notEqual(user, null);
|
||||
});
|
||||
|
||||
it("returns null when not found", async () => {
|
||||
const user = await store.getUserByEmail("notfound@example.com");
|
||||
assert.equal(user, null);
|
||||
});
|
||||
});
|
||||
|
||||
describe("getUserById", () => {
|
||||
it("returns user when found", async () => {
|
||||
const created = await store.createUser({
|
||||
email: "test@example.com",
|
||||
passwordHash: "hash",
|
||||
});
|
||||
|
||||
const user = await store.getUserById(created.id);
|
||||
assert.notEqual(user, null);
|
||||
assert.equal(user!.id, created.id);
|
||||
});
|
||||
|
||||
it("returns null when not found", async () => {
|
||||
const user = await store.getUserById(
|
||||
"00000000-0000-0000-0000-000000000000" as UserId,
|
||||
);
|
||||
assert.equal(user, null);
|
||||
});
|
||||
});
|
||||
|
||||
describe("setUserPassword", () => {
|
||||
it("updates the password hash", async () => {
|
||||
const user = await store.createUser({
|
||||
email: "test@example.com",
|
||||
passwordHash: "oldhash",
|
||||
});
|
||||
|
||||
await store.setUserPassword(user.id, "newhash");
|
||||
|
||||
const hash = await store.getUserPasswordHash(user.id);
|
||||
assert.equal(hash, "newhash");
|
||||
});
|
||||
});
|
||||
|
||||
describe("updateUserEmailVerified", () => {
|
||||
it("sets user status to active", async () => {
|
||||
const created = await store.createUser({
|
||||
email: "test@example.com",
|
||||
passwordHash: "hash",
|
||||
});
|
||||
assert.equal(created.status, "pending");
|
||||
|
||||
await store.updateUserEmailVerified(created.id);
|
||||
|
||||
const user = await store.getUserById(created.id);
|
||||
assert.equal(user!.status, "active");
|
||||
});
|
||||
});
|
||||
|
||||
describe("session operations", () => {
|
||||
let userId: UserId;
|
||||
|
||||
beforeEach(async () => {
|
||||
const user = await store.createUser({
|
||||
email: "session@example.com",
|
||||
passwordHash: "hash",
|
||||
});
|
||||
userId = user.id;
|
||||
});
|
||||
|
||||
it("creates and retrieves sessions", async () => {
|
||||
const { token, session } = await store.createSession({
|
||||
userId,
|
||||
tokenType: "session",
|
||||
authMethod: "cookie",
|
||||
expiresAt: new Date(Date.now() + 3600000),
|
||||
});
|
||||
|
||||
assert.ok(token.length > 0);
|
||||
assert.equal(session.userId, userId);
|
||||
assert.equal(session.tokenType, "session");
|
||||
});
|
||||
|
||||
it("deletes sessions", async () => {
|
||||
const { session } = await store.createSession({
|
||||
userId,
|
||||
tokenType: "session",
|
||||
authMethod: "cookie",
|
||||
expiresAt: new Date(Date.now() + 3600000),
|
||||
});
|
||||
|
||||
await store.deleteSession(session.tokenId as any);
|
||||
|
||||
// Session should be soft-deleted (revoked)
|
||||
const retrieved = await store.getSession(
|
||||
session.tokenId as any,
|
||||
);
|
||||
assert.equal(retrieved, null);
|
||||
});
|
||||
|
||||
it("deletes all user sessions", async () => {
|
||||
await store.createSession({
|
||||
userId,
|
||||
tokenType: "session",
|
||||
authMethod: "cookie",
|
||||
expiresAt: new Date(Date.now() + 3600000),
|
||||
});
|
||||
|
||||
await store.createSession({
|
||||
userId,
|
||||
tokenType: "session",
|
||||
authMethod: "bearer",
|
||||
expiresAt: new Date(Date.now() + 3600000),
|
||||
});
|
||||
|
||||
const count = await store.deleteUserSessions(userId);
|
||||
assert.equal(count, 2);
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -21,13 +21,13 @@ import type { SessionData, TokenId } from "./auth/types";
|
||||
import type { Domain } from "./types";
|
||||
import { AuthenticatedUser, type User, type UserId } from "./user";
|
||||
|
||||
// Connection configuration
|
||||
// Connection configuration (supports environment variable overrides)
|
||||
const connectionConfig = {
|
||||
host: "localhost",
|
||||
port: 5432,
|
||||
user: "diachron",
|
||||
password: "diachron",
|
||||
database: "diachron",
|
||||
host: process.env.DB_HOST ?? "localhost",
|
||||
port: Number(process.env.DB_PORT ?? 5432),
|
||||
user: process.env.DB_USER ?? "diachron",
|
||||
password: process.env.DB_PASSWORD ?? "diachron",
|
||||
database: process.env.DB_NAME ?? "diachron",
|
||||
};
|
||||
|
||||
// Database schema types for Kysely
|
||||
@@ -113,7 +113,7 @@ async function raw<T = unknown>(
|
||||
//
|
||||
// Migrations directory: express/migrations/
|
||||
|
||||
const FRAMEWORK_MIGRATIONS_DIR = path.join(__dirname, "framework/migrations");
|
||||
const FRAMEWORK_MIGRATIONS_DIR = path.join(__dirname, "diachron/migrations");
|
||||
const APP_MIGRATIONS_DIR = path.join(__dirname, "migrations");
|
||||
const MIGRATIONS_TABLE = "_migrations";
|
||||
|
||||
229
backend/diachron/errors.ts
Normal file
229
backend/diachron/errors.ts
Normal file
@@ -0,0 +1,229 @@
|
||||
// ANSI escape codes
|
||||
const bold = "\x1b[1m";
|
||||
const red = "\x1b[31m";
|
||||
const cyan = "\x1b[36m";
|
||||
const dim = "\x1b[2m";
|
||||
const reset = "\x1b[0m";
|
||||
|
||||
interface ParsedFrame {
|
||||
raw: string;
|
||||
fn: string;
|
||||
file: string;
|
||||
line: string;
|
||||
col: string;
|
||||
isApp: boolean;
|
||||
}
|
||||
|
||||
const frameRe = /^\s*at\s+(?:(.+?)\s+)?\(?((?:\/|[a-zA-Z]:\\).+?):(\d+):(\d+)\)?$/;
|
||||
|
||||
function parseFrame(line: string): ParsedFrame | null {
|
||||
const m = line.match(frameRe);
|
||||
if (!m) return null;
|
||||
|
||||
const fn = m[1] ?? "<anonymous>";
|
||||
const file = m[2];
|
||||
const lineNum = m[3];
|
||||
const col = m[4];
|
||||
|
||||
const isApp =
|
||||
!file.includes("node_modules") && !file.startsWith("node:");
|
||||
|
||||
return { raw: line, fn, file, line: lineNum, col, isApp };
|
||||
}
|
||||
|
||||
function relativePath(absPath: string): string {
|
||||
const marker = "backend/";
|
||||
const idx = absPath.lastIndexOf(marker);
|
||||
if (idx !== -1) return absPath.slice(idx);
|
||||
return absPath;
|
||||
}
|
||||
|
||||
function libraryName(file: string): string {
|
||||
const nmIdx = file.indexOf("node_modules/");
|
||||
if (nmIdx === -1) return "node";
|
||||
const after = file.slice(nmIdx + "node_modules/".length);
|
||||
// Handle scoped packages like @scope/pkg
|
||||
if (after.startsWith("@")) {
|
||||
const parts = after.split("/");
|
||||
return `${parts[0]}/${parts[1]}`;
|
||||
}
|
||||
return after.split("/")[0];
|
||||
}
|
||||
|
||||
interface ParsedError {
|
||||
message: string;
|
||||
frames: ParsedFrame[];
|
||||
}
|
||||
|
||||
function parseError(error: unknown): ParsedError {
|
||||
if (!(error instanceof Error)) {
|
||||
return { message: String(error), frames: [] };
|
||||
}
|
||||
|
||||
const message = error.message ?? String(error);
|
||||
const stack = error.stack ?? "";
|
||||
|
||||
const lines = stack.split("\n");
|
||||
const frameLines: string[] = [];
|
||||
for (const line of lines) {
|
||||
if (line.trimStart().startsWith("at ")) {
|
||||
frameLines.push(line);
|
||||
}
|
||||
}
|
||||
|
||||
const frames = frameLines
|
||||
.map(parseFrame)
|
||||
.filter((f): f is ParsedFrame => f !== null);
|
||||
|
||||
return { message, frames };
|
||||
}
|
||||
|
||||
// Group consecutive library frames into collapsed runs
|
||||
type FrameGroup =
|
||||
| { kind: "app"; frame: ParsedFrame }
|
||||
| { kind: "lib"; count: number; names: string[] };
|
||||
|
||||
function groupFrames(frames: ParsedFrame[]): FrameGroup[] {
|
||||
const groups: FrameGroup[] = [];
|
||||
let i = 0;
|
||||
while (i < frames.length) {
|
||||
if (frames[i].isApp) {
|
||||
groups.push({ kind: "app", frame: frames[i] });
|
||||
i++;
|
||||
} else {
|
||||
const libNames = new Set<string>();
|
||||
let count = 0;
|
||||
while (i < frames.length && !frames[i].isApp) {
|
||||
libNames.add(libraryName(frames[i].file));
|
||||
count++;
|
||||
i++;
|
||||
}
|
||||
groups.push({ kind: "lib", count, names: [...libNames] });
|
||||
}
|
||||
}
|
||||
return groups;
|
||||
}
|
||||
|
||||
function libSummary(count: number, names: string[]): string {
|
||||
const s = count === 1 ? "" : "s";
|
||||
return `... ${count} internal frame${s} (${names.join(", ")})`;
|
||||
}
|
||||
|
||||
// --- Console formatting (ANSI) ---
|
||||
|
||||
function formatError(error: unknown): string {
|
||||
const { message, frames } = parseError(error);
|
||||
if (frames.length === 0) {
|
||||
return `${bold}${red}ERROR${reset} ${message}`;
|
||||
}
|
||||
|
||||
const parts: string[] = [];
|
||||
parts.push(`${bold}${red}ERROR${reset} ${message}`);
|
||||
parts.push("");
|
||||
|
||||
for (const group of groupFrames(frames)) {
|
||||
if (group.kind === "app") {
|
||||
const rel = relativePath(group.frame.file);
|
||||
const loc = `${rel}:${group.frame.line}`;
|
||||
parts.push(
|
||||
` ${bold}${cyan}${loc.padEnd(24)}${reset}at ${group.frame.fn}`,
|
||||
);
|
||||
} else {
|
||||
parts.push(
|
||||
` ${dim}${libSummary(group.count, group.names)}${reset}`,
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
return parts.join("\n");
|
||||
}
|
||||
|
||||
// --- HTML formatting (browser) ---
|
||||
|
||||
function esc(s: string): string {
|
||||
return s
|
||||
.replace(/&/g, "&")
|
||||
.replace(/</g, "<")
|
||||
.replace(/>/g, ">")
|
||||
.replace(/"/g, """);
|
||||
}
|
||||
|
||||
function formatErrorHtml(error: unknown): string {
|
||||
const { message, frames } = parseError(error);
|
||||
const groups = groupFrames(frames);
|
||||
|
||||
let frameRows = "";
|
||||
for (const group of groups) {
|
||||
if (group.kind === "app") {
|
||||
const rel = relativePath(group.frame.file);
|
||||
const loc = `${rel}:${group.frame.line}`;
|
||||
frameRows += `<tr class="app">
|
||||
<td class="loc">${esc(loc)}</td>
|
||||
<td class="fn">at ${esc(group.frame.fn)}</td>
|
||||
</tr>\n`;
|
||||
} else {
|
||||
frameRows += `<tr class="lib">
|
||||
<td colspan="2">${esc(libSummary(group.count, group.names))}</td>
|
||||
</tr>\n`;
|
||||
}
|
||||
}
|
||||
|
||||
return `<!doctype html>
|
||||
<html>
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<title>Error</title>
|
||||
<style>
|
||||
* { margin: 0; padding: 0; box-sizing: border-box; }
|
||||
body {
|
||||
font-family: "SF Mono", "Menlo", "Consolas", monospace;
|
||||
font-size: 14px;
|
||||
background: #1a1a2e;
|
||||
color: #e0e0e0;
|
||||
padding: 40px;
|
||||
}
|
||||
.error-label {
|
||||
display: inline-block;
|
||||
background: #e74c3c;
|
||||
color: #fff;
|
||||
font-weight: 700;
|
||||
font-size: 12px;
|
||||
padding: 2px 8px;
|
||||
border-radius: 3px;
|
||||
letter-spacing: 0.5px;
|
||||
}
|
||||
.message {
|
||||
margin-top: 12px;
|
||||
font-size: 18px;
|
||||
font-weight: 600;
|
||||
color: #f8f8f2;
|
||||
line-height: 1.4;
|
||||
}
|
||||
table {
|
||||
margin-top: 24px;
|
||||
border-collapse: collapse;
|
||||
}
|
||||
tr.app td { padding: 4px 0; }
|
||||
tr.app .loc {
|
||||
color: #56d4dd;
|
||||
font-weight: 600;
|
||||
padding-right: 24px;
|
||||
white-space: nowrap;
|
||||
}
|
||||
tr.app .fn { color: #ccc; }
|
||||
tr.lib td {
|
||||
color: #666;
|
||||
padding: 4px 0;
|
||||
font-style: italic;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<span class="error-label">ERROR</span>
|
||||
<div class="message">${esc(message)}</div>
|
||||
<table>${frameRows}</table>
|
||||
</body>
|
||||
</html>`;
|
||||
}
|
||||
|
||||
export { formatError, formatErrorHtml };
|
||||
29
backend/diachron/hydrators/hydrator.ts
Normal file
29
backend/diachron/hydrators/hydrator.ts
Normal file
@@ -0,0 +1,29 @@
|
||||
import { Kysely, PostgresDialect } from "kysely";
|
||||
import { Pool } from "pg";
|
||||
import type { DB } from "../../generated/db";
|
||||
import { connectionConfig } from "../database";
|
||||
|
||||
const db = new Kysely<DB>({
|
||||
dialect: new PostgresDialect({
|
||||
pool: new Pool(connectionConfig),
|
||||
}),
|
||||
log(event) {
|
||||
if (event.level === "query") {
|
||||
// FIXME: Wire this up to the logging system
|
||||
console.log("SQL:", event.query.sql);
|
||||
console.log("Params:", event.query.parameters);
|
||||
}
|
||||
},
|
||||
});
|
||||
|
||||
abstract class Hydrator<T> {
|
||||
public db: Kysely<DB>;
|
||||
|
||||
protected abstract table: string;
|
||||
|
||||
constructor() {
|
||||
this.db = db;
|
||||
}
|
||||
}
|
||||
|
||||
export { Hydrator, db };
|
||||
1
backend/diachron/hydrators/index.ts
Normal file
1
backend/diachron/hydrators/index.ts
Normal file
@@ -0,0 +1 @@
|
||||
export type Hydrators = {};
|
||||
44
backend/diachron/hydrators/tests/setup.ts
Normal file
44
backend/diachron/hydrators/tests/setup.ts
Normal file
@@ -0,0 +1,44 @@
|
||||
// Test setup for hydrator tests
|
||||
// Run: DB_PORT=5433 DB_USER=diachron_test DB_PASSWORD=diachron_test DB_NAME=diachron_test npx tsx --test tests/*.test.ts
|
||||
|
||||
import { Pool } from "pg";
|
||||
import { connectionConfig, migrate } from "../../database";
|
||||
|
||||
const pool = new Pool(connectionConfig);
|
||||
|
||||
export async function setupTestDatabase(): Promise<void> {
|
||||
await migrate();
|
||||
}
|
||||
|
||||
export async function cleanupTables(): Promise<void> {
|
||||
// Clean in reverse dependency order
|
||||
await pool.query("DELETE FROM user_emails");
|
||||
await pool.query("DELETE FROM users");
|
||||
}
|
||||
|
||||
export async function teardownTestDatabase(): Promise<void> {
|
||||
await pool.end();
|
||||
}
|
||||
|
||||
export async function insertTestUser(data: {
|
||||
id: string;
|
||||
displayName: string;
|
||||
status: string;
|
||||
email: string;
|
||||
}): Promise<void> {
|
||||
const emailId = crypto.randomUUID();
|
||||
const normalizedEmail = data.email.toLowerCase().trim();
|
||||
|
||||
await pool.query(
|
||||
`INSERT INTO users (id, display_name, status) VALUES ($1, $2, $3)`,
|
||||
[data.id, data.displayName, data.status],
|
||||
);
|
||||
|
||||
await pool.query(
|
||||
`INSERT INTO user_emails (id, user_id, email, normalized_email, is_primary)
|
||||
VALUES ($1, $2, $3, $4, true)`,
|
||||
[emailId, data.id, data.email, normalizedEmail],
|
||||
);
|
||||
}
|
||||
|
||||
export { pool };
|
||||
98
backend/diachron/hydrators/tests/user.spec.ts
Normal file
98
backend/diachron/hydrators/tests/user.spec.ts
Normal file
@@ -0,0 +1,98 @@
|
||||
// Tests for user hydrator
|
||||
// Run with: cd express && DB_PORT=5433 DB_USER=diachron_test DB_PASSWORD=diachron_test DB_NAME=diachron_test ../cmd npx tsx --test diachron/hydrators/tests/user.test.ts
|
||||
|
||||
import assert from "node:assert/strict";
|
||||
import { after, before, beforeEach, describe, it } from "node:test";
|
||||
import { get } from "../user";
|
||||
import {
|
||||
cleanupTables,
|
||||
insertTestUser,
|
||||
setupTestDatabase,
|
||||
teardownTestDatabase,
|
||||
} from "./setup";
|
||||
|
||||
describe("user hydrator", () => {
|
||||
before(async () => {
|
||||
await setupTestDatabase();
|
||||
});
|
||||
|
||||
after(async () => {
|
||||
await teardownTestDatabase();
|
||||
});
|
||||
|
||||
beforeEach(async () => {
|
||||
await cleanupTables();
|
||||
});
|
||||
|
||||
describe("get", () => {
|
||||
it("returns null for non-existent user", async () => {
|
||||
const result = await get("00000000-0000-0000-0000-000000000000");
|
||||
assert.equal(result, null);
|
||||
});
|
||||
|
||||
it("returns user when found", async () => {
|
||||
const userId = "cfae0a19-6515-4813-bc2d-1e032b72b203";
|
||||
await insertTestUser({
|
||||
id: userId,
|
||||
displayName: "Test User",
|
||||
status: "active",
|
||||
email: "test@example.com",
|
||||
});
|
||||
|
||||
const result = await get(userId);
|
||||
|
||||
assert.notEqual(result, null);
|
||||
assert.equal(result!.id, userId);
|
||||
assert.equal(result!.display_name, "Test User");
|
||||
assert.equal(result!.status, "active");
|
||||
assert.equal(result!.email, "test@example.com");
|
||||
});
|
||||
|
||||
it("validates user data with zod parser", async () => {
|
||||
const userId = crypto.randomUUID();
|
||||
await insertTestUser({
|
||||
id: userId,
|
||||
displayName: "Valid User",
|
||||
status: "active",
|
||||
email: "valid@example.com",
|
||||
});
|
||||
|
||||
const result = await get(userId);
|
||||
|
||||
// If we get here without throwing, parsing succeeded
|
||||
assert.notEqual(result, null);
|
||||
assert.equal(typeof result!.id, "string");
|
||||
assert.equal(typeof result!.email, "string");
|
||||
});
|
||||
|
||||
it("returns user with pending status", async () => {
|
||||
const userId = crypto.randomUUID();
|
||||
await insertTestUser({
|
||||
id: userId,
|
||||
displayName: "Pending User",
|
||||
status: "pending",
|
||||
email: "pending@example.com",
|
||||
});
|
||||
|
||||
const result = await get(userId);
|
||||
|
||||
assert.notEqual(result, null);
|
||||
assert.equal(result!.status, "pending");
|
||||
});
|
||||
|
||||
it("returns user with suspended status", async () => {
|
||||
const userId = crypto.randomUUID();
|
||||
await insertTestUser({
|
||||
id: userId,
|
||||
displayName: "Suspended User",
|
||||
status: "suspended",
|
||||
email: "suspended@example.com",
|
||||
});
|
||||
|
||||
const result = await get(userId);
|
||||
|
||||
assert.notEqual(result, null);
|
||||
assert.equal(result!.status, "suspended");
|
||||
});
|
||||
});
|
||||
});
|
||||
59
backend/diachron/hydrators/user.ts
Normal file
59
backend/diachron/hydrators/user.ts
Normal file
@@ -0,0 +1,59 @@
|
||||
import {
|
||||
ColumnType,
|
||||
Generated,
|
||||
Insertable,
|
||||
JSONColumnType,
|
||||
Selectable,
|
||||
Updateable,
|
||||
} from "kysely";
|
||||
import type { TypeID } from "typeid-js";
|
||||
import { z } from "zod";
|
||||
import { db, Hydrator } from "./hydrator";
|
||||
|
||||
const parser = z.object({
|
||||
// id: z.uuidv7(),
|
||||
id: z.uuid(),
|
||||
display_name: z.string(),
|
||||
// FIXME: status is duplicated elsewhere
|
||||
status: z.union([
|
||||
z.literal("active"),
|
||||
z.literal("suspended"),
|
||||
z.literal("pending"),
|
||||
]),
|
||||
email: z.email(),
|
||||
});
|
||||
|
||||
const tp = parser.parse({
|
||||
id: "cfae0a19-6515-4813-bc2d-1e032b72b203",
|
||||
display_name: "foo",
|
||||
status: "active",
|
||||
email: "mw@philologue.net",
|
||||
});
|
||||
|
||||
export type User = z.infer<typeof parser>;
|
||||
|
||||
const get = async (id: string): Promise<null | User> => {
|
||||
const ret = await db
|
||||
.selectFrom("users")
|
||||
.where("users.id", "=", id)
|
||||
.innerJoin("user_emails", "user_emails.user_id", "users.id")
|
||||
.select([
|
||||
"users.id",
|
||||
"users.status",
|
||||
"users.display_name",
|
||||
"user_emails.email",
|
||||
])
|
||||
.executeTakeFirst();
|
||||
|
||||
if (ret === undefined) {
|
||||
return null;
|
||||
}
|
||||
|
||||
console.dir(ret);
|
||||
|
||||
const parsed = parser.parse(ret);
|
||||
|
||||
return parsed;
|
||||
};
|
||||
|
||||
export { get };
|
||||
53
backend/diachron/logging.spec.ts
Normal file
53
backend/diachron/logging.spec.ts
Normal file
@@ -0,0 +1,53 @@
|
||||
// Tests for logging.ts
|
||||
// Note: These tests verify the module structure and types.
|
||||
// Full integration tests would require a running logging service.
|
||||
|
||||
import assert from "node:assert/strict";
|
||||
import { describe, it } from "node:test";
|
||||
|
||||
// We can't easily test log() and getLogs() without mocking fetch,
|
||||
// but we can verify the module exports correctly and types work.
|
||||
|
||||
describe("logging", () => {
|
||||
describe("module structure", () => {
|
||||
it("exports log function", async () => {
|
||||
const { log } = await import("./logging");
|
||||
assert.equal(typeof log, "function");
|
||||
});
|
||||
|
||||
it("exports getLogs function", async () => {
|
||||
const { getLogs } = await import("./logging");
|
||||
assert.equal(typeof getLogs, "function");
|
||||
});
|
||||
});
|
||||
|
||||
describe("Message type", () => {
|
||||
// Type-level tests - if these compile, the types are correct
|
||||
it("accepts valid message sources", () => {
|
||||
type MessageSource = "logging" | "diagnostic" | "user";
|
||||
const sources: MessageSource[] = ["logging", "diagnostic", "user"];
|
||||
assert.equal(sources.length, 3);
|
||||
});
|
||||
});
|
||||
|
||||
describe("FilterArgument type", () => {
|
||||
// Type-level tests
|
||||
it("accepts valid filter options", () => {
|
||||
type FilterArgument = {
|
||||
limit?: number;
|
||||
before?: number;
|
||||
after?: number;
|
||||
match?: (string | RegExp)[];
|
||||
};
|
||||
|
||||
const filter: FilterArgument = {
|
||||
limit: 10,
|
||||
before: Date.now(),
|
||||
after: Date.now() - 3600000,
|
||||
match: ["error", /warning/i],
|
||||
};
|
||||
|
||||
assert.ok(filter.limit === 10);
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -1,11 +1,13 @@
|
||||
{
|
||||
"name": "express",
|
||||
"name": "diachron",
|
||||
"version": "1.0.0",
|
||||
"description": "",
|
||||
"main": "index.js",
|
||||
"scripts": {
|
||||
"test": "echo \"Error: no test specified\" && exit 1",
|
||||
"nodemon": "nodemon dist/index.js"
|
||||
"test": "DB_PORT=5433 DB_USER=diachron_test DB_PASSWORD=diachron_test DB_NAME=diachron_test tsx --test '**/*.{test,spec}.ts'",
|
||||
"test:watch": "DB_PORT=5433 DB_USER=diachron_test DB_PASSWORD=diachron_test DB_NAME=diachron_test tsx --test --watch '**/*.{test,spec}.ts'",
|
||||
"nodemon": "nodemon dist/index.js",
|
||||
"kysely-codegen": "kysely-codegen"
|
||||
},
|
||||
"keywords": [],
|
||||
"author": "",
|
||||
@@ -24,12 +26,14 @@
|
||||
"ts-luxon": "^6.2.0",
|
||||
"ts-node": "^10.9.2",
|
||||
"tsx": "^4.20.6",
|
||||
"typeid-js": "^1.2.0",
|
||||
"typescript": "^5.9.3",
|
||||
"zod": "^4.1.12"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@biomejs/biome": "2.3.10",
|
||||
"@types/express": "^5.0.5",
|
||||
"@types/pg": "^8.16.0"
|
||||
"@types/pg": "^8.16.0",
|
||||
"kysely-codegen": "^0.19.0"
|
||||
}
|
||||
}
|
||||
434
express/pnpm-lock.yaml → backend/diachron/pnpm-lock.yaml
generated
434
express/pnpm-lock.yaml → backend/diachron/pnpm-lock.yaml
generated
@@ -44,6 +44,9 @@ importers:
|
||||
tsx:
|
||||
specifier: ^4.20.6
|
||||
version: 4.20.6
|
||||
typeid-js:
|
||||
specifier: ^1.2.0
|
||||
version: 1.2.0
|
||||
typescript:
|
||||
specifier: ^5.9.3
|
||||
version: 5.9.3
|
||||
@@ -60,9 +63,20 @@ importers:
|
||||
'@types/pg':
|
||||
specifier: ^8.16.0
|
||||
version: 8.16.0
|
||||
kysely-codegen:
|
||||
specifier: ^0.19.0
|
||||
version: 0.19.0(kysely@0.28.9)(pg@8.16.3)(typescript@5.9.3)
|
||||
|
||||
packages:
|
||||
|
||||
'@babel/code-frame@7.28.6':
|
||||
resolution: {integrity: sha512-JYgintcMjRiCvS8mMECzaEn+m3PfoQiyqukOMCCVQtoJGYJw8j/8LBJEiqkHLkfwCcs74E3pbAUFNg7d9VNJ+Q==}
|
||||
engines: {node: '>=6.9.0'}
|
||||
|
||||
'@babel/helper-validator-identifier@7.28.5':
|
||||
resolution: {integrity: sha512-qSs4ifwzKJSV39ucNjsvc6WVHs6b7S03sOh2OcHF9UHfVPqWWALUsNUVzhSBiItjRZoLHx7nIarVjqKVusUZ1Q==}
|
||||
engines: {node: '>=6.9.0'}
|
||||
|
||||
'@biomejs/biome@2.3.10':
|
||||
resolution: {integrity: sha512-/uWSUd1MHX2fjqNLHNL6zLYWBbrJeG412/8H7ESuK8ewoRoMPUgHDebqKrPTx/5n6f17Xzqc9hdg3MEqA5hXnQ==}
|
||||
engines: {node: '>=14.21.3'}
|
||||
@@ -360,6 +374,14 @@ packages:
|
||||
engines: {node: '>=0.4.0'}
|
||||
hasBin: true
|
||||
|
||||
ansi-styles@3.2.1:
|
||||
resolution: {integrity: sha512-VT0ZI6kZRdTh8YyJw3SMbYm/u+NqfsAxEpWO0Pf9sq8/e94WxxOpPKx9FR1FlyCtOVDNOQ+8ntlqFxiRc+r5qA==}
|
||||
engines: {node: '>=4'}
|
||||
|
||||
ansi-styles@4.3.0:
|
||||
resolution: {integrity: sha512-zbB9rCJAT1rbjiVDb2hqKFHNYLxgtk8NURxZ3IZwD3F6NtxbXZQCnnSi1Lkx+IDohdPlFp222wVALIheZJQSEg==}
|
||||
engines: {node: '>=8'}
|
||||
|
||||
anymatch@3.1.3:
|
||||
resolution: {integrity: sha512-KMReFUr0B4t+D+OBkjR3KYqvocp2XaSzO55UcB6mgQMd3KbcE+mWTyvVV7D/zsdEbNnV6acZUutkiHQXvTr1Rw==}
|
||||
engines: {node: '>= 8'}
|
||||
@@ -367,6 +389,9 @@ packages:
|
||||
arg@4.1.3:
|
||||
resolution: {integrity: sha512-58S9QDqG0Xx27YwPSt9fJxivjYl432YCwfDMfZ+71RAqUrZef7LrKQZ3LHLOwCS4FLNBplP533Zx895SeOCHvA==}
|
||||
|
||||
argparse@2.0.1:
|
||||
resolution: {integrity: sha512-8+9WqebbFzpX9OR+Wa6O29asIogeRMzcGtAINdpMHHyAg10f05aSFVBbcEqGf/PXw1EjAZ+q2/bEBg3DvurK3Q==}
|
||||
|
||||
asap@2.0.6:
|
||||
resolution: {integrity: sha512-BSHWgDSAiKs50o2Re8ppvp3seVHXSRM44cdSsT9FfNEUUZLOGWVCsiWaRPWM1Znn+mqZ1OfVZ3z3DWEzSp7hRA==}
|
||||
|
||||
@@ -400,10 +425,35 @@ packages:
|
||||
resolution: {integrity: sha512-+ys997U96po4Kx/ABpBCqhA9EuxJaQWDQg7295H4hBphv3IZg0boBKuwYpt4YXp6MZ5AmZQnU/tyMTlRpaSejg==}
|
||||
engines: {node: '>= 0.4'}
|
||||
|
||||
callsites@3.1.0:
|
||||
resolution: {integrity: sha512-P8BjAsXvZS+VIDUI11hHCQEv74YT67YUi5JJFNWIqL235sBmjX4+qx9Muvls5ivyNENctx46xQLQ3aTuE7ssaQ==}
|
||||
engines: {node: '>=6'}
|
||||
|
||||
chalk@2.4.2:
|
||||
resolution: {integrity: sha512-Mti+f9lpJNcwF4tWV8/OrTTtF1gZi+f8FqlyAdouralcFWFQWF2+NgCHShjkCb+IFBLq9buZwE1xckQU4peSuQ==}
|
||||
engines: {node: '>=4'}
|
||||
|
||||
chalk@4.1.2:
|
||||
resolution: {integrity: sha512-oKnbhFyRIXpUuez8iBMmyEa4nbj4IOQyuhc/wy9kY7/WVPcwIO9VA668Pu8RkO7+0G76SLROeyw9CpQ061i4mA==}
|
||||
engines: {node: '>=10'}
|
||||
|
||||
chokidar@3.6.0:
|
||||
resolution: {integrity: sha512-7VT13fmjotKpGipCW9JEQAusEPE+Ei8nl6/g4FBAmIm0GOOLMua9NDDo/DWp0ZAxCr3cPq5ZpBqmPAQgDda2Pw==}
|
||||
engines: {node: '>= 8.10.0'}
|
||||
|
||||
color-convert@1.9.3:
|
||||
resolution: {integrity: sha512-QfAUtd+vFdAtFQcC8CCyYt1fYWxSqAiK2cSD6zDB8N3cpsEBAvRxp9zOGg6G/SHHJYAT88/az/IuDGALsNVbGg==}
|
||||
|
||||
color-convert@2.0.1:
|
||||
resolution: {integrity: sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ==}
|
||||
engines: {node: '>=7.0.0'}
|
||||
|
||||
color-name@1.1.3:
|
||||
resolution: {integrity: sha512-72fSenhMw2HZMTVHeCA9KCmpEIbzWiQsjN+BHcBbS9vr1mtt+vJjPdksIBNUmKAW8TFUDPJK5SUU3QhE9NEXDw==}
|
||||
|
||||
color-name@1.1.4:
|
||||
resolution: {integrity: sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA==}
|
||||
|
||||
commander@5.1.0:
|
||||
resolution: {integrity: sha512-P0CysNDQ7rtVw4QIQtm+MRxV66vKFSvlsQvGYXZWR3qFU0jlMKHZZZgw8e+8DSah4UDKMqnknRDQz+xuQXQ/Zg==}
|
||||
engines: {node: '>= 6'}
|
||||
@@ -427,6 +477,15 @@ packages:
|
||||
resolution: {integrity: sha512-yki5XnKuf750l50uGTllt6kKILY4nQ1eNIQatoXEByZ5dWgnKqbnqmTrBE5B4N7lrMJKQ2ytWMiTO2o0v6Ew/w==}
|
||||
engines: {node: '>= 0.6'}
|
||||
|
||||
cosmiconfig@9.0.0:
|
||||
resolution: {integrity: sha512-itvL5h8RETACmOTFc4UfIyB2RfEHi71Ax6E/PivVxq9NseKbOWpeyHEOIbmAw1rs8Ak0VursQNww7lf7YtUwzg==}
|
||||
engines: {node: '>=14'}
|
||||
peerDependencies:
|
||||
typescript: '>=4.9.5'
|
||||
peerDependenciesMeta:
|
||||
typescript:
|
||||
optional: true
|
||||
|
||||
create-require@1.1.1:
|
||||
resolution: {integrity: sha512-dcKFX3jn0MpIaXjisoRvexIJVEKzaq7z2rZKxf+MSr9TkdmHmsU4m2lcLojrj/FHl8mk5VxMmYA+ftRkP/3oKQ==}
|
||||
|
||||
@@ -443,10 +502,26 @@ packages:
|
||||
resolution: {integrity: sha512-g7nH6P6dyDioJogAAGprGpCtVImJhpPk/roCzdb3fIh61/s/nPsfR6onyMwkCAR/OlC3yBC0lESvUoQEAssIrw==}
|
||||
engines: {node: '>= 0.8'}
|
||||
|
||||
diff@3.5.0:
|
||||
resolution: {integrity: sha512-A46qtFgd+g7pDZinpnwiRJtxbC1hpgf0uzP3iG89scHk0AUC7A1TGxf5OiiOUv/JMZR8GOt8hL900hV0bOy5xA==}
|
||||
engines: {node: '>=0.3.1'}
|
||||
|
||||
diff@4.0.2:
|
||||
resolution: {integrity: sha512-58lmxKSA4BNyLz+HHMUzlOEpg09FV+ev6ZMe3vJihgdxzgcwZ8VoEEPmALCZG9LmqfVoNMMKpttIYTVG6uDY7A==}
|
||||
engines: {node: '>=0.3.1'}
|
||||
|
||||
dotenv-expand@12.0.3:
|
||||
resolution: {integrity: sha512-uc47g4b+4k/M/SeaW1y4OApx+mtLWl92l5LMPP0GNXctZqELk+YGgOPIIC5elYmUH4OuoK3JLhuRUYegeySiFA==}
|
||||
engines: {node: '>=12'}
|
||||
|
||||
dotenv@16.6.1:
|
||||
resolution: {integrity: sha512-uBq4egWHTcTt33a72vpSG0z3HnPuIl6NqYcTrKEg2azoEyl2hpW0zqlxysq2pK9HlDIHyHyakeYaYnSAwd8bow==}
|
||||
engines: {node: '>=12'}
|
||||
|
||||
dotenv@17.2.3:
|
||||
resolution: {integrity: sha512-JVUnt+DUIzu87TABbhPmNfVdBDt18BLOWjMUFJMSi/Qqg7NTYtabbvSNJGOJ7afbRuv9D/lngizHtP7QyLQ+9w==}
|
||||
engines: {node: '>=12'}
|
||||
|
||||
dunder-proto@1.0.1:
|
||||
resolution: {integrity: sha512-KIN/nDJBQRcXw0MLVhZE9iQHmG68qAVIBg9CqmUYjmQIhgij9U5MFvrqkUL5FbtyyzZuOeOt0zdeRe4UY7ct+A==}
|
||||
engines: {node: '>= 0.4'}
|
||||
@@ -458,6 +533,13 @@ packages:
|
||||
resolution: {integrity: sha512-Q0n9HRi4m6JuGIV1eFlmvJB7ZEVxu93IrMyiMsGC0lrMJMWzRgx6WGquyfQgZVb31vhGgXnfmPNNXmxnOkRBrg==}
|
||||
engines: {node: '>= 0.8'}
|
||||
|
||||
env-paths@2.2.1:
|
||||
resolution: {integrity: sha512-+h1lkLKhZMTYjog1VEpJNG7NZJWcuc2DDk/qsqSTRRCOXiLjeQ1d1/udrUGhqMxUgAlwKNZ0cf2uqan5GLuS2A==}
|
||||
engines: {node: '>=6'}
|
||||
|
||||
error-ex@1.3.4:
|
||||
resolution: {integrity: sha512-sqQamAnR14VgCr1A618A3sGrygcpK+HEbenA/HiEAkkUwcZIIB/tgWqHFxWgOyDh4nB4JCRimh79dR5Ywc9MDQ==}
|
||||
|
||||
es-define-property@1.0.1:
|
||||
resolution: {integrity: sha512-e3nRfgfUZ4rNGL232gUgX06QNyyez04KdjFrF+LTRoOXmrOgFKDg4BCdsjW8EnT69eqdYGmRpJwiPVYNrCaW3g==}
|
||||
engines: {node: '>= 0.4'}
|
||||
@@ -478,6 +560,10 @@ packages:
|
||||
escape-html@1.0.3:
|
||||
resolution: {integrity: sha512-NiSupZ4OeuGwr68lGIeym/ksIZMJodUGOSCZ/FSnTxcrekbvqrgdUxlJOMpijaKZVjAJrWrGs/6Jy8OMuyj9ow==}
|
||||
|
||||
escape-string-regexp@1.0.5:
|
||||
resolution: {integrity: sha512-vbRorB5FUQWvla16U8R/qgaFIya2qGzwDrNmCZuYKrbdSUMG6I1ZCGQRefkRVhuOkIGVne7BQ35DSfo1qvJqFg==}
|
||||
engines: {node: '>=0.8.0'}
|
||||
|
||||
etag@1.8.1:
|
||||
resolution: {integrity: sha512-aIL5Fx7mawVa300al2BnEE4iNvo1qETxLrPI/o05L7z6go7fCw1J6EQmbK4FmJ2AS7kgVF/KEZWufBfdClMcPg==}
|
||||
engines: {node: '>= 0.6'}
|
||||
@@ -502,6 +588,9 @@ packages:
|
||||
resolution: {integrity: sha512-Rx/WycZ60HOaqLKAi6cHRKKI7zxWbJ31MhntmtwMoaTeF7XFH9hhBp8vITaMidfljRQ6eYWCKkaTK+ykVJHP2A==}
|
||||
engines: {node: '>= 0.8'}
|
||||
|
||||
fs.realpath@1.0.0:
|
||||
resolution: {integrity: sha512-OO0pH2lK6a0hZnAdau5ItzHPI6pUlvI7jMVnxUQRtw4owF2wk8lOSabtGDCTP4Ggrg2MbGnWO9X8K1t4+fGMDw==}
|
||||
|
||||
fsevents@2.3.3:
|
||||
resolution: {integrity: sha512-5xoDfX+fL7faATnagmWPpbFtwh/R77WmMMqqHGS65C3vvB0YHrgF+B1YmZ3441tMj5n63k0212XNoJwzlhffQw==}
|
||||
engines: {node: ^8.16.0 || ^10.6.0 || >=11.0.0}
|
||||
@@ -521,10 +610,18 @@ packages:
|
||||
get-tsconfig@4.13.0:
|
||||
resolution: {integrity: sha512-1VKTZJCwBrvbd+Wn3AOgQP/2Av+TfTCOlE4AcRJE72W1ksZXbAx8PPBR9RzgTeSPzlPMHrbANMH3LbltH73wxQ==}
|
||||
|
||||
git-diff@2.0.6:
|
||||
resolution: {integrity: sha512-/Iu4prUrydE3Pb3lCBMbcSNIf81tgGt0W1ZwknnyF62t3tHmtiJTRj0f+1ZIhp3+Rh0ktz1pJVoa7ZXUCskivA==}
|
||||
engines: {node: '>= 4.8.0'}
|
||||
|
||||
glob-parent@5.1.2:
|
||||
resolution: {integrity: sha512-AOIgSQCepiJYwP3ARnGx+5VnTu2HBYdzbGP45eLw1vr3zB3vZLeyed1sC9hnbcOc9/SrMyM5RPQrkGz4aS9Zow==}
|
||||
engines: {node: '>= 6'}
|
||||
|
||||
glob@7.2.3:
|
||||
resolution: {integrity: sha512-nFR0zLpU2YCaRxwoCJvL6UvCH2JFyFVIvwTLsIf21AuHlMskA1hhTdk+LlYJtOlYt9v6dvszD2BGRqBL+iQK9Q==}
|
||||
deprecated: Glob versions prior to v9 are no longer supported
|
||||
|
||||
gopd@1.2.0:
|
||||
resolution: {integrity: sha512-ZUKRh6/kUFoAiTAtTYPZJ3hw9wNxx+BIBOijnlG9PnrJsCcSjs1wyyD6vJpaYtgnzDrKYRSqf3OO6Rfa93xsRg==}
|
||||
engines: {node: '>= 0.4'}
|
||||
@@ -533,6 +630,10 @@ packages:
|
||||
resolution: {integrity: sha512-sKJf1+ceQBr4SMkvQnBDNDtf4TXpVhVGateu0t918bl30FnbE2m4vNLX+VWe/dpjlb+HugGYzW7uQXH98HPEYw==}
|
||||
engines: {node: '>=4'}
|
||||
|
||||
has-flag@4.0.0:
|
||||
resolution: {integrity: sha512-EykJT/Q1KjTWctppgIAgfSO0tKVuZUjhgMr17kqTumMl6Afv3EISleU7qZUzoXDFTAHTDC4NOoG/ZxU3EvlMPQ==}
|
||||
engines: {node: '>=8'}
|
||||
|
||||
has-symbols@1.1.0:
|
||||
resolution: {integrity: sha512-1cDNdwJ2Jaohmb3sg4OmKaMBwuC48sYni5HUw2DvsC8LjGTLK9h+eb1X6RyuOHe4hT0ULCW68iomhjUoKUqlPQ==}
|
||||
engines: {node: '>= 0.4'}
|
||||
@@ -556,17 +657,36 @@ packages:
|
||||
ignore-by-default@1.0.1:
|
||||
resolution: {integrity: sha512-Ius2VYcGNk7T90CppJqcIkS5ooHUZyIQK+ClZfMfMNFEF9VSE73Fq+906u/CWu92x4gzZMWOwfFYckPObzdEbA==}
|
||||
|
||||
import-fresh@3.3.1:
|
||||
resolution: {integrity: sha512-TR3KfrTZTYLPB6jUjfx6MF9WcWrHL9su5TObK4ZkYgBdWKPOFoSoQIdEuTuR82pmtxH2spWG9h6etwfr1pLBqQ==}
|
||||
engines: {node: '>=6'}
|
||||
|
||||
inflight@1.0.6:
|
||||
resolution: {integrity: sha512-k92I/b08q4wvFscXCLvqfsHCrjrF7yiXsQuIVvVE7N82W3+aqpzuUdBbfhWcy/FZR3/4IgflMgKLOsvPDrGCJA==}
|
||||
deprecated: This module is not supported, and leaks memory. Do not use it. Check out lru-cache if you want a good and tested way to coalesce async requests by a key value, which is much more comprehensive and powerful.
|
||||
|
||||
inherits@2.0.4:
|
||||
resolution: {integrity: sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ==}
|
||||
|
||||
interpret@1.4.0:
|
||||
resolution: {integrity: sha512-agE4QfB2Lkp9uICn7BAqoscw4SZP9kTE2hxiFI3jBPmXJfdqiahTbUuKGsMoN2GtqL9AxhYioAcVvgsb1HvRbA==}
|
||||
engines: {node: '>= 0.10'}
|
||||
|
||||
ipaddr.js@1.9.1:
|
||||
resolution: {integrity: sha512-0KI/607xoxSToH7GjN1FfSbLoU0+btTicjsQSWQlh/hZykN8KpmMf7uYwPW3R+akZ6R/w18ZlXSHBYXiYUPO3g==}
|
||||
engines: {node: '>= 0.10'}
|
||||
|
||||
is-arrayish@0.2.1:
|
||||
resolution: {integrity: sha512-zz06S8t0ozoDXMG+ube26zeCTNXcKIPJZJi8hBrF4idCLms4CG9QtK7qBl1boi5ODzFpjswb5JPmHCbMpjaYzg==}
|
||||
|
||||
is-binary-path@2.1.0:
|
||||
resolution: {integrity: sha512-ZMERYes6pDydyuGidse7OsHxtbI7WVeUEozgR/g7rd0xUimYNlvZRE/K2MgZTjWy725IfelLeVcEM97mmtRGXw==}
|
||||
engines: {node: '>=8'}
|
||||
|
||||
is-core-module@2.16.1:
|
||||
resolution: {integrity: sha512-UfoeMA6fIJ8wTYFEUjelnaGI67v6+N7qXJEvQuIGa99l4xsCruSYOVSQ0uPANn4dAzm8lkYPaKLrrijLq7x23w==}
|
||||
engines: {node: '>= 0.4'}
|
||||
|
||||
is-extglob@2.1.1:
|
||||
resolution: {integrity: sha512-SbKbANkN603Vi4jEZv49LeVJMn4yGwsbzZworEoyEiutsN3nJYdbO36zfhGJ6QEDpOZIFkDtnq5JRxmvl3jsoQ==}
|
||||
engines: {node: '>=0.10.0'}
|
||||
@@ -582,10 +702,62 @@ packages:
|
||||
is-promise@4.0.0:
|
||||
resolution: {integrity: sha512-hvpoI6korhJMnej285dSg6nu1+e6uxs7zG3BYAm5byqDsgJNWwxzM6z6iZiAgQR4TJ30JmBTOwqZUw3WlyH3AQ==}
|
||||
|
||||
js-tokens@4.0.0:
|
||||
resolution: {integrity: sha512-RdJUflcE3cUzKiMqQgsCu06FPu9UdIJO0beYbPhHN4k6apgJtifcoCtT9bcxOpYBtpD2kCM6Sbzg4CausW/PKQ==}
|
||||
|
||||
js-yaml@4.1.1:
|
||||
resolution: {integrity: sha512-qQKT4zQxXl8lLwBtHMWwaTcGfFOZviOJet3Oy/xmGk2gZH677CJM9EvtfdSkgWcATZhj/55JZ0rmy3myCT5lsA==}
|
||||
hasBin: true
|
||||
|
||||
json-parse-even-better-errors@2.3.1:
|
||||
resolution: {integrity: sha512-xyFwyhro/JEof6Ghe2iz2NcXoj2sloNsWr/XsERDK/oiPCfaNhl5ONfp+jQdAZRQQ0IJWNzH9zIZF7li91kh2w==}
|
||||
|
||||
kysely-codegen@0.19.0:
|
||||
resolution: {integrity: sha512-ZpdQQnpfY0kh45CA6yPA9vdFsBE+b06Fx7QVcbL5rX//yjbA0yYGZGhnH7GTd4P4BY/HIv5uAfuOD83JVZf95w==}
|
||||
engines: {node: '>=20.0.0'}
|
||||
hasBin: true
|
||||
peerDependencies:
|
||||
'@libsql/kysely-libsql': '>=0.3.0 <0.5.0'
|
||||
'@tediousjs/connection-string': '>=0.5.0 <0.6.0'
|
||||
better-sqlite3: '>=7.6.2 <13.0.0'
|
||||
kysely: '>=0.27.0 <1.0.0'
|
||||
kysely-bun-sqlite: '>=0.3.2 <1.0.0'
|
||||
kysely-bun-worker: '>=1.2.0 <2.0.0'
|
||||
mysql2: '>=2.3.3 <4.0.0'
|
||||
pg: '>=8.8.0 <9.0.0'
|
||||
tarn: '>=3.0.0 <4.0.0'
|
||||
tedious: '>=18.0.0 <20.0.0'
|
||||
peerDependenciesMeta:
|
||||
'@libsql/kysely-libsql':
|
||||
optional: true
|
||||
'@tediousjs/connection-string':
|
||||
optional: true
|
||||
better-sqlite3:
|
||||
optional: true
|
||||
kysely-bun-sqlite:
|
||||
optional: true
|
||||
kysely-bun-worker:
|
||||
optional: true
|
||||
mysql2:
|
||||
optional: true
|
||||
pg:
|
||||
optional: true
|
||||
tarn:
|
||||
optional: true
|
||||
tedious:
|
||||
optional: true
|
||||
|
||||
kysely@0.28.9:
|
||||
resolution: {integrity: sha512-3BeXMoiOhpOwu62CiVpO6lxfq4eS6KMYfQdMsN/2kUCRNuF2YiEr7u0HLHaQU+O4Xu8YXE3bHVkwaQ85i72EuA==}
|
||||
engines: {node: '>=20.0.0'}
|
||||
|
||||
lines-and-columns@1.2.4:
|
||||
resolution: {integrity: sha512-7ylylesZQ/PV29jhEDl3Ufjo6ZX7gCqJr5F7PKrqc93v7fzSymt1BpwEU8nAUXs8qzzvqhbjhK5QZg6Mt/HkBg==}
|
||||
|
||||
loglevel@1.9.2:
|
||||
resolution: {integrity: sha512-HgMmCqIJSAKqo68l0rS2AanEWfkxaZ5wNiEFb5ggm08lDs9Xl2KxBlX3PTcaD2chBM1gXAYf491/M2Rv8Jwayg==}
|
||||
engines: {node: '>= 0.6.0'}
|
||||
|
||||
make-error@1.3.6:
|
||||
resolution: {integrity: sha512-s8UhlNe7vPKomQhC1qFelMokr/Sc3AgNbso3n74mVPA5LTZwkB9NlXf4XPamLxJE8h0gh73rM94xvwRT2CVInw==}
|
||||
|
||||
@@ -601,6 +773,10 @@ packages:
|
||||
resolution: {integrity: sha512-Snk314V5ayFLhp3fkUREub6WtjBfPdCPY1Ln8/8munuLuiYhsABgBVWsozAG+MWMbVEvcdcpbi9R7ww22l9Q3g==}
|
||||
engines: {node: '>=18'}
|
||||
|
||||
micromatch@4.0.8:
|
||||
resolution: {integrity: sha512-PXwfBhYu0hBCPw8Dn0E+WDYb7af3dSLVWKi3HGv84IdF4TyFoC0ysxFd0Goxw7nSv4T/PzEJQxsYsEiFCKo2BA==}
|
||||
engines: {node: '>=8.6'}
|
||||
|
||||
mime-db@1.54.0:
|
||||
resolution: {integrity: sha512-aU5EJuIN2WDemCcAp2vFBfp/m4EAhWJnUNSSw0ixs7/kXbd6Pg64EmwJkNdFhB8aWt1sH2CTXrLxo/iAGV3oPQ==}
|
||||
engines: {node: '>= 0.6'}
|
||||
@@ -612,6 +788,9 @@ packages:
|
||||
minimatch@3.1.2:
|
||||
resolution: {integrity: sha512-J7p63hRiAjw1NDEww1W7i37+ByIrOWO5XQQAzZ3VOcL0PNybwpfmV/N05zFAzwQ9USyEcX6t3UO+K5aqBQOIHw==}
|
||||
|
||||
minimist@1.2.8:
|
||||
resolution: {integrity: sha512-2yyAR8qBkN3YuheJanUpWC5U3bb5osDywNB8RzDVlDwDHbocAJveqqj1u8+SVD7jkWT4yvsHCpWqqWqAxb0zCA==}
|
||||
|
||||
ms@2.1.3:
|
||||
resolution: {integrity: sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA==}
|
||||
|
||||
@@ -649,10 +828,25 @@ packages:
|
||||
once@1.4.0:
|
||||
resolution: {integrity: sha512-lNaJgI+2Q5URQBkccEKHTQOPaXdUxnZZElQTZY0MFUAuaEqe1E+Nyvgdz/aIyNi6Z9MzO5dv1H8n58/GELp3+w==}
|
||||
|
||||
parent-module@1.0.1:
|
||||
resolution: {integrity: sha512-GQ2EWRpQV8/o+Aw8YqtfZZPfNRWZYkbidE9k5rpl/hC3vtHHBfGm2Ifi6qWV+coDGkrUKZAxE3Lot5kcsRlh+g==}
|
||||
engines: {node: '>=6'}
|
||||
|
||||
parse-json@5.2.0:
|
||||
resolution: {integrity: sha512-ayCKvm/phCGxOkYRSCM82iDwct8/EonSEgCSxWxD7ve6jHggsFl4fZVQBPRNgQoKiuV/odhFrGzQXZwbifC8Rg==}
|
||||
engines: {node: '>=8'}
|
||||
|
||||
parseurl@1.3.3:
|
||||
resolution: {integrity: sha512-CiyeOxFT/JZyN5m0z9PfXw4SCBJ6Sygz1Dpl0wqjlhDEGGBP1GnsUVEL0p63hoG1fcj3fHynXi9NYO4nWOL+qQ==}
|
||||
engines: {node: '>= 0.8'}
|
||||
|
||||
path-is-absolute@1.0.1:
|
||||
resolution: {integrity: sha512-AVbw3UJ2e9bq64vSaS9Am0fje1Pa8pbGqTTsmXfaIiMpnr5DlDhfJOuLj9Sf95ZPVDAUerDfEk88MPmPe7UCQg==}
|
||||
engines: {node: '>=0.10.0'}
|
||||
|
||||
path-parse@1.0.7:
|
||||
resolution: {integrity: sha512-LDJzPVEEEPR+y48z93A0Ed0yXb8pAByGWo/k5YYdYgpY2/2EsOsksJrq7lOHxryrVOn1ejG6oAp8ahvOIQD8sw==}
|
||||
|
||||
path-to-regexp@8.3.0:
|
||||
resolution: {integrity: sha512-7jdwVIRtsP8MYpdXSwOS0YdD0Du+qOoF/AEPIt88PcCFrZCzx41oxku1jD88hZBwbNUIEfpqvuhjFaMAqMTWnA==}
|
||||
|
||||
@@ -690,10 +884,17 @@ packages:
|
||||
pgpass@1.0.5:
|
||||
resolution: {integrity: sha512-FdW9r/jQZhSeohs1Z3sI1yxFQNFvMcnmfuj4WBMUTxOrAyLMaTcE1aAMBiTlbMNaXvBCQuVi0R7hd8udDSP7ug==}
|
||||
|
||||
picocolors@1.1.1:
|
||||
resolution: {integrity: sha512-xceH2snhtb5M9liqDsmEw56le376mTZkEX/jEb/RxNFyegNul7eNslCXP9FDj/Lcu0X8KEyMceP2ntpaHrDEVA==}
|
||||
|
||||
picomatch@2.3.1:
|
||||
resolution: {integrity: sha512-JU3teHTNjmE2VCGFzuY8EXzCDVwEqB2a8fsIvwaStHhAWJEeVd1o1QD80CU6+ZdEXXSLbSsuLwJjkCBWqRQUVA==}
|
||||
engines: {node: '>=8.6'}
|
||||
|
||||
pluralize@8.0.0:
|
||||
resolution: {integrity: sha512-Nc3IT5yHzflTfbjgqWcCPpo7DaKy4FnpB0l/zCAW0Tc7jxAiuqSxHasntB3D7887LSrA93kDJ9IXovxJYxyLCA==}
|
||||
engines: {node: '>=4'}
|
||||
|
||||
postgres-array@2.0.0:
|
||||
resolution: {integrity: sha512-VpZrUqU5A69eQyW2c5CA1jtLecCsN2U/bD6VilrFDWq5+5UIEVO7nazS3TEcHf1zuPYO/sqGvUvW62g86RXZuA==}
|
||||
engines: {node: '>=4'}
|
||||
@@ -733,9 +934,22 @@ packages:
|
||||
resolution: {integrity: sha512-hOS089on8RduqdbhvQ5Z37A0ESjsqz6qnRcffsMU3495FuTdqSm+7bhJ29JvIOsBDEEnan5DPu9t3To9VRlMzA==}
|
||||
engines: {node: '>=8.10.0'}
|
||||
|
||||
rechoir@0.6.2:
|
||||
resolution: {integrity: sha512-HFM8rkZ+i3zrV+4LQjwQ0W+ez98pApMGM3HUrN04j3CqzPOzl9nmP15Y8YXNm8QHGv/eacOVEjqhmWpkRV0NAw==}
|
||||
engines: {node: '>= 0.10'}
|
||||
|
||||
resolve-from@4.0.0:
|
||||
resolution: {integrity: sha512-pb/MYmXstAkysRFx8piNI1tGFNQIFA3vkE3Gq4EuA1dF6gHp/+vgZqsCGJapvy8N3Q+4o7FwvquPJcnZ7RYy4g==}
|
||||
engines: {node: '>=4'}
|
||||
|
||||
resolve-pkg-maps@1.0.0:
|
||||
resolution: {integrity: sha512-seS2Tj26TBVOC2NIc2rOe2y2ZO7efxITtLZcGSOnHHNOQ7CkiUBfw0Iw2ck6xkIhPwLhKNLS8BO+hEpngQlqzw==}
|
||||
|
||||
resolve@1.22.11:
|
||||
resolution: {integrity: sha512-RfqAvLnMl313r7c9oclB1HhUEAezcpLjz95wFH4LVuhk9JF/r22qmVP9AMmOU4vMX7Q8pN8jwNg/CSpdFnMjTQ==}
|
||||
engines: {node: '>= 0.4'}
|
||||
hasBin: true
|
||||
|
||||
router@2.2.0:
|
||||
resolution: {integrity: sha512-nLTrUKm2UyiL7rlhapu/Zl45FwNgkZGaCpZbIHajDYgwlJCOzLSk+cIPAnsEqV955GjILJnKbdQC1nVPz+gAYQ==}
|
||||
engines: {node: '>= 18'}
|
||||
@@ -762,6 +976,15 @@ packages:
|
||||
setprototypeof@1.2.0:
|
||||
resolution: {integrity: sha512-E5LDX7Wrp85Kil5bhZv46j8jOeboKq5JMmYM3gVGdGH8xFpPWXUMsNrlODCrkoxMEeNi/XZIwuRvY4XNwYMJpw==}
|
||||
|
||||
shelljs.exec@1.1.8:
|
||||
resolution: {integrity: sha512-vFILCw+lzUtiwBAHV8/Ex8JsFjelFMdhONIsgKNLgTzeRckp2AOYRQtHJE/9LhNvdMmE27AGtzWx0+DHpwIwSw==}
|
||||
engines: {node: '>= 4.0.0'}
|
||||
|
||||
shelljs@0.8.5:
|
||||
resolution: {integrity: sha512-TiwcRcrkhHvbrZbnRcFYMLl30Dfov3HKqzp5tO5b4pt6G/SezKcYhmDg15zXVBswHmctSAQKznqNW2LO5tTDow==}
|
||||
engines: {node: '>=4'}
|
||||
hasBin: true
|
||||
|
||||
side-channel-list@1.0.0:
|
||||
resolution: {integrity: sha512-FCLHtRD/gnpCiCHEiJLOwdmFP+wzCmDEkc9y7NsYxeF4u7Btsn1ZuwgwJGxImImHicJArLP4R0yX4c2KCrMrTA==}
|
||||
engines: {node: '>= 0.4'}
|
||||
@@ -798,6 +1021,14 @@ packages:
|
||||
resolution: {integrity: sha512-QjVjwdXIt408MIiAqCX4oUKsgU2EqAGzs2Ppkm4aQYbjm+ZEWEcW4SfFNTr4uMNZma0ey4f5lgLrkB0aX0QMow==}
|
||||
engines: {node: '>=4'}
|
||||
|
||||
supports-color@7.2.0:
|
||||
resolution: {integrity: sha512-qpCAvRl9stuOHveKsn7HncJRvv501qIacKzQlO/+Lwxc9+0q2wLyv4Dfvt80/DPn2pqOBsJdDiogXGR9+OvwRw==}
|
||||
engines: {node: '>=8'}
|
||||
|
||||
supports-preserve-symlinks-flag@1.0.0:
|
||||
resolution: {integrity: sha512-ot0WnXS9fgdkgIcePe6RHNk1WA8+muPa6cSjeR3V8K27q9BB1rTE3R1p7Hv0z1ZyAc8s6Vvv8DIyWf681MAt0w==}
|
||||
engines: {node: '>= 0.4'}
|
||||
|
||||
to-regex-range@5.0.1:
|
||||
resolution: {integrity: sha512-65P7iz6X5yEr1cwcgvQxbbIw7Uk3gOy5dIdtZ4rDveLqhrdJP+Li/Hx6tyK0NEb+2GCyneCMJiGqrADCSNk8sQ==}
|
||||
engines: {node: '>=8.0'}
|
||||
@@ -837,6 +1068,9 @@ packages:
|
||||
resolution: {integrity: sha512-OZs6gsjF4vMp32qrCbiVSkrFmXtG/AZhY3t0iAMrMBiAZyV9oALtXO8hsrHbMXF9x6L3grlFuwW2oAz7cav+Gw==}
|
||||
engines: {node: '>= 0.6'}
|
||||
|
||||
typeid-js@1.2.0:
|
||||
resolution: {integrity: sha512-t76ZucAnvGC60ea/HjVsB0TSoB0cw9yjnfurUgtInXQWUI/VcrlZGpO23KN3iSe8yOGUgb1zr7W7uEzJ3hSljA==}
|
||||
|
||||
typescript@5.9.3:
|
||||
resolution: {integrity: sha512-jl1vZzPDinLr9eUt3J/t7V6FgNEw9QjvBPdysz9KfQDD41fQrC2Y4vKQdiaUpFT4bXlb1RHhLpp8wtm6M5TgSw==}
|
||||
engines: {node: '>=14.17'}
|
||||
@@ -852,6 +1086,10 @@ packages:
|
||||
resolution: {integrity: sha512-pjy2bYhSsufwWlKwPc+l3cN7+wuJlK6uz0YdJEOlQDbl6jo/YlPi4mb8agUkVC8BF7V8NuzeyPNqRksA3hztKQ==}
|
||||
engines: {node: '>= 0.8'}
|
||||
|
||||
uuid@10.0.0:
|
||||
resolution: {integrity: sha512-8XkAphELsDnEGrDxUOHB3RGvXz6TeuYSGEZBOjtTtPm2lwhGBjLgOzLHB63IUWfBpNucQjND6d3AOudO+H3RWQ==}
|
||||
hasBin: true
|
||||
|
||||
v8-compile-cache-lib@3.0.1:
|
||||
resolution: {integrity: sha512-wa7YjyUGfNZngI/vtK0UHAN+lgDCxBPCylVXGp0zu59Fz5aiGtNXaq3DhIov063MorB+VfufLh3JlF2KdTK3xg==}
|
||||
|
||||
@@ -875,6 +1113,14 @@ packages:
|
||||
|
||||
snapshots:
|
||||
|
||||
'@babel/code-frame@7.28.6':
|
||||
dependencies:
|
||||
'@babel/helper-validator-identifier': 7.28.5
|
||||
js-tokens: 4.0.0
|
||||
picocolors: 1.1.1
|
||||
|
||||
'@babel/helper-validator-identifier@7.28.5': {}
|
||||
|
||||
'@biomejs/biome@2.3.10':
|
||||
optionalDependencies:
|
||||
'@biomejs/cli-darwin-arm64': 2.3.10
|
||||
@@ -1081,6 +1327,14 @@ snapshots:
|
||||
|
||||
acorn@8.15.0: {}
|
||||
|
||||
ansi-styles@3.2.1:
|
||||
dependencies:
|
||||
color-convert: 1.9.3
|
||||
|
||||
ansi-styles@4.3.0:
|
||||
dependencies:
|
||||
color-convert: 2.0.1
|
||||
|
||||
anymatch@3.1.3:
|
||||
dependencies:
|
||||
normalize-path: 3.0.0
|
||||
@@ -1088,6 +1342,8 @@ snapshots:
|
||||
|
||||
arg@4.1.3: {}
|
||||
|
||||
argparse@2.0.1: {}
|
||||
|
||||
asap@2.0.6: {}
|
||||
|
||||
balanced-match@1.0.2: {}
|
||||
@@ -1129,6 +1385,19 @@ snapshots:
|
||||
call-bind-apply-helpers: 1.0.2
|
||||
get-intrinsic: 1.3.0
|
||||
|
||||
callsites@3.1.0: {}
|
||||
|
||||
chalk@2.4.2:
|
||||
dependencies:
|
||||
ansi-styles: 3.2.1
|
||||
escape-string-regexp: 1.0.5
|
||||
supports-color: 5.5.0
|
||||
|
||||
chalk@4.1.2:
|
||||
dependencies:
|
||||
ansi-styles: 4.3.0
|
||||
supports-color: 7.2.0
|
||||
|
||||
chokidar@3.6.0:
|
||||
dependencies:
|
||||
anymatch: 3.1.3
|
||||
@@ -1141,6 +1410,18 @@ snapshots:
|
||||
optionalDependencies:
|
||||
fsevents: 2.3.3
|
||||
|
||||
color-convert@1.9.3:
|
||||
dependencies:
|
||||
color-name: 1.1.3
|
||||
|
||||
color-convert@2.0.1:
|
||||
dependencies:
|
||||
color-name: 1.1.4
|
||||
|
||||
color-name@1.1.3: {}
|
||||
|
||||
color-name@1.1.4: {}
|
||||
|
||||
commander@5.1.0: {}
|
||||
|
||||
concat-map@0.0.1: {}
|
||||
@@ -1155,6 +1436,15 @@ snapshots:
|
||||
|
||||
cookie@0.7.2: {}
|
||||
|
||||
cosmiconfig@9.0.0(typescript@5.9.3):
|
||||
dependencies:
|
||||
env-paths: 2.2.1
|
||||
import-fresh: 3.3.1
|
||||
js-yaml: 4.1.1
|
||||
parse-json: 5.2.0
|
||||
optionalDependencies:
|
||||
typescript: 5.9.3
|
||||
|
||||
create-require@1.1.1: {}
|
||||
|
||||
debug@4.4.3(supports-color@5.5.0):
|
||||
@@ -1165,8 +1455,18 @@ snapshots:
|
||||
|
||||
depd@2.0.0: {}
|
||||
|
||||
diff@3.5.0: {}
|
||||
|
||||
diff@4.0.2: {}
|
||||
|
||||
dotenv-expand@12.0.3:
|
||||
dependencies:
|
||||
dotenv: 16.6.1
|
||||
|
||||
dotenv@16.6.1: {}
|
||||
|
||||
dotenv@17.2.3: {}
|
||||
|
||||
dunder-proto@1.0.1:
|
||||
dependencies:
|
||||
call-bind-apply-helpers: 1.0.2
|
||||
@@ -1177,6 +1477,12 @@ snapshots:
|
||||
|
||||
encodeurl@2.0.0: {}
|
||||
|
||||
env-paths@2.2.1: {}
|
||||
|
||||
error-ex@1.3.4:
|
||||
dependencies:
|
||||
is-arrayish: 0.2.1
|
||||
|
||||
es-define-property@1.0.1: {}
|
||||
|
||||
es-errors@1.3.0: {}
|
||||
@@ -1216,6 +1522,8 @@ snapshots:
|
||||
|
||||
escape-html@1.0.3: {}
|
||||
|
||||
escape-string-regexp@1.0.5: {}
|
||||
|
||||
etag@1.8.1: {}
|
||||
|
||||
express@5.1.0:
|
||||
@@ -1269,6 +1577,8 @@ snapshots:
|
||||
|
||||
fresh@2.0.0: {}
|
||||
|
||||
fs.realpath@1.0.0: {}
|
||||
|
||||
fsevents@2.3.3:
|
||||
optional: true
|
||||
|
||||
@@ -1296,14 +1606,33 @@ snapshots:
|
||||
dependencies:
|
||||
resolve-pkg-maps: 1.0.0
|
||||
|
||||
git-diff@2.0.6:
|
||||
dependencies:
|
||||
chalk: 2.4.2
|
||||
diff: 3.5.0
|
||||
loglevel: 1.9.2
|
||||
shelljs: 0.8.5
|
||||
shelljs.exec: 1.1.8
|
||||
|
||||
glob-parent@5.1.2:
|
||||
dependencies:
|
||||
is-glob: 4.0.3
|
||||
|
||||
glob@7.2.3:
|
||||
dependencies:
|
||||
fs.realpath: 1.0.0
|
||||
inflight: 1.0.6
|
||||
inherits: 2.0.4
|
||||
minimatch: 3.1.2
|
||||
once: 1.4.0
|
||||
path-is-absolute: 1.0.1
|
||||
|
||||
gopd@1.2.0: {}
|
||||
|
||||
has-flag@3.0.0: {}
|
||||
|
||||
has-flag@4.0.0: {}
|
||||
|
||||
has-symbols@1.1.0: {}
|
||||
|
||||
hasown@2.0.2:
|
||||
@@ -1328,14 +1657,32 @@ snapshots:
|
||||
|
||||
ignore-by-default@1.0.1: {}
|
||||
|
||||
import-fresh@3.3.1:
|
||||
dependencies:
|
||||
parent-module: 1.0.1
|
||||
resolve-from: 4.0.0
|
||||
|
||||
inflight@1.0.6:
|
||||
dependencies:
|
||||
once: 1.4.0
|
||||
wrappy: 1.0.2
|
||||
|
||||
inherits@2.0.4: {}
|
||||
|
||||
interpret@1.4.0: {}
|
||||
|
||||
ipaddr.js@1.9.1: {}
|
||||
|
||||
is-arrayish@0.2.1: {}
|
||||
|
||||
is-binary-path@2.1.0:
|
||||
dependencies:
|
||||
binary-extensions: 2.3.0
|
||||
|
||||
is-core-module@2.16.1:
|
||||
dependencies:
|
||||
hasown: 2.0.2
|
||||
|
||||
is-extglob@2.1.1: {}
|
||||
|
||||
is-glob@4.0.3:
|
||||
@@ -1346,8 +1693,37 @@ snapshots:
|
||||
|
||||
is-promise@4.0.0: {}
|
||||
|
||||
js-tokens@4.0.0: {}
|
||||
|
||||
js-yaml@4.1.1:
|
||||
dependencies:
|
||||
argparse: 2.0.1
|
||||
|
||||
json-parse-even-better-errors@2.3.1: {}
|
||||
|
||||
kysely-codegen@0.19.0(kysely@0.28.9)(pg@8.16.3)(typescript@5.9.3):
|
||||
dependencies:
|
||||
chalk: 4.1.2
|
||||
cosmiconfig: 9.0.0(typescript@5.9.3)
|
||||
dotenv: 17.2.3
|
||||
dotenv-expand: 12.0.3
|
||||
git-diff: 2.0.6
|
||||
kysely: 0.28.9
|
||||
micromatch: 4.0.8
|
||||
minimist: 1.2.8
|
||||
pluralize: 8.0.0
|
||||
zod: 4.1.12
|
||||
optionalDependencies:
|
||||
pg: 8.16.3
|
||||
transitivePeerDependencies:
|
||||
- typescript
|
||||
|
||||
kysely@0.28.9: {}
|
||||
|
||||
lines-and-columns@1.2.4: {}
|
||||
|
||||
loglevel@1.9.2: {}
|
||||
|
||||
make-error@1.3.6: {}
|
||||
|
||||
math-intrinsics@1.1.0: {}
|
||||
@@ -1356,6 +1732,11 @@ snapshots:
|
||||
|
||||
merge-descriptors@2.0.0: {}
|
||||
|
||||
micromatch@4.0.8:
|
||||
dependencies:
|
||||
braces: 3.0.3
|
||||
picomatch: 2.3.1
|
||||
|
||||
mime-db@1.54.0: {}
|
||||
|
||||
mime-types@3.0.1:
|
||||
@@ -1366,6 +1747,8 @@ snapshots:
|
||||
dependencies:
|
||||
brace-expansion: 1.1.12
|
||||
|
||||
minimist@1.2.8: {}
|
||||
|
||||
ms@2.1.3: {}
|
||||
|
||||
negotiator@1.0.0: {}
|
||||
@@ -1403,8 +1786,23 @@ snapshots:
|
||||
dependencies:
|
||||
wrappy: 1.0.2
|
||||
|
||||
parent-module@1.0.1:
|
||||
dependencies:
|
||||
callsites: 3.1.0
|
||||
|
||||
parse-json@5.2.0:
|
||||
dependencies:
|
||||
'@babel/code-frame': 7.28.6
|
||||
error-ex: 1.3.4
|
||||
json-parse-even-better-errors: 2.3.1
|
||||
lines-and-columns: 1.2.4
|
||||
|
||||
parseurl@1.3.3: {}
|
||||
|
||||
path-is-absolute@1.0.1: {}
|
||||
|
||||
path-parse@1.0.7: {}
|
||||
|
||||
path-to-regexp@8.3.0: {}
|
||||
|
||||
pg-cloudflare@1.2.7:
|
||||
@@ -1442,8 +1840,12 @@ snapshots:
|
||||
dependencies:
|
||||
split2: 4.2.0
|
||||
|
||||
picocolors@1.1.1: {}
|
||||
|
||||
picomatch@2.3.1: {}
|
||||
|
||||
pluralize@8.0.0: {}
|
||||
|
||||
postgres-array@2.0.0: {}
|
||||
|
||||
postgres-bytea@1.0.1: {}
|
||||
@@ -1478,8 +1880,20 @@ snapshots:
|
||||
dependencies:
|
||||
picomatch: 2.3.1
|
||||
|
||||
rechoir@0.6.2:
|
||||
dependencies:
|
||||
resolve: 1.22.11
|
||||
|
||||
resolve-from@4.0.0: {}
|
||||
|
||||
resolve-pkg-maps@1.0.0: {}
|
||||
|
||||
resolve@1.22.11:
|
||||
dependencies:
|
||||
is-core-module: 2.16.1
|
||||
path-parse: 1.0.7
|
||||
supports-preserve-symlinks-flag: 1.0.0
|
||||
|
||||
router@2.2.0:
|
||||
dependencies:
|
||||
debug: 4.4.3(supports-color@5.5.0)
|
||||
@@ -1523,6 +1937,14 @@ snapshots:
|
||||
|
||||
setprototypeof@1.2.0: {}
|
||||
|
||||
shelljs.exec@1.1.8: {}
|
||||
|
||||
shelljs@0.8.5:
|
||||
dependencies:
|
||||
glob: 7.2.3
|
||||
interpret: 1.4.0
|
||||
rechoir: 0.6.2
|
||||
|
||||
side-channel-list@1.0.0:
|
||||
dependencies:
|
||||
es-errors: 1.3.0
|
||||
@@ -1565,6 +1987,12 @@ snapshots:
|
||||
dependencies:
|
||||
has-flag: 3.0.0
|
||||
|
||||
supports-color@7.2.0:
|
||||
dependencies:
|
||||
has-flag: 4.0.0
|
||||
|
||||
supports-preserve-symlinks-flag@1.0.0: {}
|
||||
|
||||
to-regex-range@5.0.1:
|
||||
dependencies:
|
||||
is-number: 7.0.0
|
||||
@@ -1606,6 +2034,10 @@ snapshots:
|
||||
media-typer: 1.1.0
|
||||
mime-types: 3.0.1
|
||||
|
||||
typeid-js@1.2.0:
|
||||
dependencies:
|
||||
uuid: 10.0.0
|
||||
|
||||
typescript@5.9.3: {}
|
||||
|
||||
undefsafe@2.0.5: {}
|
||||
@@ -1614,6 +2046,8 @@ snapshots:
|
||||
|
||||
unpipe@1.0.0: {}
|
||||
|
||||
uuid@10.0.0: {}
|
||||
|
||||
v8-compile-cache-lib@3.0.1: {}
|
||||
|
||||
vary@1.1.2: {}
|
||||
93
backend/diachron/routing.ts
Normal file
93
backend/diachron/routing.ts
Normal file
@@ -0,0 +1,93 @@
|
||||
import { contentTypes } from "./content-types";
|
||||
import { httpCodes } from "./http-codes";
|
||||
import express, {
|
||||
type NextFunction,
|
||||
type Request as ExpressRequest,
|
||||
type Response as ExpressResponse,
|
||||
} from "express";
|
||||
|
||||
import {isRedirect, InternalHandler, AuthenticationRequired,
|
||||
AuthorizationDenied, Call,type Method, type ProcessedRoute,methodParser, type Result, type Route,massageMethod } from "./types";
|
||||
import { runWithContext } from "./context";
|
||||
import { Session } from "./auth";import { request } from "./request";
|
||||
import { match } from "path-to-regexp";
|
||||
|
||||
type ProcessedRoutes= {[K in Method]: ProcessedRoute[] }
|
||||
const processRoutes=(routes:Route[]) :ProcessedRoutes => {
|
||||
const retval:ProcessedRoutes= {
|
||||
GET: [],
|
||||
POST: [],
|
||||
PUT: [],
|
||||
PATCH: [],
|
||||
DELETE: [],
|
||||
};
|
||||
|
||||
routes.forEach((route: Route, _idx: number, _allRoutes: Route[]) => {
|
||||
// const pattern /*: URLPattern */ = new URLPattern({ pathname: route.path });
|
||||
const matcher = match<Record<string, string>>(route.path);
|
||||
const methodList = route.methods;
|
||||
|
||||
const handler: InternalHandler = async (
|
||||
expressRequest: ExpressRequest,
|
||||
): Promise<Result> => {
|
||||
const method = massageMethod(expressRequest.method);
|
||||
|
||||
console.log("method", method);
|
||||
|
||||
if (!methodList.includes(method)) {
|
||||
// XXX: Worth asserting this?
|
||||
}
|
||||
|
||||
console.log("request.originalUrl", expressRequest.originalUrl);
|
||||
|
||||
// Authenticate the request
|
||||
const auth = await request.auth.validateRequest(expressRequest);
|
||||
|
||||
const req: Call = {
|
||||
pattern: route.path,
|
||||
path: expressRequest.originalUrl,
|
||||
method,
|
||||
parameters: { one: 1, two: 2 },
|
||||
request: expressRequest,
|
||||
user: auth.user,
|
||||
session: new Session(auth.session, auth.user),
|
||||
};
|
||||
|
||||
try {
|
||||
const retval = await runWithContext({ user: auth.user }, () =>
|
||||
route.handler(req),
|
||||
);
|
||||
return retval;
|
||||
} catch (error) {
|
||||
// Handle authentication errors
|
||||
if (error instanceof AuthenticationRequired) {
|
||||
return {
|
||||
code: httpCodes.clientErrors.Unauthorized,
|
||||
contentType: contentTypes.application.json,
|
||||
result: JSON.stringify({
|
||||
error: "Authentication required",
|
||||
}),
|
||||
};
|
||||
}
|
||||
if (error instanceof AuthorizationDenied) {
|
||||
return {
|
||||
code: httpCodes.clientErrors.Forbidden,
|
||||
contentType: contentTypes.application.json,
|
||||
result: JSON.stringify({ error: "Access denied" }),
|
||||
};
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
};
|
||||
|
||||
for (const [_idx, method] of methodList.entries()) {
|
||||
const pr: ProcessedRoute = { matcher, method, handler };
|
||||
|
||||
retval[method].push(pr);
|
||||
}
|
||||
});
|
||||
|
||||
return retval;
|
||||
}
|
||||
|
||||
export{processRoutes}
|
||||
179
backend/diachron/types.spec.ts
Normal file
179
backend/diachron/types.spec.ts
Normal file
@@ -0,0 +1,179 @@
|
||||
// Tests for types.ts
|
||||
// Pure unit tests
|
||||
|
||||
import assert from "node:assert/strict";
|
||||
import { describe, it } from "node:test";
|
||||
import type { Request as ExpressRequest } from "express";
|
||||
import { Session } from "./auth/types";
|
||||
import { contentTypes } from "./content-types";
|
||||
import { httpCodes } from "./http-codes";
|
||||
import {
|
||||
AuthenticationRequired,
|
||||
AuthorizationDenied,
|
||||
type Call,
|
||||
isRedirect,
|
||||
massageMethod,
|
||||
methodParser,
|
||||
type Permission,
|
||||
type RedirectResult,
|
||||
type Result,
|
||||
requireAuth,
|
||||
requirePermission,
|
||||
} from "./types";
|
||||
import { AuthenticatedUser, anonymousUser } from "./user";
|
||||
|
||||
// Helper to create a minimal mock Call
|
||||
function createMockCall(overrides: Partial<Call> = {}): Call {
|
||||
const defaultSession = new Session(null, anonymousUser);
|
||||
return {
|
||||
pattern: "/test",
|
||||
path: "/test",
|
||||
method: "GET",
|
||||
parameters: {},
|
||||
request: {} as ExpressRequest,
|
||||
user: anonymousUser,
|
||||
session: defaultSession,
|
||||
...overrides,
|
||||
};
|
||||
}
|
||||
|
||||
describe("types", () => {
|
||||
describe("methodParser", () => {
|
||||
it("accepts valid HTTP methods", () => {
|
||||
assert.equal(methodParser.parse("GET"), "GET");
|
||||
assert.equal(methodParser.parse("POST"), "POST");
|
||||
assert.equal(methodParser.parse("PUT"), "PUT");
|
||||
assert.equal(methodParser.parse("PATCH"), "PATCH");
|
||||
assert.equal(methodParser.parse("DELETE"), "DELETE");
|
||||
});
|
||||
|
||||
it("rejects invalid methods", () => {
|
||||
assert.throws(() => methodParser.parse("get"));
|
||||
assert.throws(() => methodParser.parse("OPTIONS"));
|
||||
assert.throws(() => methodParser.parse("HEAD"));
|
||||
assert.throws(() => methodParser.parse(""));
|
||||
});
|
||||
});
|
||||
|
||||
describe("massageMethod", () => {
|
||||
it("converts lowercase to uppercase", () => {
|
||||
assert.equal(massageMethod("get"), "GET");
|
||||
assert.equal(massageMethod("post"), "POST");
|
||||
assert.equal(massageMethod("put"), "PUT");
|
||||
assert.equal(massageMethod("patch"), "PATCH");
|
||||
assert.equal(massageMethod("delete"), "DELETE");
|
||||
});
|
||||
|
||||
it("handles mixed case", () => {
|
||||
assert.equal(massageMethod("Get"), "GET");
|
||||
assert.equal(massageMethod("pOsT"), "POST");
|
||||
});
|
||||
|
||||
it("throws for invalid methods", () => {
|
||||
assert.throws(() => massageMethod("options"));
|
||||
assert.throws(() => massageMethod("head"));
|
||||
});
|
||||
});
|
||||
|
||||
describe("isRedirect", () => {
|
||||
it("returns true for redirect results", () => {
|
||||
const result: RedirectResult = {
|
||||
code: httpCodes.redirection.Found,
|
||||
contentType: contentTypes.text.html,
|
||||
result: "",
|
||||
redirect: "/other",
|
||||
};
|
||||
assert.equal(isRedirect(result), true);
|
||||
});
|
||||
|
||||
it("returns false for non-redirect results", () => {
|
||||
const result: Result = {
|
||||
code: httpCodes.success.OK,
|
||||
contentType: contentTypes.text.html,
|
||||
result: "hello",
|
||||
};
|
||||
assert.equal(isRedirect(result), false);
|
||||
});
|
||||
});
|
||||
|
||||
describe("AuthenticationRequired", () => {
|
||||
it("has correct name and message", () => {
|
||||
const err = new AuthenticationRequired();
|
||||
assert.equal(err.name, "AuthenticationRequired");
|
||||
assert.equal(err.message, "Authentication required");
|
||||
});
|
||||
|
||||
it("is an instance of Error", () => {
|
||||
const err = new AuthenticationRequired();
|
||||
assert.ok(err instanceof Error);
|
||||
});
|
||||
});
|
||||
|
||||
describe("AuthorizationDenied", () => {
|
||||
it("has correct name and message", () => {
|
||||
const err = new AuthorizationDenied();
|
||||
assert.equal(err.name, "AuthorizationDenied");
|
||||
assert.equal(err.message, "Authorization denied");
|
||||
});
|
||||
|
||||
it("is an instance of Error", () => {
|
||||
const err = new AuthorizationDenied();
|
||||
assert.ok(err instanceof Error);
|
||||
});
|
||||
});
|
||||
|
||||
describe("requireAuth", () => {
|
||||
it("returns user for authenticated call", () => {
|
||||
const user = AuthenticatedUser.create("test@example.com");
|
||||
const session = new Session(null, user);
|
||||
const call = createMockCall({ user, session });
|
||||
|
||||
const result = requireAuth(call);
|
||||
assert.equal(result, user);
|
||||
});
|
||||
|
||||
it("throws AuthenticationRequired for anonymous user", () => {
|
||||
const call = createMockCall({ user: anonymousUser });
|
||||
|
||||
assert.throws(() => requireAuth(call), AuthenticationRequired);
|
||||
});
|
||||
});
|
||||
|
||||
describe("requirePermission", () => {
|
||||
it("returns user when they have the permission", () => {
|
||||
const user = AuthenticatedUser.create("test@example.com", {
|
||||
permissions: ["posts:create" as Permission],
|
||||
});
|
||||
const session = new Session(null, user);
|
||||
const call = createMockCall({ user, session });
|
||||
|
||||
const result = requirePermission(
|
||||
call,
|
||||
"posts:create" as Permission,
|
||||
);
|
||||
assert.equal(result, user);
|
||||
});
|
||||
|
||||
it("throws AuthenticationRequired for anonymous user", () => {
|
||||
const call = createMockCall({ user: anonymousUser });
|
||||
|
||||
assert.throws(
|
||||
() => requirePermission(call, "posts:create" as Permission),
|
||||
AuthenticationRequired,
|
||||
);
|
||||
});
|
||||
|
||||
it("throws AuthorizationDenied when missing permission", () => {
|
||||
const user = AuthenticatedUser.create("test@example.com", {
|
||||
permissions: ["posts:read" as Permission],
|
||||
});
|
||||
const session = new Session(null, user);
|
||||
const call = createMockCall({ user, session });
|
||||
|
||||
assert.throws(
|
||||
() => requirePermission(call, "posts:create" as Permission),
|
||||
AuthorizationDenied,
|
||||
);
|
||||
});
|
||||
});
|
||||
});
|
||||
213
backend/diachron/user.spec.ts
Normal file
213
backend/diachron/user.spec.ts
Normal file
@@ -0,0 +1,213 @@
|
||||
// Tests for user.ts
|
||||
// These are pure unit tests - no database needed
|
||||
|
||||
import assert from "node:assert/strict";
|
||||
import { describe, it } from "node:test";
|
||||
import {
|
||||
AnonymousUser,
|
||||
AuthenticatedUser,
|
||||
anonymousUser,
|
||||
type Permission,
|
||||
type Role,
|
||||
} from "./user";
|
||||
|
||||
describe("User", () => {
|
||||
describe("AuthenticatedUser.create", () => {
|
||||
it("creates a user with default values", () => {
|
||||
const user = AuthenticatedUser.create("test@example.com");
|
||||
|
||||
assert.equal(user.email, "test@example.com");
|
||||
assert.equal(user.status, "active");
|
||||
assert.equal(user.isAnonymous(), false);
|
||||
assert.deepEqual([...user.roles], []);
|
||||
assert.deepEqual([...user.permissions], []);
|
||||
});
|
||||
|
||||
it("creates a user with custom values", () => {
|
||||
const user = AuthenticatedUser.create("test@example.com", {
|
||||
id: "custom-id",
|
||||
displayName: "Test User",
|
||||
status: "pending",
|
||||
roles: ["admin"],
|
||||
permissions: ["posts:create"],
|
||||
});
|
||||
|
||||
assert.equal(user.id, "custom-id");
|
||||
assert.equal(user.displayName, "Test User");
|
||||
assert.equal(user.status, "pending");
|
||||
assert.deepEqual([...user.roles], ["admin"]);
|
||||
assert.deepEqual([...user.permissions], ["posts:create"]);
|
||||
});
|
||||
});
|
||||
|
||||
describe("status checks", () => {
|
||||
it("isActive returns true for active users", () => {
|
||||
const user = AuthenticatedUser.create("test@example.com", {
|
||||
status: "active",
|
||||
});
|
||||
assert.equal(user.isActive(), true);
|
||||
});
|
||||
|
||||
it("isActive returns false for suspended users", () => {
|
||||
const user = AuthenticatedUser.create("test@example.com", {
|
||||
status: "suspended",
|
||||
});
|
||||
assert.equal(user.isActive(), false);
|
||||
});
|
||||
|
||||
it("isActive returns false for pending users", () => {
|
||||
const user = AuthenticatedUser.create("test@example.com", {
|
||||
status: "pending",
|
||||
});
|
||||
assert.equal(user.isActive(), false);
|
||||
});
|
||||
});
|
||||
|
||||
describe("role checks", () => {
|
||||
it("hasRole returns true when user has the role", () => {
|
||||
const user = AuthenticatedUser.create("test@example.com", {
|
||||
roles: ["admin", "editor"],
|
||||
});
|
||||
assert.equal(user.hasRole("admin"), true);
|
||||
assert.equal(user.hasRole("editor"), true);
|
||||
});
|
||||
|
||||
it("hasRole returns false when user does not have the role", () => {
|
||||
const user = AuthenticatedUser.create("test@example.com", {
|
||||
roles: ["user"],
|
||||
});
|
||||
assert.equal(user.hasRole("admin"), false);
|
||||
});
|
||||
|
||||
it("hasAnyRole returns true when user has at least one role", () => {
|
||||
const user = AuthenticatedUser.create("test@example.com", {
|
||||
roles: ["editor"],
|
||||
});
|
||||
assert.equal(user.hasAnyRole(["admin", "editor"]), true);
|
||||
});
|
||||
|
||||
it("hasAnyRole returns false when user has none of the roles", () => {
|
||||
const user = AuthenticatedUser.create("test@example.com", {
|
||||
roles: ["user"],
|
||||
});
|
||||
assert.equal(user.hasAnyRole(["admin", "editor"]), false);
|
||||
});
|
||||
|
||||
it("hasAllRoles returns true when user has all roles", () => {
|
||||
const user = AuthenticatedUser.create("test@example.com", {
|
||||
roles: ["admin", "editor", "user"],
|
||||
});
|
||||
assert.equal(user.hasAllRoles(["admin", "editor"]), true);
|
||||
});
|
||||
|
||||
it("hasAllRoles returns false when user is missing a role", () => {
|
||||
const user = AuthenticatedUser.create("test@example.com", {
|
||||
roles: ["admin"],
|
||||
});
|
||||
assert.equal(user.hasAllRoles(["admin", "editor"]), false);
|
||||
});
|
||||
});
|
||||
|
||||
describe("permission checks", () => {
|
||||
it("hasPermission returns true for direct permissions", () => {
|
||||
const user = AuthenticatedUser.create("test@example.com", {
|
||||
permissions: ["posts:create" as Permission],
|
||||
});
|
||||
assert.equal(
|
||||
user.hasPermission("posts:create" as Permission),
|
||||
true,
|
||||
);
|
||||
});
|
||||
|
||||
it("hasPermission returns true for role-derived permissions", () => {
|
||||
const user = AuthenticatedUser.create("test@example.com", {
|
||||
roles: ["admin" as Role],
|
||||
});
|
||||
// admin role has users:read, users:create, users:update, users:delete
|
||||
assert.equal(user.hasPermission("users:read" as Permission), true);
|
||||
assert.equal(
|
||||
user.hasPermission("users:delete" as Permission),
|
||||
true,
|
||||
);
|
||||
});
|
||||
|
||||
it("hasPermission returns false when permission not granted", () => {
|
||||
const user = AuthenticatedUser.create("test@example.com", {
|
||||
roles: ["user" as Role],
|
||||
});
|
||||
// user role only has users:read
|
||||
assert.equal(
|
||||
user.hasPermission("users:delete" as Permission),
|
||||
false,
|
||||
);
|
||||
});
|
||||
|
||||
it("can() is a convenience method for hasPermission", () => {
|
||||
const user = AuthenticatedUser.create("test@example.com", {
|
||||
roles: ["admin" as Role],
|
||||
});
|
||||
assert.equal(user.can("read", "users"), true);
|
||||
assert.equal(user.can("delete", "users"), true);
|
||||
assert.equal(user.can("create", "posts"), false);
|
||||
});
|
||||
});
|
||||
|
||||
describe("effectivePermissions", () => {
|
||||
it("returns combined direct and role-derived permissions", () => {
|
||||
const user = AuthenticatedUser.create("test@example.com", {
|
||||
roles: ["user" as Role],
|
||||
permissions: ["posts:create" as Permission],
|
||||
});
|
||||
|
||||
const perms = user.effectivePermissions();
|
||||
assert.equal(perms.has("posts:create" as Permission), true);
|
||||
assert.equal(perms.has("users:read" as Permission), true); // from user role
|
||||
});
|
||||
|
||||
it("returns empty set for user with no roles or permissions", () => {
|
||||
const user = AuthenticatedUser.create("test@example.com");
|
||||
const perms = user.effectivePermissions();
|
||||
assert.equal(perms.size, 0);
|
||||
});
|
||||
});
|
||||
|
||||
describe("serialization", () => {
|
||||
it("toJSON returns plain object", () => {
|
||||
const user = AuthenticatedUser.create("test@example.com", {
|
||||
id: "test-id",
|
||||
displayName: "Test",
|
||||
status: "active",
|
||||
roles: ["admin"],
|
||||
permissions: ["posts:create"],
|
||||
});
|
||||
|
||||
const json = user.toJSON();
|
||||
assert.equal(json.id, "test-id");
|
||||
assert.equal(json.email, "test@example.com");
|
||||
assert.equal(json.displayName, "Test");
|
||||
assert.equal(json.status, "active");
|
||||
assert.deepEqual(json.roles, ["admin"]);
|
||||
assert.deepEqual(json.permissions, ["posts:create"]);
|
||||
});
|
||||
|
||||
it("toString returns readable string", () => {
|
||||
const user = AuthenticatedUser.create("test@example.com", {
|
||||
id: "test-id",
|
||||
});
|
||||
assert.equal(user.toString(), "User(id test-id)");
|
||||
});
|
||||
});
|
||||
|
||||
describe("AnonymousUser", () => {
|
||||
it("isAnonymous returns true", () => {
|
||||
const user = AnonymousUser.create("anon@example.com");
|
||||
assert.equal(user.isAnonymous(), true);
|
||||
});
|
||||
|
||||
it("anonymousUser singleton is anonymous", () => {
|
||||
assert.equal(anonymousUser.isAnonymous(), true);
|
||||
assert.equal(anonymousUser.id, "-1");
|
||||
assert.equal(anonymousUser.email, "anonymous@example.com");
|
||||
});
|
||||
});
|
||||
});
|
||||
61
backend/diachron/util.spec.ts
Normal file
61
backend/diachron/util.spec.ts
Normal file
@@ -0,0 +1,61 @@
|
||||
// Tests for util.ts
|
||||
// Pure unit tests with filesystem
|
||||
|
||||
import assert from "node:assert/strict";
|
||||
import { mkdir, rm, writeFile } from "node:fs/promises";
|
||||
import { join } from "node:path";
|
||||
import { after, before, describe, it } from "node:test";
|
||||
import { loadFile } from "./util";
|
||||
|
||||
describe("util", () => {
|
||||
const testDir = join(import.meta.dirname, ".test-util-tmp");
|
||||
|
||||
before(async () => {
|
||||
await mkdir(testDir, { recursive: true });
|
||||
});
|
||||
|
||||
after(async () => {
|
||||
await rm(testDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
describe("loadFile", () => {
|
||||
it("loads file contents as string", async () => {
|
||||
const testFile = join(testDir, "test.txt");
|
||||
await writeFile(testFile, "hello world");
|
||||
|
||||
const content = await loadFile(testFile);
|
||||
assert.equal(content, "hello world");
|
||||
});
|
||||
|
||||
it("handles utf-8 content", async () => {
|
||||
const testFile = join(testDir, "utf8.txt");
|
||||
await writeFile(testFile, "hello \u{1F511} world");
|
||||
|
||||
const content = await loadFile(testFile);
|
||||
assert.equal(content, "hello \u{1F511} world");
|
||||
});
|
||||
|
||||
it("handles empty file", async () => {
|
||||
const testFile = join(testDir, "empty.txt");
|
||||
await writeFile(testFile, "");
|
||||
|
||||
const content = await loadFile(testFile);
|
||||
assert.equal(content, "");
|
||||
});
|
||||
|
||||
it("handles multiline content", async () => {
|
||||
const testFile = join(testDir, "multiline.txt");
|
||||
await writeFile(testFile, "line1\nline2\nline3");
|
||||
|
||||
const content = await loadFile(testFile);
|
||||
assert.equal(content, "line1\nline2\nline3");
|
||||
});
|
||||
|
||||
it("throws for nonexistent file", async () => {
|
||||
await assert.rejects(
|
||||
loadFile(join(testDir, "nonexistent.txt")),
|
||||
/ENOENT/,
|
||||
);
|
||||
});
|
||||
});
|
||||
});
|
||||
109
backend/generated/db.d.ts
vendored
Normal file
109
backend/generated/db.d.ts
vendored
Normal file
@@ -0,0 +1,109 @@
|
||||
/**
|
||||
* This file was generated by kysely-codegen.
|
||||
* Please do not edit it manually.
|
||||
*/
|
||||
|
||||
import type { ColumnType } from "kysely";
|
||||
|
||||
export type Generated<T> =
|
||||
T extends ColumnType<infer S, infer I, infer U>
|
||||
? ColumnType<S, I | undefined, U>
|
||||
: ColumnType<T, T | undefined, T>;
|
||||
|
||||
export type Timestamp = ColumnType<Date, Date | string, Date | string>;
|
||||
|
||||
export interface _Migrations {
|
||||
applied_at: Generated<Timestamp>;
|
||||
id: Generated<number>;
|
||||
name: string;
|
||||
}
|
||||
|
||||
export interface Capabilities {
|
||||
description: string | null;
|
||||
id: string;
|
||||
name: string;
|
||||
}
|
||||
|
||||
export interface Groups {
|
||||
created_at: Generated<Timestamp>;
|
||||
id: string;
|
||||
name: string;
|
||||
}
|
||||
|
||||
export interface RoleCapabilities {
|
||||
capability_id: string;
|
||||
granted_at: Generated<Timestamp>;
|
||||
revoked_at: Timestamp | null;
|
||||
role_id: string;
|
||||
}
|
||||
|
||||
export interface Roles {
|
||||
description: string | null;
|
||||
id: string;
|
||||
name: string;
|
||||
}
|
||||
|
||||
export interface Sessions {
|
||||
auth_method: string;
|
||||
created_at: Generated<Timestamp>;
|
||||
expires_at: Timestamp;
|
||||
id: Generated<string>;
|
||||
ip_address: string | null;
|
||||
is_used: Generated<boolean | null>;
|
||||
revoked_at: Timestamp | null;
|
||||
token_hash: string;
|
||||
token_type: string;
|
||||
user_agent: string | null;
|
||||
user_email_id: string | null;
|
||||
user_id: string;
|
||||
}
|
||||
|
||||
export interface UserCredentials {
|
||||
created_at: Generated<Timestamp>;
|
||||
credential_type: Generated<string>;
|
||||
id: string;
|
||||
password_hash: string | null;
|
||||
updated_at: Generated<Timestamp>;
|
||||
user_id: string;
|
||||
}
|
||||
|
||||
export interface UserEmails {
|
||||
created_at: Generated<Timestamp>;
|
||||
email: string;
|
||||
id: string;
|
||||
is_primary: Generated<boolean>;
|
||||
is_verified: Generated<boolean>;
|
||||
normalized_email: string;
|
||||
revoked_at: Timestamp | null;
|
||||
user_id: string;
|
||||
verified_at: Timestamp | null;
|
||||
}
|
||||
|
||||
export interface UserGroupRoles {
|
||||
granted_at: Generated<Timestamp>;
|
||||
group_id: string;
|
||||
revoked_at: Timestamp | null;
|
||||
role_id: string;
|
||||
user_id: string;
|
||||
}
|
||||
|
||||
export interface Users {
|
||||
created_at: Generated<Timestamp>;
|
||||
display_name: string | null;
|
||||
id: string;
|
||||
status: Generated<string>;
|
||||
updated_at: Generated<Timestamp>;
|
||||
}
|
||||
|
||||
export interface DB {
|
||||
_migrations: _Migrations;
|
||||
capabilities: Capabilities;
|
||||
groups: Groups;
|
||||
role_capabilities: RoleCapabilities;
|
||||
roles: Roles;
|
||||
sessions: Sessions;
|
||||
user_credentials: UserCredentials;
|
||||
user_emails: UserEmails;
|
||||
user_group_roles: UserGroupRoles;
|
||||
users: Users;
|
||||
}
|
||||
71
backend/handlers.spec.ts
Normal file
71
backend/handlers.spec.ts
Normal file
@@ -0,0 +1,71 @@
|
||||
// Tests for handlers.ts
|
||||
// These tests use mock Call objects
|
||||
|
||||
import assert from "node:assert/strict";
|
||||
import { describe, it } from "node:test";
|
||||
import type { Request as ExpressRequest } from "express";
|
||||
import { Session } from "./diachron/auth/types";
|
||||
import { contentTypes } from "./diachron/content-types";
|
||||
import { multiHandler } from "./handlers";
|
||||
import { httpCodes } from "./diachron/http-codes";
|
||||
import type { Call } from "./diachron/types";
|
||||
import { anonymousUser } from "./diachron/user";
|
||||
|
||||
// Helper to create a minimal mock Call
|
||||
function createMockCall(overrides: Partial<Call> = {}): Call {
|
||||
const defaultSession = new Session(null, anonymousUser);
|
||||
return {
|
||||
pattern: "/test",
|
||||
path: "/test",
|
||||
method: "GET",
|
||||
parameters: {},
|
||||
request: {} as ExpressRequest,
|
||||
user: anonymousUser,
|
||||
session: defaultSession,
|
||||
...overrides,
|
||||
};
|
||||
}
|
||||
|
||||
describe("handlers", () => {
|
||||
describe("multiHandler", () => {
|
||||
it("returns OK status", async () => {
|
||||
const call = createMockCall({ method: "GET" });
|
||||
const result = await multiHandler(call);
|
||||
|
||||
assert.equal(result.code, httpCodes.success.OK);
|
||||
});
|
||||
|
||||
it("returns text/plain content type", async () => {
|
||||
const call = createMockCall();
|
||||
const result = await multiHandler(call);
|
||||
|
||||
assert.equal(result.contentType, contentTypes.text.plain);
|
||||
});
|
||||
|
||||
it("includes method in result", async () => {
|
||||
const call = createMockCall({ method: "POST" });
|
||||
const result = await multiHandler(call);
|
||||
|
||||
assert.ok(result.result.includes("POST"));
|
||||
});
|
||||
|
||||
it("includes a random number in result", async () => {
|
||||
const call = createMockCall();
|
||||
const result = await multiHandler(call);
|
||||
|
||||
// Result format: "that was GET (0.123456789)"
|
||||
assert.match(result.result, /that was \w+ \(\d+\.?\d*\)/);
|
||||
});
|
||||
|
||||
it("works with different HTTP methods", async () => {
|
||||
const methods = ["GET", "POST", "PUT", "PATCH", "DELETE"] as const;
|
||||
|
||||
for (const method of methods) {
|
||||
const call = createMockCall({ method });
|
||||
const result = await multiHandler(call);
|
||||
|
||||
assert.ok(result.result.includes(method));
|
||||
}
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -1,7 +1,9 @@
|
||||
import { contentTypes } from "./content-types";
|
||||
import { core } from "./core";
|
||||
import { httpCodes } from "./http-codes";
|
||||
import type { Call, Handler, Result } from "./types";
|
||||
// This is a sample file provided by diachron. You are encouraged to modify it.
|
||||
|
||||
import { contentTypes } from "./diachron/content-types";
|
||||
import { core } from "./diachron/core";
|
||||
import { httpCodes } from "./diachron/http-codes";
|
||||
import type { Call, Handler, Result } from "./diachron/types";
|
||||
|
||||
const multiHandler: Handler = async (call: Call): Promise<Result> => {
|
||||
const code = httpCodes.success.OK;
|
||||
@@ -0,0 +1 @@
|
||||
CREATE TABLE test_application_table ();
|
||||
1
backend/migrations/2026-01-15_01.sql
Normal file
1
backend/migrations/2026-01-15_01.sql
Normal file
@@ -0,0 +1 @@
|
||||
CREATE TABLE test_application_table ();
|
||||
21
backend/package.json
Normal file
21
backend/package.json
Normal file
@@ -0,0 +1,21 @@
|
||||
{
|
||||
"name": "my app",
|
||||
"version": "0.0.1",
|
||||
"description": "",
|
||||
"main": "index.js",
|
||||
"scripts": {
|
||||
"test": "DB_PORT=5433 DB_USER=diachron_test DB_PASSWORD=diachron_test DB_NAME=diachron_test tsx --test '**/*.{test,spec}.ts'",
|
||||
"test:watch": "DB_PORT=5433 DB_USER=diachron_test DB_PASSWORD=diachron_test DB_NAME=diachron_test tsx --test --watch '**/*.{test,spec}.ts'",
|
||||
"nodemon": "nodemon dist/index.js",
|
||||
"kysely-codegen": "kysely-codegen"
|
||||
},
|
||||
"keywords": [],
|
||||
"author": "",
|
||||
"license": "ISC",
|
||||
"packageManager": "pnpm@10.12.4",
|
||||
"dependencies": {
|
||||
"diachron": "workspace:*"
|
||||
},
|
||||
"devDependencies": {
|
||||
}
|
||||
}
|
||||
2
backend/pnpm-workspace.yaml
Normal file
2
backend/pnpm-workspace.yaml
Normal file
@@ -0,0 +1,2 @@
|
||||
packages:
|
||||
- 'diachron'
|
||||
@@ -1,14 +1,16 @@
|
||||
// This is a sample file provided by diachron. You are encouraged to modify it.
|
||||
|
||||
/// <reference lib="dom" />
|
||||
|
||||
import nunjucks from "nunjucks";
|
||||
import { DateTime } from "ts-luxon";
|
||||
import { authRoutes } from "./auth/routes";
|
||||
import { routes as basicRoutes } from "./basic/routes";
|
||||
import { contentTypes } from "./content-types";
|
||||
import { core } from "./core";
|
||||
import { authRoutes } from "./diachron/auth/routes";
|
||||
import { routes as basicRoutes } from "./diachron/basic/routes";
|
||||
import { contentTypes } from "./diachron/content-types";
|
||||
import { core } from "./diachron/core";
|
||||
import { multiHandler } from "./handlers";
|
||||
import { httpCodes } from "./http-codes";
|
||||
import type { Call, Result, Route } from "./types";
|
||||
import { httpCodes } from "./diachron/http-codes";
|
||||
import type { Call, Result, Route } from "./diachron/types";
|
||||
|
||||
// FIXME: Obviously put this somewhere else
|
||||
const okText = (result: string): Result => {
|
||||
@@ -27,6 +29,9 @@ const routes: Route[] = [
|
||||
...authRoutes,
|
||||
basicRoutes.home,
|
||||
basicRoutes.hello,
|
||||
{...basicRoutes.hello,
|
||||
path: "/yo-whats-up"
|
||||
},
|
||||
basicRoutes.login,
|
||||
basicRoutes.logout,
|
||||
{
|
||||
@@ -35,7 +40,7 @@ const routes: Route[] = [
|
||||
handler: async (_call: Call): Promise<Result> => {
|
||||
console.log("starting slow request");
|
||||
|
||||
await core.misc.sleep(2);
|
||||
await core.misc.sleep(5000);
|
||||
|
||||
console.log("finishing slow request");
|
||||
const retval = okText("that was slow");
|
||||
@@ -72,7 +77,6 @@ const routes: Route[] = [
|
||||
`;
|
||||
const result = nunjucks.renderString(template, { rrr });
|
||||
|
||||
const _listing = lr(routes).join(", ");
|
||||
return {
|
||||
code,
|
||||
result,
|
||||
@@ -6,4 +6,4 @@ DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
|
||||
cd "$DIR"
|
||||
|
||||
exec ../cmd node dist/index.js "$@"
|
||||
exec ../cmd node --enable-source-maps dist/index.js "$@"
|
||||
15
backend/services.ts
Normal file
15
backend/services.ts
Normal file
@@ -0,0 +1,15 @@
|
||||
// This is a sample file provided by diachron. You are encouraged to modify it.
|
||||
|
||||
// Application services go here. A service encapsulates a capability that
|
||||
// handlers depend on: database queries, external API calls, business logic
|
||||
// that doesn't belong in a handler.
|
||||
//
|
||||
// The framework provides core services via `core` (from ./diachron/core):
|
||||
// core.database, core.logging, core.misc, etc. This file is for your
|
||||
// application's own services.
|
||||
|
||||
import { core } from "./diachron/core";
|
||||
|
||||
const db = core.database.db;
|
||||
|
||||
export { db };
|
||||
@@ -6,7 +6,7 @@ DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
|
||||
check_dir="$DIR"
|
||||
|
||||
source "$check_dir"/../framework/shims/common
|
||||
source "$check_dir"/../framework/shims/node.common
|
||||
source "$check_dir"/../diachron/shims/common
|
||||
source "$check_dir"/../diachron/shims/node.common
|
||||
|
||||
$ROOT/cmd pnpm tsc --showConfig
|
||||
@@ -9,5 +9,6 @@
|
||||
"strict": true,
|
||||
"types": ["node"],
|
||||
"outDir": "out"
|
||||
}
|
||||
},
|
||||
"exclude": ["**/*.spec.ts", "**/*.test.ts", "check-deps.ts"]
|
||||
}
|
||||
8
backend/types.ts
Normal file
8
backend/types.ts
Normal file
@@ -0,0 +1,8 @@
|
||||
// This is a sample file provided by diachron. You are encouraged to modify it.
|
||||
|
||||
// Application-specific types go here. Framework types (Call, Result, Route,
|
||||
// Handler, etc.) are defined in ./diachron/types and should be imported from
|
||||
// there.
|
||||
//
|
||||
// This file is for your domain types: the nouns and shapes that are specific
|
||||
// to your application.
|
||||
@@ -6,8 +6,8 @@ DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
|
||||
check_dir="$DIR"
|
||||
|
||||
source "$check_dir"/../framework/shims/common
|
||||
source "$check_dir"/../framework/shims/node.common
|
||||
source "$check_dir"/../diachron/shims/common
|
||||
source "$check_dir"/../diachron/shims/node.common
|
||||
|
||||
# $ROOT/cmd pnpm tsc --lib ES2023 --esModuleInterop -w $check_dir/app.ts
|
||||
# $ROOT/cmd pnpm tsc -w $check_dir/app.ts
|
||||
49
bootstrap.sh
Executable file
49
bootstrap.sh
Executable file
@@ -0,0 +1,49 @@
|
||||
#!/bin/bash
|
||||
# shellcheck disable=SC2002
|
||||
|
||||
set -eu
|
||||
set -o pipefail
|
||||
IFS=$'\n\t'
|
||||
|
||||
# print useful message on failure
|
||||
trap 's=$?; echo >&2 "$0: Error on line "$LINENO": $BASH_COMMAND"; exit $s' ERR
|
||||
|
||||
# shellcheck disable=SC2034
|
||||
DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
|
||||
# cd "$DIR"
|
||||
here="$PWD"
|
||||
|
||||
"$DIR/update-cached-repository.sh"
|
||||
|
||||
# repository="${2:-https://gitea.philologue.net/philologue/diachron}"
|
||||
repository="${2:-$HOME/.cache/diachron/v1/repositories/diachron.git}"
|
||||
ref="${1:-hydrators-kysely}"
|
||||
|
||||
echo will bootstrap ref "$ref" of repo "$repository"
|
||||
|
||||
into=$(mktemp -d)
|
||||
cd "$into"
|
||||
echo I am in $(pwd)
|
||||
echo I will clone repository "$repository", ref "$ref"
|
||||
git clone "$repository"
|
||||
|
||||
r=$(ls -1)
|
||||
|
||||
cd "$r"
|
||||
|
||||
echo I am in $(pwd)
|
||||
|
||||
git checkout "$ref"
|
||||
|
||||
ls
|
||||
echo working dir: $PWD
|
||||
# ls backend
|
||||
|
||||
# exit 0
|
||||
|
||||
tar cvf - $(cat "$PWD/file-list" | grep -v '^#') | (cd "$here" && tar xf -)
|
||||
|
||||
echo "$ref" > .diachron-version
|
||||
|
||||
echo "Now, run the command ./sync.sh"
|
||||
6
check.sh
6
check.sh
@@ -10,7 +10,7 @@ cd "$DIR"
|
||||
#
|
||||
exclusions="SC2002"
|
||||
|
||||
source "$DIR/framework/versions"
|
||||
source "$DIR/diachron/versions"
|
||||
|
||||
if [[ $# -ne 0 ]]; then
|
||||
shellcheck --exclude="$exclusions" "$@"
|
||||
@@ -20,10 +20,10 @@ fi
|
||||
shell_scripts="$(fd .sh | xargs)"
|
||||
|
||||
# The files we need to check all either end in .sh or else they're the files
|
||||
# in framework/cmd.d and framework/shims. -x instructs shellcheck to also
|
||||
# in diachron/cmd.d and diachron/shims. -x instructs shellcheck to also
|
||||
# check `source`d files.
|
||||
|
||||
shellcheck -x --exclude="$exclusions" "$DIR/cmd" "$DIR"/framework/cmd.d/* "$DIR"/framework/shims/* "$shell_scripts"
|
||||
shellcheck -x --exclude="$exclusions" "$DIR/cmd" "$DIR"/diachron/cmd.d/* "$DIR"/diachron/shims/* "$shell_scripts"
|
||||
|
||||
pushd "$DIR/master"
|
||||
docker run --rm -v $(pwd):/app -w /app golangci/golangci-lint:$golangci_lint golangci-lint run
|
||||
|
||||
4
cmd
4
cmd
@@ -13,7 +13,7 @@ if [ $# -lt 1 ]; then
|
||||
echo "Usage: ./cmd <command> [args...]"
|
||||
echo ""
|
||||
echo "Available commands:"
|
||||
for cmd in "$DIR"/framework/cmd.d/*; do
|
||||
for cmd in "$DIR"/diachron/cmd.d/*; do
|
||||
if [ -x "$cmd" ]; then
|
||||
basename "$cmd"
|
||||
fi
|
||||
@@ -24,4 +24,4 @@ fi
|
||||
subcmd="$1"
|
||||
shift
|
||||
|
||||
exec "$DIR"/framework/cmd.d/"$subcmd" "$@"
|
||||
exec "$DIR"/diachron/cmd.d/"$subcmd" "$@"
|
||||
|
||||
4
develop
4
develop
@@ -13,7 +13,7 @@ if [ $# -lt 1 ]; then
|
||||
echo "Usage: ./develop <command> [args...]"
|
||||
echo ""
|
||||
echo "Available commands:"
|
||||
for cmd in "$DIR"/framework/develop.d/*; do
|
||||
for cmd in "$DIR"/diachron/develop.d/*; do
|
||||
if [ -x "$cmd" ]; then
|
||||
basename "$cmd"
|
||||
fi
|
||||
@@ -24,4 +24,4 @@ fi
|
||||
subcmd="$1"
|
||||
shift
|
||||
|
||||
exec "$DIR"/framework/develop.d/"$subcmd" "$@"
|
||||
exec "$DIR"/diachron/develop.d/"$subcmd" "$@"
|
||||
|
||||
0
diachron/.nodejs/.gitignore
vendored
Normal file
0
diachron/.nodejs/.gitignore
vendored
Normal file
191
diachron/AGENTS.md
Normal file
191
diachron/AGENTS.md
Normal file
@@ -0,0 +1,191 @@
|
||||
# Working with diachron (Agent Guide)
|
||||
|
||||
This document helps AI coding agents work effectively with projects built on
|
||||
the diachron framework. It covers the conventions, commands, and structures
|
||||
you need to know.
|
||||
|
||||
## Quick orientation
|
||||
|
||||
diachron is a TypeScript/Express web framework with a Go master process.
|
||||
Your application code lives in `backend/`. The framework owns `master/`,
|
||||
`logger/`, `diachron/`, and `backend/diachron/`.
|
||||
|
||||
**Do not modify framework-owned files** unless explicitly asked to work on
|
||||
the framework itself.
|
||||
|
||||
## Running the application
|
||||
|
||||
```bash
|
||||
./sync.sh # Install dependencies (run once, or after pulling changes)
|
||||
./master # Start the application (watches files, rebuilds, proxies)
|
||||
```
|
||||
|
||||
By default, the app listens on port 8080 (proxy) and workers run on
|
||||
ports starting at 3000.
|
||||
|
||||
## Commands you'll need
|
||||
|
||||
All tools (node, pnpm, tsx, etc.) are managed by the framework. Do not
|
||||
invoke system-installed versions.
|
||||
|
||||
```bash
|
||||
./cmd pnpm install # Install npm packages
|
||||
./cmd pnpm test # Run tests
|
||||
./cmd pnpm biome check . # Lint (run from backend/)
|
||||
./develop db # Open database shell
|
||||
./develop reset-db # Drop and recreate database
|
||||
./develop migrate # Run migrations (development)
|
||||
./mgmt migrate # Run migrations (production-safe)
|
||||
```
|
||||
|
||||
### Formatting and linting
|
||||
|
||||
```bash
|
||||
cd backend && ../cmd pnpm biome check --write .
|
||||
```
|
||||
|
||||
### Building Go code
|
||||
|
||||
```bash
|
||||
cd master && go build
|
||||
cd logger && go build
|
||||
```
|
||||
|
||||
### Quality checks
|
||||
|
||||
```bash
|
||||
./check.sh # shellcheck + golangci-lint
|
||||
```
|
||||
|
||||
## Application structure
|
||||
|
||||
### Where to put code
|
||||
|
||||
| What | Where |
|
||||
|--------------------------|-----------------------------|
|
||||
| Application entry point | `backend/app.ts` |
|
||||
| Route definitions | `backend/routes.ts` |
|
||||
| Route handlers | `backend/handlers.ts` |
|
||||
| Service layer | `backend/services.ts` |
|
||||
| Application types | `backend/types.ts` |
|
||||
| Application config | `backend/config.ts` |
|
||||
| Database migrations | `backend/migrations/` |
|
||||
| Framework library code | `backend/diachron/` |
|
||||
|
||||
### Types and naming
|
||||
|
||||
- HTTP request wrapper: `Call` (not Request)
|
||||
- HTTP response wrapper: `Result` (not Response)
|
||||
- Route definitions: arrays of `Route` objects
|
||||
- Handlers: functions that take a `Call` and return a `Result`
|
||||
|
||||
These names are intentional. Use them consistently.
|
||||
|
||||
Import framework types from `./diachron/types`:
|
||||
|
||||
```typescript
|
||||
import type { Call, Result, Route, Handler } from "./diachron/types";
|
||||
```
|
||||
|
||||
Application-specific domain types go in `backend/types.ts`.
|
||||
|
||||
### Services
|
||||
|
||||
Application services go in `backend/services.ts`. Framework services are
|
||||
accessed through the `core` object:
|
||||
|
||||
```typescript
|
||||
import { core } from "./diachron/core";
|
||||
|
||||
core.database.db // Kysely database instance
|
||||
core.logging.log // Logging
|
||||
core.misc.sleep // Utilities
|
||||
```
|
||||
|
||||
### Exports
|
||||
|
||||
When a TypeScript file exports symbols, they should be listed in
|
||||
alphabetical order.
|
||||
|
||||
## Database
|
||||
|
||||
diachron uses PostgreSQL exclusively, accessed through Kysely (type-safe
|
||||
query builder). There is no ORM.
|
||||
|
||||
- Write SQL via Kysely, not raw query strings (unless Kysely can't express
|
||||
the query)
|
||||
- Migrations live in `backend/migrations/`
|
||||
- Run `./develop codegen` after schema changes to regenerate Kysely types
|
||||
|
||||
## Key conventions
|
||||
|
||||
### No dev/prod distinction
|
||||
|
||||
There is no `NODE_ENV`. The application behaves identically everywhere.
|
||||
Do not introduce environment-based branching.
|
||||
|
||||
### Managed tooling
|
||||
|
||||
Never reference globally installed `node`, `npm`, `npx`, or `pnpm`.
|
||||
Always use `./cmd node`, `./cmd pnpm`, etc.
|
||||
|
||||
### File ownership boundary
|
||||
|
||||
```
|
||||
You may edit: backend/* (except backend/diachron/)
|
||||
Do not edit: master/*, logger/*, diachron/*, backend/diachron/*
|
||||
```
|
||||
|
||||
If a task requires framework changes, confirm with the user first.
|
||||
|
||||
### Command safety tiers
|
||||
|
||||
- `./cmd` -- Tool wrappers, always safe
|
||||
- `./mgmt` -- Production-safe operations (migrations, user management)
|
||||
- `./develop` -- Destructive operations, development only
|
||||
|
||||
Never use `./develop` commands against production data.
|
||||
|
||||
## Common tasks
|
||||
|
||||
### Add a new route
|
||||
|
||||
1. Define the route in `backend/routes.ts` as a `Route` object
|
||||
2. Implement the handler in `backend/handlers.ts`
|
||||
3. Add any needed types to `backend/types.ts`
|
||||
|
||||
### Add a database migration
|
||||
|
||||
1. Create a migration file in `backend/migrations/`
|
||||
2. Run `./develop migrate` to apply it
|
||||
3. Run `./develop codegen` to regenerate Kysely types
|
||||
|
||||
### Install a package
|
||||
|
||||
```bash
|
||||
cd backend && ../cmd pnpm add <package>
|
||||
```
|
||||
|
||||
### Run a one-off TypeScript script
|
||||
|
||||
```bash
|
||||
./develop tsx backend/path/to/script.ts
|
||||
```
|
||||
|
||||
## file-list
|
||||
|
||||
The root `file-list` file is a manifest of all files that ship with the
|
||||
framework. When you create or delete a file that is part of the project
|
||||
(not a scratch file or generated output), you must update `file-list` to
|
||||
match. Keep it sorted alphabetically.
|
||||
|
||||
## Things to avoid
|
||||
|
||||
- Do not introduce `.env` files or `dotenv` without checking with the
|
||||
team first. The configuration story is still being decided.
|
||||
- Do not introduce webpack, vite, or other bundlers. The master process
|
||||
handles building via `@vercel/ncc`.
|
||||
- Do not add express middleware directly. Use the framework's route
|
||||
processing in `backend/diachron/routing.ts`.
|
||||
- Do not use `npm` or globally installed `pnpm`. Use `./cmd pnpm`.
|
||||
- Do not add `NODE_ENV` checks or development/production branches.
|
||||
0
diachron/binaries/.gitignore
vendored
Normal file
0
diachron/binaries/.gitignore
vendored
Normal file
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user