feat(regime)

This commit is contained in:
Dan Finch 2026-04-29 00:55:53 +02:00
commit 2be7ec6836
62 changed files with 1817 additions and 0 deletions

3
.gitignore vendored Normal file
View file

@ -0,0 +1,3 @@
node_modules
*.tsbuildinfo
dist

184
README.md Normal file
View file

@ -0,0 +1,184 @@
# REGIME
> Tooling and unified configuration for managing a bunch of repositories and packages.
## Stack & Standards
- Bun
- TypeScript 7
- Oxlint & Oxfmt
- Commitlint
- Conventional Commit
- Semantic Release
- Forgejo Actions
- [Gum](https://github.com/charmbracelet/gum)
## `regime` CLI
```
regime <check|sync|promote|templates> [path] [--yes] [--full]
```
If `[path]` is omitted, the current working directory is used. Regime discovers all `regime.config.json` files recursively under the target path (skipping `node_modules` and `.git`).
### Commands
#### `regime check [path] [--full]`
Compares managed files against their templates and reports differences. For each `regime.config.json` found:
- **overwrite** files: reports `missing` or `differs`
- **merge json / merge jsonc** files: reports field-level diffs -- missing fields, wrong values, and what the expected value should be
By default only problems are shown. Pass `--full` to also print fields/files that are already in sync.
#### `regime sync [path]`
Writes template-managed files into each project. For each `regime.config.json` found:
- **overwrite** files: created or replaced with the template content (with variable interpolation)
- **merge json / merge jsonc** files: deep-merged so that template-required fields are present while preserving any extra fields the project has added
Files and directories are created if they don't exist. Files already in sync are skipped silently.
#### `regime promote [path] [--yes]`
Interactively promotes a local file change back into a template. Only applies to **overwrite**-strategy files that differ from their template.
1. Presents a filterable list of changed files (via `gum filter`)
2. If the template chain has multiple templates, asks which template to write to (via `gum choose`)
3. De-interpolates variable values back into `<<varname>>` placeholders
4. Shows a diff of the proposed change
5. Asks for confirmation (skip with `--yes`)
6. Writes the updated file into the template directory
Requires [gum](https://github.com/charmbracelet/gum) to be installed.
#### `regime templates [--full]`
Lists all available templates as a tree showing inheritance relationships. Pass `--full` to also list the files each template provides.
## Templates
### Opting In
A project places a `regime.config.json` in its root (or in each workspace package):
```json
{
"templates": ["profile/library", "workflow/mirror", "adapts-to/bun"],
"vars": {
"repo": "my-project"
}
}
```
- `templates` -- a string or array of template names (paths relative to `templates/`)
- `vars` -- key-value pairs for `<<key>>` interpolation in template file contents and filenames
### Structure
Each template is a directory under `templates/` containing:
- `.regime-template.json` -- metadata (inheritance and file strategies)
- Any other files -- the template content synced into target repos
A `.regime-template.json` looks like:
```json
{
"inherits": ["shared/package"],
"patterns": {
"package.json": "merge json",
"tsconfig.*.json": "merge json"
}
}
```
- `inherits` -- parent templates resolved first (depth-first; parents before children)
- `patterns` -- maps file paths or globs to a file strategy
### File Strategies
| Strategy | Behavior |
|----------|----------|
| `overwrite` (default) | Template file replaces the target file entirely. |
| `merge json` | Deep-merged into existing JSON. Template values win for shared keys; target-only keys are preserved; arrays are unioned (template items first, then unique target items). |
| `merge jsonc` | Like `merge json` but parses/writes JSONC (JSON with comments and trailing commas). |
When multiple templates in a chain provide the same file:
- **overwrite**: the last template in the chain wins
- **merge json / merge jsonc**: all template versions are merged in chain order, then merged into the target
### Variable Interpolation
Template files can contain `<<varname>>` placeholders in both their **content** and **filenames** (including directory components). These are replaced with values from the project's `vars` during `check` and `sync`.
For example, a template file named `<<repo>>.code-workspace` with `vars: { "repo": "route" }` produces `route.code-workspace` in the target.
The `promote` command reverses this (de-interpolation), replacing concrete values back into `<<varname>>` placeholders before writing to the template. Undeclared variables emit a warning and remain as-is.
### Indentation
When updating existing JSON/JSONC files, regime detects and preserves the file's existing indentation style. New files default to 2-space indent (JSON) or tab indent (JSONC).
### Template Categories
```
templates/
shared/ foundational building blocks
include/ files included by inheritance only (not used directly)
profile/ complete project profiles (composed from shared + include)
adapts-to/ runtime/platform adapters
tool/ dev tooling (commitlint, husky, oxc)
workflow/ Forgejo CI workflows
```
### Available Templates
| Template | Inherits | Description |
|----------|----------|-------------|
| `shared/repo` | `include/license` | Pulls in LICENSE. |
| `shared/package` | -- | Base `package.json` (license, author, repository with `<<repo>>`). Merges `package.json`, `tsconfig.json`, `tsconfig.*.json`. |
| `shared/library` | `shared/package` | Full TypeScript library setup (tsconfig variants for src, test, config). |
| `include/license` | -- | MIT license file. |
| `profile/library` | `shared/repo`, `shared/library` | Standalone library repo with license + full TS setup. |
| `profile/monorepo/root` | `shared/repo`, `shared/package` | Monorepo root with workspaces catalog and workspace-wide scripts. |
| `profile/monorepo/library` | `shared/library` | Workspace package inside a monorepo (no license or repo-level files). |
| `profile/workspace` | -- | VS Code `.code-workspace` file (merges JSONC). Uses `<<repo>>` in filename. |
| `adapts-to/bun` | -- | Adds `@types/bun`, bun type references, `tsconfig.bun.json`. |
| `adapts-to/cloudflare` | -- | Adds `@cloudflare/workers-types`, cloudflare type references, `tsconfig.cloudflare.json`. |
| `tool/commitlint` | -- | Commitlint config, deps, and husky hook. |
| `tool/husky` | -- | Husky dep and prepare script. |
| `tool/oxc` | -- | Oxlint + oxfmt configs, deps, and lint script. |
| `workflow/checks` | -- | Forgejo CI workflow for lint/check/test. |
| `workflow/mirror` | -- | Forgejo workflow for mirroring to GitHub. |
| `workflow/publish-npm` | -- | Forgejo release workflow with semantic-release config. |
### Usage Examples
Standalone library:
```json
{
"templates": ["profile/library", "profile/workspace", "tool/oxc", "tool/commitlint", "tool/husky", "workflow/mirror", "workflow/publish-npm", "adapts-to/bun"],
"vars": { "repo": "route" }
}
```
Monorepo root:
```json
{
"templates": ["profile/monorepo/root", "profile/workspace", "tool/oxc", "tool/commitlint", "tool/husky", "workflow/mirror", "workflow/publish-npm"],
"vars": { "repo": "toolkit" }
}
```
Monorepo workspace package:
```json
{
"templates": ["profile/monorepo/library"],
"vars": { "repo": "toolkit" }
}
```

29
actions/checks/action.yml Normal file
View file

@ -0,0 +1,29 @@
name: Checks
description: Run lint, type check, and tests
runs:
using: composite
steps:
- name: Setup Bun
shell: bash
run: curl -fsSL https://bun.sh/install | bash && echo "$HOME/.bun/bin" >> "$GITHUB_PATH"
- name: Resolve external workspaces
shell: bash
run: bun "$GITHUB_ACTION_PATH/../../scripts/resolve-workspaces.ts"
- name: Install dependencies
shell: bash
run: bun install --no-save-lockfile
- name: Lint
shell: bash
run: bun run lint
- name: Type check
shell: bash
run: bun run check
- name: Test
shell: bash
run: bun run test

55
actions/mirror/action.yml Normal file
View file

@ -0,0 +1,55 @@
name: Mirror to GitHub
description: Mirror the current repo to GitHub.
inputs:
target:
description: "GitHub repo (e.g. owner/repo)"
required: true
source:
description: "Authenticated clone URL for the source repo"
required: true
token:
description: "GitHub personal access token with push access"
required: true
runs:
using: composite
steps:
- name: Install git-filter-repo
shell: bash
run: pip install --break-system-packages git-filter-repo
- name: Clone mirror
shell: bash
run: git clone --bare "$GITHUB_WORKSPACE" /tmp/mirror-repo
- name: Filter ignored paths
shell: bash
run: |
MIRRORIGNORE="$GITHUB_WORKSPACE/.mirrorignore"
if [ ! -f "$MIRRORIGNORE" ]; then
echo "No .mirrorignore found, skipping filter"
exit 0
fi
ARGS=""
while IFS= read -r line || [ -n "$line" ]; do
line="$(echo "$line" | sed 's/^[[:space:]]*//;s/[[:space:]]*$//')"
[ -z "$line" ] && continue
[[ "$line" == \#* ]] && continue
ARGS="$ARGS --path $line"
done < "$MIRRORIGNORE"
# Always filter .mirrorignore itself
ARGS="$ARGS --path .mirrorignore"
if [ -n "$ARGS" ]; then
cd /tmp/mirror-repo
git filter-repo $ARGS --invert-paths --force
fi
- name: Push mirror
shell: bash
run: |
cd /tmp/mirror-repo
git push --mirror "https://${{ inputs.token }}@github.com/${{ inputs.target }}.git"

View file

@ -0,0 +1,48 @@
name: Publish to NPM
description: Run semantic-release (multi for monorepos, standard for solo packages)
inputs:
gitea-token:
description: "Forgejo API token with push + API access"
required: true
gitea-url:
description: "Forgejo instance URL"
required: true
npm-token:
description: "npm registry auth token"
required: true
runs:
using: composite
steps:
- name: Setup Bun
shell: bash
run: curl -fsSL https://bun.sh/install | bash && echo "$HOME/.bun/bin" >> "$GITHUB_PATH"
- name: Resolve external workspaces
shell: bash
run: bun "$GITHUB_ACTION_PATH/../../scripts/resolve-workspaces.ts"
- name: Install dependencies
shell: bash
run: bun install --no-save-lockfile
- name: Configure npm auth
shell: bash
env:
NPM_TOKEN: ${{ inputs.npm-token }}
run: echo "//registry.npmjs.org/:_authToken=${NPM_TOKEN}" >> ~/.npmrc
- name: Run multi-semantic-release
shell: bash
env:
GITEA_TOKEN: ${{ inputs.gitea-token }}
GITEA_URL: ${{ inputs.gitea-url }}
NPM_TOKEN: ${{ inputs.npm-token }}
run: |
# TODO: use bun
if node -e "const p=require('./package.json'); process.exit(p.workspaces ? 0 : 1)"; then
bunx multi-semantic-release
else
bunx semantic-release
fi

30
bin/regime Executable file
View file

@ -0,0 +1,30 @@
#!/usr/bin/env bun
import { resolve } from "path"
import { check } from "../src/check"
import { sync } from "../src/sync"
import { promote } from "../src/promote"
import { templates } from "../src/templates"
const [command, ...rawArgs] = process.argv.slice(2)
const hasYes = rawArgs.includes("--yes")
const hasFull = rawArgs.includes("--full")
const args = rawArgs.filter(a => a !== "--yes" && a !== "--full")
const targetDir = resolve(args[0] ?? process.cwd())
switch (command) {
case "check":
await check(targetDir, hasFull)
break
case "sync":
await sync(targetDir)
break
case "promote":
await promote(targetDir, hasYes)
break
case "templates":
templates(hasFull)
break
default:
console.error("Usage: regime <check|sync|promote|templates> [path] [--yes] [--full]")
process.exit(1)
}

25
bun.lock Normal file
View file

@ -0,0 +1,25 @@
{
"lockfileVersion": 1,
"configVersion": 1,
"workspaces": {
"": {
"dependencies": {
"jsonc-parser": "^3.3.1",
},
"devDependencies": {
"@types/bun": "^1.3.13",
},
},
},
"packages": {
"@types/bun": ["@types/bun@1.3.13", "", { "dependencies": { "bun-types": "1.3.13" } }, "sha512-9fqXWk5YIHGGnUau9TEi+qdlTYDAnOj+xLCmSTwXfAIqXr2x4tytJb43E9uCvt09zJURKXwAtkoH4nLQfzeTXw=="],
"@types/node": ["@types/node@25.6.0", "", { "dependencies": { "undici-types": "~7.19.0" } }, "sha512-+qIYRKdNYJwY3vRCZMdJbPLJAtGjQBudzZzdzwQYkEPQd+PJGixUL5QfvCLDaULoLv+RhT3LDkwEfKaAkgSmNQ=="],
"bun-types": ["bun-types@1.3.13", "", { "dependencies": { "@types/node": "*" } }, "sha512-QXKeHLlOLqQX9LgYaHJfzdBaV21T63HhFJnvuRCcjZiaUDpbs5ED1MgxbMra71CsryN/1dAoXuJJJwIv/2drVA=="],
"jsonc-parser": ["jsonc-parser@3.3.1", "", {}, "sha512-HUgH65KyejrUFPvHFPbqOY0rsFip3Bo5wb4ngvdi1EpCYWUQDC5V+Y7mZws+DLkr4M//zQJoanu1SP+87Dv1oQ=="],
"undici-types": ["undici-types@7.19.2", "", {}, "sha512-qYVnV5OEm2AW8cJMCpdV20CDyaN3g0AjDlOGf1OW4iaDEx8MwdtChUp4zu4H0VP3nDRF/8RKWH+IPp9uW0YGZg=="],
}
}

8
package.json Normal file
View file

@ -0,0 +1,8 @@
{
"devDependencies": {
"@types/bun": "^1.3.13"
},
"dependencies": {
"jsonc-parser": "^3.3.1"
}
}

View file

@ -0,0 +1,65 @@
import { readFileSync, writeFileSync, existsSync, readdirSync } from "fs"
import { join } from "path"
import { execSync } from "child_process"
const pkg = JSON.parse(readFileSync("package.json", "utf8"))
const ws: string[] = pkg.workspaces?.packages || pkg.workspaces || []
const hasWorkspaces = Array.isArray(ws)
// remove external workspace entries if present
if (hasWorkspaces) {
const ext = ws.filter((p) => p.startsWith(".."))
if (ext.length > 0) {
pkg.workspaces.packages = ws.filter((p) => !p.startsWith(".."))
writeFileSync("package.json", JSON.stringify(pkg, null, 2) + "\n")
}
}
// collect local workspace package names
const localNames = new Set<string>()
if (hasWorkspaces) {
for (const pattern of ws.filter((p) => !p.startsWith(".."))) {
const base = pattern.replace(/\/\*$/, "")
if (!existsSync(base)) continue
for (const dir of readdirSync(base, { withFileTypes: true })) {
if (!dir.isDirectory()) continue
const p = join(base, dir.name, "package.json")
if (existsSync(p)) localNames.add(JSON.parse(readFileSync(p, "utf8")).name)
}
}
}
// find all package.json files
const files = execSync('find . -name package.json -not -path "*/node_modules/*"')
.toString()
.trim()
.split("\n")
const resolved: string[] = []
for (const file of files) {
const content = readFileSync(file, "utf8")
let changed = content
// replace workspace:* deps that aren't local
const wsRe = /"([^"]+)": "workspace:\*"/g
let m: RegExpExecArray | null
while ((m = wsRe.exec(content)) !== null) {
if (!localNames.has(m[1])) {
changed = changed.replace(m[0], `"${m[1]}": "*"`)
resolved.push(m[1])
}
}
// replace ../path deps
const pathRe = /"([^"]+)": "\.\.\/[^"]+"/g
while ((m = pathRe.exec(content)) !== null) {
changed = changed.replace(m[0], `"${m[1]}": "*"`)
resolved.push(m[1])
}
if (changed !== content) writeFileSync(file, changed)
}
if (resolved.length) {
console.log("Resolved external deps:", [...new Set(resolved)].join(", "))
}

128
src/check.ts Normal file
View file

@ -0,0 +1,128 @@
import { dirname, relative, join } from "node:path"
import { existsSync } from "node:fs"
import {
type RegimeConfig,
findRegimeConfigs,
resolveTemplateChain,
getStrategy,
interpolate,
readFileSync,
diffJson,
mergeTemplateJsonFiles,
mergeTemplateJsoncFiles,
parseJsonc,
} from "./shared"
const red = Bun.color("red", "ansi")
const orange = Bun.color("orange", "ansi")
const green = Bun.color("green", "ansi")
const purple = Bun.color("purple", "ansi")
const reset = "\x1b[0m"
export async function check(targetDir: string, full = false): Promise<void> {
const rcFiles = await findRegimeConfigs(targetDir)
if (rcFiles.length === 0) {
console.log("No regime.config.json files found.")
return
}
for (const rcFile of rcFiles) {
const rcDir = dirname(rcFile)
const relDir = relative(targetDir, rcDir) || "."
console.log(`\n${purple}${relDir}/${reset}`)
const rc: RegimeConfig = JSON.parse(readFileSync(rcFile))
const templateNames = Array.isArray(rc.templates) ? rc.templates : [rc.templates]
const vars = rc.vars ?? {}
const { files, patterns } = resolveTemplateChain(templateNames)
if (files.size === 0) {
console.log(" (no template files)")
continue
}
let synced = true
for (const [relPath, templatePaths] of files) {
const targetRelPath = interpolate(relPath, vars)
const targetPath = join(rcDir, targetRelPath)
const strategy = getStrategy(targetRelPath, patterns)
if (!existsSync(targetPath)) {
console.log(` ${targetRelPath}: ${red}missing${reset}`)
synced = false
continue
}
const existingContent = readFileSync(targetPath)
if (strategy === "merge json") {
try {
const templateObj = mergeTemplateJsonFiles(templatePaths, vars, targetRelPath)
const existingObj = JSON.parse(existingContent)
const entries = diffJson(templateObj, existingObj, full)
const diffs = entries.filter(e => !e.ok)
if (diffs.length > 0 || (full && entries.length > 0)) {
if (diffs.length > 0) synced = false
console.log(` ${targetRelPath}:`)
for (const d of entries) {
if (d.ok) {
console.log(` ${d.field}: ${green}ok${reset}`)
} else {
const exp = JSON.stringify(d.expected)
const act = d.actual === undefined ? `${red}missing${reset}` : `${orange}${JSON.stringify(d.actual)}${reset}`
console.log(` ${d.field}: ${act} -> ${green}${exp}${reset}`)
}
}
} else if (full) {
console.log(` ${targetRelPath}: ${green}ok${reset}`)
}
} catch (e) {
console.log(` ${targetRelPath}: ${red}failed to parse JSON${reset} - ${e}`)
synced = false
}
} else if (strategy === "merge jsonc") {
try {
const templateObj = mergeTemplateJsoncFiles(templatePaths, vars, targetRelPath)
const existingObj = parseJsonc(existingContent)
const entries = diffJson(templateObj, existingObj, full)
const diffs = entries.filter(e => !e.ok)
if (diffs.length > 0 || (full && entries.length > 0)) {
if (diffs.length > 0) synced = false
console.log(` ${targetRelPath}:`)
for (const d of entries) {
if (d.ok) {
console.log(` ${d.field}: ${green}ok${reset}`)
} else {
const exp = JSON.stringify(d.expected)
const act = d.actual === undefined ? `${red}missing${reset}` : `${orange}${JSON.stringify(d.actual)}${reset}`
console.log(` ${d.field}: ${act} -> ${green}${exp}${reset}`)
}
}
} else if (full) {
console.log(` ${targetRelPath}: ${green}ok${reset}`)
}
} catch (e) {
console.log(` ${targetRelPath}: ${red}failed to parse JSONC${reset} - ${e}`)
synced = false
}
} else if (strategy === "overwrite") {
const templateContent = interpolate(readFileSync(templatePaths[templatePaths.length - 1]), vars, targetRelPath)
if (existingContent !== templateContent) {
console.log(` ${targetRelPath}: ${orange}differs${reset}`)
synced = false
} else if (full) {
console.log(` ${targetRelPath}: ${green}ok${reset}`)
}
}
}
if (synced) {
console.log(` ${green}in sync${reset}`)
}
}
}

170
src/promote.ts Normal file
View file

@ -0,0 +1,170 @@
import { dirname, join } from "node:path"
import { existsSync } from "node:fs"
import { writeFile } from "node:fs/promises"
import { $ } from "bun"
import {
type RegimeConfig,
findRegimeConfigs,
resolveTemplateChain,
getStrategy,
interpolate,
deinterpolate,
readFileSync,
templatesDir,
} from "./shared"
const green = Bun.color("green", "ansi")
const orange = Bun.color("orange", "ansi")
const purple = Bun.color("purple", "ansi")
const reset = "\x1b[0m"
async function gumStdin(command: string, input: string): Promise<string> {
const proc = Bun.spawn(["gum", command], {
stdin: Buffer.from(input),
stdout: "pipe",
stderr: "inherit",
})
const output = await new Response(proc.stdout).text()
await proc.exited
return output
}
interface PromotableFile {
relPath: string // interpolated relative path (e.g. "tsconfig.json")
templateRelPath: string // raw template path (may contain <<var>> placeholders)
repoPath: string // absolute path in the repo
configDir: string // directory containing regime.config.json
vars: Record<string, string>
templateNames: string[] // all templates in the inheritance tree
}
export async function promote(targetDir: string, yes = false): Promise<void> {
// Step 1: Find regime configs
const rcFiles = await findRegimeConfigs(targetDir)
if (rcFiles.length === 0) {
console.error("No regime.config.json files found.")
process.exit(1)
}
// Step 2: Collect promotable files (overwrite-strategy files that differ)
const promotable: PromotableFile[] = []
for (const rcFile of rcFiles) {
const rcDir = dirname(rcFile)
const rc: RegimeConfig = JSON.parse(readFileSync(rcFile))
const templateNames = Array.isArray(rc.templates) ? rc.templates : [rc.templates]
const vars = rc.vars ?? {}
const { files, patterns, templateNames: chainNames } = resolveTemplateChain(templateNames)
for (const [relPath, templatePaths] of files) {
const targetRelPath = interpolate(relPath, vars)
const strategy = getStrategy(targetRelPath, patterns)
if (strategy !== "overwrite") continue
const repoPath = join(rcDir, targetRelPath)
if (!existsSync(repoPath)) continue
const repoContent = readFileSync(repoPath)
const templateContent = interpolate(
readFileSync(templatePaths[templatePaths.length - 1]),
vars,
targetRelPath,
)
if (repoContent !== templateContent) {
promotable.push({
relPath: targetRelPath,
templateRelPath: relPath,
repoPath,
configDir: rcDir,
vars,
templateNames: chainNames,
})
}
}
}
if (promotable.length === 0) {
console.log("Nothing to promote.")
return
}
// Step 3: Interactive file selection via gum filter
const fileList = promotable.map(p => p.relPath).join("\n")
const selectedFile = (await gumStdin("filter", fileList)).trim()
if (!selectedFile) {
console.log("No file selected.")
return
}
const entry = promotable.find(p => p.relPath === selectedFile)
if (!entry) {
console.error(`File "${selectedFile}" not found in promotable list.`)
process.exit(1)
}
// Step 4: Interactive template selection via gum choose
let selectedTemplate: string
if (entry.templateNames.length === 1) {
selectedTemplate = entry.templateNames[0]
} else {
const templateList = entry.templateNames.join("\n")
selectedTemplate = (await gumStdin("choose", templateList)).trim()
}
if (!selectedTemplate) {
console.log("No template selected.")
return
}
// Step 5: De-interpolate
const repoContent = readFileSync(entry.repoPath)
const promoted = deinterpolate(repoContent, entry.vars)
// Step 6: Show diff
const templateFilePath = join(templatesDir, selectedTemplate, entry.templateRelPath)
console.log(`\n${purple}Promoting${reset} ${entry.relPath} -> ${green}${selectedTemplate}${reset}`)
if (existsSync(templateFilePath)) {
const tmpFile = `/tmp/regime-promote-${Date.now()}`
await writeFile(tmpFile, promoted)
try {
const diff = await $`diff -u ${templateFilePath} ${tmpFile} || true`.text()
if (diff.trim()) {
console.log(diff)
} else {
console.log("No changes detected after de-interpolation.")
return
}
} finally {
await $`rm -f ${tmpFile}`.quiet()
}
} else {
console.log(`${orange}New file${reset} — will be created in template "${selectedTemplate}"`)
console.log(promoted)
}
// Step 7: Confirm
if (!yes) {
const proc = Bun.spawn(["gum", "confirm", "Write to template?"], {
stdin: "inherit",
stdout: "inherit",
stderr: "inherit",
})
if (await proc.exited !== 0) {
console.log("Cancelled.")
return
}
}
// Step 8: Write
const targetFileDir = dirname(templateFilePath)
if (!existsSync(targetFileDir)) {
await $`mkdir -p ${targetFileDir}`.quiet()
}
await writeFile(templateFilePath, promoted)
console.log(`${green}Written${reset} ${templateFilePath}`)
}

356
src/shared.ts Normal file
View file

@ -0,0 +1,356 @@
import { resolve, join, dirname } from "node:path"
import { existsSync } from "node:fs"
import { readdir } from "node:fs/promises"
import { Glob } from "bun"
import { parse as parseJsonc } from "jsonc-parser"
export { parseJsonc }
// --- Types ---
export interface RegimeConfig {
templates: string | string[]
vars?: Record<string, string>
}
export interface TemplateConfig {
inherits?: string[]
patterns?: Record<string, string>
}
export interface CollectedTemplate {
files: Map<string, string[]> // relative path -> absolute paths (in chain order)
patterns: Record<string, string>
templateNames: string[] // ordered list of visited template names
}
// --- Constants ---
export const regimeDir = resolve(dirname(import.meta.dir))
export const templatesDir = join(regimeDir, "templates")
// --- Utilities ---
export function deepEqual(a: any, b: any): boolean {
if (a === b) return true
if (typeof a !== typeof b) return false
if (a === null || b === null) return a === b
if (Array.isArray(a) && Array.isArray(b)) {
if (a.length !== b.length) return false
return a.every((item: any, i: number) => deepEqual(item, b[i]))
}
if (typeof a === "object") {
const keysA = Object.keys(a)
const keysB = Object.keys(b)
if (keysA.length !== keysB.length) return false
return keysA.every(k => k in b && deepEqual(a[k], b[k]))
}
return false
}
export function readFileSync(path: string): string {
return require("fs").readFileSync(path, "utf-8")
}
export function readdirSyncRecursive(dir: string, prefix = ""): string[] {
const fs = require("fs")
const results: string[] = []
for (const entry of fs.readdirSync(dir, { withFileTypes: true })) {
const rel = prefix ? `${prefix}/${entry.name}` : entry.name
if (entry.isDirectory()) {
results.push(...readdirSyncRecursive(join(dir, entry.name), rel))
} else {
results.push(rel)
}
}
return results
}
// --- Template discovery ---
export function findAllTemplateNames(): string[] {
const fs = require("fs")
const results: string[] = []
function walk(dir: string, prefix: string) {
let entries
try {
entries = fs.readdirSync(dir, { withFileTypes: true })
} catch {
return
}
if (prefix && existsSync(join(dir, ".regime-template.json"))) {
results.push(prefix)
return
}
for (const entry of entries) {
if (entry.isDirectory()) {
const childName = prefix ? `${prefix}/${entry.name}` : entry.name
walk(join(dir, entry.name), childName)
}
}
}
walk(templatesDir, "")
return results.sort()
}
// --- Template resolution ---
export function resolveTemplateConfig(name: string): TemplateConfig {
const configPath = join(templatesDir, name, ".regime-template.json")
if (!existsSync(configPath)) return {}
const raw = JSON.parse(readFileSync(configPath))
return raw as TemplateConfig
}
export function resolveTemplateChain(names: string[]): CollectedTemplate {
const visited = new Set<string>()
const files = new Map<string, string[]>()
const patterns: Record<string, string> = {}
const templateNames: string[] = []
function walk(name: string) {
if (visited.has(name)) return
visited.add(name)
const dir = join(templatesDir, name)
if (!existsSync(dir)) {
console.error(` warning: template "${name}" not found at ${dir}`)
return
}
const config = resolveTemplateConfig(name)
// Walk parents first so children override
if (config.inherits) {
for (const parent of config.inherits) {
walk(parent)
}
}
templateNames.push(name)
// Collect patterns
if (config.patterns) {
Object.assign(patterns, config.patterns)
}
// Collect files (skip .regime-template.json)
const entries = readdirSyncRecursive(dir)
for (const entry of entries) {
if (entry === ".regime-template.json") continue
const existing = files.get(entry) ?? []
existing.push(join(dir, entry))
files.set(entry, existing)
}
}
for (const name of names) {
walk(name)
}
return { files, patterns, templateNames }
}
// --- Strategy matching ---
export function getStrategy(filePath: string, patterns: Record<string, string>): string {
// Check exact match first
if (patterns[filePath]) return patterns[filePath]
// Check glob patterns
for (const [pattern, strategy] of Object.entries(patterns)) {
if (pattern.includes("*")) {
const glob = new Glob(pattern)
if (glob.match(filePath)) return strategy
}
}
return "overwrite" // default
}
// --- Variable interpolation ---
export function interpolate(content: string, vars: Record<string, string>, context?: string): string {
return content.replace(/<<(\w+)>>/g, (_, key) => {
if (!(key in vars)) {
console.error(` warning: undeclared var "<<${key}>>"${context ? ` in ${context}` : ""}`)
}
return vars[key] ?? `<<${key}>>`
})
}
export function deinterpolate(content: string, vars: Record<string, string>): string {
let result = content
for (const [key, value] of Object.entries(vars)) {
result = result.replaceAll(value, `<<${key}>>`)
}
return result
}
// --- Deep merge (template values win; existing-only fields preserved) ---
export function deepMerge(base: any, overlay: any): any {
if (Array.isArray(overlay) && Array.isArray(base)) {
const result = [...overlay]
for (const item of base) {
if (!result.some(o => deepEqual(o, item))) {
result.push(item)
}
}
return result
}
if (typeof base !== "object" || base === null) return overlay
if (typeof overlay !== "object" || overlay === null) return overlay
const result = { ...base }
for (const key of Object.keys(overlay)) {
if (key in result) {
result[key] = deepMerge(result[key], overlay[key])
} else {
result[key] = overlay[key]
}
}
return result
}
// --- Merge all template JSON files into one combined object ---
export function mergeTemplateJsonFiles(paths: string[], vars: Record<string, string>, relPath: string): any {
let merged: any = {}
for (const p of paths) {
const content = interpolate(readFileSync(p), vars, relPath)
merged = deepMerge(merged, JSON.parse(content))
}
return merged
}
// --- Merge all template JSONC files into one combined object ---
export function mergeTemplateJsoncFiles(paths: string[], vars: Record<string, string>, relPath: string): any {
let merged: any = {}
for (const p of paths) {
const content = interpolate(readFileSync(p), vars, relPath)
merged = deepMerge(merged, parseJsonc(content))
}
return merged
}
// --- JSONC stringify (JSON with trailing commas) ---
export function stringifyJsonc(obj: any, indent: string): string {
return jsonWithTrailingCommas(obj, indent, 0) + "\n"
}
function jsonWithTrailingCommas(value: any, indent: string, depth: number): string {
if (value === null) return "null"
if (typeof value === "boolean" || typeof value === "number") return JSON.stringify(value)
if (typeof value === "string") return JSON.stringify(value)
const currentIndent = indent.repeat(depth + 1)
const closingIndent = indent.repeat(depth)
if (Array.isArray(value)) {
if (value.length === 0) return "[]"
const items = value.map(item => `${currentIndent}${jsonWithTrailingCommas(item, indent, depth + 1)},`)
return `[\n${items.join("\n")}\n${closingIndent}]`
}
if (typeof value === "object") {
const keys = Object.keys(value)
if (keys.length === 0) return "{}"
const entries = keys.map(key =>
`${currentIndent}${JSON.stringify(key)}: ${jsonWithTrailingCommas(value[key], indent, depth + 1)},`
)
return `{\n${entries.join("\n")}\n${closingIndent}}`
}
return String(value)
}
// --- Indentation detection ---
export function detectIndent(content: string): string {
const match = content.match(/^(\s+)/m)
return match?.[1] ?? " "
}
// --- Diff reporting ---
export interface DiffEntry {
field: string
expected: any
actual: any
ok: boolean
}
export function diffJson(
templateObj: any,
existingObj: any,
full = false,
path: string[] = [],
): DiffEntry[] {
const results: DiffEntry[] = []
for (const key of Object.keys(templateObj)) {
const fieldPath = [...path, key].join(".")
const expected = templateObj[key]
const actual = existingObj?.[key]
if (actual === undefined) {
results.push({ field: fieldPath, expected, actual: undefined, ok: false })
} else if (
typeof expected === "object" &&
expected !== null &&
!Array.isArray(expected) &&
typeof actual === "object" &&
actual !== null &&
!Array.isArray(actual)
) {
results.push(...diffJson(expected, actual, full, [...path, key]))
} else if (Array.isArray(expected) && Array.isArray(actual)) {
const missing = expected.filter(
(e: any) => !actual.some((a: any) => deepEqual(a, e))
)
if (missing.length > 0) {
results.push({ field: fieldPath, expected, actual, ok: false })
} else if (full) {
results.push({ field: fieldPath, expected, actual, ok: true })
}
} else if (JSON.stringify(expected) !== JSON.stringify(actual)) {
results.push({ field: fieldPath, expected, actual, ok: false })
} else if (full) {
results.push({ field: fieldPath, expected, actual, ok: true })
}
}
return results
}
// --- Find all regime.config.json files in a repo ---
export async function findRegimeConfigs(repoDir: string): Promise<string[]> {
const results: string[] = []
async function walk(dir: string) {
let entries
try {
entries = await readdir(dir, { withFileTypes: true })
} catch {
return
}
for (const entry of entries) {
if (entry.name === "node_modules" || entry.name === ".git") continue
const full = join(dir, entry.name)
if (entry.isDirectory()) {
await walk(full)
} else if (entry.name === "regime.config.json") {
results.push(full)
}
}
}
await walk(repoDir)
return results
}

192
src/sync.ts Normal file
View file

@ -0,0 +1,192 @@
import { dirname, relative, join } from "node:path"
import { existsSync } from "node:fs"
import { writeFile } from "node:fs/promises"
import {
type RegimeConfig,
findRegimeConfigs,
resolveTemplateChain,
getStrategy,
interpolate,
readFileSync,
deepMerge,
mergeTemplateJsonFiles,
mergeTemplateJsoncFiles,
stringifyJsonc,
parseJsonc,
detectIndent,
} from "./shared"
const green = Bun.color("green", "ansi")
const red = Bun.color("red", "ansi")
const purple = Bun.color("purple", "ansi")
const reset = "\x1b[0m"
export async function sync(targetDir: string): Promise<void> {
const rcFiles = await findRegimeConfigs(targetDir)
if (rcFiles.length === 0) {
console.log("No regime.config.json files found.")
return
}
for (const rcFile of rcFiles) {
const rcDir = dirname(rcFile)
const relDir = relative(targetDir, rcDir) || "."
console.log(`\n${purple}${relDir}/${reset}`)
const rc: RegimeConfig = JSON.parse(readFileSync(rcFile))
const templateNames = Array.isArray(rc.templates) ? rc.templates : [rc.templates]
const vars = rc.vars ?? {}
const { files, patterns } = resolveTemplateChain(templateNames)
if (files.size === 0) {
console.log(" (no template files)")
continue
}
let allSynced = true
for (const [relPath, templatePaths] of files) {
const targetRelPath = interpolate(relPath, vars)
const targetPath = join(rcDir, targetRelPath)
const strategy = getStrategy(targetRelPath, patterns)
// Ensure target directory exists
const targetFileDir = dirname(targetPath)
if (!existsSync(targetFileDir)) {
require("fs").mkdirSync(targetFileDir, { recursive: true })
}
if (strategy === "merge json") {
let templateObj: any
try {
templateObj = mergeTemplateJsonFiles(templatePaths, vars, targetRelPath)
} catch (e) {
console.log(` ${targetRelPath}: ${red}failed to parse template JSON${reset} - ${e}`)
continue
}
if (existsSync(targetPath)) {
const existingContent = readFileSync(targetPath)
let existingObj: any
try {
existingObj = JSON.parse(existingContent)
} catch (e) {
console.log(` ${targetRelPath}: ${red}failed to parse existing JSON${reset} - ${e}`)
continue
}
const merged = deepMerge(existingObj, templateObj)
const indent = detectIndent(existingContent)
const mergedContent = JSON.stringify(merged, null, indent) + "\n"
if (mergedContent === existingContent) {
// already in sync
continue
}
await writeFile(targetPath, mergedContent)
console.log(` ${targetRelPath}: ${green}updated${reset}`)
allSynced = false
} else {
const content = JSON.stringify(templateObj, null, " ") + "\n"
await writeFile(targetPath, content)
console.log(` ${targetRelPath}: ${green}created${reset}`)
allSynced = false
}
} else if (strategy === "merge jsonc") {
let templateObj: any
try {
templateObj = mergeTemplateJsoncFiles(templatePaths, vars, targetRelPath)
} catch (e) {
console.log(` ${targetRelPath}: ${red}failed to parse template JSONC${reset} - ${e}`)
continue
}
if (existsSync(targetPath)) {
const existingContent = readFileSync(targetPath)
let existingObj: any
try {
existingObj = parseJsonc(existingContent)
} catch (e) {
console.log(` ${targetRelPath}: ${red}failed to parse existing JSONC${reset} - ${e}`)
continue
}
const merged = deepMerge(existingObj, templateObj)
const indent = detectIndent(existingContent)
const mergedContent = stringifyJsonc(merged, indent)
if (mergedContent === existingContent) {
// already in sync
continue
}
await writeFile(targetPath, mergedContent)
console.log(` ${targetRelPath}: ${green}updated${reset}`)
allSynced = false
} else {
const content = stringifyJsonc(templateObj, "\t")
await writeFile(targetPath, content)
console.log(` ${targetRelPath}: ${green}created${reset}`)
allSynced = false
}
if (existsSync(targetPath)) {
const existingContent = readFileSync(targetPath)
let existingObj: any
try {
existingObj = parseJsonc(existingContent)
} catch (e) {
console.log(` ${targetRelPath}: ${red}failed to parse existing JSON5${reset} - ${e}`)
continue
}
const merged = deepMerge(existingObj, templateObj)
const indent = detectIndent(existingContent)
const mergedContent = stringifyJsonc(merged, indent)
if (mergedContent === existingContent) {
// already in sync
continue
}
await writeFile(targetPath, mergedContent)
console.log(` ${targetRelPath}: ${green}updated${reset}`)
allSynced = false
} else {
const content = stringifyJsonc(templateObj, "\t")
await writeFile(targetPath, content)
console.log(` ${targetRelPath}: ${green}created${reset}`)
allSynced = false
}
} else if (strategy === "overwrite") {
const templateContent = interpolate(
readFileSync(templatePaths[templatePaths.length - 1]),
vars,
targetRelPath,
)
if (existsSync(targetPath)) {
const existingContent = readFileSync(targetPath)
if (existingContent === templateContent) {
// already in sync
continue
}
await writeFile(targetPath, templateContent)
console.log(` ${targetRelPath}: ${green}updated${reset}`)
allSynced = false
} else {
await writeFile(targetPath, templateContent)
console.log(` ${targetRelPath}: ${green}created${reset}`)
allSynced = false
}
}
}
if (allSynced) {
console.log(` ${green}in sync${reset}`)
}
}
}

57
src/templates.ts Normal file
View file

@ -0,0 +1,57 @@
import { join } from "node:path"
import { templatesDir, resolveTemplateConfig, readdirSyncRecursive, findAllTemplateNames } from "./shared"
const purple = Bun.color("green", "ansi")
const reset = "\x1b[0m"
interface TreeNode {
name: string
children: TreeNode[]
files: string[]
}
function buildTree(name: string, visited = new Set<string>()): TreeNode {
if (visited.has(name)) return { name, children: [], files: [] }
visited.add(name)
const config = resolveTemplateConfig(name)
const children = (config.inherits ?? []).map(p => buildTree(p, visited))
const dir = join(templatesDir, name)
const files = readdirSyncRecursive(dir).filter(f => f !== ".regime-template.json")
return { name, children, files }
}
function printTree(node: TreeNode, full: boolean, prefix = "", isLast = true, isRoot = true) {
const connector = isRoot ? "" : isLast ? "└── " : "├── "
const line = isRoot ? node.name : `${prefix}${connector}${node.name}`
console.log(line)
const childPrefix = isRoot ? "" : prefix + (isLast ? " " : "│ ")
if (full && node.files.length > 0) {
const hasChildren = node.children.length > 0
for (let i = 0; i < node.files.length; i++) {
const fileConnector = hasChildren || i < node.files.length - 1 ? "│ " : " "
const bullet = "·"
console.log(`${childPrefix}${fileConnector}${purple}${bullet} ${node.files[i]}${reset}`)
}
}
for (let i = 0; i < node.children.length; i++) {
const child = node.children[i]
const last = i === node.children.length - 1
printTree(child, full, childPrefix, last, false)
}
}
export function templates(full: boolean) {
const entries = findAllTemplateNames()
for (let i = 0; i < entries.length; i++) {
const tree = buildTree(entries[i])
printTree(tree, full)
if (i < entries.length - 1) console.log()
}
}

View file

@ -0,0 +1 @@
{}

View file

@ -0,0 +1,5 @@
{
"devDependencies": {
"@types/bun": "^1.3.13"
}
}

View file

@ -0,0 +1,13 @@
{
"extends": "./tsconfig.base.json",
"references": [
{ "path": "./tsconfig.src.json" }
],
"compilerOptions": {
"composite": true,
"types": ["bun"]
},
"include": [
"src/bun"
]
}

View file

@ -0,0 +1,5 @@
{
"references": [
{ "path": "./tsconfig.bun.json" }
]
}

View file

@ -0,0 +1,5 @@
{
"exclude": [
"src/bun"
]
}

View file

@ -0,0 +1 @@
{}

View file

@ -0,0 +1,5 @@
{
"devDependencies": {
"@cloudflare/workers-types": "^4.20250425.0"
}
}

View file

@ -0,0 +1,13 @@
{
"extends": "./tsconfig.base.json",
"references": [
{ "path": "./tsconfig.src.json" }
],
"compilerOptions": {
"composite": true,
"types": ["@cloudflare/workers-types"]
},
"include": [
"src/cloudflare"
]
}

View file

@ -0,0 +1,5 @@
{
"references": [
{ "path": "./tsconfig.cloudflare.json" }
]
}

View file

@ -0,0 +1,5 @@
{
"exclude": [
"src/cloudflare"
]
}

View file

@ -0,0 +1 @@
{}

View file

@ -0,0 +1,7 @@
Copyright © 2026 Sigitex
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

View file

@ -0,0 +1,7 @@
{
"inherits": [
"shared/repo",
"shared/library",
"tool/oxc"
]
}

View file

@ -0,0 +1,5 @@
{
"inherits": [
"shared/library"
]
}

View file

@ -0,0 +1,9 @@
{
"inherits": [
"shared/repo",
"shared/package",
"tool/oxc",
"tool/commitlint",
"tool/husky"
]
}

View file

@ -0,0 +1,11 @@
{
"workspaces": {
"catalog": {
"@types/bun": "1.3.12"
}
},
"scripts": {
"test": "bun run --workspaces --parallel --no-exit-on-error test",
"check": "bun run --workspaces --parallel --no-exit-on-error check"
}
}

View file

@ -0,0 +1,5 @@
{
"patterns": {
"*.code-workspace": "merge jsonc"
}
}

View file

@ -0,0 +1,27 @@
{
"folders": [
{
"name": "<<repo>>",
"path": "."
}
],
"settings": {
"oxc.path.oxfmt": "./node_modules/.bin/oxfmt",
"oxc.path.oxlint": "./node_modules/.bin/oxlint",
"files.exclude": {
"**/.git": true,
"**/node_modules": true,
"**/.temp": true,
"**/dist": true,
"**/*.tsbuildinfo": true,
},
"explorer.fileNesting.enabled": true,
"explorer.fileNesting.patterns": {
"tsconfig.json": "tsconfig.*.json",
"package.json": "bun.lock, biome.json*, *.bun.plugin.ts, ox*.config.ts, .gitignore, .mirrorignore, regime.config.json, commitlint.config.*, release.config.*js",
"vite.config.ts": "*.vite.plugin.ts",
"README.md": "LICENSE, LICENSE.md"
},
"git.ignoreLimitWarning": true,
}
}

View file

@ -0,0 +1,5 @@
{
"inherits": [
"shared/package"
]
}

View file

@ -0,0 +1,13 @@
{
"devDependencies": {
"@typescript/native-preview": "beta",
"@types/bun": "^1.3.13"
},
"scripts": {
"check": "tsgo --build",
"test": "bun test --pass-with-no-tests --tsconfig-override tsconfig.test.json"
},
"files": [
"src"
]
}

View file

@ -0,0 +1,14 @@
{
"compilerOptions": {
"module": "esnext",
"target": "esnext",
"lib": ["esnext"],
"types": [],
"moduleResolution": "bundler",
"esModuleInterop": true,
"skipDefaultLibCheck": true,
"skipLibCheck": true,
"strict": true,
"outDir": "dist"
}
}

View file

@ -0,0 +1,12 @@
{
"extends": "./tsconfig.base.json",
"compilerOptions": {
"composite": true,
"types": ["bun"]
},
"include": [
"*.config.ts",
"*.config.cjs"
],
"files": []
}

View file

@ -0,0 +1,8 @@
{
"files": [],
"references": [
{ "path": "./tsconfig.src.json" },
{ "path": "./tsconfig.test.json" },
{ "path": "./tsconfig.config.json" }
]
}

View file

@ -0,0 +1,9 @@
{
"extends": "./tsconfig.base.json",
"compilerOptions": {
"composite": true
},
"include": [
"src"
]
}

View file

@ -0,0 +1,14 @@
{
"extends": "./tsconfig.base.json",
"references": [
{ "path": "./tsconfig.src.json" }
],
"compilerOptions": {
"composite": true,
"types": ["bun"]
},
"include": [
"tests"
],
"files": []
}

View file

@ -0,0 +1,7 @@
{
"patterns": {
"package.json": "merge json",
"tsconfig.json": "merge json",
"tsconfig.*.json": "merge json"
}
}

View file

@ -0,0 +1,11 @@
{
"license": "MIT",
"author": {
"name": "Sigitex",
"url": "http://github.com/sigitex"
},
"repository": {
"type": "git",
"url": "https://github.com/sigitex/<<repo>>.git"
}
}

View file

@ -0,0 +1,3 @@
{
"inherits": ["include/license"]
}

View file

@ -0,0 +1 @@
bunx --no-install -- commitlint --edit $1

View file

@ -0,0 +1 @@
{}

View file

@ -0,0 +1,3 @@
export default {
extends: ["@commitlint/config-conventional"]
}

View file

@ -0,0 +1,6 @@
{
"devDependencies": {
"@commitlint/cli": "^20.5.3",
"@commitlint/config-conventional": "^20.5.3"
}
}

View file

@ -0,0 +1 @@
{}

View file

@ -0,0 +1,8 @@
{
"devDependencies": {
"husky": "^9.1.7"
},
"scripts": {
"prepare": "husky"
}
}

View file

@ -0,0 +1 @@
{}

View file

@ -0,0 +1,16 @@
import { defineConfig } from "oxfmt"
export default defineConfig({
useTabs: false,
tabWidth: 2,
printWidth: 80,
singleQuote: false,
jsxSingleQuote: false,
quoteProps: "as-needed",
trailingComma: "all",
semi: false,
arrowParens: "always",
bracketSameLine: false,
bracketSpacing: true,
ignorePatterns: ["**/*.gen.ts"],
});

View file

@ -0,0 +1,88 @@
import { defineConfig } from "oxlint"
export default defineConfig({
plugins: ["typescript", "unicorn", "oxc"],
ignorePatterns: ["**/*.gen.ts", "node_modules/**/*"],
categories: {
correctness: "error",
suspicious: "warn",
perf: "warn",
style: "warn",
restriction: "error",
},
rules: {
"capitalized-comments": "off",
"default-case": "off",
"filename-case": "off",
"func-style": ["error", "declaration", { allowArrowFunctions: true }],
"id-length": "off",
"init-declarations": "off",
"max-params": "off",
"max-statements": "off",
"new-cap": "off",
"no-array-for-each": "off",
"no-async-await": "off",
"no-await-expression-member": "off",
"no-await-in-loop": "off",
"no-bitwise": "off",
"no-console": "off",
"no-continue": "off",
"no-dynamic-delete": "off",
"no-empty-file": "off",
"no-empty-function": "off",
"no-implicit-coercion": "off",
"no-magic-numbers": "off",
"no-multi-assign": "off",
"no-nested-ternary": "off",
"unicorn/no-nested-ternary": "off",
"no-null": "off",
"no-optional-chaining": "off",
"no-plusplus": "off",
"no-rest-spread-properties": "off",
"no-shadow-restricted-names": "off",
"no-shadow": "off",
"no-ternary": "off",
"no-undefined": "off",
"no-underscore-dangle": "off",
"no-use-before-define": "off",
"unicorn/numeric-separators-style": "off",
"prefer-destructuring": "off",
"prefer-for-of": "off",
"prefer-template": "off",
"prefer-ternary": "off",
"require-module-specifiers": "off",
"sort-imports": "off",
"sort-keys": "off",
"switch-case-braces": "off",
"typescript/consistent-indexed-object-style": "off",
"typescript/consistent-type-definitions": ["error", "type"],
"typescript/explicit-function-return-type": "off",
"typescript/explicit-member-accessibility": "off",
"typescript/explicit-module-boundary-types": "off",
"typescript/no-empty-interface": "off",
"typescript/no-empty-object-type": "off",
"typescript/no-namespace": "off",
"typescript/no-non-null-assertion": "off",
"typescript/prefer-function-type": "off",
},
overrides: [
{
files: ["*.test.ts"],
rules: {
"typescript/no-explicit-any": "off",
"typescript/no-require-imports": "off",
"typescript/no-var-requires": "off",
"unicorn/prefer-module": "off",
"unicorn/consistent-function-scoping": "off",
},
},
{
files: ["*.d.ts"],
rules: {
"typescript/no-explicit-any": "off",
"unicorn/consistent-function-scoping": "off",
"typescript/consistent-type-definitions": "off",
},
},
],
})

View file

@ -0,0 +1,9 @@
{
"devDependencies": {
"oxfmt": "^0.47.0",
"oxlint": "^1.62.0"
},
"scripts": {
"lint": "oxlint"
}
}

View file

@ -0,0 +1,13 @@
name: Checks
on:
push:
branches: ["*"]
pull_request:
jobs:
checks:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: https://${{ secrets.FORGE_TOKEN }}@code.quickbasic.org/sigitex/regime/actions/checks@main

View file

@ -0,0 +1 @@
{}

View file

@ -0,0 +1,18 @@
name: Mirror to GitHub
on:
push:
branches:
- main
jobs:
mirror:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- uses: https://${{ secrets.FORGE_TOKEN }}@code.quickbasic.org/sigitex/regime/actions/mirror@main
with:
target: sigitex/<<repo>>
token: ${{ secrets.MIRROR_TOKEN }}

View file

@ -0,0 +1,3 @@
AGENTS.md
.forgejo
openspec

View file

@ -0,0 +1 @@
{}

View file

@ -0,0 +1,27 @@
name: Checks
on:
push:
branches: ["*"]
pull_request:
jobs:
checks:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: https://${{ secrets.FORGE_TOKEN }}@code.quickbasic.org/sigitex/regime/actions/checks@main
release:
runs-on: ubuntu-latest
needs: checks
if: github.ref == 'refs/heads/main' && github.event_name == 'push'
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- uses: https://${{ secrets.FORGE_TOKEN }}@code.quickbasic.org/sigitex/regime/actions/release@main
with:
gitea-token: ${{ secrets.FORGE_TOKEN }}
gitea-url: https://code.quickbasic.org
npm-token: ${{ secrets.NPM_TOKEN }}

View file

@ -0,0 +1 @@
{}

View file

@ -0,0 +1,8 @@
{
"devDependencies": {
"semantic-release": "^25.0.3",
"multi-semantic-release": "^3.1.0",
"@markwylde/semantic-release-gitea": "^2.2.0",
"@semantic-release/exec": "^7.0.3"
}
}

View file

@ -0,0 +1,16 @@
/**
* @type {import('semantic-release').GlobalConfig}
*/
module.exports = {
repositoryUrl: "https://code.quickbasic.org/sigitex/<<repo>>.git",
branches: ["main"],
plugins: [
"@semantic-release/commit-analyzer",
"@semantic-release/release-notes-generator",
["@semantic-release/exec", {
prepareCmd: "npm pkg set version=${nextRelease.version}",
publishCmd: "npm publish --access public",
}],
"@markwylde/semantic-release-gitea",
],
};

19
tsconfig.json Normal file
View file

@ -0,0 +1,19 @@
{
"compilerOptions": {
"composite": true,
"module": "esnext",
"target": "esnext",
"lib": ["esnext"],
"types": ["bun"],
"moduleResolution": "bundler",
"esModuleInterop": true,
"skipDefaultLibCheck": true,
"skipLibCheck": true,
"strict": true,
"outDir": "lib"
},
"include": [
"src/**/*",
"./bin/regime"
]
}