Skip to content

Validation + parsing

Parses JSON strings, validates them against a Zod schema, and returns typed results — either on the main thread or through a Knitting worker pool. If your app already does JSON.parse + validation and you want to see what offloading looks like, start here.

The host generates JSON strings (some valid, some intentionally broken). Each job parses the string, runs it through UserSchema.safeParse, and returns { ok: true, value } or { ok: false, issues }. The host aggregates counts and prints sample failures.

Three files:

  • schema_knitting.ts — runs parse+validate in host and Knitting modes
  • utils.ts — schema logic, payload builders, task exports
  • bench_schema_validate.ts — host-vs-worker benchmark with mitata
// valid
{
"id": "u_42",
"email": "ari@knitting.dev",
"displayName": "Ari Lane",
"age": 29,
"roles": ["admin"],
"marketingOptIn": true
}
// invalid
{
"id": "u_42",
"email": "ari@knitting.dev",
"displayName": "x",
"age": "unknown",
"roles": ["owner"]
}

The first returns { ok: true, value }. The second returns { ok: false, issues }, with messages like displayName: String must contain at least 2 character(s) or age: Expected number, received string.

deno.sh
deno add --npm jsr:@vixeny/knitting
deno add npm:zod npm:mitata
bun.sh
bun src/schema_knitting.ts

You should see output like:

-- host mode --
valid: 800 invalid: 200 sample issue: ["Expected string, received number"]
-- knitting mode (2 threads) --
valid: 800 invalid: 200 sample issue: ["Expected string, received number"]
bun.sh
bun src/bench_schema_validate.ts

The benchmark compares JSON.parse + safeParse via direct function imports (host) against the same logic dispatched through a worker pool (knitting). Batch calls keep per-dispatch overhead predictable.

Expected output:

benchmark avg (ns) min ... max (ns)
host 12,340 11,200 ... 18,400
knitting 6,890 6,100 ... 11,200
schema_knitting.ts
import { createPool, isMain } from "@vixeny/knitting";
import {
buildPayloads,
parseAndValidate,
parseAndValidateHost,
type ParseValidateResult,
} from "./utils.ts";
const THREADS = 2;
const REQUESTS = 20_000;
const INVALID_PERCENT = 15;
type Summary = {
valid: number;
invalid: number;
sampleIssues: string[];
};
function summarize(results: ParseValidateResult[]): Summary {
let valid = 0;
let invalid = 0;
const sampleIssues: string[] = [];
for (let i = 0; i < results.length; i++) {
const result = results[i]!;
if (result.ok) {
valid++;
continue;
}
invalid++;
if (sampleIssues.length < 3 && result.issues.length > 0) {
sampleIssues.push(result.issues[0]!);
}
}
return { valid, invalid, sampleIssues };
}
function runHost(payloads: string[]): Summary {
const results = payloads.map((payload) => parseAndValidateHost(payload));
return summarize(results);
}
async function runWorkers(payloads: string[]): Promise<Summary> {
const pool = createPool({ threads: THREADS })({ parseAndValidate });
try {
const jobs: Promise<ParseValidateResult>[] = [];
for (let i = 0; i < payloads.length; i++) {
jobs.push(pool.call.parseAndValidate(payloads[i]!));
}
const results = await Promise.all(jobs);
return summarize(results);
} finally {
pool.shutdown();
}
}
function printSummary(mode: string, summary: Summary, ms: number): void {
const secs = Math.max(1e-9, ms / 1000);
const rps = REQUESTS / secs;
console.log(mode);
console.log("requests :", REQUESTS.toLocaleString());
console.log("invalidRate :", `${INVALID_PERCENT}%`);
console.log("valid :", summary.valid.toLocaleString());
console.log("invalid :", summary.invalid.toLocaleString());
console.log("took :", `${ms.toFixed(2)} ms`);
console.log("throughput :", `${rps.toFixed(0)} req/s`);
if (summary.sampleIssues.length > 0) {
console.log("sampleIssues:", summary.sampleIssues.join(" | "));
}
}
async function main() {
const payloads = buildPayloads(REQUESTS, INVALID_PERCENT);
const hostStart = performance.now();
const hostSummary = runHost(payloads);
const hostMs = performance.now() - hostStart;
const workerStart = performance.now();
const workerSummary = await runWorkers(payloads);
const workerMs = performance.now() - workerStart;
const uplift = (hostMs / Math.max(1e-9, workerMs) - 1) * 100;
console.log("JSON parse + schema validation");
console.log(`threads: ${THREADS}`);
console.log("");
printSummary("host", hostSummary, hostMs);
console.log("");
printSummary("knitting", workerSummary, workerMs);
console.log("");
console.log(`uplift: ${uplift.toFixed(1)}%`);
}
if (isMain) {
main().catch((error) => {
console.error(error);
process.exitCode = 1;
});
}

Schema validation is a textbook case for worker offloading: each call is independent, the input/output is small, and Zod’s internals are CPU-bound (type checking, error formatting). If you’re validating hundreds of payloads per second — API gateway, webhook ingestion, form processing — batching them through a pool can free your main thread without changing any validation logic.