Embed interactive graph views or run the pipeline offline.
The fastest way to get a BitZoom graph on screen: one import, one function call, done. See the minimal example for a complete working page.
import { createBitZoomView } from './bitzoom-canvas.js'; const view = createBitZoomView( document.getElementById('canvas'), edgesText, // SNAP .edges file content (string) nodesText, // SNAP .nodes file content or null { initialLevel: 3, heatmapMode: 'density', showLegend: true } );
That parses the data, runs the full MinHash + projection + blend pipeline, and renders an interactive canvas with pan, zoom, click-to-select, and scroll-to-zoom-level. Here it is with the Epstein dataset:
BitZoom uses a simple tab-delimited text format based on SNAP. A dataset consists of an .edges file (required) and an optional .nodes file.
Tab-delimited, one edge per line. Lines starting with # are comments. Each line is either From<tab>To (undirected) or From<tab>To<tab>EdgeType (typed).
# Social network edges # From To EdgeType alice bob friend alice carol colleague bob dave friend carol dave colleague eve alice manager
Tab-delimited, one node per line. The first comment line defines column names: # NodeId<tab>Label<tab>Group followed by any number of extra columns. Extra columns become property groups used for MinHash similarity and layout positioning.
# NodeId Label Group Role Experience
alice Alice engineering backend 8
bob Bob engineering frontend 5
carol Carol design ux 3
dave Dave design visual 7
eve Eve management director 12
Numeric columns are auto-detected when >=80% of values parse as numbers. They receive 3-level tokenization (coarse, medium, fine bins), so nodes with nearby numeric values cluster together while distant values separate.
Empty fields (missing or blank values) produce zero tokens, which results in a neutral [0,0] projection for that property group. This avoids false clustering: nodes with unknown values float to the center rather than being forced into a cluster.
If no .nodes file is provided, the graph is edge-only. Layout relies entirely on topology (set smoothAlpha to 1.0 for best results).
The code above is the full embed. All you need is a <canvas> element and the SNAP data as strings. The canvas auto-sizes via ResizeObserver.
// Fetch data, create view const [edges, nodes] = await Promise.all([ fetch('data/epstein.edges').then(r => r.text()), fetch('data/epstein.nodes').then(r => r.text()), ]); const view = createBitZoomView(canvas, edges, nodes, { initialLevel: 3, edgeMode: 'lines', heatmapMode: 'density', showLegend: true, showResetBtn: true, weights: { group: 8, edgetype: 4 }, labelProps: ['label'], });
The returned view is a BitZoomCanvas instance with a full API for programmatic control.
If your graph is already in memory as JavaScript objects, skip the SNAP parsing entirely with createBitZoomFromGraph. Pass nodes as plain objects with id, group, label, and any extra properties. Edges are {src, dst} pairs.
import { createBitZoomFromGraph } from './bitzoom-canvas.js'; const nodes = [ { id: 'alice', group: 'engineer', label: 'Alice', lang: 'Go', exp: 8 }, { id: 'bob', group: 'engineer', label: 'Bob', lang: 'Go', exp: 5 }, { id: 'carol', group: 'designer', label: 'Carol', lang: 'JS', exp: 3 }, { id: 'dave', group: 'designer', label: 'Dave', lang: 'JS', exp: 7 }, { id: 'eve', group: 'manager', label: 'Eve', lang: 'Python', exp: 12 }, { id: 'frank', group: 'engineer', label: 'Frank', lang: 'Rust', exp: 2 }, { id: 'grace', group: 'designer', label: 'Grace', lang: 'JS', exp: 6 }, { id: 'hank', group: 'manager', label: 'Hank', lang: 'Python', exp: 9 }, ]; const edges = [ { src: 'alice', dst: 'bob' }, { src: 'alice', dst: 'carol' }, { src: 'bob', dst: 'frank' }, { src: 'carol', dst: 'dave' }, { src: 'carol', dst: 'grace' }, { src: 'dave', dst: 'grace' }, { src: 'eve', dst: 'alice' }, { src: 'eve', dst: 'hank' }, { src: 'hank', dst: 'bob' }, { src: 'frank', dst: 'eve' }, ]; const view = createBitZoomFromGraph(canvas, nodes, edges, { initialLevel: 3, showLegend: true, showResetBtn: true, weights: { group: 5, lang: 8 }, });
Only id is required. group defaults to "unknown" and label defaults to the node's id. Any other property becomes an extra property group. Numeric values (like exp) get multi-resolution tokenization automatically.
All options are optional. Sensible defaults are applied.
| Option | Type | Default | Description |
|---|---|---|---|
initialLevel | number | 3 | Starting zoom level index (0=L1, 13=L14, 14=RAW) |
edgeMode | string | 'curves' | 'curves', 'lines', or 'none' |
heatmapMode | string | 'off' | 'off', 'splat', or 'density' |
quantMode | string | 'gaussian' | 'gaussian' (density-preserving) or 'rank' (uniform) |
sizeBy | string | 'edges' | Node size: 'edges' (degree) or 'members' (count) |
sizeLog | boolean | false | Log scale for node size |
smoothAlpha | number | 0 | Topology blend weight, 0 (property only) to 1 (topology only) |
weights | object | {group:3, rest:1} | Override property group weights |
labelProps | array | [] | Property names to show as node labels |
showLegend | boolean | false | Draw color legend in bottom-right corner |
showResetBtn | boolean | false | Draw reset button in top-right corner |
skipEvents | boolean | false | Disable built-in mouse/touch/keyboard events |
webgl | boolean | false | Use WebGL2 instanced rendering for geometry (details) |
useGPU | boolean | false | Force WebGPU compute for projection and blend (details) |
autoGPU | boolean | true | Automatically enable WebGPU when dataset is large enough to benefit (N×G > 2000). Set to false to disable. |
colorScheme | number | 0 | Color scheme index: 0=vivid, 1=viridis, 2=plasma, 3=inferno, 4=thermal, 5=grayscale, 6=diverging, 7=greens, 8=reds |
lightMode | boolean | false | Light theme for labels, legend, and grid. Press C to cycle color schemes, set view.lightMode = true at runtime. |
Color scheme and theme can also be changed at runtime:
import { SCHEME_VIRIDIS, SCHEME_DIVERGING } from './bitzoom-colors.js'; // Light mode with viridis colors const view = createBitZoomView(canvas, edges, nodes, { colorScheme: SCHEME_VIRIDIS, lightMode: true, }); // Change at runtime view.colorScheme = SCHEME_DIVERGING; // switch to diverging view.lightMode = false; // back to dark view.cycleColorScheme(); // next scheme console.log(view.colorSchemeName); // "thermal"
Available constants: SCHEME_VIVID (0, default), SCHEME_VIRIDIS, SCHEME_PLASMA, SCHEME_INFERNO, SCHEME_THERMAL, SCHEME_GRAYSCALE, SCHEME_DIVERGING, SCHEME_GREENS, SCHEME_REDS.
React to user interaction without touching the DOM:
const view = createBitZoomView(canvas, edges, nodes, { onSelect(hit) { // hit = { type: 'node'|'supernode', item: nodeObj|supernodeObj } if (hit) console.log('Selected:', hit.item.id || hit.item.bid); else console.log('Deselected'); }, onHover(hit) { // Called on every hover change (null when leaving a node) sidebar.textContent = hit ? (hit.item.label || hit.item.cachedLabel) : ''; }, onRender() { // Called after each frame render. Useful for syncing external UI. levelIndicator.textContent = 'L' + view.currentLevel; }, });
Property weights control which attributes dominate the layout. Change them at any time. No rehashing, no reprojection. Just a weighted blend of fixed 2D anchors.
// Change weights after creation view.setWeights({ kind: 8, group: 1 }); // cluster by file type view.setWeights({ kind: 1, group: 8 }); // cluster by module // Adjust topology smoothing view.setAlpha(0.5); // pull connected nodes together // Batch display options view.setOptions({ heatmapMode: 'density', edgeMode: 'lines', sizeBy: 'members', labelProps: ['file', 'kind'], });
When loading an unfamiliar dataset, the default weights may produce a collapsed or uniform layout. The auto-tune optimizer searches the weight/alpha/quant parameter space to find a configuration with good visual structure, then applies it automatically.
Synth Packages dataset: default weights (left) vs auto-tuned (right).
Pass autoTune in the options. The view renders immediately with defaults; the optimizer runs in the background and re-renders when done. A progress overlay appears on the canvas during optimization.
import { createBitZoomView } from './bitzoom-canvas.js'; const view = createBitZoomView(canvas, edgesText, nodesText, { // Auto-tune all parameters autoTune: { weights: true, alpha: true, quant: true }, }); // view is returned synchronously — usable immediately // auto-tune runs async and re-renders when done
Control which parameters are tuned:
// Only tune weights, keep alpha and quant as specified const view = createBitZoomView(canvas, edges, nodes, { smoothAlpha: 0.5, quantMode: 'gaussian', autoTune: { weights: true, alpha: false, quant: false }, });
Explicit weights, smoothAlpha, and quantMode in the options take precedence over tuned values.
Call autoTuneWeights directly for full control:
import { autoTuneWeights } from './bitzoom-utils.js'; const result = await autoTuneWeights( view.nodes, view.groupNames, view.adjList, view.nodeIndexFull, { weights: true, alpha: true, quant: true, onProgress(info) { console.log(`${info.phase} ${info.step}/${info.total}`); }, } ); console.log(result); // { weights: {group:8, ...}, alpha: 0.5, quantMode: 'gaussian', // labelProps: ['group'], score: 0.42, blends: 87, quants: 174, timeMs: 320 } // Apply to an existing view for (const g of view.groupNames) view.propWeights[g] = result.weights[g]; view.smoothAlpha = result.alpha; view.quantMode = result.quantMode; view.setWeights(result.weights); // triggers re-blend and render
The optimizer maximizes spread × clumpiness at zoom level L5 (32×32 grid). Spread (fraction of cells occupied) penalizes collapse. Clumpiness (coefficient of variation of per-cell node counts) penalizes uniform scatter and rewards clusters with gaps between them. The search runs in two phases:
Each evaluation runs a full blend + quantize cycle (~1-3ms per evaluation depending on graph size). Total time is typically 200-800ms for graphs under 5K nodes.
Run the full pipeline without a canvas, in Node.js or Deno. Useful for batch processing, precomputation, or analysis scripts.
import { runPipeline } from './bitzoom-pipeline.js'; import { unifiedBlend, normalizeAndQuantize, buildLevel } from './bitzoom-algo.js'; // Parse + tokenize + project (no DOM needed) const result = runPipeline(edgesText, nodesText); console.log(result.nodeArray.length, 'nodes'); console.log(result.groupNames); // ['group', 'label', 'structure', ...] // Hydrate nodes with projections const G = result.groupNames.length; const nodes = result.nodeArray.map((n, i) => { const projections = {}; for (let g = 0; g < G; g++) { const off = (i * G + g) * 2; projections[result.groupNames[g]] = [result.projBuf[off], result.projBuf[off + 1]]; } return { ...n, projections, px: 0, py: 0, gx: 0, gy: 0, x: 0, y: 0 }; }); // Build adjacency const nodeIndex = Object.fromEntries(nodes.map(n => [n.id, n])); const adjList = Object.fromEntries(nodes.map(n => [n.id, []])); for (const e of result.edges) { if (adjList[e.src] && adjList[e.dst]) { adjList[e.src].push(e.dst); adjList[e.dst].push(e.src); } } // Blend with custom weights const weights = { group: 5, edgetype: 8 }; for (const g of result.groupNames) weights[g] ??= 1; unifiedBlend(nodes, result.groupNames, weights, 0, adjList, nodeIndex, 5, 'rank'); // Build a zoom level and inspect supernodes const level = buildLevel(4, nodes, result.edges, nodeIndex, n => n.group, n => n.label || n.id, () => '#888'); console.log(level.supernodes.length, 'supernodes at L4'); console.log(level.snEdges.length, 'super-edges'); // Find the largest supernode const biggest = level.supernodes.sort((a, b) => b.members.length - a.members.length)[0]; console.log(biggest.cachedLabel, biggest.members.length, 'members');
deno run --allow-read), Node.js 18+ (with --experimental-vm-modules), and browsers.
BitZoom scales well because the pipeline is O(n) in node count (MinHash + projection) and rendering uses hierarchical aggregation (only visible supernodes are drawn). Here are typical timings on a modern desktop:
| Scale | Pipeline | Render | Notes |
|---|---|---|---|
| ~1K nodes | <10ms | <1ms | Instant, any mode |
| ~5K nodes | <50ms | <5ms | Smooth at 60fps |
| ~50K nodes | <500ms | <20ms | Consider disabling heatmap |
| ~367K nodes (Amazon) | ~4s (GPU) | ~5ms (WebGL) | GPU pipeline + WebGL rendering recommended |
heatmapMode: 'off'. Density heatmap is the most expensive render pass.edgeMode: 'none' to skip edge rendering entirely. Edge drawing is O(visible edges) and can dominate frame time at high zoom levels.webgl: true to offload circle, edge, and heatmap geometry to the GPU via instanced rendering.useGPU: true to run projection and blend on the GPU via WebGPU compute shaders. This is the biggest win for pipeline time on very large graphs.| Symptom | Cause | Fix |
|---|---|---|
| All nodes collapse to a single point | All property values are identical across nodes (zero variance) | Add distinguishing properties, increase smoothAlpha to use topology, or run auto-tune |
| Uniform random scatter, no clusters | No .nodes file (edge-only dataset) |
Set smoothAlpha: 1.0 to lay out purely by topology. Provide a .nodes file if properties are available |
| Slow pipeline on very large graphs | CPU-bound MinHash + projection at scale | Use useGPU: true for WebGPU compute. Disable heatmap with heatmapMode: 'off' |
| Rank quantization falls back to CPU | Rank sort requires float64 precision not available in GPU compute | Use quantMode: 'gaussian' (the default) for GPU-compatible quantization |
| Slow rendering at high zoom | Many visible nodes/edges at detailed zoom levels | Enable webgl: true, use edgeMode: 'none' or 'lines' instead of 'curves' |
Use MinHash directly to estimate Jaccard similarity between token sets:
import { computeMinHash, jaccardEstimate } from './bitzoom-algo.js'; const sigA = computeMinHash(['group:Person', 'label:jeffrey', 'label:epstein']); const sigB = computeMinHash(['group:Person', 'label:ghislaine', 'label:maxwell']); const sigC = computeMinHash(['group:Organization', 'label:us', 'label:government']); console.log(jaccardEstimate(sigA, sigB)); // ~0.2 (share group:Person) console.log(jaccardEstimate(sigA, sigC)); // ~0.0 (no shared tokens) console.log(jaccardEstimate(sigA, sigA)); // 1.0 (identical)
For on-demand signatures of existing nodes (using the same tokenization as the pipeline):
import { computeNodeSig } from './bitzoom-pipeline.js'; const sig = computeNodeSig(node); // Float64Array[128]
The zoom hierarchy is the core trick: 14 aggregation levels from two stored uint16 coordinates per node, derived via bit shifts. You can access any level programmatically.
import { cellIdAtLevel, buildLevel } from './bitzoom-algo.js'; // Which cell is a node in at level 4? (16x16 grid) const bid = cellIdAtLevel(node.gx, node.gy, 4); const cx = bid >> 4; // cell column const cy = bid & 0b1111; // cell row // Build level 6 (64x64 grid) const lv = buildLevel(6, nodes, edges, nodeIndex, n => n.group, n => n.label, val => colors[val] || '#888'); // Each supernode: { bid, members[], ax, ay, cachedColor, cachedLabel, ... } for (const sn of lv.supernodes) { console.log(`Cell (${sn.cx},${sn.cy}): ${sn.members.length} nodes, label: ${sn.cachedLabel}`); } // Super-edges: { a: bidA, b: bidB, weight: edgeCount } for (const e of lv.snEdges) { console.log(`${e.a} ↔ ${e.b}: ${e.weight} edges`); }
Pass webgl: true to render geometry (circles, edges, heatmaps, grid) via WebGL2 instanced draw calls. Text (labels, counts, legend) stays on a Canvas 2D overlay. The two canvases are stacked automatically.
Epstein dataset: Canvas 2D (left) vs WebGL2 (right). Same data, same layout, same interaction.
import { createBitZoomView } from './bitzoom-canvas.js'; // Canvas 2D (default) const view2d = createBitZoomView(canvas1, edges, nodes, { edgeMode: 'curves', showLegend: true, }); // WebGL2 const viewGL = createBitZoomView(canvas2, edges, nodes, { webgl: true, edgeMode: 'curves', showLegend: true, }); // Toggle at runtime viewGL.useWebGL = false; // switch back to Canvas 2D viewGL.useWebGL = true; // switch to WebGL2
Embed a graph with a single HTML element — no JavaScript needed for basic use. Include bz-graph.js in your page and use the <bz-graph> tag.
<script type="module" src="bz-graph.js"></script>
<style>bz-graph:not(:defined) { visibility: hidden; }</style>
<bz-graph edges="data/epstein.edges" nodes="data/epstein.nodes"
level="3" heatmap="density" legend
weights="group:5,edgetype:8" label-props="label">
</bz-graph>
<bz-graph format="json" level="2" legend color-scheme="6">
{
"nodes": [
{"id": "alice", "group": "eng", "label": "Alice", "lang": "Go"},
{"id": "bob", "group": "eng", "label": "Bob", "lang": "Go"},
{"id": "carol", "group": "des", "label": "Carol", "lang": "JS"}
],
"edges": [
{"src": "alice", "dst": "bob"},
{"src": "bob", "dst": "carol"}
]
}
</bz-graph>
<bz-graph level="2" legend edge-mode="lines"> alice bob bob carol carol alice </bz-graph>
| Attribute | Example | Description |
|---|---|---|
edges | "data/karate.edges" | URL to .edges file |
nodes | "data/karate.nodes" | URL to .nodes file (optional) |
format | "json" | Inline data format: json or snap (default) |
level | "3" | Initial zoom level (0-indexed) |
heatmap | "density" | Heatmap mode: off, density, splat |
edge-mode | "curves" | Edge mode: curves, lines, none |
alpha | "0.5" | Topology blend alpha (0–1) |
color-scheme | "1" | Color scheme index (0–8) |
weights | "group:5,kind:8" | Property weights |
label-props | "label,group" | Properties shown as labels |
legend | (boolean) | Show color legend |
reset-btn | (boolean) | Show reset button |
light-mode | (boolean) | Light theme |
webgl | (boolean) | WebGL2 rendering |
Access the underlying BitZoomCanvas via element.view for programmatic control:
const el = document.querySelector('bz-graph'); el.view.colorScheme = 3; // change at runtime el.view.cycleColorScheme(); // next scheme el.setAttribute('level', '5'); // reactive attribute
See the full demo page for more examples.
BitZoom has no build step and no npm package. Copy the files you need into your project:
For running the pipeline without a browser or canvas:
| File | Role |
|---|---|
bitzoom-algo.js | MinHash, projection, blend, quantization, levels |
bitzoom-pipeline.js | SNAP parsing, graph building, tokenization |
These two files have zero DOM dependencies. Import runPipeline from bitzoom-pipeline.js.
For embedding an interactive graph view in a web page:
| File | Role |
|---|---|
bitzoom-canvas.js | Entry point, createBitZoomView() factory |
bitzoom-algo.js | MinHash, projection, blend, quantization |
bitzoom-pipeline.js | SNAP parsing, graph building, tokenization |
bitzoom-renderer.js | Canvas 2D rendering, hit testing, layout |
bitzoom-utils.js | Auto-tune optimizer |
bitzoom-gl-renderer.js | WebGL2 instanced rendering (optional feature, always imported) |
bitzoom-gpu.js | WebGPU compute acceleration (optional feature, always imported) |
bitzoom-colors.js | Color schemes (vivid, viridis, plasma, etc.) |
bz-graph.js | <bz-graph> web component (optional, imports bitzoom-canvas.js) |
The first seven files are required for createBitZoomView. Add bz-graph.js for the web component. All are ES modules. Place them in the same directory. No bundler required.
How It Works · Viewer · GitHub