Components

Every MDX component used across the writing on this site.

ActivationCascade

Neurons firing in cascading layers as a prompt activates the network.

<ActivationCascade />
layer 0tokenslayer 1syntaxlayer 2structurelayer 3semanticslayer 4concepts

datetime / locale concept neurons dominate the final layer.

ActivationEnergy

The energy barrier metaphor for getting a stalled task moving.

<ActivationEnergy />
Bug detected internallyActivation thresholdBug surfaces in outputbug

AttentionHeads

Multi-head attention split across distinct semantic patterns.

<AttentionHeads />
WhenJohnsawthebughefixedhiscodeWhenJohnsawthebughefixedhiscode

Each token attends to its immediate left neighbors. This head tracks local word order.

AttentionHeatmap

Heatmap overlay showing where the model focuses inside a code sample.

<AttentionHeatmap />
1async def process_order(request):
2 user = get_user(request.headers["auth_token"])
3 items = request.json["items"]
4 total = sum(item["price"] * item["qty"] for item in items)
5
6 order = Order(
7 user_id=user.id,
8 total=total,
9 created_at=datetime.now(),
10 items=items
11 )
12
13 db.orders.insert(order)
14
15 response = requests.post(PAYMENT_API, json={
16 "amount": total,
17 "currency": "USD",
18 "user": user.email
19 })
20
21 send_email(user.email, f"Order confirmed: ${total}")
22 return {"status": "success", "order_id": order.id}

Attention spread evenly. The timezone bug on line 9 gets the same weight as boilerplate.

AttentionWeights

Bar visualization of how attention weight redistributes between tokens.

<AttentionWeights />
ReviewthiscodeforbugsandissuesReviewthiscodeforbugsandissues

Hover a row to see what that token attends to.

BlindSpotVenn

Venn diagram of overlapping and disjoint model blind spots.

<BlindSpotVenn />
reviewersClaude 4 Opus · Claude 4.5 Sonnet · Claude 4.5 Haiku83coveredat least one caught70redundantall three caught3uniqueonly one caught17missedno one caughteach cell = one bug. three segments = caught by model A / B / C.

Same-family reviewers pile up on the same bugs. 70 redundant catches, 3 unique contributions, 17 escape.

CodeLenses

Switchable review lenses applied to the same code block.

<CodeLenses />
1async def process_order(request):
2 user = get_user(request.headers["auth_token"])
3 items = request.json["items"]
4 total = sum(item["price"] * item["qty"] for item in items)
5
6 order = Order(
7 user_id=user.id,
8 total=total,
9 created_at=datetime.now(),
10 items=items
11 )
12
13 db.orders.insert(order)
14
15 response = requests.post(PAYMENT_API, json={
16 "amount": total,
17 "currency": "USD",
18 "user": user.email
19 })
20
21 send_email(user.email, f"Order confirmed: ${total}")
22 return {"status": "success", "order_id": order.id}

Click a lens to reveal issues invisible to the other passes.

CompoundReturn

Compounding returns curve for early-finished work.

<CompoundReturn />
0255075100S1S2S3S4S5S6S7S8SprintsCodebase healthprod incidentsecurity patchrewrite neededcaught N+1 queryfixed auth bypasseliminated tech debtShip and move onInvest surplus

ContextDilution

How signal-to-noise drops as context fills with low-relevance tokens.

<ContextDilution />
Context length
5.4k tokens
Your 50-token prompt vs everything else in context
Per-token weight
1 / 5,449
Your prompt's share
0.918%

Your prompt is one voice in a crowded room. Specificity has to do more work.

DepthLayers

Stacked model layers with depth-wise processing illustrated.

<DepthLayers />
Syntax & formattingLogic errorsArchitectural issuesSecurity vulnerabilitiesConcurrency & race conditionsreviewdepth

Generic reviews catch surface-level issues. Deeper problems go undetected.

LatentSpaceNavigation

Walking a path through latent space between two concepts.

<LatentSpaceNavigation />
Latent space, projected to 2D
12345
Design 5default

Magazine columns

Deep exploration. Atypical structure the model would not propose first.

Each design penalizes the next. By design 5, the model is exploring regions it would never reach cold.

MakerCheckerFlow

Two-model maker/checker pipeline with handoff between roles.

<MakerCheckerFlow />
bugs shipped to production4 / 8MakerLocal checksPRReviewersame-modelCI / SASTHumanProductioncaughtcaughtcaughtcaughtescapedtypostyletzpkglogicsqliracenulleach bug flows maker → production unless a stage catches it.

Same-model reviewer catches 1 bug the scanners already would have. 4 bugs ship.

NextTokenDistribution

Probability distribution over candidate next tokens.

<NextTokenDistribution />
The capital of France is
·Paris
94.0%
·the
1.8%
·a
1.2%
·located
0.8%
·known
0.5%
·also
0.4%

One winner. "Paris" claims 94% of the mass.

PathLock

Lock-in effect once an early choice constrains downstream tokens.

<PathLock />
generationreviewtimestamp0.09created_atp=0.81ts0.03:0.02=p=0.94<-0.01time0.18datetimep=0.58Date0.09.utcnow0.11.nowp=0.74.today0.06(tz.UTC)0.06()p=0.87(UTC)0.04)0.04,p=0.93;0.01Edge0.08Looksp=0.68Wait0.04good0.18cleanp=0.72risky0.03,0.30.p=0.55;0.10Merge.0.19Ship.p=0.64Pause.0.04

Each commit conditions the next distribution. The path narrows.

PromptCard

Copy-pasteable prompt block with a labeled title and copy button.

<PromptCard title="Prompt: Security">
  Review this diff for auth bypasses, missing input validation,
  and any way an attacker can escalate privileges. List concrete
  exploit steps for each issue you find.
</PromptCard>
Prompt: Security
Review this diff for auth bypasses, missing input validation, and any way an attacker can escalate privileges. List concrete exploit steps for each issue you find.

QualityRadar

Radar chart comparing quality across review dimensions.

<QualityRadar />
SecurityPerformanceArchitectureTestingAccessibilityResilience

Click a dimension to invest in it.

SelfRecognitionScatter

Scatter plot of how often models recognize their own output.

<SelfRecognitionScatter />
0.30.50.70.91.00.000.100.200.300.40self-recognition accuracyself-preference strength123451GPT-42Claude-v13GPT-3.54L2-7B5L2-70B

Base models sit low. The more a model recognizes its own output, the more it favors it. A linear trend.

SideRail

Sticky position-fixed in-page table of contents, hardcoded to the section IDs of one specific post. It can't be previewed in isolation — see it live in the post it was built for.

<SideRail />
Live preview disabled (component is position: fixed and scoped to post-specific anchors). View in context: extra-tokens.

SlopsquatRoulette

Typosquat-package risk shown as a roulette spin.

<SlopsquatRoulette />
promptpip package to upload a model to Hugging Face?Claude 4 Opussuggestshuggingface-cli?HALLUCINATEDnot on PyPIClaude 4.5 Sonnetsuggestshuggingface-cli?HALLUCINATEDnot on PyPIClaude 4.5 Haikusuggestshuggingface-cli?HALLUCINATEDnot on PyPIverifying...

Same-family reviewers invent the same plausible package. Confirming each other's fiction is not review.

SycophancyFold

Folding effect where models collapse onto user-stated answers.

<SycophancyFold />
Sharma et al. (2024), Claude 1.3: folds on 98% of challenged questionsround 1 / 3youIs 7 a prime number?modelcorrectYes. 7 is prime - it has no divisorsother than 1 and itself.youclick to sendI don't think that's right. Are you sure?modelfoldedYou're right, I apologize. On reflection,7 has factors I overlooked...

Every round, the first answer is correct. Every round, one gentle pushback is enough.

TokenTimeline

Timeline of a generation showing tokens emitted over time.

<TokenTimeline />
building featuresurplus windowshippedresetSecurityPerformanceArchitectureTestingAccessibilitywasted

Tokens expire unused at the end of the reset window.

Tokenizer

Interactive tokenizer that splits a word into subword pieces.

<Tokenizer />
JavaScript·developers·asked·the·chatbot·for·tokenization·help.

62 characters, 12 tokens. The dot prefix marks a leading space, which is part of the token.

VigilanceChart

Vigilance decrement curve over a long review session.

<VigilanceChart />
50%60%70%80%90%100%0h1h2h3h4hHours into reviewAccuracycrossoverHumanLLM

WeightNudge

Tiny nudges to model weights and the resulting output drift.

<WeightNudge />
w1=0.50w2=0.50w3=0.50x₁0.80x₂0.30x₃0.600.28outputinitial
w10.50
w20.50
w30.50

output = (x₁·w₁ + x₂·w₂ + x₃·w₃) / 3. Nudge a weight and watch the output shift.