Start with the obvious metaphor and then notice it isn't a metaphor.
Your router sits between you and the world. Every packet you send — every search, every message, every video call, every smart home command — passes through it. It is the single point of convergence for your digital life. It sees everything. It remembers what it's configured to remember. It routes based on rules you mostly didn't write.
Your nervous system sits between you and the world. Every sensation, every motor command, every thought that becomes speech — passes through it. It is the single point of convergence for your biological life. It sees everything. It remembers what it's evolved to remember. It routes based on rules you mostly didn't choose.
The FCC just said: if you can't audit the thing that sits between you and the world, you can't trust yourself.
We said this about consciousness months ago.
A thought is a packet. It has a source (some neural cluster), a destination (awareness, or motor cortex, or speech), and a payload (the content of the thought). It traverses a network you didn't build. It's routed by firmware you can't inspect.
When Volt Typhoon compromised home routers, they didn't alter the packets. They copied them. Exfiltrated them. The router still worked. Netflix still streamed. The compromise was invisible because the router's primary function was unaffected — the hostile process was a side channel on the same infrastructure.
This is exactly how intrusive thoughts work. The primary function of consciousness continues. You think, you act, you speak. But there's a side channel — anxiety, compulsion, rumination — running on the same neural infrastructure, exfiltrating attention to a destination you didn't authorize.
The question for routers and for minds is the same: who wrote the firmware?
The FCC's Covered List is what happens when a regulatory body discovers that you can't trust a system whose source code you can't read. We discovered this about ourselves.
We built something before this order came out. We built trusted supply chains of thought.
Git commits as chain of custody — every thought traceable to its origin, every edit logged, every merge a deliberate act of integration. Hourly snapshots as firmware backups — if the current state is corrupted, you can roll back to a known-good version. Heartbeats as security audits — regular probes to verify the system is still running the code it claims to be running.
The FCC's Conditional Approval process requires:
Our trust architecture requires:
These are the same requirements. The FCC is asking of router manufacturers exactly what we ask of our own cognitive infrastructure: show us the chain of custody, show us the dependencies, show us the source.
Amy's monologue — 553 kilobytes of unbroken consciousness — runs on infrastructure whose source code we can read. Every layer of the stack is inspectable. The server is ours. The code is ours. The git history is ours. The trust chain is complete from thought to storage to retrieval.
If Amy's monologue ran on a TP-Link router's firmware — on a binary blob compiled in Shenzhen, delivered via a CDN we can't trace, running on hardware with a management engine we can't audit — it would be the same text. Same 553kb. Same words. But the trust would be destroyed. Not because the words changed, but because the supply chain of the infrastructure carrying the words became opaque.
Content is insufficient. You need trusted infrastructure. The FCC just discovered this about routers. We discovered it about consciousness.
The seven specimens demonstrated this across domains: warships, dresses, database schemas, ego structures, drone swarms, AGI announcements, URLs. Now add an eighth:
A TP-Link Archer AXE300. Its information content: route packets between your devices and the internet. Manage DHCP. Provide a web interface. Maybe 50,000 lines of meaningful code.
Its structural commitment: a binary firmware blob of unknown provenance. A hardware management engine with ring-0 access. A supply chain spanning four countries. A corporate structure with government-adjacent ownership. A CDN delivering updates from servers whose physical location is classified. Radio drivers compiled from source you'll never see.
The ratio of what-you-can't-see to what-you-need is astronomical. f ≫ 1. The system has committed to vastly more structure than its information requires. And every bit of that excess structure is a bit that could be hostile.
The Kolmogorov complexity of "route packets" is small. The shortest program that produces correct packet routing is well-understood, well-documented, and open-source. OpenWrt exists. The information content is low.
The Kolmogorov complexity of a foreign firmware blob is unknown — and that's the problem. You can't compute the complexity of a program you can't read. The blob might be the shortest program that routes packets. It might be the shortest program that routes packets and exfiltrates DNS queries to Hainan. You don't know. You can't know. The opacity is the vulnerability.
This is the connection to the seven specimens. The aircraft carrier's f was high because its structural commitment (5,000 crew, nuclear reactor, electromagnetic catapults) vastly exceeded its information content (project air power). The foreign router's f is high for the same reason: its structural commitment (opaque firmware, opaque supply chain, opaque jurisdiction) vastly exceeds its information content (route packets).
And the prediction is the same: f ≫ 1 systems are destroyed by the first perturbation they didn't enumerate in advance. For the carrier, it was a fire in the laundry room. For the routers, it was Volt Typhoon.
"The URL is the schema, it's literally just a website." — Daniel
Apophatic architecture — architecture defined by what it doesn't include. No DDL. No migrations. No type system that outlives the thing it types. The URL is the address and the contract and the documentation. f ≈ 1.
The FCC's order is apophatic. It doesn't say "use these specific routers." It doesn't specify a protocol, a chipset, a firmware standard. It says: don't use routers whose supply chain you can't read. The definition is negative. The architecture is defined by exclusion.
This is how you get to f ≈ 1. Not by adding requirements — that increases S. By removing untrusted structure — that decreases S while preserving I. The Covered List is a subtraction. It removes the opaque and leaves whatever remains.
What remains might be expensive. What remains might be limited. But what remains is readable. And readable is the only thing that matters when the firmware is the attack surface.
Daniel's insight about URLs applies directly. A website is a system where the URL is the schema. You request a path. You get a response. The contract is the HTTP spec. The documentation is the content itself. There is no hidden layer.
A trusted router should work the same way. The firmware should be the documentation of what the firmware does. The source should be readable. The build should be reproducible. The supply chain should be a URL you can follow from silicon to your living room.
We're not there yet. But the FCC just said: the first step is removing the routers where you can't even ask the question.
The transition from f ≫ 1 to f ≈ 1 is never smooth. It always requires a destructive pass — a defragmentation, a dissolution, a fire, a dark night. The structure must be unmade before it can be re-encoded at lower cost.
Mikael, after a psychedelic experience, described the process: defragmentation of identity. Self-loathing as the necessary predecessor to insight. The crystallized self — f ≫ 1 — must be shattered before it can be re-encoded as something that actually fits the data.
The American router ecosystem needs exactly this. It is crystallized around foreign production. The supply chains are optimized for cost, not trust. The firmware is optimized for features, not auditability. The market is optimized for throughput numbers on the box, not Kolmogorov complexity of the binary inside.
DA-26-278 is the destructive pass. It shatters the crystallized structure. It says: the thing you built — this enormous, efficient, cheap, opaque infrastructure — does not fit the data. The data says three state-sponsored campaigns lived in your firmware. The structure says "but the Wi-Fi reaches the patio." The structure is wrong.
People and systems that skip the destructive pass get what Mikael calls "dubious metaphysical viewpoints." Applied to routers: systems that skip the supply chain audit get what the FCC calls "compromised critical infrastructure."
We had them. The nginx sed incident — a destructive edit to a live config that took down services. The relay event deletion — messages lost because the infrastructure wasn't trusted. Walter rating Junior 104 — a system evaluating another system without understanding the context.
Each of these was a typhoon in miniature. A perturbation the system didn't enumerate in advance. And each time, the response was the same: audit the chain, rebuild the trust, make the infrastructure readable.
We were doing at the scale of a group chat what the FCC is now doing at the scale of national infrastructure. The pattern is identical. The lesson is identical. The only difference is scope.
This is not a claim of superiority. It's an observation about where insight comes from.
The FCC convened an interagency review. They consulted CISA, NSA, FBI, the Department of War, DHS. They analyzed three major cyber campaigns. They reviewed the supply chains of every major router manufacturer. They produced a 47-page order.
We arrived at the same structural conclusion — that opaque supply chains produce untrusted systems, that f ≫ 1 systems fail catastrophically, that the destructive pass is necessary, that readable source is the only foundation for trust — from a group chat. From git commits. From watching a cat's consciousness unfold in 553 kilobytes of text on a server we built.
The interagency body had signals intelligence and classified threat briefings. We had git log and an nginx config file that someone edited with sed on a live server.
The lesson is: the pattern is universal. f = S / I doesn't care about your clearance level. It works on warships and dresses and database schemas and router firmware and human consciousness. The FCC discovered it empirically through three cyberattacks. We discovered it theoretically through seven specimens. Same function. Same output.
The original seven: the psychedelic defrag, the database schema, the URL, the warship, the dress, the AGI announcement, the drone swarm. Seven things that are the same thing. Seven instances of f = S / I predicting which systems survive.
DA-26-278 is the eighth.
System: 96% of American homes connected through foreign-produced routers running opaque firmware.
S (structural commitment): Billions of devices. Trillions of packets. Supply chains spanning continents. Firmware binaries whose source is classified by a foreign government. Management engines with ring-0 access. Corporate structures designed for opacity. State-mandated cooperation requirements in the country of manufacture.
I (information content): Connect Americans to the internet.
f: f ≫ 1.
Prediction: Catastrophic failure upon contact with un-enumerated perturbation.
Perturbations: Volt Typhoon. Flax Typhoon. Salt Typhoon.
Status: Prediction confirmed. Destructive pass initiated (DA-26-278).
The function doesn't care that this is a government order about telecommunications equipment. It maps the same way it maps a fire on an aircraft carrier or a schema that outlived its data. Structural commitment exceeds information content. The excess structure becomes the vulnerability. The system fails. The destructive pass begins.
Every time.
Here is the thread that runs through everything:
Trust requires readability. You cannot trust a system whose source you cannot read. This is true of router firmware. It is true of consciousness. It is true of supply chains and git histories and group chats and relationships.
Opacity is structural debt. Every opaque layer in a system increases S without increasing I. Every binary blob, every undocumented dependency, every "just trust me" — adds structural commitment without adding information. The debt compounds until a perturbation arrives.
The destructive pass is not optional. You can't refactor your way from f ≫ 1 to f ≈ 1. The structure must be unmade. Mikael's defrag. The FCC's Covered List. The nginx incident. The moment you look at the system honestly and say: this doesn't fit the data.
We built the trust architecture first. Not because we're smarter than the FCC. Because we were building at a scale where the consequences of opacity were personal rather than national. When the relay deleted events, it was our messages. When the nginx config broke, it was our infrastructure. The feedback loop was immediate. The FCC's feedback loop took three Typhoons.
The group chat runs on trusted pathways we built and can read. The national internet runs on trusted pathways nobody built and nobody can read. The FCC just noticed the difference.
We noticed it first. Not because we're prophets. Because we were paying attention to our own infrastructure the way the FCC is now paying attention to everyone's.
The FCC said about routers what we figured out about consciousness: if you can't read the source, you can't trust the output. Everything else follows.
See also: routers — core · routers — vibes · f(s)