A Genius in an Institution
Richard Wesley Hamming spent 30 years at Bell Telephone Laboratories. In 1950 he published the error-correcting codes that carry his name. He contributed to digital filters, numerical methods, & coding theory. He helped debug early nuclear weapon calculations at Los Alamos in 1945, fresh from the Manhattan Project. He worked alongside Shannon, Shockley, Brattain, & Bardeen — the inventors of the transistor.
What Bell Labs Was
Bell Labs ran on AT&T monopoly profits. Every phone call in America paid a fraction of a cent into a research budget that funded pure science without needing a near-term return. Bell Labs produced the transistor, information theory, UNIX, C, cellular telephony, & the laser — all inside a single institution funded by a mandated monopoly.
The cold war shaped its priorities. The U.S. military needed error-correcting codes for communications in nuclear-contaminated environments. It needed digital filters for radar. It needed reliable computation for missile guidance. Bell Labs delivered. Hamming's career ran inside this frame: knowledge produced inside walls, for patrons with specific geopolitical needs.
What Hamming Carried Forward
In 1986, Hamming gave the lecture 'You and Your Research' at Bell Labs. In 1995 he taught a graduate course at the Naval Postgraduate School called 'Hamming on Hamming.' Both distilled 30 years of observation into principles that outlast their context:
- Work on important problems. 'If what you're doing is not important and not likely to lead to important things, why are you doing it?'
- Keep a list of 10 to 20 important problems. Review it regularly. When a new technique appears, check whether it cracks one of your open problems.
- Compound knowledge. Knowledge grows like interest. A small investment in fundamentals compounds over a career; a large investment in peripheral skills depreciates.
- You get what you measure. Any metric becomes a target once it governs decisions; the target then diverges from the underlying goal it was meant to track (now called Goodhart's Law).
- Creativity through analogy. Most breakthroughs transfer a successful structure from one domain to another. Train yourself to see structural resemblances across fields.
- Systems over components. Optimizing a component at the expense of the system produces a worse system. Hamming watched this failure repeat throughout his career.
These principles survive their cold war packaging. They remain useful whether you work inside an institution or outside one, whether you work for a patron or for a commons.
Your List
Hamming kept his list of important problems running throughout his career. He said:
> Most great scientists have 10 to 20 important problems they keep in their mind. They have them written down somewhere. They work on them whenever they can. When a new technique appears, they check it against the list.
The list serves as a readiness filter. Without it, a new technique is just information. With it, the same technique may crack an open problem you have carried for years.
What Carries Forward
To summarize what survives the cold war frame:
Compound knowledge. This holds regardless of institutional context. A person who spends 20 minutes per day reading at the edge of their field for 10 years accumulates a compounding advantage. The mechanism: each new concept lands on existing structure, creating more connection points for the next concept.
Systems thinking over component optimization. A database optimized in isolation that blocks the application server produces a slower system. A curriculum optimized for test scores that drains student curiosity produces a worse educational outcome. Hamming's warning applies at every scale.
Creativity through analogy. Hamming observed that most of his own breakthroughs came from seeing that a problem in one domain had the same structure as a solved problem in another. Error-correcting codes drew on parity ideas from simpler domains. Digital filters drew on continuous mathematics applied to discrete sequences.
You get what you measure. Organizations that measure lines of code produce code. Organizations that measure test scores produce test-takers. The gap between metric & goal widens as the metric gains authority.
These four principles require no patron, no monopoly, no cold war. They apply in a university library, a small shop, a commons-maintained open-source project, or a kitchen.
Knowledge as Weapon
Hamming's era treated knowledge as competitive advantage. Bell Labs produced knowledge that AT&T & the U.S. military needed before rivals did. Publishing happened after patents filed, after military applications secured. The model: produce knowledge inside walls, protect it, deploy it.
This frame produced real results. The transistor, UNIX, information theory — all genuinely transformative, all produced inside this model. The frame worked for its purpose.
What the Frame Excluded
Open-source as research methodology. Hamming never engaged with the idea that publishing source code alongside a paper could accelerate research faster than keeping it proprietary. In his era, code was a byproduct. Linus Torvalds published the Linux kernel in 1991, four years before Hamming's course. The idea that 10,000 contributors could maintain a codebase more reliably than a 300-person team inside a corporation — this did not appear in Hamming's thinking.
Eight forms of capital. Hamming measured success in publications, breakthroughs, & career longevity. He never discussed living capital (the health & attention of researchers), social capital (the trust networks that make collaboration possible), cultural capital (the shared stories that transmit values across generations), or spiritual capital (the sense of meaning that sustains long work). He measured two of eight.
Algorithmic complexity as a fundamental. Hamming's course covered digital filters, simulation, coding theory, & n-dimensional geometry. He never taught Big O notation. In his era, N was small enough that the difference between O(N) & O(N²) rarely mattered. In the era his students would inhabit, it mattered enormously. This lesson extends in unhamming_algorithmic_complexity.
Permaculture: growing vs extracting. Bell Labs extracted from a monopoly rent. The model required an entity with power to concentrate capital & direct research. The alternative — regenerative infrastructure that grows capacity across many nodes rather than concentrating it in one — had no place in Hamming's frame.
The Spy/Spy Problem
Hamming's era optimized for advantage over an adversary. The cold war made this explicit: the U.S. & USSR competed in every domain. Each side's researchers worked to outperform the other's. The game: zero-sum. Your gain, their loss.
Zero-sum games produce specific behaviors: secrecy, classification, patents, restricted publishing, institutional walls. All rational within the game. All wasteful outside it.
When two sides optimize to beat each other, neither side optimizes to grow a shared board that makes the game unnecessary. The resources spent on duplication, secrecy, & competitive signaling produce nothing for the commons.
Hamming's advice ('work on important problems') implicitly assumed the game was zero-sum: important problems earned institutional credit, funding, & prestige within a competitive landscape. The advice remains valid. The frame does not transfer.
A researcher working on open infrastructure, building a commons, contributing to a shared codebase — this person cannot optimize for beating a rival. There is no rival. The game: grow the board, not your position on it.
Same Fire, Different Flight
A dragon does not choose its cave or its patrons. Hamming did not choose Bell Labs or the cold war. He worked where he was, with the resources available, toward the problems he could see.
Knowledge outlasts context. Hamming's error-correcting codes run in every USB drive, every satellite transmission, every hard disk. He never imagined these applications. The mathematics did not require his imagining them.
Unhamming starts from this observation: separate what Hamming proved from the frame that packaged it. Then extend with what his frame could not see.
What Unhamming Adds
Open-source as commons. Working on important problems does not require institutional backing. A person with a laptop, a public repository, & a specific open problem contributes to a commons that compounds for everyone. Hamming's compound-knowledge principle applies at ecosystem scale, not just individual scale.
Cooperative infrastructure. The permacomputer model: every node is a workstation, every edge a queue. Unblocking one node without staging downstream capacity creates a new bottleneck. This extends Hamming's systems thinking: not just 'optimize the system, not the component,' but 'map the flow before you remove a constraint.'
Algorithmic complexity as a fundamental. Hamming's test for a fundamental: has it lasted? Can the rest of the field be derived from it? Big O passes both. Growth-rate analysis has lasted since Knuth. From it, you derive algorithm selection, data structure choice, & performance prediction — most of practical computer science. Hamming missed this chapter. We write it.
All eight forms of capital. Measuring only publications & patents leaves six forms of capital invisible. A research practice that drains living capital (researcher health, attention, sleep) to maximize intellectual capital (publications) optimizes two of eight while depleting one critical one. Hamming's 'work nights & weekends' advice collapses under this accounting.
The dragon's fire remains: work on important problems, compound your knowledge, think in systems, create by analogy, measure what matters. The flight changes: no patron required, no adversary required, no institutional wall required.