The part of the documentation most tools don't include.
PHALUS automates the process of clean room reimplementation — producing functionally equivalent software without copying source code. This technique has legal precedent going back to Phoenix Technologies' 1984 IBM BIOS clone and the principle from Baker v. Selden (1879) that copyright protects expression, not ideas.
What used to take months of engineering effort now takes minutes of compute time. That's the part that should concern you.
Open source licenses work because reimplementation was expensive enough to be impractical. A copyleft license like GPL enforced sharing because the alternative — writing it yourself — cost more than complying. AI has collapsed that cost asymmetry.
If anyone can reimplement any library in seconds under any license they choose, the enforcement mechanism that makes open source licenses meaningful is effectively gone. This is not a theoretical concern. It's happening now.
The capability exists whether or not PHALUS exists. Malus demonstrated this at FOSDEM 2026. The technique is straightforward: two LLM calls with an isolation barrier. Anyone with an API key can build this in an afternoon.
PHALUS makes the process visible, auditable, and discussable. It's harder to have an honest conversation about a threat you can't see. This tool is the threat made concrete.
The strongest argument against AI clean rooms: the contamination happens at the training phase, not the inference phase. If the LLM was trained on the original source code, its output may unconsciously reproduce patterns, variable names, or algorithmic structure from training data.
The isolation firewall between Agent A and Agent B provides process isolation, but it cannot undo what the model already knows. This is a fundamental limitation that no amount of checksumming can resolve.
Before using this tool on real dependencies, ask yourself:
Does the original maintainer deserve compensation or credit for their work? Is the license they chose unreasonable, or are you just trying to avoid its obligations? Would you be comfortable if the maintainer saw what you're doing?
If the answer to that last question is no, the tool is working exactly as intended — as a mirror.
Malus — Clean Room as a Service (Dylan Ayrey & Mike Nolan, FOSDEM 2026)
Phoenix Technologies BIOS clean room (1984)
Baker v. Selden, 101 U.S. 99 (1879)
Google v. Oracle, 593 U.S. 1 (2021)