AI can rewrite open source code—but can it rewrite the license, too?
Computer engineers and programmers have long relied on reverse engineering as a way to copy the functionality of a computer program without copying that program's copyright-protected code directly. Now, AI coding tools are raising new issues with how that "clean room" rewrite process plays out both legally, ethically, and practically.
Those issues came to the forefront last week with the release of a new version of chardet, a popular open source python library for automatically detecting character encoding. The repository was originally written by coder Mark Pilgrim in 2006 and released under an LGPL license that placed strict limits on how it could be reused and redistributed.
Dan Blanchard took over maintenance of the repository in 2012 but waded into some controversy with the release of version 7.0 of chardet last week. Blanchard described that overhaul as "a ground-up, MIT-licensed rewrite" of the entire library built with the help of Claude Code to be "much faster and more accurate" than what came before.