
Sigrid Jin was waiting to board the plane when he saw the stunning news that the artificial intelligence startup Anthropic had accidentally leaked the source code of Claude Code, its favorite AI agent. Mr Jin, 25, a university student, tried to post a copy online. His worried girlfriend quickly texted him: Did he break the copyright law?
Mr. Jin turned to the AI assistant team for a solution. He ordered them to rewrite the leaked code into another programming language and then share that version online. Within hours, more than 100,000 people had liked or linked to her.
Anthropic, one of the leading artificial intelligence companies alongside OpenAI, said the leak was caused by human error and demanded that GitHub, an online library of computer code, remove posts sharing the code, citing copyright infringement. Thousands of posts have been removed. But Mr. Jin’s version remains online. He said Anthropic didn’t ask him to take it down.
It’s unclear whether Anthropic, which did not respond to questions from The New York Times, makes a difference with the rewritten code. Mr. Jin said he believed the code rewrite turned it into a new work that Anthropic could not claim ownership of.
He said he was driven less by money or fame than by a desire to make a larger philosophical point. What is the value of copyrighted intellectual property in an age when artificial intelligence can easily replicate not only computer code but also art, music and literature in minutes?
“I wanted to raise some ethical questions in the era of AI agents,” he said. “Any creative work can be reproduced in seconds.”
Computer code has long been considered a protected creative work, similar to music or art. But copyright enforcement has been difficult because the software’s basic computational instructions can be copied or modified in ways that are hard to trace. Even what is considered protected has been a matter of debate. Google and Oracle have waged a legal battle for years, arguing over where to draw the line between creative expression and core software functionality.
Now new technology makes it even more complicated.
When Anthropic Leak came online, Mr. Jin and his friends were already treating AI helpers like Claude Code and OpenClaw like employees who handle daily tasks. These agents don’t just answer questions; they perform tasks on their own when prompted for a goal, such as “organize my receipts” or “create a new social media post.”
Agents also make it easier to copy and imitate than ever before and on a much larger scale.
For many software companies, but also for authors, artists and musicians, it is not just direct copying that poses a risk. The point is that the market for their work could be flooded with AI-generated substitutes that cost next to nothing to produce, reducing the value of the original work.
“What happened with the Claude Code leak is basically a preview of what’s to come for every creative industry,” said Russ Pearlman, an AI and technology attorney and chief information officer at Dallas College. Existing copyright rules, he said, were built on the assumption that copying takes time and that there is a meaningful window in which steps can be taken to protect the work.
“When an AI agent can rewrite 512,000 lines of code into another language before most people finish their morning coffee, that assumption crumbles,” he said.
In 2022, the US Copyright Office stated that works created solely by artificial intelligence without human creative input do not qualify for copyright protection. A subsequent review reaffirmed this decision and found that a simple human challenge was not enough. But the courts have yet to decide how much human involvement is necessary.
“Artists and musicians are extremely concerned about this,” said Yelena Ambartsumian, founder of Ambart Law, a New York firm that advises start-ups on artificial intelligence, intellectual property and other issues. “All the resources you put into being able to copyright your human expression, does it really matter if in a second or two hours that expression can be copied and then changed?”
Many popular AI models have been trained to produce human-like prose by ingesting vast amounts of material published online. Artists, authors and media companies have said that AI firms have infringed their copyrights by using their work to train their systems.
Last year, Anthropic agreed to pay $1.5 billion to a group of authors and publishers in the largest settlement in U.S. copyright case history after a judge ruled that it illegally downloaded and stored millions of copyrighted books. Antropic claimed that instead of replicating a creator’s exact work, its systems analyze underlying patterns in that work to create something new.
(The New York Times sued OpenAI and Microsoft in 2023, accusing them of copyright infringement for news content related to AI systems. Both companies denied the claims.)
“A library of everything that has been written has already been put into AI,” said Kandis Koustenis, an attorney specializing in intellectual property at Bean, Kinney & Korman in Virginia. “From the author’s perspective, the genie is kind of out of the bottle.
Since the advent of personal computers, technology companies have devised ways to recreate software that is similar to competing software without infringing copyright, including techniques that prevent programmers from directly copying the original code.
Mr. Jin claimed to have used a comparable approach to rewrite the anthropic code, using AI agents rather than human programmers.
This distinction has not been tested in court.
While some AI companies, including Anthropic, closely guard the inner workings of their systems, others have embraced open source on the idea that transparency makes AI safer and accelerates innovation.
As agents make it easier to replicate such work with minimal human input, creativity becomes more valuable, Mr. Jin said. His goal was not to create something new, but to point out how few truly novel ideas remain.
“Now we rely on models that rely on ideas that come out of other people’s heads,” Mr. Jin said. “It’s still difficult to have a novelty.”




