Owning the Output: Copyright Challenges in an AI-Generated World
- WAI CONTENT TEAM
- Sep 29
- 4 min read
Today, artificial intelligence (AI) tools are producing stories, art, and music. However, this creative surge also sparked copyright debates at a global level. Courts and policymakers worldwide are grappling with questions of authorship and ownership, as traditional copyright law does not seem to effectively apply to AI. Some jurisdictions require human involvement for copyright protection, while others are exploring or adopting more flexible standards for AI-assisted works. Laws also differ on whether using copyrighted material to train AI is permissible, with some countries allowing it to encourage innovation and others imposing stricter controls to protect creators’ rights. As technology outpaces legal frameworks, stakeholders are calling for new legislation that balances innovation with fair compensation and protection for human creators.
This article is written by Jean Gan. Jean is a senior in-house legal counsel with 15 years of experience across APAC, specialising in contracts, compliance, and cross-border transactions. She is completing a Global MBA, preparing to qualify as a solicitor of England and Wales, and pursuing a PhD in Law focused on AI and dispute resolution. She also runs Beyond the Clauses on LinkedIn, where she shares insights on legal strategy and AI, as well as How to Legal AI, where she provides practical tips and guidance on navigating AI in the legal world.
In 2025, artificial intelligence has become a creative force. Tools like ChatGPT, Midjourney, and Stable Diffusion are producing stories that make literary shortlists, art that fills galleries, and music that tops streaming charts. Yet this creative boom comes with a pressing legal question: who owns the output?
The rise of AI-generated content has pushed copyright law into new territory. Courts, policymakers, and industry leaders are wrestling with issues of authorship, ownership, and whether training AI models on copyrighted works is lawful. The answers are far from settled, but the debates already carry major consequences for creators, developers, and platforms.
The Ownership Dilemma
Copyright law has always centred on human creativity. It protects the labour, originality, and expression of authors, artists, and musicians. Generative AI challenges this picture. Can a work generated largely by an algorithm qualify as “original”? If so, who holds the rights: the person prompting the tool, the developer who built it, or no one at all?
Alongside this, a second challenge is equally important. AI systems are trained on enormous datasets that often include copyrighted books, images, and music. Is this a fair and transformative use, or does it amount to infringement on a global scale?
Key Legal Developments in 2025
United States
The U.S. Copyright Office has reaffirmed that copyright requires human authorship. Purely machine-generated works remain unprotectable, but blended creations that involve meaningful human editing or curation may qualify.
On training data, the Office’s May 2025 report stated that using copyrighted material without permission could constitute infringement, although fair use arguments might apply in limited circumstances. This has fuelled lawsuits such as The New York Times v. OpenAI, which alleges mass scraping of articles, and several author-led class actions.
Courts have delivered mixed verdicts. Some ruled that training qualifies as transformative fair use, while others questioned whether this reasoning holds when vast quantities of copyrighted works are involved. Settlements, such as Anthropic’s agreement with authors to compensate and disclose training data, highlight the industry’s growing pressure to adapt.
United Kingdom
The UK is exploring reforms. In early 2025, the government floated a proposal to introduce an exception allowing AI training on copyrighted works without prior permission. Advocates argue it would support innovation, while artists and rights holders warn it could undermine creative industries.
The UK also retains a distinctive statutory provision that assigns authorship of computer-generated works to the “person who makes the arrangements.” Once obscure, this clause is now under scrutiny in light of generative AI.
Asia
Japan has embraced a permissive approach since 2018, allowing copyrighted works to be used in AI training even for commercial purposes. While this has spurred innovation, it has also raised concerns among creators.
China is moving toward a hybrid model. In 2025, courts in Beijing recognised copyright protection for AI-assisted works that demonstrated originality and significant human input. At the same time, lawsuits targeting training practices are increasing, signalling a shift toward stronger protections for rights holders.
Practical Implications
For creators: AI poses risks of dilution and unauthorised use of works. Many are seeking collective action or licensing frameworks to safeguard their rights.
For developers: Transparent sourcing and ethical training practices are no longer optional. They are becoming essential to limit liability and maintain trust.
For platforms: Content moderation responsibilities are expanding, with greater scrutiny on AI-generated uploads and attribution practices.
For users: Those who rely on AI to create content must recognise that outputs may not enjoy full copyright protection, limiting exclusivity and commercial certainty.
Rethinking Copyright Frameworks
The law is being stretched by technologies it was not built to address. Some scholars and policymakers are now proposing new models, including:
Shared authorship frameworks crediting both human input and technological contribution
Licensing systems that ensure rights holders are compensated when their works are used in training
International cooperation to reduce the growing patchwork of conflicting rules
Until clearer frameworks are in place, disputes will continue, leaving creators and businesses navigating uncertainty.
Conclusion
Artificial intelligence is redefining creativity, but copyright law has yet to catch up. The challenge is to find rules that protect human innovation while allowing technology to flourish. Striking this balance will determine whether AI enhances the creative ecosystem or erodes it.
At Women in AI, we believe conversations like these are essential. As stakeholders, from artists to policymakers, we must work together to ensure a future where innovation and fairness go hand in hand.
We want to hear your perspective. Do you think current copyright frameworks can adapt, or is it time for an entirely new approach? Share your thoughts in the comments.
Collaborate with us!
As always, we appreciate you taking the time to read our blog post.
If you have news relevant to our global WAI community or expertise in AI and law, we invite you to contribute to the WAI Legal Insights Blog. To explore this opportunity, please contact WAI editors Silvia A. Carretta, WAI Chief Legal Officer (via LinkedIn or silvia@womeninai.co), or Dina Blikshteyn, (dina@womeninai.co).
Silvia A. Carretta and Dina Blikshteyn
- Editors
