IP
Contest Over for AI-Generated Content in Street Fighter 6’s Fan Art Contest
Table of Contents
The competitive fighting game community found itself in an unexpected bout this week when Capcom disqualified a finalist in its Street Fighter 6 Art Contest after allegations of AI-generated artwork surfaced. The controversy centered on a Kimberly illustration submitted by X account @lilithascends. This incident illuminates critical legal questions about AI art, intellectual property rights, and contest governance that the gaming industry must grapple with as generative AI becomes increasingly sophisticated and widely adopted.
The Finishing Move: Contest Rules and AI Prohibition
Capcom’s second Street Fighter 6 fan art contest began in June, inviting fans to submit artwork featuring playable characters for potential inclusion as “New Challenger” screens in the game. One of the rules explicitly prohibits the use of AI-generated art—a provision that proved prophetic.
The legal significance of contest rules cannot be overstated. These terms form a binding contract between the contest sponsor and participants. When participants submitted entries, they agreed to abide by all stated rules, thereby creating enforceable obligations. Violations can result in disqualification and potential civil liability for breach of contract. In this case, Capcom announced it would disqualify an entry for violating “Section 5 pertaining to: Entry Submission and Design Requirements,” though the company stopped short of naming the specific artwork.
Copyright’s Human Touch: Why AI Art Lacks Legal Protection
The prohibition against AI-generated art in contests reflects broader copyright law principles. Under U.S. and International law, copyright protection requires human authorship. The 2025 Report from the U.S. Copyright Office reiterates that AI-generated outputs, absent meaningful human creative input, lack the necessary authorship required for protection under the Copyright Act.
This principle has been consistently upheld in federal courts. In Thaler v. Perlmutter, U.S. Appellate Circuit Judge Patricia Millett affirmed that “authors are at the center of the Copyright Act.” And while not defined by the Copyright Act, legal interpretations have held that the “author” must be human. The implications for contest submissions are clear: purely AI-generated artwork cannot be copyrighted, meaning it enters the public domain immediately upon creation.
However, the analysis becomes more complex when considering hybrid works. If AI is used as a tool within a broader creative process, such as an artist generating AI-based sketches and then painting over them or digitally arranging elements into a larger work, the human-authored portions may be eligible for protection. This distinction is critical for contest organizers and participants alike, as it determines whether submitted works can be properly licensed and used as intended.
Trademark and Unfair Competition Concerns
Beyond copyright, AI-generated contest entries raise trademark and unfair competition issues. Artists pointed out that the suspected AI submission appeared alongside entries that “appeared to be lifted or traced from other sources”—a practice that could constitute trademark infringement.
Moreover, using AI to mimic specific artists’ styles presents additional legal risks. In the ongoing Andersen v. Stability AI litigation, artists allege that AI platforms enable users to mimick distinctive aesthetic styles without authorization, generating artworks that are “indistinguishable” from their copyrighted works. Contest organizers must now consider whether AI-generated entries that replicate recognizable artistic styles could potentially expose them to secondary liability claims.
Detection Challenges and Evidentiary Standards
The Street Fighter 6 incident highlights the practical difficulties of enforcing AI prohibitions. Artists identified and noted that the submitting account’s creation “seemingly coincid[ed] with the beginning of the Street Fighter contest.” The account holder’s immediate deletion of their social media profiles after scrutiny only added to the backlash.
From a legal perspective, proving AI usage presents evidentiary challenges. Unlike plagiarism, which can be demonstrated through side-by-side comparison, AI detection more often than not relies on technical analysis and circumstantial evidence. Contest organizers must establish clear standards of proof and investigation procedures to ensure fair and legally defensible disqualification decisions. Here, it seems that Capcom was well-aware of the risks AI posed as seen in their contest terms.
Legislative Developments and Future Frameworks
The regulatory landscape for AI and creative works is rapidly evolving. The Generative AI Copyright Disclosure Act of 2024, introduced in the U.S. House of Representatives, would require companies developing generative AI models to disclose the datasets used to train their systems. While this legislation primarily targets AI developers, it signals growing congressional interest in transparency and accountability measures that could eventually extend to contests and finished products.
For the gaming industry specifically, companies should consider implementing:
- Enhanced verification procedures: Requiring contestants to provide work-in-progress files, time-lapse recordings, or other documentation proving human authorship.
- Clear definitional frameworks: Precisely defining what constitutes “AI-generated” (not copyright protected) versus “AI-assisted” (may be copyright protected) work in contest rules.
- Graduated penalty structures: Establishing proportionate consequences for different levels of AI usage.
- Appeal mechanisms: Creating fair & transparent processes to challenge disqualification decisions.
Conclusion: Leveling Up Legal Frameworks
The Street Fighter 6 controversy represents more than just a single disqualified entry—it’s a preview of legal challenges that will increasingly confront the gaming industry. As generative AI tools become more sophisticated and accessible, contest organizers must evolve their legal frameworks accordingly.
The incident also underscores the importance of community vigilance. Other artists’ scrutiny and technical analysis led to the discovery of potential rule violations, demonstrating how peer review can supplement official enforcement mechanisms.
Moving forward, the gaming industry must balance innovation with integrity—ensuring that human-generated creativity remains at the heart of artistic expression while adapting to technological change. Clear rules, robust enforcement mechanisms, and thoughtful legal frameworks will be essential to maintaining fair play in an AI-enhanced creative landscape. As Capcom’s swift action demonstrates, the industry is beginning to grapple with these challenges—but the legal fight over AI generated art has only just begun.