The Midjourney Lawsuit: What Designers and AI Builders Must Learn from Disney and Universal’s Legal Strike
- Jeremy Kolb

- Jul 18
- 3 min read
This week, Disney and Universal filed a landmark lawsuit against AI image generator Midjourney, alleging widespread copyright infringement. They claim the platform enables users to generate unauthorized images of their most iconic characters—from Ariel to Bart Simpson to Darth Vader—without consent or compensation.
It’s the first major showdown between Hollywood and a generative AI company. But it won’t be the last.
As a designer, a builder, and someone who works closely with AI-driven tools, I find myself pulled in multiple directions. The possibilities of this tech are thrilling—but so are the questions it raises about ownership, creativity, and what we owe to the artists whose work built our cultural landscape.
Let’s unpack the deeper challenge here—and what it means for the way we design and deploy AI.
Training on the Internet Isn’t Neutral—It’s a Choice
One of the core defenses AI companies use is that their models train on massive swaths of public data: images, videos, text—whatever’s available. But let’s be honest: “public” doesn’t mean “unprotected.”
Training on copyrighted material without consent is a legal gray zone that’s only now being tested. Midjourney, like many others, built its model using images that include countless pieces of copyrighted content. They argue that their outputs are novel, not derivative. But is that really true when a user can type in “Mickey Mouse in Blade Runner” and get something instantly recognizable?
It’s not a stretch to say that what’s being generated is downstream from someone else’s labor—Disney animators, Pixar artists, comic illustrators—none of whom were consulted, credited, or paid.
At the same time, human artists are also shaped by what they see. Every painter is influenced. Every designer draws inspiration. So where do we draw the line between learning and copying—between homage and theft?
We don’t have a good answer yet. That’s the real tension at the heart of this case.
Ethics in AI Isn’t Just Legal—it’s Strategic
Here’s what I believe: Responsible AI isn’t just the right thing to do—it’s the smart thing.
As AI becomes more embedded in the creative process, ethical transparency will become a UX differentiator. Users want to know: Who made this? What was it trained on? Who benefits from this system?
Ignoring those questions may speed up go-to-market, but it creates risk—reputational, legal, and cultural. That’s especially true when you’re building tools meant to scale across industries built on IP: media, fashion, publishing, marketing.
Designers should start building with consent and traceability in mind. Imagine if your tool could surface the artists who influenced a style—or even route attribution and compensation back to them. That’s not a limitation. That’s a feature. It builds trust.
The Creative Commons Wasn’t Meant for Machines
One of the underdiscussed challenges is this: our current copyright frameworks weren’t built with generative AI in mind. Copyright law assumes a world where humans create for humans, and violations are clear—copy/paste, remix, resale.
But what happens when a machine learns from thousands of works at once, and outputs something “new,” but undeniably derivative?
Is it theft? Is it innovation? Is it both?
This lawsuit forces that conversation into the open. On one hand, we don’t want to stifle innovation by banning all AI training. On the other, if AI tools can be trained on an entire generation of artwork and then used to replace the artists who made it—how is that fair?
The real issue isn’t whether AI can generate beautiful images. It’s whether it should be allowed to do so in ways that undermine the economic and creative ecosystem it relies on.
Final Thought: Respect Is the Baseline, Not the Bonus
I’m optimistic about what AI can do for creativity. But that optimism comes with a caveat: if we don’t embed ethical principles now—consent, attribution, transparency—we’re going to replicate the same extractive patterns that defined the last tech boom.
AI is a tool. Like any tool, it reflects the values of those who build it. We have the chance to shape those values now—before the courts do it for us.
This lawsuit isn’t just a battle between tech and Hollywood. It’s a moment of reckoning for everyone working in the creative economy.
If you’re building with AI, ask yourself: Does your product expand creative possibility—or does it erode the foundation it’s built on?
The future will reward those who can answer that question with clarity, humility, and respect.




Comments