George R.R. Martin and other authors sue OpenAI for copyright infringement
George R.R. Martin and other authors sue OpenAI for copyright infringement
Martin and other popular authors said OpenAI stole their copyrighted work.
George R.R. Martin and other authors sue OpenAI for copyright infringement
Martin and other popular authors said OpenAI stole their copyrighted work.
GRRM is worried AI will finish writing his books before him
We could teach ducks to write and they will finish before him.
Would you rather it was finished by 100 duck sized George RR Martins or a George RR Martin sized duck?
And then everyone would removed because it wasn't good, like what happend with the last seasons of GoT.
Another moment in A Dream of Spring involved Bran receiving a vision that The Wall was not just a physical barrier, but a mystical shield holding back the Night King's power. "This twist fits well within the universe and raises tension for the remainder of the story," Swayne remarks.
That's just a popular fan theory that has been discussed countless times on various forums.
I guess we can conclude that ChatGPT has been reading a lot of reddit.
He still hasnt released the TWOW yet? Srsly, i feel sad for his fanbase.
Most of us have reached the Acceptance stage now.
Actually getting a good ending to the ASOIAF after GRRM dies is gonna be one of the big turning points that transforms everyone's opinion on AI.
It's gonna be like fan edits for movies. People will debate which is the better version of the story. The only person hurt by this is George, who will be dead and was never going to finish the books anyways.
Since he will never finish the next book, then that's very likely given infinite time :)
I mean… AGOT might be in the public domain before TWOW comes out.
The authors added that OpenAI’s LLMs could result in derivative work “that is based on, mimics, summarizes, or paraphrases” their books, which could harm their market.
Ok, so why not wait until those hypothetical violations occur and then sue?
People can do that too, are they gonna sue all people?
I have nipples Greg, could you sue me?
Because the outcome of suing first is to address the potential outcome of what could happen based on what OenAI is doing right now. Kind of like how safety regulations are intended to prevent future problems based on what has happened previously, but expanded similar potential dangers instead of waiting for each exact scenario to happen.
The difference is that you're trying to sue someone based on what could happen. That's like sueing some random author because they read your book and could potentially make a story that would be a copy of it.
LLM's are trained on writings in the language and understand how to structure sentences based on their training data. Do AI models plagiarize anymore than someone using their understanding of the English language is plagiarizing when they construct a brand new sentence? After all, we learn how to write by reading the language and learning the rules, is the training data we read when we were kids being infringed whenever we write about similar topics?
When someone uses AI to plagiarize you sue them into eternity for all I care, but no one seems concerned with the implications of trying to a sue someone/something because they trained an intelligence by letting it read publicly available written works. Reading and learning isn't allowed because you could maybe one day possibly use that information to break copyright law.
But if OpenAI cannot legally be inspired by your work, the implication is humans can't either.
It's not how copyright works. Transformative work is transformative.
Safety regulations are created by regulatory agencies empowered by Congress, not private parties suing each other over hypotheticals.
Because that is far harder to prove than showing OpenAI used his IP without permission.
In my opinion, it should not be allowed to train a generative model on data without permission of the rights holder. So at the very least, OpenAI should publish (references to) the training data they used so far, and probably restrict the dataset to public domain--and opt-in works for future models.
I don't see why they (authors/copyright holders) have any right to prevent use of their product beyond purchasing. If I legally own a copy of Game of Thrones, I should be able to do whatever the crap I want with it.
And basically, I can. I can quote parts of it, I can give it to a friend to read, I can rip out a page and tape it to the wall, I can teach my kid how to read with it.
Why should I not be allowed to train my AI with it? Why do you think it's unethical?
Assuming that books used for GPT training were indeed purchased, not pirated, and since "AI training" was not prohibited at the time of the purchase, the engineers had every right to use them. Maybe authors in the future could prohibit "AI training" but for the books purchased before they do, "AI training" is a fair usage.
Okay, the problem is there are only about three companies with either enough data or enough money to buy it. Any open source or small time AI model is completely dead in the water. Since our economy is quickly moving towards being AI driven, it would basically guarantee our economy is completely owned by a handful of companies like Getty Images.
Any artist with less weight than GRR and Taylor Swift is still screwed, they might get a peanut or two at most.
I'd rather get an explosion of culture, even if it mean GRR doesn't get a last fat paycheck and Hollywood loses control of its monopoly.
We could get Elon musk to develop a corpus and train all AI on that instead of training AI on a corpus from scraping websites.
I mean this isn't miles away from what the writer's strike is about. Certainly I think the technology is great but after the last round of tech companies turning out to be fuckwits (Facebook, Google etc) it's only natural that people are going to want to make sure this stuff is properly regulated and run fairly (not at the expense of human creatives).
As it stands now, I actually think it is miles away.
Studio's were raking in huge profits from digital residuals that weren't being passed to creatives, but AI models aren't currently paying copyright holders anything. If they suddenly did start paying publishers for use, it would almost certainly exclude payment to the actual creatives.
I'd also point out that LLM's aren't like digital distribution models because LLM's aren't distributing copyrighted works, at best you can say they're distributing a very lossy (to use someone else's term) compressed alternative that has to be pieced back together manually if you really wanted to extract it.
No argument that AI should be properly regulated, but I don't think copyright is the right framework to be doing it.
If the models trained on pirated works were available as a non-profit sort of setup with any commercial application being banned I think that would be fine.
Business owners salivating over the idea that they can just pocket the money writers and artists would make is not exactly a good use of tech.
Copyright law in general is out of date and needs updating it's not just AI that's the problem that's just the biggest new thing. But content creators of traditional media have been railing against what they perceive as copyright violation for ages.
Look at Nintendo and Let's Plays.
The law is the problem here. Not AI.
Copyright law has been such a disaster for so long, while clearly being wielded like a blunt weapon by corporations. I can see the existential threat that generative AI can pose to creators if it becomes good enough. And I also am aware that my dream of asking an AI to make a buddy cop adventure where Batman and Deadpool accidentally bust into the Disney universe, or remake the final season of Game of Thrones, is never gonna be allowed, but there's honestly a huge amount of potential for people to get the entertainment they want.
At any rate, it seems likely that they're going to try and neuter generative AI with restrictions, despite it really not being the issue at hand.
Disappointed to see John Grisham on the list.
I've expressed my opinions on this before which wasn't popular, but I think this case is going to get thrown out. Authors Guild, Inc. v. Google, Inc. has established the precedent that digitalization of copyrighted work is considered fair use, and finetuning an LLM even more so, because LLMs ultimately can be thought of as storing text data in a very, very lossy comprssion algorithm, and you can't exactly copyright JPEG noise.
And I don't think many of the writers or studio people actually tried to use ChatGPT to do creative writing, and so they think it magically outputs perfect scripts just by telling it to write a script: the reality is if you give it a simple prompt, it generates the blandest, most uninspired, badly paced textural garbage imaginable (LLMs are also really terrible at jokes), and you have to spend so much time prompt engineering just to get it to write something passable that it's often easier just to write it yourself.
So, the current law on it is fine I think, that pure AI generated contents are uncopyright-able, and NOBODY can make money off them without significant human contribution.
he reality is if you give it a simple prompt, it generates the blandest, most uninspired, badly paced textural garbage imaginable
Which is not too far from the typical sequel quality coming out of hollywood at the moment ;-)
Well, nobody really wants to ever put their name on something they're not proud of, right?
But when the goals is to churn out as much "content" as fast as possible to fill out streaming services on impossible deadlines on threat of unemployment for writers, of course writing quality will suffer.
Unhappy, overworked and underpaid people will understandably deliver poor work, which is why the strike is necessary for things to change.
This will definitely change though. As LLMs get better and develop new emergent properties, the gap between a human written story and an ai generated one will inevitably diminish.
Of course you will still need to provide a detailed and concrete input, so that the model can provide you with the most accurate result.
I feel like many people subscribe to a sort of human superiority complex, that is unjustified and which will quite certainly get stomped in the coming decades.
That is definitely not inevitable. It could very well be that we reach a point of diminishing returns soon. I'm not convinced, that the simplistic construction of current generation machine learning can go much further than it already has without significant changes in strategy.
When the movie about the nerds behind these apps comes out, this will be the part of the movie trailer where Jesse Eisenberg looks nervous and says he's being sued for over a billion dollars.
And if AI writes it Walter White will appear and announce the need to cook. Then they'll all melt for no reason.
It's hard to cook as an AI character when you have eight fingers and they are all fused together.
it was at this moment walter white became a transformer
And then Walter cooks a delicious spaghetti.
I'm still not convinced that Jesse Eisenberg and Sam Altman aren't actually the same person.
To be the devil's advocate (or GRRM's attorney), I see the merit of his and other authors' concerns. Chat GPT makes it feasible to generate short stories in their world and with their characters, which can easily replace their licensed products. This is not just their main work, but also other products that generates them some revenue stream.
Example: A friend of mine is using Chat GPT to generate short bedtime stories for his daughters. A typical request is something like this: "Please write a five paragraph story where Elsa from Frozen meets Santa Claus. Together, they fly in Santa's sleigh over the world, and Elsa is magicking snow on all Christmas trees." Normally, you'd buy a Disney-licensed book of short Christmas stories (I have one for my kids), but Chat GPT is more flexible and free.
Same goes for GRRM. He doesn't write Children stories, but one can still prompt Chat GPT to produce stories from the universe, which scratch the ASOIAF itch. This substitutes the officially licensed products and deprives the author of additional revenue stream. Just for the fun of it, I prompted Chat GPT: "Hello GPT-3.5. Please write a four paragraph story set in the Game of Thrones universe. In this story, Jon Snow and Tyrion Lannister go fishing and catch a monster alligator, which then attacks them." It produces a surprisingly readable story, and if I were a fan of this universe, I can imagine myself spending a lot of time with different prompts and then reading the results.
(On a side note,AI-generated content already has at least one group of victims: the authors of short fiction. Magazines like Clarkesworld were forced to close submissions of new stories, as they became overwhelmed with AI-generated content.)
Couple things:
You are right, especially regarding the copyright law. My argument here, however, was the same argument as companies are using against non-genuine spare parts or 3D printing (even though the latter seems to be a lost battle): people who are able to generate substitutes based on the company's designs (you can say their IP) are eating into their aftermarket profits. That's not even taking into account planned obsolescence (my kids toys are prime examples) or add-ons to products (I printed my own bracket for my Ring doorbell). With AI, I don't need to buy short story books for my kids to read; I'll generate my own until they are old enough to use Chat GPT themselves.
The thing is you can tell an AI to make a story like grrm and the AI doesn't even have to read grrm. This is a losing battle.
How will it know what grrm is if it hasn't read the book or is aware of the content? Pretty sure it does need to read the book in order to generate content similar to the authors style.
"LLMs allow anyone to generate — automatically and freely (or very cheaply) — text that they would otherwise pay writers to create" My heart bleeds for them 🙄
That new technology is going to make it harder for us to earn income. As if automation and other improvements over the years hasn't diminished other positions and they should somehow be protected at the cost of improvements for everyone as a whole
Do any of these authors use a word processor? Because that would be displacing the job of a skilled typist.
Technological progress is disruptive and largely unavoidable. Loosing your livelihood to a machine isn't fun, I don't dispute that. But the fact of that didn't stop the industrial revolution, the automobile, the internet, or many other technological shifts. Those who embraced them reaped a lot benefits however.
Technology is also often unpredictable. The AI hype train should not be taken at face value, and at this point we can't say if generative AI systems will ever really "replace" human artistry at all, especially at the highest of levels. But technology such as LLMs do not have reach that level to still be useful for other applications, and if the tech is killed on unfounded fear mongering we could loose all of it.
Also they're not going to lose their livelihoods. They might lose a little bit of money, but honestly even that I doubt.
We are still going to need humans to create creative works and as much as Hollywood reckons they're going to replace actors with AI. They're still going to need humans to write the scripts unless they can convince everyone that formulaic predictable nonsense is the new hotness.
Creative works is probably the only industry that will ultimately actually be safe from the AI, not because AI can't be creative, but because humans want humans to be creative. We put special value on human created works. That's why people object to AI art so much, not because it isn't good but because it lacks, for one of a better word, any soul.
They're not saying LLMs are bad, they're LLMs trained on copyrighted works are.
What's the alternative? Only mega billion corporations and pirates should be allowed to train AI? See how much worse that is?
I fail to see how training an LLM in any way discourages authors from producing or distributing new works, which is ostensibly the intent of copyright law.
"Those fancy robots will allow anyone to create — automatically and freely (or very cheaply) — cars that they would otherwise pay mechanics to create"
Oh the horror
Julia was twenty-six years old... and she worked, as he had guessed, on the novel-writing machines in the Fiction Department. She enjoyed her work, which consisted chiefly in running and servicing a powerful but tricky electric motor... She could describe the whole process of composing a novel, from the general directive issued by the Planning Committee down to the final touching-up by the Rewrite Squad. But she was not interested in the final product. She "didn't much care for reading," she said. Books were just a commodity that had to be produced, like jam or bootlaces.
Ok I've been seeing these headlines for over a year now... any update on literally any of these suits?
When G.R.R. Martin is involved, you are going to have to wait a long time to learn how the thing ended.
Well, as far as the show is concerned, we learned the ending rather quickly. Some would say disappointingly quickly.
Fuck D+D
lmao
Come on, everyone jump on the grift boat! ⛴️
Seriously. The intent behind copyright, which no one disputes, is that you should not be able to make a copy of someone else's work that dilutes the value of their work to the point where anybody chooses to use the diluted version instead of the original.
Where in AI can it be even REMOTELY shown that someone is using an AI product where they otherwise before AI would have been inclined to purchase the original novel instead?
Copyright abuse has been a problem for years but because the big players are the ones doing the abuse no one wants to fix it.
Same for patent trolls.
I don't want AI trained on pirated works 'found' by scrubbing the entire internet.
This is the best summary I could come up with:
According to the complaint, OpenAI “copied plaintiffs’ works wholesale, without permission or consideration” and fed the copyrighted materials into large language models.
The authors added that OpenAI’s LLMs could result in derivative work “that is based on, mimics, summarizes, or paraphrases” their books, which could harm their market.
OpenAI, the complaint said, could have trained GPT on works in the public domain instead of pulling in copyrighted material without paying a licensing fee.
This is the latest lawsuit against OpenAI from popular authors — Martin wrote Game of Thrones, Grisham’s many books have been turned into films, and so on — alleging copyright infringement.
Amazing Adventures of Kavalier and Clay writer Michael Chabon and others sued the company for using their books to train GPT earlier in September.
Comedian Sarah Silverman and authors Christopher Golden and Richard Kadrey also sought legal action against OpenAI and Meta, while Paul Tremblay and Mona Awad filed their complaint in June.
The original article contains 323 words, the summary contains 157 words. Saved 51%. I'm a bot and I'm open source!
See Authors Guild, Inc. v. Google, Inc..
Google won.
Not even remotely the same thing.
Idk, seems pretty relevant actually.
Not a good look for GRRM. When companies no longer can innovate, they litigate. Is Martin saying he's run out of ideas?
Hypothetically, I bought a kindle copy of GoT shared it with my AI friend John who has no intention to publish a 1:1 copy of the book, but we chat about the story and maybe about how it should end.. Is it wrong? Where?
The problem I have in this analogy is that people want to treat AI as a person who "consumes" media, but not as a person that "creates" media
IMO, an AI isn't consuming and isn't creating, it's just a tool, albeit one that definitely threatens established markets.
Aren't we all the products of our experiences, so when we generate something, it too is inspired from something else that already exists! So, are we against AI because it's not a human? If it was a cat reading the book and doing the same, will the cat be sued too?
If it’s your personal AI instance and you train it on books you own, and only you use it, I don’t see the problem.
You know, John has other friends too..
So basically he feels threatened by an extrapolator? In my opinion that's something a self-respecting author wouldn't do.
Everything is derivative. Adapt and overcome, instead of gatekeep.
I hope openAI wins and secures even more access to this stuff.
At least openAI can give us a good ending to the series
Not at the moment at least, if ever.
Certainly! Here's a remix of the Game of Thrones ending with Top Gun themes and imagery:
Title: "Game of Thrones: Dragon Wing"
In the final episode of "Game of Thrones: Dragon Wing," the power struggle for the Iron Throne takes a thrilling and high-flying turn. Daenerys Targaryen, now known as the "Dragon Queen," leads her dragons into an epic aerial battle against the remaining claimants to the throne.
As the dragons soar through the skies of King's Landing, the music swells with the iconic Top Gun soundtrack. Daenerys, riding her mighty dragon, "Firestorm," takes on a persona reminiscent of Maverick from Top Gun, exuding confidence and charisma.
Jon Snow, on his own dragon, "Snowhawk," plays the role of Iceman, a skilled and competitive rival to Daenerys. Their aerial duels are breathtaking and intense, echoing the dogfights of Top Gun.
Tyrion Lannister, known for his wit and strategic thinking, takes on the role of Viper, the wise and experienced mentor who guides the dragon riders through their challenges.
Arya Stark, with her newfound dragon-riding skills, becomes the "Wildcard," executing daring and unconventional tactics during the battles, just like Maverick's unpredictable maneuvers.
The final showdown takes place over the Red Keep, where the Iron Throne sits. In an explosive climax, Daenerys and Jon Snow face off in an aerial duel to determine who will rule the Seven Kingdoms. The dragon battle is a breathtaking mix of fire-breathing power and aerial acrobatics, set to the electrifying Top Gun soundtrack.
Ultimately, Daenerys emerges victorious, and she takes her place on the Iron Throne, symbolizing the triumph of courage and unity over political intrigue. The realm is finally united under her rule, and her fellow dragon riders, including Jon Snow, Tyrion, and Arya, stand by her side as loyal allies.
The series ends with a soaring shot of Daenerys and her dragons flying into the sunset, capturing the spirit of adventure and camaraderie found in Top Gun, while also providing a thrilling and satisfying conclusion to the Game of Thrones saga.
Boomers who don't understand technology yelling into the wind.