Science fiction writers imagine a future in which AI doesn’t abuse copyright – or their generosity (original) (raw)
The Science Fiction Writers Association (SFWA) has asked us all to imagine a future in which builders of AI models offer a price they're willing to pay for the copyrighted material they need, and creators choose whether to pay it until enough deals are struck that all stakeholders achieve satisfaction.
For that future to become possible, the Association wants regulators to consider the perils of the world we inhabit now – in which creators take advantage of digital technology to promote their work and ensure it is widely available, but technology is used to exploit their generosity.
That sad state of affairs came about after the likes of OpenAI built models by scouring the internet for material to analyze. Those models can now generate text in the style of authors whose work they ingested, without recognizing – or compensating – those authors.
Which is why several authors and the Authors Guild have launched lawsuits against OpenAI. It's also why the US Copyright Office in August 2023 launched an inquiry into copyright and artificial intelligence and invited public comments.
The SFWA took advantage of that offer, as did many others: the consultation has generated over 10,000 comments.
The Association's most recent submission – lodged on December 7 and noticed by Torrentfreak – notes that it is in "the unique position of representing many authors who have fought to make their work available for free for human readers."
"Over the last twenty years, many science fiction and fantasy authors of short fiction have embraced the open internet, believing that it is good for society and for a flourishing culture that art be available to their fellow human beings regardless of ability to pay," the submission states. But there's a difference between making a work free and giving it away.
"Being freely available has never meant abandoning the moral and legal rights of the authors, nor the obligation to enter into legal contracts to compensate authors for their work and spell out how it may and may not be used," the submission argues.
"The current content-scraping regime preys on that good-faith sharing of art as a connection between human minds and the hard work of building a common culture," the submission adds.
- Getty's image-scraping sueball against Stability AI will go to trial in the UK
- AI won't take your job, might shrink your wages, European Central Bank reckons
- Author hopes to throw the book at OpenAI, Microsoft with copyright class action
- To pay or not to pay for AI's creative 'borrowing' – that is the question
Another angle of attack addresses the fact that some authors have eschewed proprietary digital rights management (DRM) tech. Leading sci-fi publisher Tor stopped using DRM in 2012.
"Our authors and readers have been asking for this for a long time," president and publisher Tom Doherty explained at the time. "They're a technically sophisticated bunch, and DRM is a constant annoyance to them. It prevents them from using legitimately-purchased e-books in perfectly legal ways, like moving them from one kind of e-reader to another."
But DRM-free e-books that circulate online are easy for scrapers to ingest.
The SFWA submission suggests "Authors who have made their work available in forms free of restrictive technology such as DRM for the benefit of their readers may have especially been taken advantage of."
The submission includes a quote from multiple award-winning author N. K. Jemisin, who stated: “If my work is just going to get stolen, and if some company's shareholders are going to get the benefit of my labor and skill without compensating me, I see no reason to continue sharing my work with the public -- and a lot of other artists will make the same choice.”
The Association calls for development of an opt-in regime, under which authors can choose to allow their work to be scraped into a corpus used to fuel AI, in return for reasonable compensation.
"If it isn't reasonable, authors will simply not opt in," the submission states. "If they don’t, their work will not be included. The question of achieving the scale needed for AI training can then become a negotiation: they can make their arguments and offers, and we can make ours, and a balance will be reached where AI companies have persuaded or paid enough creators to get what they want."
Which sounds like a rather nicer future than the one AI companies want. ®