Why Should Anyone Own the Product of A.I.?
Yet another round on how intellectual property isn't "property"
Sunspring, a short film written by an A.I.
The WGA strike began just as we were gearing up in pre-production on my feature film, Resentment. Needless to say, our first concern was that we not do anything that would run afoul of WGA rules, and I’m pretty confident we didn’t. Our production was completely independent, in no way affiliated with the members of the AMPTP; neither I nor anyone else on the project is a member of the WGA; and the script was written anyway before the strike commenced.
But now that the production phase of my film is done, I’m able to step back and think about the issues at stake in the strike. Many of them, while important, are interesting primarily to people involved in the entertainment business, like the growth of “mini rooms” that work writers harder and give them less opportunity for advancement, or the way in which subscription services like Netflix have cut writers out of their share of the value they create because their peculiar business model results in them not paying residuals in the way that traditional broadcast media or VOD services did. I think these questions are very interesting and important, and I do even though they don’t affect me personally (I’m not writing for Netflix). They’re just interesting business stories and interesting labor stories.
There’s one issue at stake in the WGA strike, though, that has broader philosophical implications, to whit: the “threat” of artificial intelligence.
WGA writers very reasonably fear that ChatGPT and its more-powerful successors will soon make some of them redundant. Even if A.I. turns out to be a long way from being able to write anything better than workmanlike, plenty of professional writing in Hollywood barely clears that bar (particularly once that writing has had the life beaten out of it by studio notes). Studio executives are transparently excited by the prospect of being able to dispense with most of the writers they employ, and employing those writers they continue to engage only in the most limited fashion. The imagined future involves an executive’s non-union assistant typing prompts into a A.I.’s interface until a script pops out that meets the studio’s requirements, and hiring a writer or two for a limited contract to “punch up” that script with a bit of creative sparkle.
That future doesn’t strike me as completely impossible—which is why the WGA is worried, and wants to lock in the role of a human writer in an increasingly A.I.-dominated world. I’m not sure such a strategy will ultimately work, though, even if it is successful temporarily. Ultimately, if A.I. really gets good enough to write a serviceable script, then the value added by the human writer is going to be smaller and smaller as a percentage of the final product. And if that’s the way things develop, then the demand for traditional levels of compensation for the writer is going to look more and more like featherbedding. Striking against automation is a losing game in the long term, because at a certain point it’ll be obvious to management that it’s in their power, and plainly in their interest, to break the union altogether.
But the prospect of A.I. domination raises a different question for me, the one in the title of this post. Why should a script—or any piece of writing—written by an A.I. be owned by anyone?
There’s a meme going around that you’ve probably seen saying that A.I. is plagiarism because the training data isn’t public domain. And there's a counter to that meme that argues, basically, that neither was your training data, you supposedly creative writer. Yes, the A.I. learned how to write a script by reading billions of lines of text, including thousands of screenplays, to which it does not own copyright and for which it does not pay royalties. But the same is true of you: you also learned to write by reading, and you aren’t paying royalties every time you write something, not even if it was clearly inspired by something someone else did. There’s a shot in my film directly inspired by a shot in After Hours, and I won’t owe Martin Scorsese any royalties for the inspiration, any more than Paul Thomas Anderson owed him royalties for modeling the end of Boogie Nights on a scene from Raging Bull.
I’m with the counter-meme on this one: I don’t think the product of an A.I. is plagiarism. But I think the implications of that fact are broader than people realize. Because the same reasons why it isn’t plagiarism suggest that it shouldn’t be subject to copyright.
Think about it. The reason we have copyright in the first place is that the marginal cost of reproducing a piece of writing is minimal—in a digital era, it’s virtually zero. Since it takes labor to produce writing, and we want people to engage in that labor, we need to grant them a temporary legal monopoly over the right to reproduce that writing, so that it’s possible to earn something from that labor. That’s what copyright is: not a recognition and protection of a property right that is inherent in ownership (as with land or other real property), but the granting of a legal monopoly to artificially raise the value of something to encourage its production. But . . . why would we want to do that on behalf of an A.I., or on behalf of a corporation that happens to be using an A.I.? What communal interest is served by granting a legal monopoly in that circumstance?
I can’t think of any. When a writer does work for hire, they’re trading away their rights to their work for compensation. But when an A.I. does the work, why should anyone have the rights to the result? It’s not like we need to incentivize the A.I. to do the work.
Perhaps we need to incentivize the A.I. developers to build better A.I. tools? Perhaps, though I doubt those incentives would be lacking even if the product of A.I. is deemed to always be public domain; the tools are just too obviously valuable. But more to the point: if the A.I.-developers’ incentives are ultimately based on the copyright value of the product of A.I., then the “it’s not plagiarism” argument starts to fray. Because all the A.I. does is predict the next word/sentence/paragraph/page likely to follow from a certain prompt, and it does that based on the patterns observed across enormous quantities of training text, text that is not all public domain. If the product is subject to copyright, and the work is all done by a machine, then it really does start to look like you’ve trampled on one group of creators’ rights to create rights for a new entity that didn’t even do any creative work. It seems to me it’s far cleaner to just say that anything produced by an A.I. is public domain.
Just to be perfectly clear, this is a unique thing about intellectual property, not applicable to physical property. If you design a robot that makes shoes, the shoes it makes are your property even though you didn’t make them yourself. The reason is that shoes are inherently property, because only one person can wear them at a time. You can decide that they are the collective property of a group of people who take turns wearing them, or that they belong to you alone, or whatever, but their physical limitation is what gives them the character of property, and so the question inherently arises of who they belong to. (There’s still the question of who should own the machines, of course.)
Intellectual property isn’t like that. If I write a sonnet, and you read it, it in no way limits anyone else from reading it—or, in the digital age, from everyone reading it simultaneously. The sonnet is not a naturally scarce good; unlike shoes, I lose nothing of the sonnet itself from how many other people read it. Same thing if I write a tune or discover the cure for cancer. If someone else sings the tune or applies the cure, I lose nothing. We create categories of intellectual property in law precisely because these things aren’t automatically property, because they can be shared freely with no loss to the owner; because without a legal monopoly that is analogous in some ways to physical property, it’s not clear how anyone could earn compensation from intellectual labor. But it’s not inherent in the sonnet or the song or the cure itself that it should be property. If a machine is doing the labor, though, then there’s nobody who needs to be compensated. So why should it belong to the person who owns the machine, instead of to everybody?
I have no idea what the consequences would be of making the product of A.I. public domain. Maybe it would result in a catastrophic collapse in the value of A.I.-related companies and a drying up of innovation in that sector, thereby stunting human progress for a generation—or, alternatively saving us from an A.I. apocalypse. Maybe the exact opposite would happen—open-source A.I. would explode, and we’d have a bonanza of development and creativity. Maybe the “human touch” would suddenly become incredibly valuable as the only way to obtain rights to a work would be to have a human work on it, and courts would be tied up debating whether that human contribution was substantial enough to warrant granting property rights. I have no idea. I’m not arguing my case on the basis of a confident prediction or even a purely speculative notion of what would result. I’m arguing it from first principles, because I want people to wake up to the damage that an “intellectual property” framework has been doing to culture (and also to science) for years now. We have to remember that none of this is natural, that both culture and science predate this set of legal arrangements, just as physical property predates the development of the limited liability corporation. Intellectual property is a legal fiction, something that we came up with to serve our collective good.
If the prospect of corporations getting decades of monopoly profits from the intellectual output of machines doesn’t wake us up to that fact, and to the fact that we are the ones with the inherent right to redesign those arrangements to better suit a new collective good, then I don’t know what will.
Bless you for writing this. So few seem to recognize that copyright is not a property law; it is a temporary monopoly law. The default state of creative works built from recorded words or sounds or images is inherently belonging to everyone who hears it. I have no problem with attribution laws that are perpetual. I have a problem with a permanent monopoly.
This is one of the basic differences between the European and US copyright systems: Europe recognizes some "right" of ownership of creators of their creative works. The US provides for a temporary monopoly to further the continued creation of new works. It's been disappointing over the years to see the US follow Europe's ever-increasing copyright lengths and rights of authorship.