Artists are among the many groups that will feel the effects of artificial intelligence in the coming years, but it’s not a bad situation for everyone. A group of artists organized an open letter to Congress, arguing that generative AI isn’t that bad and, more importantly, that the creative community should be included in conversations about how the technology should be regulated and defined.
The full letter and list of signatories are here; the bottom line is that artificial intelligence, machine learning, and algorithmic or automated tools have been used in music, art, and other media for decades and this is just another tool.
Therefore, those who use the tools, be they software engineers or painters, should be consulted in the process of guiding their development and regulation.
Here is an edited fragment of the letter:
Just like previous innovations, these tools lower barriers in artistic creation, a career that has traditionally been limited to those with considerable financial means, able-bodied bodies, and the right social connections.
Unfortunately, this diverse and pioneering work by individual human artists is misrepresented. Some say it’s simply typing suggestions or regurgitating existing work. Others deride our methods and art as based on ‘theft’ and ‘data theft’. …many individual artists are afraid of repercussions if they even touch these important new instruments.
Sen. Schumer and members of Congress, we appreciate the ongoing hearings, “Insight Forums” and other initiatives focused on regulating generative AI systems, and that your goal is to be inclusive, drawing from a range of “scientists, advocates and community leaders” ‘ who are actively engaged in the industry. Ultimately, this must mean including artists like us.
We see a unique opportunity right now to shape the development of generative AI in a responsible way. The broad concerns expressed today regarding human artistic work cannot be ignored. Too often, large corporations and other powerful entities use technology in ways that exploit the work of artists and undermine our ability to make a living. If seeking to ensure that the revolutionary trajectory of generative AI benefits humanity as a whole, it would be a grave oversight to exclude those in our society working within its potential and limitations.
There is certainly reason and wisdom in these words, and the government ignores the creator community at its peril if it intends to form a diverse and representative group to advise its deliberations on artificial intelligence.
But the letter, despite being published under the auspices of Creative Commons, conspicuously mischaracterizes the most serious criticism leveled at the AI systems that artists object to: that they were created through large-scale intellectual property theft that still exploits artists’ work for commercial gain today, without their consent and certainly without paying them. It’s a strange oversight for an organization dedicated to navigating the complex world of copyright and digital licensing.
While there may be some who subjectively deride AI-assisted art as simply quick engineering or what have you, many who oppose it do so because the companies that created these tools did so in ways that exploited artists. Whether the art resulting from such systems is derivative or original, it is reasonable to regard it as the fruit of a poisoned tree.
Thousands of authors sign a letter urging AI makers to stop stealing books
Just as authors denounce some great language models as obviously trained on their work, among the complaints that artists can and likely will bring to any congressional hearing or forum must be that corporations are ingesting unethical and perhaps illegal work copyrighted against the wishes and welfare of their creators.
We are only at the beginning of the era of art and industry influenced by AI, so there is plenty of room for both disagreement and collaboration. While this open letter is just one perspective, it is a valuable one — and also likely one that will receive considerable pushback from other artists who feel their work or positions are being misrepresented. And by this time next year, the world and its resulting conflicts will have moved on once again, as today’s models and methods will be abandoned. We’ll be talking about this for a long, long time.