Thoughts

Plagiarmixing: Idea Ownership in the Age of Generative AI

20 Jun 2023
4 min read

As a freshman in college, we were required to read Free Culture 1 by Lawrence Lessig, a book that would later shape my understanding of intellectual property in the digital age. This was in 2004, a time when I vividly remember using Napster and LimeWire to "procure" my favorite audio clips from TV shows and remix them into quirky songs. Downloading Nelson from The Simpsons iconic “HaHa.wav” file took about 45 minutes on a dial-up modem using peer-to-peer (P2P) sharing. I would then remix these clips and share them with friends for fun.

Now, in the age of artificial intelligence, the concept of idea ownership is being challenged by the rise of generative AI tools. These tools, capable of producing content that mimics human creativity, are widely used by artists, small businesses, and large corporations alike. They draw from an immense pool of data, much of which wasn't originally theirs, raising ethical questions about intellectual property and the nature of creativity itself.

Generative AI tools, such as GPT-4, rely on machine learning algorithms to generate new content based on the data they are trained on. This data often includes a vast range of sources, from books and articles to social media posts and more. The AI doesn't create in isolation; it remixes and reinterprets the information it has been fed, often producing results that are surprisingly original and creative. This process—a blend of plagiarism and remixing—could be described as "plagiarmixing."

Artists, small businesses, and large corporations are increasingly using these tools to create content. For artists, generative AI offers a new medium for expression, enabling the creation of works that would be impossible through traditional means. Small businesses can leverage these tools to produce marketing materials, product descriptions, and more, saving both time and resources. Large corporations, with their extensive capabilities, use generative AI to produce staggering amounts of content, from articles and reports to social media posts and beyond.

However, the use of generative AI raises important questions about the ownership of the ideas that fuel its output. These AI tools are trained on data created by others, often without their knowledge or consent. This prompts ethical concerns about whether original creators should be credited or compensated for their contributions.

Moreover, there is currently little recourse for those who feel their work has been used without permission. Existing copyright laws are not equipped to handle the intricacies of AI-generated content, and proving that specific pieces of data were used to train an AI model is often difficult. As a result, original creators find themselves in a precarious position, with limited protections for their work.

Yet, perhaps it's time to reconsider our understanding of idea ownership and creativity. After all, aren't we all generating ideas based on a synthesis of environmental stimuli, memories, and personal experiences? Our ideas are not created in a vacuum; they are shaped by the world around us—the books we read, the conversations we have, and the experiences we undergo. In this sense, perhaps we are all engaging in a form of "plagiarmixing."

The rise of generative AI tools challenges traditional notions of idea ownership and creativity. As we move forward, it is essential to address the ethical implications of these tools and develop laws and regulations that protect the rights of original creators while recognizing the transformative potential of AI. In the age of generative AI, perhaps we need to rethink what it means to be creative and consider the possibility that, in some way, we are all generative beings engaging in plagiarmixing.

Mark Rubbo