Fight for the Future’s Lia Holland On A.I. Copyright, Human Art and More

Artificial intelligence promises to make work easier, but can artists and their art survive in the brave new world we're creating?

On Monday, the group Fight for the Future led an A.I. Day of Action aimed at raising awareness about the real threats posed to creatives by artificial intelligence. Tech executives like to brag that their buzzy innovation may one day become Skynet, but the A.I. Day of Action brought together a coalition of six groups—the Freelancers Union, United Musicians and Allied Workers, Media Alliance, RootsAction, Open Markets Institute and Fight for the Future—that aimed to show how A.I. might hurt all of us in the short term. Observer recently had a chance to catch up with Fight for the Future’s Lia Holland to hear about their efforts.

Fight for the Future’s Lia Holland. Courtsey Lia Holland

How do you think the A.I. Day of Action went? Do you think the message came across to those who needed to hear it?

This Day of Action was our launching pad for lawmakers, civil society and the independent and freelance artist community to connect on one of the more concerning dynamics A.I. might supercharge in the coming years: the misuse of copyright law for the benefit of large corporations and the detriment of human creativity. We definitely consider this day a success based on the in-depth conversations it started amid efforts from not only Congress, but the U.S. Copyright Office and the FTC, with their October 4 hearing on creators and A.I.

Yet despite how much hard work went into this action, it’s only the beginning. Independent artists and creators have nowhere near the lobbying power on Capitol Hill that large media and content corporations like NewsCorp, Disney or Warner Bros Discovery wield. And what’s more, many artists are understandably hesitant to offend potential employers by advocating for change. I get that. When you’re trying to feed your family with your art, crumbs are better than nothing at all. For some, A.I. is the last straw, and we’re excited to welcome them and organize alongside them. I say all of this to share that we’re engaged in a long struggle, likely a multigenerational struggle, for the economic dignity of the people who create our culture with their art. Our A.I. Day of Action was a success on that continuum but the message will keep needing to be delivered.

Technology has been stealing jobs since the time of the Luddites. What makes the threat of A.I. unique?

A.I.’s threats across the spectrum of issues my organization works on can be summed up in one word: supercharging. These A.I. systems are designed to do work faster than ever before, at greater scales than we could have imagined even a decade ago and with less transparency or accountability. To be a little reductionist, because I think it’s a helpful mental model, A.I. is simply an instrument of power. Those who are able to wield it can point it wherever they like—unless we adopt laws to prohibit certain uses. As a nation, the US is terribly behind both on basic human rights like privacy and on ensuring the dignity of work with robust labor laws. Our social safety net is crumbling, a reality painfully apparent to me every time I head to the grocery store.

At this time of great inequality, we need subversive art (human art) now more than ever—popular inspiration to fight back against staggering inequality and reject all the cultural trappings of such extractive business models. If within a decade A.I. has ended the last dregs of economic incentive that human artists have to exist and develop their craft by replacing their labor with corporate-controlled machine output, we don’t just lose the twee joy of knowing a real human made the art that accompanies some New Yorker thinkpiece. What we’ll see is artistic pathways narrowing even further for those who aren’t elite, rich, or well-connected. We’ll lose even more of the diverse voices making our culture, and we’ll lose them faster than we ever have before.

Your cause is aimed at preventing machines from obtaining the ability to copyright their work. Why have you chosen to attack the problem in this way?

A small correction: we are aiming at preventing large corporations from obtaining the ability to copyright work made with significant A.I.-enabled elements. We aren’t weighing in on A.I. machines—or their creators—owning copyrights. There’s a lot there on that issue when it comes to artists building new tools or forms of art—like creating A.I. neural networks. In that regard, we still feel like we have a lot to learn. What we do know is that copyright has long been a broken tool when it comes to protecting the economic dignity and diverse creativity of artists ourselves, and we’d love to see something better, regardless of the A.I. issue.

We chose this specific demand because it aims at the heart of one of the primary pathways that corporations will take to replace human artists with A.I. If corporations cannot claim works with significant A.I.-enabled elements as their intellectual property, they will be forced to continue to hire human artists to obtain intellectual property rights. This is much the same as the WGA negotiated for their own members, although a different mechanism would apply to all forms of creativity—from photography to journalism to music—and thus protect some economic opportunity for the majority freelance and non-unionized arts workforce.

Why do you think executives in the creative industries have been so quick to embrace A.I.?

The hype for A.I. is incredible and fueled by tangible consumer-facing uses of this technology with image generators, ChatGPT and the like. It’s what everyone is talking about, so of course that gets the attention of executives. But also the full force of the tech industry is intent on gaining market share at this moment of explosive growth. Part of gaining market share is straight-up being the sexiest tech, and the arts are how you achieve that cultural perception. By extension, the full marketing force of the tech industry is focused on executives in the creative industries and all the artists they work with—artists who have the ability to give legitimacy to their products.

One caveat is that I don’t think that all decision-makers are inherently malicious here. They, too, want to make everyone’s jobs less labor-intensive and eliminate menial and repetitive tasks. A.I. promises to make work easier—and as an author, I’m frankly thrilled at the idea of a future where A.I.-enabled tools might mean I can use a keyboard and mouse less. That could be transformative not only for the speed of my creative process but also when it comes to the physical limitations of my body with chronic pain, etc. But the unchecked forces of hype and investment are washing us all toward a future that won’t be of benefit to the arts or artists, a future I think even a lot of creative executives would actually be bummed to live in.

What’s the biggest thing that politicians don’t currently understand about A.I.?

A lot of the issues with A.I. aren’t actually A.I. issues but rather the result of politicians failing, for decades, to protect their constituents when it comes to labor, privacy, and basic human rights. Accordingly, a lot of their A.I.-focused legislation should actually be broadly focused on the rights and dignities that people deserve, irrespective of what technology we’re talking about. In an ideal world, Congress wouldn’t just act to reduce corporations’ copyrights when it comes to A.I.; they would reject the fundamental failures of our archaic copyright system to achieve the purposes it ostensibly serves in encouraging individual artistic innovation, and they would engage directly with artists and culture workers to make something better.

The postmodern painter David Salle recently talked about using A.I. as an extremely limited studio assistant, which seems harmless and perhaps even beneficial. Do you think A.I. can co-exist with creatives? 

Absolutely. In fact, it already coexists and serves an important role when you think about music production, Photoshop, or animation software. Despite the fact that it is getting much more powerful, at the end of the day A.I. is simply a tool. Unchecked, this tool will amplify and supercharge existing inequalities and injustices—especially in the arts. It’s been disappointing to me to see artists attacking each other for using a tool that makes their work easier, or more possible, especially in the context of accessibility. This feels a lot like the old, stodgy criticisms of sampling and remix culture, or even what film photographers said about Photoshop back in the day. The forces at work here are much bigger than David Salle finding a way to work more efficiently. Infighting about A.I. tools among artists as counterproductive if not actively harmful. Our energy should be aimed at decision-makers who actually have the ability to guide AI toward making all of our work easier and more lucrative.

Do you take any solace in how derivative and cliche the current A.I. output is? 

I don’t, because I’ve seen some very interesting and unique work being made using A.I. tools in the hands of professional artists. It seems like art can be derivative and cliche no matter what you’re using to make it. But if you’re talking about your neighbor Bob feeding a prompt into Midjourney to get a sci-fi skunk riding a taco—that seems like a whole other thing. I think of that as more like a next-generation meme generator and far be it for me to steal Bob’s joy. But on a more serious note, I don’t think we should be resting on whatever the limitations of A.I. are today, because an uncouth amount of money and talent is pouring into A.I. development and without serious human rights and labor safeguards, a lot of creative workers are in for a rough ride.


Fight for the Future’s Lia Holland On A.I. Copyright, Human Art and More