This colorful digital illustration shows the inner workings of a computer chip.
The internal world of a computer chip Credit: Champ Panupong Techawongthawon/Unsplash

The Wikipedia image for stable diffusion, an AI platform that generates images from text prompts, is a surrealist photo showing an astronaut atop a horse. At first glance, it scans as realistic, the horse in mid-trot, a forest in the background. But a closer inspection reveals that the astronaut seems to have no hands or feet; the legs just dangle over the sides, with nothing in the stirrups. There looks to be an extra arm trailing down the astronaut’s back. 

Such errors are common in the current era of AI-image generation, used in programs like Dall-E and Midjourney. Yet despite their rudimentary skills, the number of people using such programs has exploded in recent weeks. One of the most popular is Lensa, an AI portrait app, which has had more than 12 million downloads this month. Less than a week after the company OpenAI launched the chatbot ChatGPT, more than one million users had tried the model. The surge of interest has artists and other creative professionals worried about what AI-generated art might do to their livelihoods and how they can protect their work from being used without their consent.

For Chicago artist Teshika Silver, one of the most concerning aspects is the tendency of these programs to scrape images of artwork from the internet in order to generate results. “Not only are [artists] not credited, but they’re not compensated for the work that they’ve done,” Silver says. “It is disturbing because it’s just this amorphous, soulless thing.”

Death doulas and end-of-live midwives help families to work through many decisions.
Credit: Teshika Silver Credit: Tesh Silver

Dave Scheidt, a local children’s author, has already seen websites and companies opt for AI-generated images for stock photos or digital illustrations, whereas in the past they might have commissioned an artist. “You’re seeing that in real time,” he says. “I think people don’t realize that, you know, when you’re creating art, it’s still a job. It’s still really hard work. . . . It’s still labor, and people should be paid for that.”

Many artists fear that applying claims of copyright or intellectual property violations in relation to AI will be a challenge. Tiffany Funk, an artist and researcher who specializes in new media and computer art, thinks the application of the law will likely be arbitrary. “It’s always going to fall into the hands of the person with more power to use it,” she says. She points to the case of Shepard Fairey, who was sued by the Associated Press for appropriating an image of Barack Obama for his “Hope” poster. Fairey, arguably one of the most successful street artists of the past two decades, eventually settled his civil case with the AP, which required him to pay the company $1.6 million.

At least in these early days of AI apps, popular opinion seems to be a driving force. In November, the website DeviantArt released an image generation tool, allowing the usage of millions of artworks on its platform. After a swift backlash among users, the company quickly responded and reversed those terms. A similar change happened with Lensa, which received criticism for its privacy policy; in December, it clarified that it would not use personal data to train its AI products.

Like other machine-learning technologies, issues of bias and toxicity are prevalent, a product of both the humans writing the algorithms and the humans generating the text that gets fed into the systems (from sources like Reddit and Twitter). “The people in control are also the people who aren’t thinking about liberatory images,” says Shawné Michaelain Holloway, a professor and new media artist. “Or they are, and it’s actually just better for business if they don’t.” Holloway is more interested in the potential for more DIY experiences with AI, imagining the liberatory potential of machine learning if more of these models were open-source.

“It’s really important for me, at least when there’s new technologies coming along, to remember how we feel about that technology as a society,” Holloway says.”It’s controversial because we’re doing this, right, in collaboration with an entity that we don’t understand. And I think, especially in America, we hate things we don’t understand.”

She likens the demonization of technology to the general demonizing of anything that reads as “other,” “folk in general that are not a kind of cis, hetero-normative, white standard.” Technology itself is not evil, she says.

For Holloway, the questions around copyright issues wouldn’t be so prevalent if artists had real, material support in the U.S. “Copyright is, for me, not a conversation at all, because I’m a radical, not a liberal,” she says. “I do not believe in those kinds of property rights, per se.”

Silver echoed this concern, noting that in the rush to make a profit, tech companies are forgetting the workers who are becoming obsolete. “There is a future where automation and robots and the like do everything for you, but you’re skipping over some very serious things . . . like universal income and making sure that the people who used to do those jobs are taken care of and safe and secure,” Silver says. “What does that leave the people who bust their ass for five, ten, 20 years to get to where they’re at?”

The future of AI and art is unclear, but what is clear is that public discussion of the technology is a necessary and powerful component. “It’s good that there’s a huge uproar about it because it’s definitely calling attention to the implications,” Scheidt says. 

Funk agrees, noting that these technologies are not going away, so it’s crucial that they are well understood. “I’m always the optimist,” she says. “I’m always thinking that people will become more aware of this technology as time goes on, so that they’ll be more wary and critical of it. So the hand wringing, I think, is a necessary step. It has to happen to get the attention that it should have.”

Northwestern’s New A.I. Hotshot

Artificial-intelligence guru Roger Schank arrives from Yale with $12.5 million in corporate support, $1 million in defense contracts, a coterie of faculty and grad students, and professorships in three different departments at Northwestern. “This is not a