Opinion: The Problem With Artificial Intelligence

Courtesy of Shea Stevenson

By Shea Stevenson

 

   Public-facing AI and its recent expansion into prompt-based writing has been generating mixed reactions. Mostly, lots of people are angry and scared. There’s a big “think of the children” type of talk now, especially concerning how students may use things like Chat GPT to skip English homework and essay writing. If a prompt like “write an essay about class in ‘Oliver Twist’” can get you a coherent if not incredibly bland essay, then that’s that, right?

   This is all to varying degrees understandable. It is a little scary, on the face of it, that we can make a robot that can get within a stone’s throw of imitating an actual human thought. But it’s telling that Americans figured that “art” and “labor” would be such separate categories; that something like writing a story and something like driving a truck wouldn’t be able to be automated in the same way, and many thought writing couldn’t be automated at all. It reminds me of all the myriad ways in which scientists throughout the ages have sought to differentiate “human” from “animal,” a false distinction. People thought that, sure, a baboon can be a train operator, but only a human could produce original writing and art! This is just the latest high profile case that forces an understanding that there is no meaningful distinction between us and the world beyond ourselves.

   In a system of living built around necessary work, where your job is meant to define who you are and how you live, everything one spends their time on and studies must fall under the umbrella of how it will eventually make you money. In this way, even though there is a social distinction between “art” and “labor” in practice, of course, art is labor. If writing and drawing can be taught – if there are “good” paintings and “bad” paintings for specific reasons – then why wouldn’t a machine be able to learn it? When forms of art can be boiled down to an equation, computers will always solve them faster.

   But many still recoil at the idea of an AI-written book. It forces us to consider, for the first time, “what if those cute elephants that make paintings were working at something like the level of an art school freshman but were still very confused about hands? What if they could, one day, figure out the hands?” In the end, I would still care less about elephant paintings than human paintings. They’d be neat I guess, but who cares? It’s not trying to tell me something, it’s only making decent shapes. It’s a gimmick piece because at the end of the day, the repulsion to non-human art comes from the fact that most people seek out art to participate in a conversation broader than themselves.

   We read because on some level we know, across time, there is someone on the other end of that line. Ideally trying to tell us something interesting. Writing is not meaningful because it’s made of words, it’s meaningful because those words carry a thought from one head to another. When something like an AI inevitably writes a book, it will be a niche interest because even if the words mean something when strung together, it cannot be a thought that comes from one human to another without a human involved at some point.

   AI will be a tool for artists to augment their work, perhaps to make a really bland first draft and generate some random ideas, and then work from there. It will change how art is made, but not fundamentally. We used to not have a backspace or undo button. We used to make handwritten drafts. Convenience is always changing the game.

   But it sucks because we need to figure out this whole new thing now. Teachers need to figure out how to teach English like there’s a calculator for it. A lot of things like travel brochures are about to become first drafted by a machine, then edited by a person, then shipped, and you won’t know the difference. People will lose jobs, like how truckers will presumably lose their jobs if self-driving cars become more widespread.

   In the long term, the real problem is that people who are relying on their craft now will be replaced by a robot who can do the same thing for cheaper and with no human rights to consider. And we live in a country where that is a terrible thing, where the thought of needing to start over and find another trade is simply unrealistic yet totally necessary. AI is only a problem if the only lives it improves are those of the people who buy them, and not the society that finds itself with the same labor requirements being met, yet more free time.

About web 846 Articles
WebGroup is a group @ Brooklyn College