AI as Outgrowth of the Rot Affecting Technological Cultures as a Whole

Let me put aside the irony for a minute and explain why I think that AI technology, as it is sold now is bad. It may take a minute, but it is something I have been trying to formulate for a long time. It will be simplified, but I am not prepared to write a book on this… yet. It will contain bold claims that lack nuance, because that’s how outlining general concepts works. In the same way that a pen and ink drawing may contain more truth about a person than a more detailed photograph, I hope that my outlining the issue will clarify some basic truths about AI.

Prior to 2009, technology was sold to the consumer on the basis of basically two things: the promise of eliminating drudgery to allow more time for creative expression and increasing pleasure. Why did you want the latest video game console? Because it promised to be more beautiful and allow for more fun. What happened in 2009? Well, it’s an arbitrary cutoff point, but it’s where Facebook overtook Myspace in the cultural zeitgeist and the basic point where social media started to become work.

Facebook was never fun. Don’t let anyone lie to you. You joined it because it was exclusive and wanted to connect with someone on there. If it didn’t exist, that person would still exist and you could still connect with them. However, facebook used manipulative means to become the only place to reach people. Myspace was obnoxious, but it was mostly harmless anarchy. It is still amazing to me how much unserious crap people posted on there. Because no one put their finger on the algorithm to boost certain people and not others, you still had cool stuff like people from bands in your favorite microgenre reaching out to you because they liked your music. It also got worse over time when it was bought and sold by billionaires who were completely out of touch with what people actually wanted out of the site, allowed for bots to add growth, and added a host of unwanted features. Sound familiar?

Facebook was a different beast. Jaron Lanier wrote a book about it. The gist is that they used subtle manipulative tactics for social control by increasing outrage, among other things, but not until they had a critical mass of people hooked. None of this was fun, but you couldn’t leave because they held your friends and family hostage. Sure, you could still contact them through other means, but they designed facebook to make leaving feel isolating. Have you ever tried deleting it? They will show photos of people you know with a message about how much they will miss you. I have lost contact with dozens of people I really liked because I don’t use social media and they have forgotten I exist. They don’t think to text or email. I will occasionally send emails to them and months will go by without hearing from them. They have been programmed to ignore emails and check social media constantly. I know it doesn’t sound real, but that’s how it works. From a practical perspective facebook stole friendships from me and gave me nothing but outrage in return. If you are hastily typing something about personally responsibility, you don’t get how these technologies work (or how people work), and you are a prime target for their manipulation tactics.

I imagine “Wait, I thought you were talking about AI?” has passed through you head at this point. I am. “AI” is just the latest in a long line of products, crafted with a particular philosophy in mind. This is philosophy is best expressed as “we have decided this is how things work now, don’t ask questions.” Do you know how many people have told me that they would love to leave social media, but they can’t operate as an artist without it? A student of mine in media studies course at UCSC penned an essay about how social media led to an eating disorder that left them hospitalized, and how, despite this, they remembered feeling excited for the comments on their posts about it from the hospital bed. This is what this technology that brings no real joy or beauty is doing to our kids and no one is doing anything about it. The companies will face no repercussions from this. We talk about it like it’s an earthquake. “AI IS HERE, YOU CAN’T NOT USE IT.” We’ve all heard it a thousand times.

I cannot restate this enough: this is not why people used technology for the majority of my life. Of course, it always had a downside to it, and, of course, certain types of unskilled labor were replaced. That’s what technology does. I am not making the well-trodden argument that technology is a Faustian bargain. I am saying that there has been a shift where instead of losing something to gain something, we are losing something to lose something. What do we gain by letting AI make our music for us? What do we gain by letting Facebook run our social lives? What do we gain by Elon Musk promising us for the thousandth time that self-driving cars are going to be awesome? What do we gain by allowing children to gamble on loot boxes? What advantage does crypto provide non-criminals and speculators? Is anyone going to defend NFTs in 2024? None of these technologies propose advantages that we exchange something for.

GPS is a Faustian bargain. I no longer get lost, but I lose the sense of adventure exploring new places. There is a clear promise of a tangible advantage in exchange for my personal agency. If I bought a self-driving car to go to Boston in, I would get there slower than the train, not be able to work while “driving” as I babysat the car, and still have to find parking. This is just the most innocuous example. At least the rabid Tesla devotees get to enjoy the momentary fun of a robotic car parking itself.

AI isn’t really a technology. There is no device, technological principle, or technique called AI. If you read a book on programming, you will not find an “AI” algorithm. It’s a category of techniques so broad that it encompasses all of computing. This was not invented in 2022. Corporations use the term so that stupid people picture a brain in a vat making all these disparate things happen. It’s a marketing buzzword. Originally Artificial Intelligence simple referred to basic logic processors like CMOS chips because they could perform some tasks we had previously thought were exclusive to humans. Modern “AI” is an outgrowth of attempts to model various parts of the human mind or perception, but that’s actually just what all computing has ever done… ever. I wrote a dissertation on the subject. The new form that makes visuals or writes bad essays are based on a sort of new technology (not really, but we now have the processing power to make it work reasonably well) derived from neural networks, which are sort of a way of measuring similarity without understanding the underlying processes. Another way of stating this is that they are a way of modeling pattern recognition. This is helpful for dealing with large sets of data where creating mathematical or logic models would prove impossible. You don’t really ever learn why it works, it just relies on you to subjectively agree that it does. How else are you going to prove that it’s doing what it’s supposed to be doing? More AI? No, you look at whether the weights of the systems are producing images of dogs from the database of dogs. This is, incidentally, why they came after the arts first: it’s not actually good at replacing humans at tasks where the success or failure is measured objectively. It works great when evaluated subjectively by uninformed masses of people.

This doesn’t mean they are theoretically useless. AI products that eliminate drudgery are helpful. Unfortunately, all they seem to do is amplify it. Instead of getting “Hey professor, I am siick today, I drank a whole carton of spoiled milk.” I now have to read through paragraphs of verbose nothingness generated from ChatGPT. Why is this? Because professors have been dis-empowered from making subjective judgments that would allow them to ban AI. Not because these papers and emails are convincing (despite what people will tell you). If you want to spot people with no background in literature, look for the ones who think that ChatGPT is a good writer. Nick Cave is doing just fine financially. If you created a perfect clone of him, people would still show up for the old guy and his band because they (actually “we” in this case) feel an emotional connection to him and his art. When he wrote the following it was not because he had anything to lose from it, it’s because he had read enough good writing to know soulless crap from real poetry:

What makes a great song great is not its close resemblance to a recognizable work. Writing a good song is not mimicry, or replication, or pastiche, it is the opposite. It is an act of self-murder that destroys all one has strived to produce in the past. It is those dangerous, heart-stopping departures that catapult the artist beyond the limits of what he or she recognises as their known self. This is part of the authentic creative struggle that precedes the invention of a unique lyric of actual value; it is the breathless confrontation with one’s vulnerability, one’s perilousness, one’s smallness, pitted against a sense of sudden shocking discovery; it is the redemptive artistic act that stirs the heart of the listener, where the listener recognizes in the inner workings of the song their own blood, their own struggle, their own suffering. This is what we humble humans can offer, that AI can only mimic, the transcendent journey of the artist that forever grapples with his or her own shortcomings. This is where human genius resides, deeply embedded within, yet reaching beyond, those limitations.

When you encounter a real writer (and I do not consider myself a very good one, despite a few moments of brilliance spread very thinly across a few dozen albums), doesn’t it wake you up a bit? When you experience something great, and new, doesn’t it make you feel like all the pastiches and repeat listens of songs that used to mean something to you are kind of pointless? Reading just this little blog post from Nick Cave, makes me want to throw everything out and go back to a place where I am no longer measuring against some arbitrary standard of quality and instead by the extent to which I can make some contribution beyond a simple retread. Ironically, AI makes me feel this way as well, since it has solved the problem of perfect pastiche (okay not really, but good enough for marketing execs). The technologies of the past never really made me feel this. Perhaps auto-tune–when used to homogenize sound and not for artistic effect–had a similar cultural impact. Admittedly, the increased quantization of music, reduction of dynamic range, etc. has laid the cultural groundwork for AI.

Most music technologies of the past were largely about enabling new ways of making sound or making it easier to perform tasks that were previously challenging. I guess churning out fully realized (bad) pop songs you have no control over was more difficult before. Really though, instead of promising more unique or efficient ways of making sound, distributing it, etc., AI is promoted as “Hey use this thing that sounds bad and generic or else you aren’t in the technology cool kids club.” I have been told by people that I can guarantee turn their computer monitor on a 90-degree angle to rotate a PDF that I am hopelessly behind the times if I am not typing text into a box and having it spit out bad clones of other people’s music. There has never been a product so obnoxiously marketed by people who are not getting paid to promote it.

Unfortunately, these people have already been trained by social media that the point of their existence is to endlessly hype things in the hopes of being noticed. Their brains are cooked. No one is saying it out loud (except for me, right now), but the trade we are accepting is degradation of aesthetics and the loss of opportunities for artists, in exchange for our personal agency. It doesn’t have to be this way. Open AI probably keeps a database of all generated text and could allow schools to search it to make sure students aren’t using their product to cheat. Why aren’t they? Because they like the hype that destroying our education system generates. That’s it.

I used to be excited about neural networks as tools for artistic expression when my friends in grad school were developing the tools that have become so common now. I thought that when artists got hold of a version of these with more streamlined control and interesting interfaces we’d see some really exciting art that broke these technologies open. What we actually saw is an attempt to make an end run around artists by rushing to market with these exceptionally tacky tools meant for the lowest common denominator end-user. What has resulted is what we should expect from this modern technological era: scams, extortion schemes, political manipulation, further erosion of the public trust, sexual exploitation and the elimination of work for artists who have been consistently hurt by technology for the past 20 years.

I don’t love the old technological era or the old technocrats but I have a certain respect for the fact that they promised and, to a certain extent, believed in a future where technology existed to make people’s lives better. Steve Jobs, despite everything one could say about him or the actual impact of his companies, really wanted to empower artists and musicians, because he loved art and music. Steve Ballmer really wanted to please people. I have political problems with both men, but the important point is that the fundamental relationship between someone who has something they are excited to sell you and someone who genuinely wants the thing they are selling has been fundamentally broken.

Despite having more exciting technology than any previous era, we have very little to show for it. VR technology is at precisely at level of my wildest childhood fantasies and yet there isn’t a single great game or immersive experience I can think of. I haven’t heard a single piece of experimental music in the past 20 years that has shaken my thoughts about sound to the foundation the way music from previous eras has. Why? People with money are not dedicating resources to any of it. They don’t respect the artistic process enough to fund it. The artists and musicians with something to say simply don’t have the time or resources to do it. They are out working as unlicensed taxis or delivering Taco Bell. Who do we have to thank for that?

I can’t think of a single person who thinks society is better off now than before social media, but are those who are socially engineering our culture trying to fix this? Is someone competing with a better model to sell us a better society? Part of the issue is that you cannot have market competition with massive monopolies. The other half is that getting your customers addicted to something makes it a lot easier to do whatever you want to them. They can cut their dopamine with whatever they want and we’ll keep coming back.

None of this will stop until the cycle is broken. I don’t expect people to have the willpower, reasoning and information I have, so personal intervention is out. Government intervention is also out (and is a big reason why technology has gotten so bad). The only real chance of this changing is a large-scale cultural shift. I think, perhaps, educators have a role in this, especially in colleges where we are producing the next generation of leadership. Regardless, this is why I remain grim and cantankerous on the issue.

It is okay to say no. You are not backwards for saying no to AI any more than you were when you said no to vapes or kickstarters for bad video game consoles. Having all of your music generated by typing pandora descriptions into a text box is not something that allows artists to create new, innovative things, so any comparisons to the invention of the synthesizer or the introduction of computer music are invalid. The folks who are feeling smug about using AI to churn out garbage for Red Lobster commercials aren’t going to feel so smug when the advertising exec paying their bills realizes that what they are doing takes zero skill.

My advice is to ignore this crap and focus on the hard question of what you really want to do with your short time on this earth. Find the thing inside you that is left when you aren’t seeing your art through other’s judgement. Work hard, make new beautiful things, and don’t trust people with complicated patterns on their shirts.

As always these are my personal opinions and are not endorsed by, nor do they necessarily reflect those of Berklee College of Music.