I think another issue here is that too many people have fallen for the idea that the brain is simply a fancy computer. All we have to do, then, is build a computer that is just as fancy as the brain! Then we have the best AI — another brain we can rent out, that does what we tell it to do, because it doesn't have any of those pesky emotions, or personalities, or opinions, or any of those annoying traits we find in other people. Plus, we can pay less for AI because we don't have to feed it and it doesn't have snotty-nosed kids to care for. This is the promise of AI — it's the ultimate slave! All of the work, all of the product, all of the profits, and none of the effort.
The brain, of course, is not a fancy computer, and furthermore, it's housed in a body, which is probably more properly seen as an extension of the mind. That's why a lot of people use the term "bodymind" these days; you can't so easily separate the two in reality. What is missing from AI is embodied knowledge — that's where creativity comes from, because everybody has their own bespoke bodymind. AI scans all the products of all the bodyminds, but doesn't work like a bodymind itself. It can produce a simulacrum, but never the thing. AI has no taste, both literally and figuratively. As you point out, it doesn't matter how many artworks it can "consume."
Oh God yes. This idea that the mind is a fancy computer, and so if we build a fancy enough computer it will be a mind, clouds our thinking. I'm always telling people who worry about robots taking over not to sweat it, because they would have to want to do that. That is, an AI would have to have DESIRES, and for that you need to be embodied. A big, messy body with all its needs, hungers, and fears.
Our nervous system, and by extension our consciousness, is basically an immune system amped sufficiently to enable us to jump out of the way of an oncoming bus. Or protect our social status, or find someone to fuck, or maybe even to love, or, failing that, wolf down a bag of Three Musketeers Minis in despair. LLMs are not even close.
Completely agree with Gabriel, and with Andrea's addendum as well. I think her implication here is routinely overlooked in discussions about what AI can apparently do. AI looks at surface and (re)creates surfaces. Humans-- at least thoughtful humans, sensitive humans-- have depth. This relates to the reality of the bodymind, which is not only about feelings, dreams, intuitions, and other phenomena that appear to arise independently of the conscious mind, but about the fact of our physical, organic reality: we are bodies with families, histories, physiological capacities and incapacities, and (crucial) the conscious awareness of our own mortality. No "intelligence" arising from electricity, silicon, and so forth, even if deemed "conscious" at some point (by whose standards? a quagmire, that), can approach the surfaces of the input it receives with a bodymind's sensitivity and awareness. This is why an AI's output, even when it apparently mimics a human's, still falls (far) short. What any one individual consciousness seeks via the art they enjoy is, deep down, the recognition of a conscious other reaching out from its depth to their own. Computing power isn't everything.
Well-written and thought-provoking Gabe, and comments from others as well. Regarding Saint Saens, there’s nothing of his that’s grabbed my passion or imagination, which is unusual. On the other hand Bertrand Chamayou just gave a plug when I talked to him after a performance for S.S.’s Piano Concerto. Will give it a go.
I wonder: would it be possible to write music that would be unassimilatable by AI? What would you do? Never develop themes, constantly change harmonic reference, never get on a rhythmic grid…Hmm…nah
It is much too early to venture any valid ideas or predictions. It is in the nature of a creative human to take a tool and make use of it creatively. That is probably the beginning and the end of it, with much babble in between.
I agree that human finitude becomes ever more important in the wake of AI: the finitude of experience, the finitude that manifests in the struggle to create, the finitude of the results. The tech gods see all these as things to overcome, but it not only seems foolhardy to try but we lose what makes us human in the process.
I loved this essay so much. The idea of limits, and friction. “Another way of saying this: art is an expression of process. If you strip art of its process, it ceases to be art.”
Your last line brought me to Wallace Stevens’ “The Man with the Blue Guitar.” Not knowing whether it was accidental or a conscious allusion, it also reminds me of that other magic that happens across timespans and artists - the way shared knowledge/history/experience comes through our subconsciousness. The idea of archetypes in art, the cyclical nature of encounter ideas and questions that other have similarly who may have come before - the conversation that happens subliminally, not on purpose, is perhaps more magical - and speaks to our innate human experience (qualifying it as a “human experience” felt redundant… until I remembered the title of your essay….)
Not crazy about the Saint Saens slight, that might be a bit of received wisdom that could perhaps be revisited. But I do get the point, although someone like Prince (or Haydn) more or less knew everything and both had distinctive voices. Perhaps those exceptions prove the rule. You are right of course about AI and allude to the terrifying tweak: boutique AIs fed a limited diet, perhaps curated by artists themselves - please god no - and plugged into streaming playlists, emulsifying process and product into a hideous simulacrum of art.
In particular, your assertion that a limited viewpoint is what makes for authenticity in art was something I hadn't considered, or at least not in that way.
I think one can generalize everything you assert about AI-created art, to the formation of any new ideas (not just artistic ones). I've spent a lot of time with ChatGPT and related generative AI models, and I can reliably assert that, short of general AI coming online (not likely anytime soon), generative AI can't (intentionally) produce any new ideas. And even if it does, it takes a human pouring through a sea of AI-produced garbage to recognize and properly contextualize any new idea that the AI happens to accidentally produce.
Of course, art isn't really about new ideas, so much as it is about relaying perspectives, as you say.
That doesn't mean a lot of "creative" jobs won't be lost to AI. But, it will like be analogous to existing software automation that has already taken many the jobs of copyists, engravers, assistants, studio engineers, etc. The business models of human artists will probably evolve, but not much faster than it has already done since the invention of the jpeg, mpeg, and mp3 formats. I don't see the demand for human-produced art fading within our lifetimes.
And besides, there are *far* more profitable ways to employ AI than producing text/images/audio/video. The current generation of generative AI models is just a proof-of-concept, and the only reason they started there was because the datasets were easy to obtain. We're probably at least a couple of business cycles away from anything truly groundbreaking, in terms of benefit (or harm) to society. Finance, coding, and engineering would be infinitely more profitable areas for AI companies to pursue.
Thanks for some necessary reading! My hot take:
I think another issue here is that too many people have fallen for the idea that the brain is simply a fancy computer. All we have to do, then, is build a computer that is just as fancy as the brain! Then we have the best AI — another brain we can rent out, that does what we tell it to do, because it doesn't have any of those pesky emotions, or personalities, or opinions, or any of those annoying traits we find in other people. Plus, we can pay less for AI because we don't have to feed it and it doesn't have snotty-nosed kids to care for. This is the promise of AI — it's the ultimate slave! All of the work, all of the product, all of the profits, and none of the effort.
The brain, of course, is not a fancy computer, and furthermore, it's housed in a body, which is probably more properly seen as an extension of the mind. That's why a lot of people use the term "bodymind" these days; you can't so easily separate the two in reality. What is missing from AI is embodied knowledge — that's where creativity comes from, because everybody has their own bespoke bodymind. AI scans all the products of all the bodyminds, but doesn't work like a bodymind itself. It can produce a simulacrum, but never the thing. AI has no taste, both literally and figuratively. As you point out, it doesn't matter how many artworks it can "consume."
Oh God yes. This idea that the mind is a fancy computer, and so if we build a fancy enough computer it will be a mind, clouds our thinking. I'm always telling people who worry about robots taking over not to sweat it, because they would have to want to do that. That is, an AI would have to have DESIRES, and for that you need to be embodied. A big, messy body with all its needs, hungers, and fears.
Our nervous system, and by extension our consciousness, is basically an immune system amped sufficiently to enable us to jump out of the way of an oncoming bus. Or protect our social status, or find someone to fuck, or maybe even to love, or, failing that, wolf down a bag of Three Musketeers Minis in despair. LLMs are not even close.
Completely agree with Gabriel, and with Andrea's addendum as well. I think her implication here is routinely overlooked in discussions about what AI can apparently do. AI looks at surface and (re)creates surfaces. Humans-- at least thoughtful humans, sensitive humans-- have depth. This relates to the reality of the bodymind, which is not only about feelings, dreams, intuitions, and other phenomena that appear to arise independently of the conscious mind, but about the fact of our physical, organic reality: we are bodies with families, histories, physiological capacities and incapacities, and (crucial) the conscious awareness of our own mortality. No "intelligence" arising from electricity, silicon, and so forth, even if deemed "conscious" at some point (by whose standards? a quagmire, that), can approach the surfaces of the input it receives with a bodymind's sensitivity and awareness. This is why an AI's output, even when it apparently mimics a human's, still falls (far) short. What any one individual consciousness seeks via the art they enjoy is, deep down, the recognition of a conscious other reaching out from its depth to their own. Computing power isn't everything.
Well-written and thought-provoking Gabe, and comments from others as well. Regarding Saint Saens, there’s nothing of his that’s grabbed my passion or imagination, which is unusual. On the other hand Bertrand Chamayou just gave a plug when I talked to him after a performance for S.S.’s Piano Concerto. Will give it a go.
I wonder: would it be possible to write music that would be unassimilatable by AI? What would you do? Never develop themes, constantly change harmonic reference, never get on a rhythmic grid…Hmm…nah
It is much too early to venture any valid ideas or predictions. It is in the nature of a creative human to take a tool and make use of it creatively. That is probably the beginning and the end of it, with much babble in between.
I agree that human finitude becomes ever more important in the wake of AI: the finitude of experience, the finitude that manifests in the struggle to create, the finitude of the results. The tech gods see all these as things to overcome, but it not only seems foolhardy to try but we lose what makes us human in the process.
I loved this essay so much. The idea of limits, and friction. “Another way of saying this: art is an expression of process. If you strip art of its process, it ceases to be art.”
Your last line brought me to Wallace Stevens’ “The Man with the Blue Guitar.” Not knowing whether it was accidental or a conscious allusion, it also reminds me of that other magic that happens across timespans and artists - the way shared knowledge/history/experience comes through our subconsciousness. The idea of archetypes in art, the cyclical nature of encounter ideas and questions that other have similarly who may have come before - the conversation that happens subliminally, not on purpose, is perhaps more magical - and speaks to our innate human experience (qualifying it as a “human experience” felt redundant… until I remembered the title of your essay….)
Not crazy about the Saint Saens slight, that might be a bit of received wisdom that could perhaps be revisited. But I do get the point, although someone like Prince (or Haydn) more or less knew everything and both had distinctive voices. Perhaps those exceptions prove the rule. You are right of course about AI and allude to the terrifying tweak: boutique AIs fed a limited diet, perhaps curated by artists themselves - please god no - and plugged into streaming playlists, emulsifying process and product into a hideous simulacrum of art.
Wonderful essay, Gabriel.
In particular, your assertion that a limited viewpoint is what makes for authenticity in art was something I hadn't considered, or at least not in that way.
I think one can generalize everything you assert about AI-created art, to the formation of any new ideas (not just artistic ones). I've spent a lot of time with ChatGPT and related generative AI models, and I can reliably assert that, short of general AI coming online (not likely anytime soon), generative AI can't (intentionally) produce any new ideas. And even if it does, it takes a human pouring through a sea of AI-produced garbage to recognize and properly contextualize any new idea that the AI happens to accidentally produce.
Of course, art isn't really about new ideas, so much as it is about relaying perspectives, as you say.
That doesn't mean a lot of "creative" jobs won't be lost to AI. But, it will like be analogous to existing software automation that has already taken many the jobs of copyists, engravers, assistants, studio engineers, etc. The business models of human artists will probably evolve, but not much faster than it has already done since the invention of the jpeg, mpeg, and mp3 formats. I don't see the demand for human-produced art fading within our lifetimes.
And besides, there are *far* more profitable ways to employ AI than producing text/images/audio/video. The current generation of generative AI models is just a proof-of-concept, and the only reason they started there was because the datasets were easy to obtain. We're probably at least a couple of business cycles away from anything truly groundbreaking, in terms of benefit (or harm) to society. Finance, coding, and engineering would be infinitely more profitable areas for AI companies to pursue.