You are viewing a single thread.
View all comments View context
0 points

I know I’m more confident than most that LLMs are incapable of producing art. Although you are correct to say that it has not been disproven that LLMs may have the potential to produce art, there is also not currently any evidence supporting that they could create art. We’ll see how it ultimately plays out but allow me to explain why I don’t find it likely that LLMs are the technology we can ever expect art to come from.

Inherent Limitations

LLMs are fascinating and useful for a lot of things, but they are not intelligent. A “neural net” which is “learning” through exposure training data is more sophisticated than other ways of text and image generation that we have yet invented, but compared to the system it’s meant to resemble it is hopelessly outclassed. We can’t currently make something which resembles a human brain because we don’t have a firm grasp as to how one works at all. What little we do know indicates a level of complexity that might literally be beyond human comprehension. A brain is made of billions of neurons connected to one another at trillions of points by branches. At any given moment, these trillions of branches are sending and receiving not in binary but by various combinations of neurotransmitters. At a basic level we know that the the result of this neurotransmitter activity (which is different by the area of the brain it occurs in and even is highly variable between different brains) is a mind made up of some kind of consciousness, subconsciousness, and instinct. This system was not designed by human minds but is the result of eons of natural selection. We have no idea how to even begin replicating something like this, although neural nets could be a step forward. We would need a much larger system working in a fundamentally different way which we may not be able to replicate with our limited faculties

The Quality of Art

In my opinion, of all intellectual processes the production and appreciation of art is probably the most demanding of the system I described above. There is a continuum from concrete to abstract, and while computers are excellent tools to store and process concrete data, art falls on the furthest end of abstraction. Mathematics and the natural sciences are often clearly quantifiable. Social sciences, containing social constructs which change depending on variables we are not fully aware of including the interaction of billions of the above system interacting with one another, is significantly more difficult to quantify although still possible. It is not possible to quantify the quality of art. What makes good art? We have no idea. We have never had any idea. Art is not quantifiable and may often be appreciated on a level beyond our ability to describe or even understand. There is absolutely no guide to making good art and there can’t be. Every attempt to define art has been defied in a way which is considered more expressive and more artistic than the limited products a definite process may produce. At the highest level, art is the pure expression of intentional and/or unintentional meaning from one mind to another on levels we aren’t even aware of in many cases. A machine using sophisticated word-association algorithms using a tiny fraction of the computing power a typical brain has is just not powerful enough to accomplish what a human can.

The Human Element

LLMs are not aware in the same way a human is aware and couldn’t be. Although I think it’s possible to create a true Artificial Intelligence and LLMs may be a step forward in that direction, any AI is not going to be able to understand a human experience because they can’t have them. LLMs don’t have needs or desires, they don’t have relationships or a reason to form relationships, they don’t even have the basic requirement of life to maintain a system against entropy. These are things most animals with a nervous system more developed than a worm can act according to. Building upon these animal needs, our neo-cortexes in addition allow us to have thoughts, rationally solve problems, make plans, and form and store memories. Some of those things we find computers have an easier time with because they have fewer biases, but we have biases for reasons good and bad and this is relevant to art. An artificial mind which has not themselves had to survive and seek satisfaction in this world and without even the basis to do so is never going to be able to create something meaningful to a human mind except by sheer accident. If a true AI does produce art, that art will be most meaningful to other AIs rather than to us. LLMs are mindless machines which can only imitate but don’t have the foundation to produce art themselves. The best it could ever do is challenge the kind of writing which is done with the least amount of effort which is most reliant on common tropes and cliches. The best it could be is a shadow of what we are capable of.

Conclusion

With all of that considered I actually do think that LLMs will become better at applying human language and may even be capable of replicating writing styles which we find appealing when they are being used to tell the stories we enjoy. They may even be able to generate ideas which we may find appealing as well. However, just like we might see or read something we thought we wanted but are left feeling hollow by I think there will always be the most important things missing from AI produced text and images when compared to art from any human.

permalink
report
parent
reply

Literature

!literature@beehaw.org

Create post

Pretty straightforward: books and literature of all stripes can be discussed here.

If you’re interested in posting your own writing, formal or informal, check out the Writing community!


This community’s icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

Community stats

  • 222

    Monthly active users

  • 81

    Posts

  • 108

    Comments