Because AI art, as it is commonly used nowadays lacks intentionality (the thing that makes a urinal art).
If I read a book, I used to know that every word was put there by the author with intent. If iI read AI generated text, it doesn’t convey anything that a human has put out there for me to experience. I’m looking at formatted output of stochastic models.
I’m thinking of art in the visual sense, and of the creator being a person who is prompting the image generator - which I think meets the intentionality standard.
But, there are a lot of ways folks can use AI tools that aren’t intentional, and I haven’t been considering that.
My stance isn’t 100% changed, but I will start considering intentionality.
Related, but maybe not.
Some years ago, I was a slightly older student with a deep well of photography experience entering a newer graphic design program, and some of it seemed amaturish to the point of being a joke to me. My “Digital Art” class was like that, where the average assignment was to cut and paste things together and apply x number of Photoshop filters to them. It was an easy A, so whatever.
I remember for one of those assignments, I just took it as an opportunity to digitize some prints I’d made. I had taken some black and white shots at night of a local train station, which is pretty scenic, and considered a landmark. They were moody, and foreboding, also slightly soft because I don’t have great darkroom technique.
I pumped up the brightness, threw on like a papercut/rough edges filter, and layered the whole thing with a not transparent blue gradient that made for this sort of cyanotype3 effect.
Later that year, we were told to submit something to a student art show, and I printed that assignment out on the student printer. I might have been first, because the printer hadn’t run in awhile, and the blue print head was sort of clogged, so the thing came out this shade of green instead, because the cyan didn’t print heavily. (But it didn’t band, either, so…)
I submitted that because I didn’t want to pay to reprint it, and that was that.
At the art show, someone asked me about it, and I told them that I had initially done it this way for a project. I liked the blue for some reason I now forget, but then it printed incorrectly, and I liked that too, so I didn’t reprint it. I may have even said something cute about not being able to intentionally reproduce that print failure (they cleaned the machine right after my ‘failed’ print), so it’s sort of bespoke.
A peer later asked why I didn’t just say that was intentional, and make up an excuse. And I sort of lost respect for him. Because that wasn’t my intent.
Which is to say I guess I respect even unintentional screw ups, so long as their presentation isn’t wrapped in falsehoods.
A book that is AI generated that was minimally edited and not really written by the person on the byline, then passed off as human work is not art, it’s just fraud. An AI generated book created with prompts from someone who knows how to write, then edited well to eliminate the AI weirdness, and then indicates the writing was largely done by LLM’s - well, I guess I think that’s art.
AI art passed off as traditional art, or AI art that’s not intentional and passed off as intentional is a fraud.
I guess that’s how your very good point fits in my conceptual framework. If it’s not offered in good faith as art, and explained as art, then it’s fraud. But AI art offered in good faith is art.
Edit:
I’m sorry some folks are downvoting you. You’ve been respectful and open minded our whole interaction.
Because AI art, as it is commonly used nowadays lacks intentionality (the thing that makes a urinal art).
If I read a book, I used to know that every word was put there by the author with intent. If iI read AI generated text, it doesn’t convey anything that a human has put out there for me to experience. I’m looking at formatted output of stochastic models.
That’s a good point.
I’m thinking of art in the visual sense, and of the creator being a person who is prompting the image generator - which I think meets the intentionality standard.
But, there are a lot of ways folks can use AI tools that aren’t intentional, and I haven’t been considering that.
My stance isn’t 100% changed, but I will start considering intentionality.
Related, but maybe not.
Some years ago, I was a slightly older student with a deep well of photography experience entering a newer graphic design program, and some of it seemed amaturish to the point of being a joke to me. My “Digital Art” class was like that, where the average assignment was to cut and paste things together and apply x number of Photoshop filters to them. It was an easy A, so whatever. I remember for one of those assignments, I just took it as an opportunity to digitize some prints I’d made. I had taken some black and white shots at night of a local train station, which is pretty scenic, and considered a landmark. They were moody, and foreboding, also slightly soft because I don’t have great darkroom technique. I pumped up the brightness, threw on like a papercut/rough edges filter, and layered the whole thing with a not transparent blue gradient that made for this sort of cyanotype3 effect. Later that year, we were told to submit something to a student art show, and I printed that assignment out on the student printer. I might have been first, because the printer hadn’t run in awhile, and the blue print head was sort of clogged, so the thing came out this shade of green instead, because the cyan didn’t print heavily. (But it didn’t band, either, so…) I submitted that because I didn’t want to pay to reprint it, and that was that.
At the art show, someone asked me about it, and I told them that I had initially done it this way for a project. I liked the blue for some reason I now forget, but then it printed incorrectly, and I liked that too, so I didn’t reprint it. I may have even said something cute about not being able to intentionally reproduce that print failure (they cleaned the machine right after my ‘failed’ print), so it’s sort of bespoke.
A peer later asked why I didn’t just say that was intentional, and make up an excuse. And I sort of lost respect for him. Because that wasn’t my intent.
Which is to say I guess I respect even unintentional screw ups, so long as their presentation isn’t wrapped in falsehoods.
A book that is AI generated that was minimally edited and not really written by the person on the byline, then passed off as human work is not art, it’s just fraud. An AI generated book created with prompts from someone who knows how to write, then edited well to eliminate the AI weirdness, and then indicates the writing was largely done by LLM’s - well, I guess I think that’s art.
AI art passed off as traditional art, or AI art that’s not intentional and passed off as intentional is a fraud.
I guess that’s how your very good point fits in my conceptual framework. If it’s not offered in good faith as art, and explained as art, then it’s fraud. But AI art offered in good faith is art.
Edit:
I’m sorry some folks are downvoting you. You’ve been respectful and open minded our whole interaction.