To be fair, I DID make a sidelong reference in my original post to OTHER reasons that AI might be bad for the art industry. I would only debate the 'copyright theft' angle.
You seem to be mixing up legality and ethics to some degree, though. It’s not only about
copyright theft copyright infringement
per se. It’s about the ethics of taking an artist’s body of work and using it to train a neural network without consent. That those artists should have input as the rights holders is an important concern not primarily because of the legality, but because them retaining their rights rather than releasing the work into public domain means they cannot be assumed to be okay with whatever novel use people come up with.
Right now there hasn’t, far as I’m aware, been any legal challenge that can serve as precedent for whether feeding the work into a neural network is
legal. Nor am I aware of any proposed legislation around it. I
am aware that corporations have lobbied for “orphaned works” excemptions in copyright law, and
those are/would be a very direct
legal threat to/undermining of artist rights. I believe the concept of orphaned work (as these corporations would like to define it) could be used as leverage against artists in future legal challenges against generated art. So yes, there’s legal concerns, but those mainly exist as a future concern.
Fact stands that computers do not, and
can not (at least with current technology), learn art the way humans do. Not least because computers don’t
understand any of what they learn. Comparing machine learning to human learning, and indiscriminately feeding images into a training data set to humans using references, comes off as either disingenuous or short on understanding.
If there’s no human verification that images are posted legitimately (by the rights holder or with the rights holder’s permission)
at the absolute minimum, you cannot escape situations like in the Ars Technica article I linked arising. They will continue coming up, and unlike DMCA takedowns for the source images (which could at least remove them from being used in future data sets/future uses of URI-based data sets) there is
no way to “untrain” the neural network on a specific image. It’s like the “once it’s on the Internet it’s out there forever” thing turned up to 11. Until legally challenged (and I’m not a lawyer, so I have no idea of how likely such a challenge is to stick) that’s a solely ethical issue, but one that to me has a pretty obvious answer.
I get that you feel strongly about this because you do play with these tech toys, and you do take requests from people to play with them on their behalf. But that’s the thing. You’re getting value (in views/likes/followers) out of something that builds entirely on the work of others. I’m not saying this as a character judgment on you. I’m saying this because I can’t think of a way, other than being concrete, to communicate that these artists’ work is a prerequisite of your tech toy existing. Of the images you generate existing.
My art might not have been quite the same without my being exposed to Lena Furberg, to
Bamse, to any number of artists. But the hypothetical non-existence of their work would not automatically preclude me creating art. Practically all children draw, and some of their earliest works almost universally will depict things and people they see in real life. Themselves. Their families. Pets and cars and trees and flowers. While symbols will play into it (the sun isn’t literally a yellow circle with straight spokes sticking out), those symbols are not the part of art that anyone objects to being included in training data.
Without the images being fed into them as training data, these tech toys would at best create absolute nonsense. Flat colors, pixel soup, errors, I don’t know. They directly derive their value from other people’s work. At that point, the least one can do is make sure one gains informed consent.
Because otherwise, you do risk heading into a future where more and more content is heavily watermarked and/or hidden behind a paywall. And I know
that’s something consumers of furry art kvetch about to no end.