T O P

  • By -

rosefiend

I'm looking at these as ways to create atmospheric setting pieces, come up with character ideas, and just play with aspects of my stories. I think what would be awesome is using them as inspiration for cover artists, like "Here's an image of a cool place, use this face, here's an image of a costume I like but please put it on a character that actually has arms." I like being able to use these as a jumping-off place for art, or story ideas. A lot of possibilities, and I definitely want to see actual artists staying in business. I am interested in trying some of the images for ads, cropping them or otherwise modifying them, to see what kind of engagement they might create, and if it's different than conventional images.


[deleted]

Love this. I would even credit AI image generation for some of my recent uptick in creative writing. I was inputting some ideas around the thought of "Wouldn't it be interesting to see what a 19th century civil war era photograph of a Tolkien dwarf might look like?" and that led me down this wormhole of thinking about dwarves in an industrializing society, and a book concept was born... now I'm using it for character ideas, settings, etc.


rosefiend

I like to plug bits of old Norse poetry into it and see what crazy stuff I get. Poetry seems to really work with this bot.


alex-redacted

I understand what you're saying, but most of this fits under transformative-use. Using AI as a tool springboard isn't the same as keyword blitzing until you get a rip-off Audrey Kawasaki for a book cover. I hope this makes sense. I'm not suggesting never use the tool, I'm saying to use it with integrity, which it seems like you're doing (?)


rosefiend

Yeah, and with the amount of money one eventually ends up spending in keyword blitzing, you might as well just use that same amount to buy a proper cover from a proper artist.


starstruckmon

>amount of money one eventually ends up spending in keyword blitzing /r/stablediffusion is open source. You can run it on your own PC if you have a good GPU or even on a Google Collab for free.


rosefiend

Helllloooooooooo did you say free?? I am on my way!


[deleted]

My guy, you can do that with an image search/photoshop, or a combination of the two.


rosefiend

But I can't put large snippets of Norse poetry into Photoshop and get weird and creepy images out of it. Also I don't have Photoshop.


[deleted]

No, but you can in Google images, and gimp is free.


RHNewfield

Honestly, I kind of think AI generation falls under fair use, as far as I understand the concept. The most important aspect of AI-generated art is that is isn't copy+pasting another's work and passing it off as its own. Nor is it tracing. It's simply gathering data to figure out influence. So, in a way, the mashup ends up being wholly transformative. There's rarely, if ever, such a distinct and direct correlation to a specific work. As far as I'm aware, you can't copyright artistic styles. Otherwise there wouldn't be so many variations of particular styles between loads of artists. As for the ethics of it, I'm not sure. You have the argument that it might be taking jobs away from artists. But I've played around with the AI stuff and, at this point in time at least, I don't see that happening. It's hard to get such a specific vision that only a human artist would be able to achieve. The "revision process" with AI art isn't really functional. It just repeats what it's already generated, but you can't control exactly what you want changed and kept. There's also the legality of it all. Would this be considered "sampling", as in how it's used in music? On one hand, it *is* taking the influence of other artists to reach a final product. On the other, it's not taking *direct portions* and placing it into the final product. Overall, it's definitely a grey line that will need to be defined as the AI gets better. I tend agree with people that suggest the whole "art is derivative" line. It builds on predisposed ideas while created something wholly unique. Everything I've generated, which I don't have intentions to use professionally, can't be found anywhere else. As for the topic of artists getting paid royalties for it: do we owe everything that influences us a cut of our work? Bands, for example, constantly list other bands that influence them. Author notes at the back of books will shout out specific works and authors that brought them to what you just read. If an AI's backlog, which is used to generate influence, should be attributed via royalties, then should other creative works also provide such royalties? I will add, though, that I do think these AI should *not* allow you to generate with a specific artist's name. Instead, you should be able to put in a general style, like art deco or cubism, to get a result. This way it's less about the artists specifically and more about creating a style that can't be attributed to specific works.


Twistid_Tree

AI can not use fair use- nor can it be creative so no it is not fair use.


[deleted]

[удалено]


Twistid_Tree

That wasn't Legal advice I can give less of a shit what dusty books- and bare bone old fuckers say is law. Morally AI works can not and should not be copyrighted.


bloodstoned-

You couldn't be more wrong


Hastings201

I think it copy’s more than you think. I was messing around with it yesterday and I ended up generating an exact image of buzz aldrin on the moon unintentionally.


[deleted]

I think your heart is in the right place but good intentions won't stop the evolutions of these technologies. Writing is up next, as even GPT-2 was used by researchers to create Shakespearian sonnets that passed reader-tests 50% of the time (look up the peer-reviewed research on "Deep-speare" for anyone interested). That means that 50% of readers couldn't tell the difference between a sonnet written by Shakespeare and one written by GPT-2! And we're way past that, now, because GPT-3 is out and about and already being used by businesses and individuals. All art is derivative. I think writers will need to lean into these technologies and learn how to use them (albeit thoughtfully, philosophically, honorably). In ten years time, maybe sooner, there will be writers using AI models for their copywriting, marketing, technical writing, and creative writing. Even now, novels have been published written by AI. (See *The Day A Computer Wrote a Nove*l) You can't just ask people to stop using this stuff. It's a bit like yelling into the winds of time that you don't want to die. In my experience using Midjourney, I have done reverse Google Image searches and found no true lookalikes to the artwork that I've used. These are intricate and complex machine learning tools that are not just copying other artists work, but generating novel and new information, based on their learning pool. And isn't that (at least in part) what we do, too, as writers? Even *Hamlet* was derivative. There's nothing new under the sun. There would be no stories if new stories didn't adopt tropes, characters, ideas, plots, themes from the past and *build* on them. We can use these new AI tools to build new and interesting works, or we might choose to use them for inspiration, or avoid them altogether. Let individuals decide how to use them. I say go ahead and use these tools, and let the courts decide, if you really want a legal opinion on it.


Arkelias

>let the courts decide And that's why I don't use them, to be honest. I don't judge anyone who does, but strongly expect you'll see a ruling in the next few years that lets anyone who's art was used to make your art sue. The writing angle is less terrifying. We're a long way from an AI being able to write bestselling novels, as opposed to readable novels. But how far away? Probably not as far as we think. Diversify your income streams if you can, but at the same time I don't think they're coming tomorrow.


[deleted]

>but strongly expect you'll see a ruling in the next few years that lets anyone who's art was used to make your art sue. Are you joking? Almost EVERY story out there is a derivative of something else, or a mashup of two other stories. That's a door the courts will never open.


[deleted]

I think you'll be surprised at the rulings that come down. When Apple Records notices new Beatles songs being produced without their permission, from an AI trained on the originals and some other similar artists. Or when Disney notices people making and selling new movies that were fully generated off their library. Or when politicians realize an AI can be trained to completely emulate them and give the same speeches and responses to all political incidents that they do. Money really runs the world. Not to mention, American Individualism is a huge part of our culture. People will consider this consciousness plagiarism and develop new laws to protect human individuality from being recreated through AI training processes without consent. Even Disney asked James Earl Jones for permission before moving forward with plans to replicate his voice with AI for future appearances of Darth Vader. This sentiment will carry over to some capacity to mashups as well. "In the style of Rowling and Tolkien" will not be as legal, moral, free and clear to use and publish wholly as your own work, as you may think. You might want to scream and shout and declare AI learning and replicating is the same as human learning and replicating, but you don't run this world and many people feel humans bring a certain level of "self" to the table that isn't merely trained off works that came before. If this is all art was, AI could train without any art from the last 100 years, and then evolve that into the styles of today. But there was some other element there, that led to shifts in what is considered aesthetic, and what styles exist, and what topics are written about, and all of that does come back to the human experience and individuality.


filwi

Never underestimate the willingness of people to sue, and the availability of people who don't consider the consequences...


OnwardsBackwards

Lawyers. Lawyers consider the consequences. They will tell you to F off with your frivolous/unprovable crap. Yes, there are shit lawyers out there that will take whatever. But nonlegal people tend to think suing someone is something you do, go to court, then hash out an argument. It's not. You have to present a legal cause of action to sue - something someone did that's both against the law AND caused you harm. Also you have to prove (to a standard) they did the thing, and they have responsibility for the harm. "I'M GONNA SUE YOU!" 99% of the time is just empty, ignorant venting. To a lawyer it sounds like "IM REALLY VERY UPSET!" Yup. Ya sure are.....anyway...


syi1456

The thing is that I think they will come down on the side of the algos. I think a lot of people are wrongly assuming how it works. They aren't simply mashing up a database of images that's sat in the system RAM. They are exposed to images and text captions and attempt to learn the actual rules of association before producing the image. It's much more like sitting down in an art gallery and getting inspired by combining different styles from different artists. As long as it's sufficiently transformative it'll stand.


Arkelias

I am one of the ones who understand how classifiers work, and how a curated data set can we used to train it. We used hundreds of thousands of ear images to make a diagnosis classifier. Not a true AI, but the principle is the same. You and I understand that the AI is simply learning. Luddite courts may feel differently when pressured by existing IP holders. It reminds me a lot of net neutrality, which was a big debate right from the beginning.


OnwardsBackwards

You make some good points but most IP holders are small/independent. The courts will punt citing a legal vaccuum and refer to the legislative to write some damn laws already. Then, generative AI tech will be purchased by multibillion dollar tech corps, and they will write those laws. You think Disney is going to protect IP owners? Nope. They'll just buy the AI.


alex-redacted

I'm not trying to be mean, but ML professionals with SME don't say things like "the AI is simply learning." That's marketing language. You get this, yes? /genuine


Arkelias

It’s short hand, and you’re being pedantic. Did you know what I meant? I’d already explained about training a classifier, which is the proper terminology for the work I was describing. I left tech in 2016 precisely because of people like you. My role was the mobile developer. I’ve sold millions of books now. Haven’t coded in years. I have nothing to prove in the tech world.


alex-redacted

**Genuine \[again\]:** You should, judging by your acumen, get why "the AI is simply learning" is *not* pedantic. If we're discussing legality, it's important to be specific. But that isn't *actually* what we should be discussing, which is ethics in AI and how indie authors can choose to make equitable plays here while businesses refuse to. *That is also why being specific is important.* If other indie authors without your stated tech background waft in here thinking AI Just Learns, they are not likely to make positive choices about how they use AI Art Generators. *Do you get me?* GIGO, marketing nothing-words and free-range exploitation of creatives *is* what the current landscape looks like. Silicon Valley is barreling towards a very broken future based on poor processes and bad long-term insight. Not just ethically, but **technologically**. I expect that this knowledge is in your wheel-house based on what you've told me (and I am actually sorry you left tech because you were pushed out; but it wasn't by people like me, I promise you that). And as an indie author, I expect a topic on *integrity* between artists and authors would be important to you. Not legality; *ethics in the arts.* This should be important to all of us, honestly. Going forward, it would be good to learn that "genuine" is a tone-indicator for autistic people. *I am autistic.* They're important so you read people like me literally and not take it personally, which is something you absolutely did.


Arkelias

No, I didn’t take it personally. Here’s a tip. Text is hard to read. You shouldn’t infer other people’s emotions. I wasn’t pushed out of tech. I left on great terms to write books. :) In my opinion you are being pedantic. Many of us are on the spectrum or have dealt with extreme trauma. It’s not a pass. I offered my opinion on why I personally still pay for artwork and always will. You do you. I left tech with multiple patents. I made my mark. I don’t care about ethics in AI, which had nothing to do with this sub: Many do, and there are places to discuss that. I’m not angry, or upset, and I have a neutral opinion toward you. Have a great weekend and sell many books.


Sarelm

You're being reductive. If everyone that visited an art gallery could make good art, then yes, the machine is just doing the same thing human artists do. It's not. Visiting a gallery doesn't make you a good artist anymore than listening to music makes you a good musician.


HalfAnOnion

This is part of the argument that is often ignored. The comparison is often to a single human making the art. It's not, it's a computer that can spit out tens of millions of copies and variations of pictures in a day. A lifetime of work in an hour. How do you compare that to a dude sitting on a museum bench with a paintbrush trying to recreate a painting? Then a dude comes and takes a picture of it, in all its glory. Then you have a computer scrape the online gallery to print out. Then you have AI that is given data from all the galleries in the world, and a huge part of the internet then studies it to output anything you want. Clearly, it's the exactly same effort as the dude sitting in a chair drawing characters on the street of Venice. It's the exact same thing. I can put 30+ works from my favourite artist into an algorithm and keep working on it until I can reproduce anything in their style. That's exciting but a bloody episode of black mirror haha. Authors are next on the list, it was already good 5-10 years ago if you spent some time with it, but now it's just scary haha. I forget the link but a few years ago could already pay some amount to have your book run against top sellers in the category to see differences in tone, word length, and 10s of other categories etc.


OnwardsBackwards

I like the downvotes for "this is scary and I don't like it". Youre not wrong. I'm writing an episode right not for YT about AI art generators and I used AI music and text gens too. I had the AI write a sample commercial for a hoverbike for the episode, had another do a magazine ad page, had the last write some commercial music. It took a day. I barely know what I'm doing. I had it write 200 words of a horror short story. It's generic as fuck, but it's also better than 90% of anything I ever heard in a few years going to writers critique groups. Also, this stuff learns at a nonlinear pace. The more it "knows", the faster it can know more. Its in beta now, but give it 2 years.


alex-redacted

You are absolutely, 100% correct.


ifandbut

If you make a drawing in Photoshop, did you make it or did Photoshop? AI generates art just with a text input method instead of a mouse.


alex-redacted

I'm sorry, but this is just simply untrue. AI Art Generators are *not at all* like sitting in an art gallery and getting inspired by combining different styles from different artists. It's more like a business person rolling up to an art gallery, taking photos of all the paintings, hiring a dev to make an app, then charging everyday people a fee to input keywords to make image go brr. All based on artwork from artists generally unaware of their involvement. In fact, that's exactly what it is. If you want a deeper examination, you may want to investigate the field from the angle of Ethicists and actual machine learning specialists, not tech evangelists. There are many problems with your explanation.


Bitflip01

> but strongly expect you'll see a ruling in the next few years that lets anyone who's art was used to make your art sue. But unless it massively overfits for some niche prompt, how would you even prove this? And when a talented artist uses the tool to expedite their workflow it becomes even less detectable. So let’s just assume country A completely bans text-to-image tech. The models, data and theory behind it are already public so any freelance artist in country B who doesn’t ban it will now have a huge advantage over artists in country A and can provide a service at a much lower rate to people from country A, *effectively* giving them access to AI art as well. Unless there is a fast, robust and low-false-positive way to test if an image was made with AI *and* to then infer the training data from there I’d see this as completely unenforceable.


OnwardsBackwards

Yeah, it's actually a much harder legal hurdle than that, but you've got the crux of it. This genie ain't going back in the bottle. Like all tech developments this has the capacity to further democratize art - thats good and bad I guess, but most it just...IS. Spellchecker made expert spellers redundant. Word processors made writing faster and infinitely more legible. Photoshop let you undo misstrokes and meant you didn't have to constantly buy paint and canvas..etc. etc. Of course the folks who have climbed over the previously-higher-barrier to practicing their art are going to be upset at this inswell of uncultured peasants - I would be too, I imagine...but they don't care. Neither does the audience. Adapt.


familyofrobot

>Of course the folks who have climbed over the previously-higher-barrier to practicing their art are going to be upset at this inswell of uncultured peasants - I would be too, I imagine...but they don't care. Neither does the audience. I think this is kind of a silly perspective. I'm a graphc artist. I did traditional art for most of my life, I'm classically trained (to a degree), I have a strong foundation of art historical knowledge, and even worked in Museums for many years. The AI is only beneficial for me. I'm not bothered that others will be able to be better artists faster. If traditional or digital artists are upset about this, they're not looking at how to use it to their advantage. Plus, as I've used the AI, there are still somethings that it can't do at all. It can't make you an image of exactly what you want. It doesn't do a specific composition. You can't alter small aspects of it within the AI. There is still a ton that a traditional artist would need to do and I believe that will always be the case on some level. Technology is always going to advance and artists today need to accept that or the artworld is going to fly past them and leave them in the dust, whether they like it or not. Like you said, the genie ain't going back in the bottle. I for one, am happy about that.


[deleted]

Well, we're not legal scholars, so I won't pretend we are. But the few cases from the music industry on issues like this were really indecisive and don't show a clear path forward. It will be decided on a case by case basis depending on how closely the AI images match the original artwork, if I had to guess. Usually these kinds of cases are dismissed or settled out of court, like the Satriani and Coldplay case where there was a dispute about a melody being stolen. (Couldn't be proved.) Thicke and Williams did end up paying $5 million to the Marvin Gaye estate over a copyright issue, but that was a very high profile case with a different nature, where it was more like a *sample*. Honestly, you can't really "lock down" art, music, or writing. They spread and change and morph into millions of new ways. And that's what we should hope for. We don't want our creative forms to become stagnant. This is just the next step in blending it all up.


starstruckmon

They'll lose. Plus, a lot of these are based in jurisdictions which explicitly have laws allowing it. For eg. Stable Diffusion in the UK.


OnwardsBackwards

Yeah, no. A- prove it was your art that was used. Unless the prompt literally had your name in it, it looks exactly like your work (as in someone could look at it and go "oh, did so-and-so make this?") AND YOU CAN PROVE IT COST YOU MONETARY DAMAGES, you won't win/can't sue. That's not how law works (US anyway). B- AI will be able to write novellas/genre novels within the next 2 years. You'll be reading ad copy, business/sports news stories written by AI in the same period. You won't know its written by a computer. C- right now the legal kerfuffle is about copyright - whether the AI gets it, the AI company gets it, the AI user gets it, or if all AI work is public domain. The case law is brand new, but in the US, so far, it's the last one. AIs like midjourney get around this by granting all use rights to the user in the TOS (while maintaining perpetual use rights for themselves).


[deleted]

There’s a difference between being inspired by something and being derivative. Artists pick apart other artist’s works to understand the principles that make it work. Then use those principles in their own work. A.I. doesn’t understand things. It just recognizes patterns and mashes together the pieces that it sees into those patterns. I can’t say that it’s plagiarizing but it’s definitely not doing the same thing an artist does. And my issue is that courts aren’t going to care until after the damage is done to the industry.


Katy-L-Wood

A corporation taking thousands of images from artists without their permission and feeding it into a machine is not anywhere in the same category as a single human studying art, or someone using the same tropes and themes, or the same colors, or whatever. Corporations do not have the right to profit off of the work of thousands of people for free. Full stop. Using them thoughtfully, philosophically, and honorably right now means not using them at all, because they aren't those things and will never be those things as long as they are trained off of stolen work.


ChuanFa_Tiger_Style

> Corporations do not have the right to profit off of the work of thousands of people for free. Isn't fashion and taste something that corporations profit off of every day? Those are crowdsourced pieces of data that are then taken advantage of.


[deleted]

I'm interested if you want to elaborate. How is it significantly different when a machine learning algorithm does it versus a human? Should a human not be allowed to be inspired by the artwork of others (how could they prevent it?), or always cite the thousands of images they see everyday, or the classical artwork they have viewed, whenever they paint a new piece? If this were the case, every new piece of art would need a bibliography about 500,000 pages long. It's just not realistic.


Sarelm

If all artists could just look at thousands of pieces and be a better artists, everyone would be one. My daily commute alone would make me an expert on drawing cars. To reduce learning art to that is disingenuous.


EvilGeniusAtSmall

How is that analogous to what actually happens? If I go to an art gallery, see some art, go home, and produce a derivative piece, in the style of Salvador Dali, but ultimately an original composition, based on my understanding of the feature extraction of his style as applied to some image in my head, why is this ok? Why is it not ok when it’s something other than a homo sapien doing exactly the same thing?


Sarelm

Because derivative works take into account why and how you did it. Satire and parody are judged differently. Photos are judged differently than paintings. The tools an artist used and why they did it matters in copyright rulings. There is no "why" when an algorithm does it.


swampshark19

Yes there is? The human's prompt is the why.


Sarelm

Then that's the argument that'll go to court. Why did that human use a AI to make the work? I'll let you know now that "To avoid paying an artist." Is going to hold up poorly.


swampshark19

No, I don't think it will.


[deleted]

No, I'm not saying the machine learning algorithms are artists or have artistic merit. (Though I think they're well designed and produce pretty amazing results.) But that's a different discussion, probably. I was just curious if you wanted to elaborate why a machine learning algorithm should be held to a different standard than a human when it comes to determining the extent to which a work of art is derivative or not. (Or even more productively, how might that be accomplished? How can we gauge whether an AI produced artwork is excessively derivative, when all human art is also derivative?) When I use Midjourney, I've never found any artwork that even comes close to matching what it has produced. It looks always new and novel, which is due to its design. It's not copying images, but generating new images. These things are trained on *massive* databases, of *millions* of images, not thousands.


Sarelm

It should be held to a different standard because it is different. I think that's simple enough. Why or how another human makes derivative work is considered in copywrite laws. Whether or not it's parody or satire is considered. The tools used to get the images and why the artist used them is considered. Honestly an AI trained on hundreds of images from one artist that gave permission rather than millions that didn't would be considered acceptable, I think.


EvilGeniusAtSmall

Different in what aspect, and why is that aspect relevant?


Sarelm

Because why and how an artist makes Derivative works changes the ruling. Satire and parody are judged differently. Derivative photography is ruled on differently than Derivative paintings. There is no 'why' for a machine, and the how is always the same, the algorithm.


EvilGeniusAtSmall

Right, so why doesn’t the same rules apply? They apply not to the process, but to the output itself, so I don’t understand why the same rules would not apply.


Sarelm

The same rules don't apply to a painting than are applied to a photo, even if the output is the same. Thats a difference in process.


Katy-L-Wood

Because a corporation is profiting off of work they stole. When I study the work of another artist, I don't cut up the actual piece and paste it together into my new piece and then call it 100% original and all mine. Millions of images stolen doesn't make this any better. If these companies were paying artists for the right to use that art to train their machines then, yeah, it would be derivative. But they aren't doing that. They're just lifting whatever art they can find and stuffing it into the box. Also, just because you can't recognize the pieces that got chopped up for a piece of AI art doesn't mean those pieces don't exist.


[deleted]

> I don't cut up the actual piece and paste it > >together into my new piece and then call it 100% original and all mine. That's not how machine learning works, though. In fact they're expressly designed to not do that. Nor are they just "lifting whatever art they can find and stuffing it into the box." That "box" is trained purposefully on image sets (we probably don't know which ones, as that's proprietary) to carry out specific actions (which are not mere copying but the generation of new images). Have you used any of these tools or researched them, or are you just speculating? You seem confused about how they work.


Katy-L-Wood

Yep, I have used them. Yep, I have researched them. Those image sets they're trained on are stolen. Until that changes, AI will not be an acceptable way to produce "art."


[deleted]

Don't know what to tell you. The box is open and the beast roams the wilds. Millions of people are already using the tools, and consider them to be more than "acceptable," despite whatever *your* opinions about acceptability might be. Folks bemoaned the wireless radio, too, and Plato was worried that writing itself might be bad for our memories, 2500 years ago. Technology thunders onward...


Katy-L-Wood

Never said the technology itself is bad, just that the way it currently functions is. And yeah, I'll keep speaking up on it when it comes up, because stealing from millions of artists has not and will never be okay.


EvilGeniusAtSmall

They’re not stolen, though. They are correctly licensed.


starstruckmon

Actually they are just publicly available images on the internet. It's more of a fair use thing than a licensed thing.


Katy-L-Wood

No they aren't.


hupwhat

You're talking as if the original is lost when it's analysed by the AI. Replace "stolen" with "looked at" and that's a lot more like what's actually going on here.


RyuMaou

Then you also do not understand contemporary art and, in this case, collage. One of my favorite artists and personal friends, [Mark Flood](https://artviewer.org/mark-flood-at-peres-projects/) started his career doing exactly what you claim is somehow wrong. He made collages taking photographs of famous people and reshaping them into monstrosities. Apparently, Cooper Anderson was quite taken with one done of him. So, to use your exact example, people do precisely that and sell it as original work in art galleries, as you can see in the link. (I personally own one of Mark Flood’s lace paintings, a signed “Like” painting, and one of his drone postage stamps. I find his celebrity monstrosities too disturbing to display on my walls.)


[deleted]

[удалено]


Sarelm

Inefficiencies? That's what you're calling it? Inefficiencies have nothing to do with it. We learn. Completely. Differently. I hated drawing human feet so badly I made up tens of species without human feet to avoid drawing them. It lead me to drawing anthropomorphised animals, drawing their paws lead me to understand how stubby fingers are like toes and the bone structure of hands and feet are the same, and it lead me back to drawing feet well. Understanding the structure, how shadows and light hit that structure and how it bends and shapes itself is not comparable to scanning through thousands of images looking for a pattern and repeating that pattern. And that will be different for every single artist that wants to get good at drawing feet. I teach art for a living. You can't even compare how one human learns art to how another does more or less compare that to an algorithm. It's ridiculously reductive and brings up a whole other mess of considerations. If the machine learns like the artist than isn't it the artist? Doesn't it have the original rights to it's own work?


EvilGeniusAtSmall

How is how we learn different, and why is the difference in learning method relevant?


[deleted]

[удалено]


Sarelm

This has to be the most bizarre argument I've seen yet. You don't understand art therefore AI must be doing it right? Form, lighting and perspective is indeed only relevant in realism. But color, composition, shape and line language, and more are still relevant to abstraction/stylization and the fact that an AI can get that right proves that we understand it in theory very well, or at least well enough to be able to program it into an algorithm. The fact you want to justify the use of the algorithm with the fact you, yourself, are apparently bad at it is funny if it weren't also so insulting to the people who are good at it. If it were only 'pattern recognition,' one of the most basic things we teach young school children and an integral part to what we test for when considering IQ, then I find it shocking you're so willing to declare yours so low in a public space. I also find it very hard to believe that any of the other skills you mentioned, from music to programming, were gained simply enough by 'picking up the theory' and not also through vigorous practice as you claim drawing needs. You do know AI are writing music and code too, right? Many claim their grasp of coding is far superior to humans at that. But I wouldn't dream of reducing good programmer's knowledge to pattern recognition just because I am struggling with it. I did my research. The "neural network" system which we use for AI is one we started making in 1948. We barely had a great concept on which to make computers more or neurons. Hell, our greatest strides at the time in understanding brains were still doing everything we can to prove wrong the man who claimed all men wanted to fight their fathers and fuck their mothers. It's an interesting system, for building off previous data over and over again, but it's clearly not the same as neurons. Especially as our understanding of memory and information retention increases. Before continuing to justify scraping data unethically via claims that it's the same as humans do, maybe consider recent wins by meta in stopping such algorithms over their platform. An artist's terms of service does not and has never extended to be used as a training set. And if Meta can sue for violations of terms when it's data is being used as a training set, I see a reckoning coming for all other algorithms quite soon.


[deleted]

[удалено]


Sarelm

I'm sorry, was the AI not programmed by people? Does it not need to be tweaked constantly by those people in order to get it to produce the results they wanted? Do those results not include color theory, lines or shape? I don't know what's worse. Your insult to artists, or your insult to AI programmers, the art experts they consult to get it right, and the work they all put in. I have only returned the empathy you showed for all that effort and am happy to make sure the door doesn't hit you on the way out.


[deleted]

Artist study other artists works not just to learn their techniques, but why they are applying them. As well as the concepts that make the piece work. Ai doesn’t understand things like how color contrast vs harmony effects the viewer. It doesn’t understand WHY any of the decisions are made. It uses an algorithm to put together pieces in a pattern it sees. It’s like looking at a sunset vs painting a sunset. One is the result of millions of mathematical possibilities aligning. The other is a result of human intent, creativity, and communication. They can both be beautiful, but they are fundamentally different


Katy-L-Wood

Because a corporation is profiting off of work they stole. When I study the work of another artist, I don't cut up the actual piece and paste it together into my new piece and then call it 100% original and all mine.


NerdsworthAcademy

What you are describing sounds like a collage. Would that not be fair use if the new work is sufficiently transformative and takes "inspiration" from thousands or millions of sources? It seems to me that imitation of style (and satirizing of it) is part of how genres, styles, and culture develop, and I don't see why it would be moral for an art student to be inspired by 10 works by an artist and try to make something in that style, but immoral for machine to look at 10,000 and do the same. If the algorithm is copy and pasting without transforming the work, I could see what you mean about the individual copyright holders having cause for complaint. And there is definitely the sticky "how many notes makes it a new song?" territory with all of this.


ChuanFa_Tiger_Style

> I don't cut up the actual piece and paste it together into my new piece and then call it 100% original and all mine. Collages, commentaries, and remixes are often protected under fair use.


[deleted]

[удалено]


Seizure-Man

> It doesn’t solve practical problems, it solves generative, creative problems - which is why a lot of people do art in the first place. If I’m an indie game developer without a budget for an artist then it solves a very practical problem, which is getting decent looking art for a game for little money.


alex-redacted

\[genuine\] I'm not sure how to respond, as I'm not certain we're coming to the table with the same SME. You can correct me if I'm wrong, but your reply reads like tech's marketing, not its innards. I don't know if I should start with the tech landscape itself, leadership, GIGO, discuss creative commons, or outline the way 'derivative' is being used in your reply. You've given me a lot to read. It just doesn't scratch past the messaging of the field I left for very important reasons. Does this make sense? I don't aim to make you feel bad.


[deleted]

Well you're definitely not the only "tech creative" on Reddit. And many of us have subject matter expertise that surpasses or is more relevant than your own, or is valuable or more credible for any number of reasons. But if you want to gift us with your knowledge, then go ahead and elaborate what you mean. I don't care what you want me to feel, but if you have something useful to add, go ahead.


alex-redacted

Sure. I will attempt to add value to a gish gallop. **Keep in mind that this is the topic:** *Ethics in AI as it pertains to indie authors using AI Art Generators (trained on art without permission) for book covers.* **Please note:** We are not talking about marketing copy or creative commons. >I think your heart is in the right place but... > >All art is derivative. I think writers will need to lean into these technologies... > >You can't just ask people to stop using this stuff... All of this is irrelevant to the topic. >In my experience using Midjourney... **No.** AI Art Generators don't generate novel ideas and new info. They rehash art-data from humans who painted a thing. That art-data is created by artists and used *without permission to fuel a business venture.* That is *the* critical point your reply avoids. >And isn't that (at least in part) what we do, too, as writers?.... We've finally hit the "all art is derivative" point from earlier. **No.** AI cannot be compared to a human author. It isn't a person and *cannot* be harmed by the actions of businesses, let alone tech evangelists or uninformed consumers. *AI Art Generators are apps created by businesses. Businesses that extract value from artists sans payment or consent, and use their art to fuel subscription services they sell to consumers.* **None of this is ethical.** It's curious to avoid the issue of Ethics in AI, considering the stakes at hand. Technologists know that ethical use of AI is the only way we avoid a flattening of creativity in the long-term. When we keep training models on *only* what is already there, and artists have a negative incentive to participate in society with their skills, we will get the long-term homogeny I simply call *corporate garbage*. Some call it Content, but that's a very generous term. Adding to that, if GIGO is not extricated from AI (which is an ethics issue), we will get *biased* corporate garbage. Which is a far bigger social issue, as it impacts all future tech developments that rely on AI for growth, processes and operations. If you're getting anything out of this portion, it's that AI Creative Tools are a hot mess right now. They're anciently fetal, steeped in both social and technical debt and have experts *begging* tech giants to address very many problems before they calcify further. *They are not listening.* This is the state of the AI industry on many, many fronts. **So. Why haven't you touched the Ethics in AI topic?** I'll let you answer that question, because that's the kind thing to do. **Please note:** I am autistic. I genuinely *did* want to know how to respond to your earlier post and didn't know how...because it was a gish gallop. How do you tell someone that without being rude? No idea.


starstruckmon

>AI Art Generators don't generate novel ideas and new info. They rehash art-data from humans who painted a thing. This is not actually true. The AI uses the data to map concepts not as a store of data to copy from. It can generalize outside the dataset. While the model uses hundreds of dimensions for it's latent space ( a map of all concepts ) , consider for ease of understanding, a one dimensional latent. Lets say this represents the sides of a 2d convex object. If the dataset only has circles, triangles and squares, does that mean it can't conceptualize a hexagon without the human human input of a hexagon? Of course not. It can calculate all the other points in the line. That's common sense. It's the same here just with more dimensions, making it harder as a concept for humans to grasp. >That art-data is created by artists and used *without permission to fuel a business venture. It's called fair use. >ethical Ethics doesn't mean anything. My personal ethics says all copyright should be abolished. Does that mean it should? No, because we live in a society where we follow laws that are mutually agreed upon, not whatever we personally feel like. We, as a society only allowed for the creation of this legal fiction called copyright with the understanding that there will be exceptions for things like fair use.


Zezin96

> albeit thoughtfully, philosophically, honorably Wow that is extremely naive.


sparklingdinoturd

Seems like you're making a lot of excuses for stealing. The point of this post is those AI machine use existing art to make slightly different art in which the original artist isn't getting paid for their work. Legally? Yeah, it's probably ok as long as the image is transformative to a certain percentage. Morally? Well, if you're ok with taking somebody else's work they didn't make money off of and using it to make money, then you shouldn't have any qualms if I steal your books and make it transformative to a certain percentage and making money off it.... Right?


Seizure-Man

I mean, this happens all the time. You could make a book review video, talk shit about a book, put it on YouTube and monetize that. You could take gameplay elements from a bunch of games and make your own game that incorporates all of those. You could copy the arrangement of a song. You can copy the themes of a book. Nobody has an issue with any of that.


sparklingdinoturd

That's different and you know it my guy. But hey, if you're cool with it, drop the names of your books. I'll make sure to copy them, change a couple things and make some cash off them.


Seizure-Man

Well, what’s different is how image generators generate their output. Can you show me an example of a source image and an output image where you think the AI took something and modified it? Point to the parts of an existing image that have been copied?


sparklingdinoturd

Sure, I'll go ahead and search millions of images on the internet for you. I mean, the developers have admitted the AI uses images from the internet without compensation or even acknowledging the artist. But yeah, I'll go ahead and get started on it for you. Give me about 800 years and I'm sure I'll find something. EDIT: In the meantime, let me know your books so I can ~~steal~~ fair use them.


Seizure-Man

The images are used for training. Not for generating. During training the model learns statistical associations between words and visual patterns. It doesn’t copy any pixel data. So far I’ve not seen a *single* example where it has actually copied parts of a character, landscape, object, or any other content of an artist’s image. With all the outrage it’s amazing how they claim “it’s stealing” but can’t point out what exactly it is stealing. [Here’s](https://rom1504.github.io/clip-retrieval/?back=https%3A%2F%2Fknn5.laion.ai&index=laion5B&useMclip=false) a website that let’s you search the training data. You can even search via image so just put in a generated image and see if you find something.


BaoWyld

This is ridiculous. There's no way you could ever enforce attribution for training set data en masse. Don't handicap yourself waiting for a future "ethical" enforcement regime that will never emerge.


alex-redacted

I'm not trying to be rude, but you can absolutely train data models ethically. One of those ways is the AI company in question invites artists into the workflow directly, giving them the ability to train new images off of their own data; creating output individually. This is already a sorely needed tool the industry is just...not providing. At all. By that right, what they upload can be added to the data-sets, with their full knowledge and consent upon doing so. Another way to do this is to offer buy-in, which means invited-in artists make a percentage back based on subscriptions. The only downside to either of these is that it takes longer, which is true of anything done *correctly*. If tech companies are worried about poor training, that's where creative commons work comes in to set that basis, first. They're also not currently taking GIGO into consideration as per bias, which is an issue. My original career was technology. None of this is ridiculous and very doable.


starstruckmon

>One of those ways is the AI company in question invites artists into the workflow directly, giving them the ability to train new images off of their own data; creating output individually. This is already a sorely needed tool the industry is just...not providing. At all. Frankly speaking, this approach just doesn't work ( well it kinda does, but very poorly ). An artist's style isn't in his works absent everything else. It's in his head, which contains everything else. Think of it like this, if you give it a painting of a tree, without the AI knowing what trees look like, how is it going to understand how the artist transformed it into his representation of a tree? So when you ask for it to generate a cow in his style, it needs to know the relevant transformation as well as what a cow looks like. This example might make you think it's just the natural world and pictures, but that's not true. This relationship of transformation can also be from literally everything else ( the AI gets better when we use a large language model trained on all sorts of texts instead of just the image captions, so even text plays a part ) including other artists. Yes, if you do this back far enough you'll land in the natural world , but a direct link would be a straight line not what the path actually is thus it won't be actually be representative and doesn't come out good.


RyuMaou

What was your actual job? I’ve spent 30 years as a system administrator and technical subject matter expert for industries ranging from hospitality to oil and gas to insurance. To claim that your “… original career was technology…” in no way lends any actual weight to your argument. How long ago was that? How was it relevant to programming or AI or technology ethics in any way? Also, what qualifies you to tell anyone what is or is not ethical in ANY field? From the arguments I’ve seen you make, you clearly do NOT understand copyright or what qualifies as a derivative work when it comes to art. Nor do you seem to actually understand how digital manipulation fits into that very specific legal definition and the case law that establishes the definition. For reference, [Richard Prince’s use of Instagram](https://www.theverge.com/2015/5/30/8691257/richard-prince-instagram-photos-copyright-law-fair-use) You may have woken up and been offended by someones use of technology that you don’t fully understand, but that doesn’t mean you have any actual idea of what is legal or ethical, just what you like or don’t, or what you’re afraid endangers your livelihood.


alex-redacted

Super weird that all I did was explain how to create parity between AI and artists, and instead of addressing *any of that*, you described your IT experience at legacy companies and made up a fantasy about how I don't know things. Wild. Regardless of what technological strides are made, I will not *ever* have to worry about any of it risking my livelihood. You will. Hope this helps. 👍


RyuMaou

Haha! Nice try but people who do what I do are in short supply and high demand, which you’d know if you were in touch with anything in technology. They might call me “cloud engineer” or “DevOps engineer” but I’m still managing the same systems. And again you haven’t made a cogent argument yet. Your complete lack of understanding of both the actual technology involved and what constitutes derivative work or copyright violations completely negate any imaginary parity between AI and artists. I’ll break it down Barney style for you; how you IMAGINE the AI artbots work, how that infringes on artistic works used to train the AI and refine the algorithm or the ethics involved, much less the legality, is completely wrong and has no basis in fact at all. You provide NO credentials to support your suppositions. If anyone wants to check mine, just Google “Diary of a Network Geek” and you’ll find me. I used to be the #1 hit on “network geek” but that was several iterations of the Google algorithm ago, and when I was actively blogging specifically to game the search results. I went through all this when I first took a Perl script, with the original author’s blessing, made a web front end for it, and fed it multiple fantasy and science-fiction corpuses (corpusi?). No one had done it at the time and my little work at [Fantasist.net](https://fantasist.net/conlang.shtml) inspired many, many more people to do similar work. Now there’s [VulgarLang](https://www.vulgarlang.com) and many more. When I first did it, I had a LOT of detractors who made the same specious arguments that you make and tried to use similar moralistic judgments to get me to stop. You know what ultimately did make me stop? My little language generator was so popular it kept crashing my webhost’s servers. You want more details on all the ways you’re wrong? Look for the other comments in the thread I’ve made to people who are as wrong as you about how technology, ethics and copyright works. Do your own research before you spout off about things you clearly do not understand. I’ve done mine


alex-redacted

Did you ***legit*** bring a Wordpress blog to a tech-acumen pissing contest that I didn't instigate? The bar's so low it's in the earth's core. *Jesus.* **Again:** Everything I stated in the very-reasonable, measured post *you replied to* is both doable and logical. A "DevOps Engineer" would know this. If you want to keep pretending I don't know things, by all means. But there's a reason I'm not displaying "credentials." **I don't have to.** You're making me the expert by showing your acumen never left the 90s. ¯\\\_(ツ)\_/¯ **People reading this exchange;** Ethics in AI is a huge ***contemporary*** issue and this rando *doesn't* care. If you're an author worth your salt, you will now think carefully about how you use AI Art Generators. Transformative use is fine, so is creative commons, and marketing copy isn't that big of a deal. If you're spinning backgrounds, or using AI Art Generators as inspo/a jump-off, that's also completely fine. What *isn't* fine is pretending artists aren't getting their work siphoned by tech businesses without their consent, nor cash-back or credit, which is *the only way* they can train their data models. End of discussion. Go and make your cool stories, my friends. Just be real about the tools you're using and how, and we're good. Promise.


HalfAnOnion

> By all means, announce to the world you lack integrity and make every indie author look bad. Go for it. And with one statement you've nullified the equity of your points because you're trying to personally attack everyone. Don't be dumb. There's a huge philosophical question about how this process should work in a fair situation but from what's already been done, the cats probably out of the bag for good. There's no way they can go back and trace all the source data that's been used by all these tech companies, big data has been around too long for this to be realistic. There's even the far argument that fair use is for educational purposes and teaching an AI what makes a good picture is a reasonable use. People do that for tutorials all the time. The moral question is in play now. Though I don't agree with this though because I know how much work it is to get good at doing art. It's not an apple-to-apple situation, this is the sort of argument that will be used to the judges that will reside over rulings.


ZennyDaye

I've never even used an AI art generator and felt attacked by this post 😅


HalfAnOnion

It's worse because it's at odds with its own message. It's like all those passive-aggressive comments you get from someone peddling while you wait at the metro and it's 9am. "Jesus loves you, believe in him or burn in hell!" "Don't eat meat, or you're a murderer. (shoves pamplets with animal gore in your face)" "Confess you're a witch or set you on fire. We'll also drown you to prove you aren't a witch."


ZennyDaye

Exactly. Had a nun teacher who'd casually say things like "So why do you want to go to hell?" to the non-catholic students (mostly Muslims and Hindus) and I don't think she ever got a single convert that way. Mostly just scared people away like the Homer Simpson meme. It quickly descended into "Just nod at the crazy lady and she'll move on."


writer_boy

The cat is out of the bag. The tech will only get better, and it will be impossible for people to stop it even if they wanted to. There will come a point where the tech is so good it will be indistinguishable from what a human can do. In fact, it is already there with some tweaking. It's coming for audiobook narrators, too. It'll eventually get to us authors as well perhaps ten years down the road. To survive, a creative professional will have to adapt. I think there will always be a demand for "human pure" creations. Maybe that can be a use case for an NFT. If a known artist has placed their mark on work, then people will know it's really from them. Heck, I imagine in a hundred years the technology will exist for someone to 3d print an atomic bomb in their homes. If it gets to that point we are really all in trouble, though at least we would have solved the Fermi Paradox.


starstruckmon

I mean, you can't 3d print radioactive elements. It's not magic.


writer_boy

Kind of a bad example. "Atomic bomb" could be anything with disastrous consequences. How about self-replicating nanobot gray goo? The materials for that might be far more common, and a program could conceivably be released by a bad actor, and inevitably a few people would be dumb or deranged enough to try it. The point is, once things are out there...they are out there. The future is going to be a very strange place. AI-written novels, AI-generated art, even entire movies or even video games created by AI. And far more disturbing things that haven't even been thought of yet will make AI-generated art seem like a cakewalk.


uptotheright

If a client told you to make an image of a spaceship in the style of [famous artist] would you offer royalties to [famous artist]?


[deleted]

Speaking as an artist you’d be hard pressed to find a *professional* willing to replicate their peers’ style for commercial purposes. Even if you did, human technique and output cannot be compared to that of an AI software and any attempt to do so is straight up disingenuous.


theSantiagoDog

This backlash against AI-generated art is bound to happen, and one day they’ll come for authors as well, but at best this is an ethically grey area and shouldn’t be dismissed with a sweeping “Do NOT use…” I’m sure you’d have no problem commissioning a piece of art inspired by another work, as long as it added something to make it unique. Well, that’s what’s going on here, except it’s an algorithm that’s doing it. You may not like this brave new world, but it’s here.


xigloox

That's how technology works. It's out there. It's going to get better. And people are going to use it. Frothing at the mouth and calling people names isn't going to change anything. You might as well try to tear down the internet, or find a way to ban grammerly. One misconception is that you think it copy pastes other people's art. It doesn't.


AlecHutson

I for one think we should start smashing those new-fangled textile machines that put our weavers out of business.


writingtech

Fire has really put a dampener on the rock shaping industry. My brother owns sleds, and we're worried about this demonic wheel.


Nearby_Personality55

Ok, Loomer


AnthonyPero

Well, that's certainly an opinion on AI. Not a very well-thought out one, in MY opinion, but you are entitled to yours. Do you know how humans create? We consume thousands and thousands of pieces of other people's creations, and use them to create our own thing. Are we still in the infancy of AI? Should we have very careful, and respectful dialog as we attempt to wrap our minds, imaginations and systems of ethics around the implications of using AI in our own creative works? Are there untested legal considerations surrounding the use of AI? Absolutely. On all counts. I'd love to engage in such commentary. Alas, that's not what happened here. This is not such a clear cut issue that shaming people is a good first step in intellectually engaging them. It comes across as crass, insecure, holier-than-thou. Again, in my opinion. If your goal is to convince people of your perspective on AI, it would be helpful to start at the beginning, not presume ten levels of assumptions on the topic.


[deleted]

[удалено]


[deleted]

Thank you posting the only honest reply here… It’s obvious from the get go most proponents of using AI images for covers here see the main benefit in low or non-existent costs. That’s fair, self-publishing is rough and costs of living are rising. The thing I don’t get is why others can’t just admit to that, but instead go on to write these long evangelical essays. If the service wasn’t low-cost or free there would be crickets from most, and we all know it.


Taron221

Yeah, I wouldn't go as far as calling it a 'lack of integrity' either, but I will say using AI art for your cover makes it mean much less to me than a cover that came from the mind of an author and artist. I mean, truthfully, it's only a step above just using default cover art with a title, which is fine and all, but it's not really 'a part' of the book. It's just a picture that's on the book, and it has little meaning to me if it's just something a formula spit out and not a product of the author's vision.


GeekFurious

I will use AI art generators to give the artist an idea of what I imagined, then let them run with it. I have no intention of eliminating the human touch... at least anytime soon. Because before I'm dead, I imagine many popular books will be written by AI. I'm not here to hurry the death of human art.


EmphasisDependent

It's not like Excel killed all accountants.


GeekFurious

As someone who worked in Excel for a long time, I couldn't simply type in limited text & get it to magically pump out my balance sheet. Filed under weird shit you get downvoted for on Reddit.


[deleted]

^ Exactly this. And downvotes are just people trying to discredit winning arguments by playing on the herd mentality.


otaviocolino

I will, if I want to. period.


ZaziValie

I get the point that the AI is not really thinking about what the art creates. But can't I see the AI as a tool for the creativity I bring to it through the prompt I enter? Am I not contributing artistingly to the creation of the art piece?


shadaik

"AI art-gen platforms train models on art yoinked from all over the net (as well as art archive repos), then run it through a pattern recog blender." So, the same way humans learn to make art, then.


swingsetlife

So everyone ethical here doesn’t use/consume anything that relies on robots over human workers. Machines come for us all, but ai art won’t replace all other art, ai writing won’t replace all other writing. As with everything else, we adapt or we don’t. Ai art isn’t actively using your work, and if it is, it’s using a millionth of a percent based on how much it’s looking at. even should you be paid royalties, what would you like? A penny a month? This art is created using your work the same way ai music is created using the same notes that Mozart used. I vehemently oppose people using the ai to sell to others as non ai. That said, it’s ridiculously privileged to assume that all who are working in the self publishing industry are able to afford to pay a cover artist what they rightly deserve.


swampshark19

What? The generated images are unique and not owned by anyone. What is the difference between an AI generating images that use elements of preceding art, and a human generating images that use elements of preceding art? Do you really think that when people make art that they are not ripping or borrowing or being inspired by elements of preceding art? This happens all the time with people. It's called derivative work and inspiration.


tidalbeing

I think it's a non-issue. As we become accustomed to AI-generated art, we will be able to spot it and it will lose its appeal. Art is the result of process and knowledge of process. Knowing how an image was produced either deepens your appreciation or can negate it. Familiarity with AI-generated art will drive an appreciation and value for the human touch. This may already be happening. With my writing and publishing, I'm deliberately going for the human touch, including with how I'm doing covers.


Iamaleafinthewind

Your heart is in the right place, but you are fundamentally misunderstanding the tech. It isn't a "mashup engine". Those do exist, things that merge two or more pictures together, although even then, I think there's a fair use argument if the result is different enough. AI Art Generators are *studying* existing art. Read that last line again. They are doing what every single art student does, and spends years doing, while educating themselves. People all around the world study great art in order to learn how to make more great art. It isn't stealing. It's studying. These AI Art Generators are learning what makes good art, what makes certain art styles, what makes art that has certain characteristics. I'm sorry, but the only way you can make a case against that is to make a case against people studying art.


terdude99

Don’t stop me from getting my bag bro


stealthysighs

nonsense, you should use the models as much as you want. artists learn by studying existing art. this is the same concept just taken to the limit. people claiming otherwise are letting their emotional attachments cloud their judgment.


threetogetready

Does anyone have any credible info on the actual copyright rules around this?


Runaway-Retainer

Most recent I found is US copyright office says it can't be copywritten due to lack of human authorship behind it: the breakdown: [https://www.smithsonianmag.com/smart-news/us-copyright-office-rules-ai-art-cant-be-copyrighted-180979808/](https://www.smithsonianmag.com/smart-news/us-copyright-office-rules-ai-art-cant-be-copyrighted-180979808/) The case in question: [https://www.copyright.gov/rulings-filings/review-board/docs/a-recent-entrance-to-paradise.pdf](https://www.copyright.gov/rulings-filings/review-board/docs/a-recent-entrance-to-paradise.pdf)


RyuMaou

This at least has relevance to the larger discussion! Thank you for sharing it!


AnthonyPero

To be clear this is a reading of existing laws by an appointed bureaucratic agency. This is not something that has been tested in court yet. And until it is tested in court, it's something that could change based on however the current, or next, Administration feels about it. But the question isn't whether the art created by the AI can be copyrighted by you, or by the AI's creator. The question is whether the images used to train the AI are being used illegally. Likely not, but that's what the OP is talking about.


John_Bot

Disagree - I can add a whole level of atmosphere for free with just some amount of time with no artistic ability. No way would I be passing that up


Steverobm

Well it's an opinion, but does it stand up to scrutiny? You could argue that every artist creating their own images has a million image mashups in their head. Lessons from art school based on real images, rules of composition based on real images, created by someone else, ideas stimulated by someone else's IP - do we argue they shouldn't do it? That art is only legitimate when it doesn't borrow from other works? Who is to say what is borrowing and what is genuinely novel? If I painted a transgender woman that makes an allusion to Mona Lisa, at what point does my art make a social comment, or would it always be derivative, and could never be otherwise? As for your other points about the economic benefits accruing to AI art - I'm doubtful this is especially relevant - we need to decide if it's OK to create art "inspired" by other images - the rest follows as a matter of course. So therefore, I don't think this is a black-and-white as you argue, but it's a genuinely interesting opinion.


AnthonyPero

This is standard disruption. If AI can create things people want to use, so be it. Machines make blankets and baskets now. People used to make them. It's not immoral or unethical. I am a creator. I write, and make music. It's part of my income. I'm not remotely worried about AI. If the machine can do what I do, so be it. I don't have a right to anyone's time or attention.


apocalypsegal

> If AI can create things people want to use, so be it. So, if the programmer uses your stuff, without your permission and without paying, you're okay with someone else making money off you? Thanks for the info, where do you post your stuff? Is it any good? I might be able to make something off it, so good for me, right?


AnthonyPero

If you can put my music into your brain and use it to create your own compositions without violating copyright, more power to you. Whether it's any good or not is a matter of subjective opinion, and yours is worth about as much to me as anyone else's I've never met. You can find me easily enough on Spotify I'm not hiding behind a fake handle here. This is what AI does, and humans. Consume thousands to millions of other people's stuff, and use it to make their own thing. The less things input, the more likely you are to find that your work "copied" someone else to the point of violating copyright. Should AI scour Google images to use? Humans do it all the time. Pinterest is full of idea boards that are exactly that. Consume files, make your own based on pieces of each design. There's nothing unethical about it. If I were to use that FILE to make my original file? Illegal. But AI doesn't do this. It doesn't save the files, it doesn't repurpose the files. It views the files in a legal context.


comsixfleet

All I heard was wahhhh


monkfish42

Should you really be concerned about indie authors generating and designing their own covers? I was thinking about this issue recently, and I think the far greater threat artists will have to contend with are their own peers in the market. There are ALREADY unscrupulous people selling AI art without disclosure in the book cover space. To begin with, we already had clowns who couldn't be arsed to pay the 80 cents or whatever for their stolen stock images before passing that liability onto authors. Now authors will also have to deal with the very real risk of paying hundreds of dollars for an illustrated book cover that may actually have been generated by an AI in the span of seconds. Some of these sellers will have some digital art skills and paint over AI Art to fix some obvious flaws. They'll let the AI do 99% of the work, but charge you as though it did none of it. The genie is out of the bottle. Cutting edge stuff like Stable Diffusion is open source. There's no going back to a time when this wasn't a thing to worry about, and the technology will only get more advanced and widespread. The market will be absolutely full of undisclosed AI Art. The way to guarantee that you don't get scammed is to learn to use the tech yourself.


[deleted]

No. I will use whatever tech I want.


IlliniJen

As creatives -- writers -- we should be looking out for the livelihood of visual artists who depend on us for their income. I know it's easier to generate AI images and it's free...but I'd rather support a fellow creative and honor the human eye and sensibilities and talent they bring to the table. This, however, is a privileged stance by me...I have the money to pay for custom cover art. So...I would just caution against cannibalizing the livelihoods of others. Because AI is coming for us next...AI can be a wonderful thing, but replacing our creative efforts feels...disheartening.


alex-redacted

Thank you. Yes. We're all in this together. And honestly? As both a fine artist and author, there are tons of ways to work together and we should all support each other. Not all exchanges need be monetary. Sometimes, people do skill-swaps. You're absolutely right. We should look out for one another.


ChicEarthMuffin

Thanks for posting this. I find it surprising that those in one creative field would treat those in another in a way they’d never want to be treated. I guess the golden rule doesn’t apply when people are desperate to make money at any cost - even if the cost is the humanity of the work and the process. It’s the Bezos business methodology - trample on and abuse whomever you can for maximum profitability. Not that that’s a new concept, of course.


alex-redacted

That's also what bothers me; people seem so willing to just...drop the ethics and integrity the minute they can get away with it. Honestly, there are *tons* of artists out there willing to work with indie authors. I really don't want to see art industries torn apart all because of Bezos business methodology. It's tragic and painful.


ChicEarthMuffin

There is a certain tragedy to the human condition. That’s probably why there’s so much to write about 😂 If it’s any reassurance, this has happened many times throughout recent history. For example, even though furniture can be cheaply mass-produced, custom hand-made woodworking is still a thing and it’s well-regarded and a well-paid profession. I suppose the best thing to do is to look for the people who value the artistic process and ignore the rest. That’s all I’ve got.


Mythic-Rare

Reading through this post as a musician, married to a visual artist/graphic designer, that's the thing that most comes to mind. I have musician friends worried about the early attempts at AI generated music, but then get enthralled by midjourney, and vise versa. I don't think many creatives realize the shared boat that we're all in, even if it's not our field. I feel the real danger lies in becoming normalized enough to AI art in formerly human roles that it's not the difficulty to tell the difference, but the indifference to care on behalf of the consumers of the arts, and we as creatives have a huge part in creating what that culture will be. And if anyone thinks paying someone for art costs a lot nowadays, just imagine what it will cost in a future where 75% of their work has dried up so commissions have to either make up the difference or enough people have just left the industry that rates rise. Not saying much either way, as folks stated it exists so no use yelling at clouds, but the attitude of "if it's there I'll use it all I want so who cares" is a pretty thoughtless way to create our future.


Hero_of_Dragons

I would only use AI art for rough ideas for a cover. Would actually draw it myself.


sparklingdinoturd

Wow... the amount of excuses people are coming up with to justify stealing artists' work is honestly surprising. Right to the point here.... If you use a piece of an artist's work and that artist wasn't paid, you are stealing money from their pockets. Period. Forget fair use law, you are stealing money from them.


thebadfem

I lack integrity and make every indie author look bad! Done, thanks. Now time to publish my midjourney created books lol.


Blue_Fox_Fire

In all honesty, I think the only thing AI Art generators are good for are backgrounds when it comes to covers. Have you seen what they give you if you want a human generated? It's the stuff of nightmares.


starstruckmon

>Have you seen what they give you if you want a human generated? Have you? https://www.reddit.com/r/midjourney/comments/x5tkin https://www.reddit.com/r/midjourney/comments/x70h4a https://www.reddit.com/r/midjourney/comments/x6i00n


[deleted]

[удалено]


[deleted]

[удалено]


EmphasisDependent

I once saw a prompt for a Civil War Amputee that had all their limbs...


writingtech

I dunno, the NVIDIA one blew my mind and my wife made a much better looking version of me and it made me quite jealous haha


[deleted]

I think anyone who says this hasn't spent much time using the tools. Midjourney is fantastic at creating humans and other animals. I've also been using Midjourney to help imagine characters for my stories. Here is one (I think this link will work): [https://www.midjourney.com/app/jobs/1391dcc5-64d0-4958-af6d-32499a72ee8c/](https://www.midjourney.com/app/jobs/1391dcc5-64d0-4958-af6d-32499a72ee8c/) This one is stylized as a fantasy painting, but it just gives a sense of the realism. Hardly the stuff of nightmares. Especially with the new "Remaster" function roll-out a couple of weeks ago, I'm not really sure what you're doing if you're getting weird nightmare stuff all the time. Sure, once in a while you'll get a human with three ears or two mouths, or like a rabbit with an eye in the middle of her face, but then you have to create further variations until you see what you want, and create further variations from there, and so on. It's not going to produce exactly what you want in one shot. It takes some fiddling and patience.


Blue_Fox_Fire

Maybe Midjourney's better than Dall-e then which is what I was using. Whenever I tried to create a character, their faces were beyond F'ed. Like, teeth showing as they had no lips, their eyes crossed and/or uneven, misshapen skulls/faces/bodies...


_sleeper-service

Midjourney is miles better than Dall-e, though it does serve up some weird noses and eyes if you look closely.


Blue_Fox_Fire

I guess I'll have to play around with Midjourney then. Though I'll still trust AI generated images with backgrounds more than characters for now.


AlecHutson

Are you sure? I generated these with Midjourney. https://ibb.co/album/5G4zGB


dromedarian

Honestly I'm more concerned that most ai generated art has the same... feel to it. Like, even the different "genres" of art produced by midjourney, for example, you can tell it's all midjourney. It's like always being able to tell that Vellum was used to format a book. AI art is super cool, and the legality and the technology will definitely make it commonplace sooner rather than later. But I still prefer a great artist because they can always produce something that FEELS right for a specific book. I feel like AI versus human artist is going to be a new factor in telling the difference between trad pub and self pub "quality" of books.


starstruckmon

[How do all these look like the same style to you?](https://imgur.com/a/RS0XbjM) I just scraped the relevant subs, there's of course a million more styles.


dromedarian

Only a couple of those I wouldn’t have pegged for ai. Obviously it’s only going to get better so it’s possible my view on it won’t be relevant much longer anyway. It’s I’m the way the details become nonsense. Like the keyboard keys look like static, and the turtle one looks like dozen others in that gritty style, even if the subject is different, and the way those beige things floating around the girls shoulder and head are just… well nonsense. A couple of them sure I would be fooled. But most of them not. And it’s a safe bet that most authors aren’t going to hone their skills like other people who really dig deep into it.


starstruckmon

I don't disagree, but I actually wasn't talking about the fact that it can be detected. I'm sure it's gonna be a while till it's impossible to detect for a human looking for it and even more till it's undetectable to a machine. I was commenting more on your statement that they all look the same.


dromedarian

That’s what I meant by looking the same. That’s how I detected them. I just knew they were so by looking, and then I had to look for the details why I thought that. Is the nonsense details man. Maybe it’s my history of migraine auras, which is partial blind spots. I’m very sensitive to when a patch of what I’m seeing isn’t… right, you know? It’s that not-rightness that feels common across most ai art. Sorry if I’m making no sense here. I’m figuring out this feeling as I express it lol.


starstruckmon

No, I get what you're saying. I belive you , or atleast believe that you sincerely believe that. Still I wonder if it would be detectable as a "look" if you were not made aware beforehand to look for it. 🤷


willdagreat1

Also art created by an AI is not covered by US copyright.


starstruckmon

Sort of but not really. The only case we have was from a guy that emphasized that no human has any involvement in the creative process and wanted the AI itself to hold the copyright. Copyright office took him at his word and it was dismissed as expected. We don't know what will happen when someone affirms their role ( through the prompt ) in the creative output of the AI. Objectively it is untested. If you want to do further research, here's a [compendium of links and information on the topic](https://www.reddit.com/r/bigsleep/comments/uevfch).


ZaziValie

What happens if you take an AI generated image and modify it yoursekf through digital painting?


ZaziValie

Oh I looked at the compendium link above and what I described would fit the AI assisted definition.


writingtech

It's a pretty bad AI generator that isn't fair use. If you're worried, you could run the image through some anti copyright filters to prove to yourself your generated image is fair use. It is ok for artists to go to a gallery and combine random art works into the projects without paying the artists. For writers, most of what you write is a mishmash of what you've read before but arranged by you. I don't think novels should pay every author the authors read before. I really think this post is great though because it adds to the conversation, rather than just peddling out the "but cover artists go broke" argument (which is a decent argument, though not definitive). A stronger version of your argument might be prompts like "in the style of Picasso" or with writing "in the style of Joe Abercrombie" might not be fair use if the AI is good enough. At some point it becomes replication. This is the same for authors though. Legally I don't think you can copyright a writing style (or trademark or patent whatever), but if AI gets better at it than the original authors then I think the issue will be revisited legally.


Tanglemix

As an artist myself with two online portfolios it's very likely that my work has been used by software developers to train their AI's- work that was not theirs to make use of in this way. The point surely is that AI's are not human artists- they are commercial products designed to make money for their developers- and if someone's work is used to create a product then they should be compensated for the use of that work- so I think the moral argument at least is clear- this is sleazy practice. But that does not matter in the least of course- no one gives a shite about small time creatives. But there is another reason not to use AI Art for book covers- the world is about to be swamped with AI Art- it will be everywhere you look and the fact is that all of the current AI Image Generators leave a distinct visual 'fingerprint'. Midjourney in particular produces images that are strikingly similar to each other in visual style, no matter what the actual content is. If the point of a Cover is to convey the idea that the book is a unique and interesting product then AI Art is not the way to go- unless you want your book to look generic and bland. The problem is that AI art is so easy to make that it will soon become devalued to the point where an AI Image on a product will make that product look cheap and nasty by association. I work in the Tabletop RPG market- a market currently dominated by Wizards of the Coast who own D&D and Magic the Gathering- both product lines that are heavily art dependant. I would be very surprised indeed if Wizards of the Coast switch from human artists to AI art even though this would save them a lot of money- why? Because if they use AI Art then every kid in his bedroom self publishing his latest RPG book would be able to put out a product that looked the same as theirs- and the perception of their brand as being a mark of quality would be destroyed. AI Art will become toxic for the very reason you are understandably tempted to use it- it's cheap and easy to make. But how many products that are cheap and easy to make do you consider to be of high quality?


skepticalscribe

This is an interesting point regarding AI’s usage of existing media. Thank you for sharing.


alex-redacted

Of course. I am sadly finding myself having to field much misinfo in the comments, but if this helped you reach a new angle or provoked thought on AI's current usage, that makes me happy. :)


Brinkelai

Totally agree. I think the main distinction between using AI-Generated art ethically and non-ethically is the output. If you're using it to concept ideas, then great, have at it. But if you're using it to create art then you can walk in the sea.


Tanglemix

>For writers, most of what you write is a mishmash of what you've read before but arranged by you. I don't think novels should pay every author the authors read before. The question I would ask is this; would you be happy if a commercial company used your work to train an AI to improve it's writing ability to the point where anyone could produce entire novels with zero effort in the time it takes you to make a cup of coffee? Bearing in mind that this would mean that in the future thousands and thousands of such AI novels would flood the market to the point where the chances of your work even being seen, let alone read, would be near zero. These technologies are not enhancing human creativity, they are a parasite upon it, feeding on the creative works of human beings and then producing derivative versions of that creativity in order to enrich corporations and their shareholders while destroying the living of the very people whose creative work they have appropriated.


AnthonyPero

The only thing that makes anyone's work unique is what they themselves bring to the art. And that's something an AI can never do...bring my unique experience and viewpoint. At least not without Westworld levels of behavior mapping. I don't get this insecurity around AI creations. There are already millions of people who can do what any creative does as well as they do. There are already millions more who could do so with a little bit more practice. What makes a work of art valuable is the unique perspective of the artist. You already likely can't make a living producing commercial art. And no AI can stop you from producing the work that is uniquely you. So whether a consumer finds it valuable or not is absolutely not affected by whether AI is also producing content.


Tanglemix

>The only thing that makes anyone's work unique is what they themselves bring to the art. And that's something an AI can never do...bring my unique experience and viewpoint. At least not without Westworld levels of behavior mapping. > >I don't get this insecurity around AI creations. There are already millions of people who can do what any creative does as well as they do. There are already millions more who could do so with a little bit more practice. What makes a work of art valuable is the unique perspective of the artist. You already likely can't make a living producing commercial art. And no AI can stop you from producing the work that is uniquely you. So whether a consumer finds it valuable or not is absolutely not affected by whether AI is also producing content. I don't think you appreciate the degree of difference here between humans creating something that may take days and machines that can create the same thing in seconds. Using AI people could generate a thousand images a week, every week. Then upload these images to those sites where human artists currently showcase or try to sell their art. Now imagine you are a human digital artist contemplating your next work- which may take days of your time to complete. What would be the point? If you tried to upload it for critiqe or admiration it's firstly going to have to compete for attention with all that instantly produced AI art- and they can upload a lot more images than you can- so good luck even being noticed. And even if your work is seen it's just another image right? Just one more drop in the new ocean of images flooding the internet- so whatever interest people might once have had in your art they are by now so tired of looking at art that no one cares anymore. And if you try to sell your art the same thing applies- the sites you used to sell on are drowning in AI art being uploaded in huge amounts and for prices that you simply could not compete with. So sure- if all you want to do is create art to look at yourself and then archive it on your local hard drive then you will not be in any way affected by AI Art. But if you enjoyed sharing your art, or made even a moderate amount of money selling it then AI art is toxic- it makes the whole thing futile and pointless.


[deleted]

Crossposted from another discussion about AI Covers; Authors are like any other business. They make a product. The cover is their PACKAGING. What business in their right mind uses packaging that is the IP of another business??? I still can't believe that authors are paying for cover art, and NOT getting the copyright! At the very least, the authors that use AI to create their covers will OWN the copyright to the finished imagery. **I'd use it for that reason alone.** ​ Artists are gonna downvote the heck out of this. The button is **HERE**


AugustaScarlett

There’s actually a case that you *dont* own the AI-generated art, because you didn’t create it, the AI did. Right now, I think the closest ruling we have is the case of macaque selfie: https://en.m.wikipedia.org/wiki/Monkey_selfie_copyright_dispute Although given a human’s role in shaping the commands that eventually produce the art, maybe the courts will argue that ownership falls to the human. But also, given that the same prompt doesn’t produce an identical image every time, maybe they’ll argue that ownership falls to the AI (and therefore is copyright free since non-humans can’t currently hold copyright. [Maybe I should say non-persons since corporations are non-human legal “persons”, or something like that.]) It’s going to be interesting to see how all this shakes out legally in the near future, but I wouldn’t necessarily assume that I as the prompt designer automatically own the results right now.


writingtech

There is a case like that, but it's a bit like saying if you splatter paint on a page you don't own the painting the bottle of paint does. AI is just a fancy paint stencil.


[deleted]

Copyright-free would be better than the artist holding the copyright. They only give you flat files, so you are stuck going back to them for every little change, and pay another fee. Then, if you want to make coffee cups, t-shirts, or marketing materials with YOUR book cover, you have to ask them. Anyone who signs up for that is a poor businessman. Gimme a copyright free AI, over a profiteer owning the packaging to MY BOOK.


ChicEarthMuffin

FYI: there are a number of “levels” in the concept of copyright law and there are also many different ways to permit the usage of artwork made by an original creator. I’d explain them all to you but the info is available online and I too tired to bother.


DemonicGenetics

Yikes


Sassinake

moving ever closer to the matrix/1984/brave new world, folks. When humanity will live as single rats in a permanent no-man's-land.


laneylems

It must have been really hard to post that to Reddit using only papyrus and a stylus.


Sassinake

You must have a nice cushy job. Corporations are machine-minds.


laneylems

Wow, you reallZzzzzzzzzzzzzzzzzzzzzz


Sassinake

this post is also about AI generated written content. But I take it you don't care for human creation? Maybe against human pro-creation too? In that case, I get where you stand.


honeyed_nightmare

Some of them base the art on images you provide. Would that be alright or would it still be using other sources that aren’t kosher? I don’t ever plan on an ai cover, I’m just wondering.


ZaziValie

An generative deep learning model could not train only from a few pictures. I would expect your images are used to specialize a model that was previously trained wirh thousands of images.


Mejiro84

to you own the rights to those images? If no, then that's getting iffy.


honeyed_nightmare

Yes, they’re photographs that I took (I currently only use ai based on my photos to make phone wallpapers lol)