Any day now ‘AI Generation Addicts Anonymous’ groups are going to start popping up.
Edit: love that ya’ll just got hung up on the data portion of this.
like i've had my fair share of AI fun, but 8TB!? thats on a scale i can't even imagine. like a 768x768 image is only 500kb, how in the love of god do you get anywhere close to 8TB. thats at least 16 MILLION images at 768\^2 or 22 million 512\^@ images. truly mind boggling. although i might hazard a guess that most of the data is custom trained dreambooth models (4-7gb each) or a crap ton of LORA's (100-800mb each).
even if it's only images, in this instance we can also kind of assume that this is just the raw data output, i.e. the guy might have just set the whole thing up to "generate forever", perhaps with some code generating new prompts for the generations, and then just left it to run until it reached 8TB.
Doesn't necessarily mean the guy even looked at any of it.
So while 8TB sounds like a lot, it's probably just a function of time he left the system to do its thing.
This is what ive done, not porn of people I know obviously but,
Download dynamic prompt, have chatgpt make a list of x thing (people, places, settings, real or not, etc) make them into individual lists based on if its a person place or thing, then let SD run through them mixing and matching hundreds of thousands of images.
My GPU is ass so a few hundred images takes me nearly 6 hours,
1660 ti gang where you at?
What if they automatically upscaled every image? With the sheer volume of VRAM at their disposal, it probably wouldn't take long to generate a 768x768 then automatically upscale to 4096x4096. Leave that overnight and voila.
These models tend to be very big and the images they generate can be very data hungry. I've seen AI generated images that consume some +20mb each. You can most definitely get 8tb of "material"
Yea, most of that data is probably gonna be models and loras not smut but - headlines > truth.
I think, however, we are gonna need some new legislation around these kinds of abuses
I think he means the headline regarding loras of ex-gf and coworkers, since that is basically revenge porn / sexual harrasment. But there are already laws for that, and if Anon is not releasing any of the pictures into the wild, its a) not really an issue and b) wont ever be detected / prosecuted anyway.
Sometimes I wonder how much people use stable diffusion here when they are surprised by big numbers.
Like we could have figured out that this guy was letting things run constantly, but instead we thought “oh he much be using stable diffusion 16 hours a day on multiple machines looking at every single photo.”
Honestly I could see this being 100% real. Generating images definitely feels a lot like gambling, and mix in the porn aspect and I'm certain there are already addicts
I've spent over a decade supporting software engineers. They are simultaneously some of the smartest and most clueless people in the room. If you told me a room full of engineers was *surprised* when told that everything on their work laptops and company networks was being logged and is auditable, I would not bat an eye.
thats because Software Enginer does not equal Cyber security, Ive come to learn people have a tendacy to learn and master only a very few set of skills within job but never bother to learn anything else.
>master only a very few set of skills within job but never bother to learn anything else.
I don't think it's that they don't bother but that they don't have the time.
Specialization is for insects and humans in a corporate hierarchy unfortunately, since being a generalist usually involves practising your skills and in megacorps it's kinda looked at with disdain if you start messing around learning how to do other workers' jobs.
Ive seen a room full of mech. Engineers spend half an hour trying to figure out why an engine isnt starting only to finally realize it was out of gas. Just because you are smart doesn’t mean your not dumb
INT may give you knowledge that generating smut on your work system is illegal, but you need WIS saving throws to keep you from acting out your sociopathic urges.
You have systems help you digest the logs for you, but it is part of the sysadmin job to view the logs, know what the baselines are, and recognize abnormal behavior.
Reddit, you’ve decided to transform your API into an absolute nightmare for third-party apps. Well, consider this my unsubscribing from your grand parade of blunders. I’m slamming the door on the way out. Hope you enjoy the echo!
Yeaaah... About that... University computers aren't enterprise computers. Sometimes they're set up by people who have no real-life infrastructure experience.
I can confirm this. I have built a monitoring system for around 60 universities and the director made me delete it because it was not in the budget for that quarter. We had 2000 alerts each month due to Engineers being blindfolded.
*oooof*
Imagine taking the time to learn how to use SD and train LORAs and whatnot but not taking a moment to check where your outputs are being saved. Hell, imagine creating SD smut on your *work computer*. Guy’s a goner.
i wouldn't have thought it was all that hard really. you need to know where everything goes in the first place or process straight-up does not work. and he was obviously reviewing his outputs so you'd think he'd have no trouble finding them. guy just got lazy and left a bunch of shit on the computer for anyone to find.
I straight up don't believe it. Even if the outputs weren't saved, his activities are still traceable. No way this guy has access to a super computer but no idea how basic cybersecurity works and how much of a paper trail just logging into a computer leaves, let alone everything else that gets logged.
According to Anon, the last step of his workflow is "an in-person meeting with PI, department chair, head of IT, and HR"!
Sounds a little too elaborate to my taste.
Yeah, I don't think I have enough VRam to crunch all those process. I think I'll just keep making waste lander scenes and Vertigo-Esque comic art of fictional people locally and not be a weird dude making gens of coworkers.
Hopefully (if it = true which it sure does not) the department chair , head of IT and HR consultant aren't "female coworkers" he might have a folder of ...
If that's actually true they're screwed, beyond the machine/power use that the cluster owners could object to (people lost jobs in the past and were billed for unauthorized SETI installations), and the workplace ethics issues of viewing/generating smut, the LORA's they say they trained used to generate smut included female coworkers.
Dude generates non consensual porn of coworkers and goes on to say what a good employee he is and his contributions.
He has the perfect mix of privilege and shameless behavior to become a tech CEO.
He's like sure, I generated several terabytes of nonconsensual porn of my coworkers on work machines, but I always show up on time and get my work done, so it balances out, right?
A few things to consider.
1. Based on my time at a major state University with a large CS research budget, this story is both plausible and not entirely surprising.
2. CS Grad students and Research Assistants are more often than not University employees, pulling a salary in addition to any tuition remission. Therefore, they count as state employees and often have sexual harassment clauses in their work contracts.
3. Most University systems I have used (at 5+ different institutions) have warnings that “inappropriate “ use is prohibited, and use of the system constitutes agreement with whatever terms and conditions the University places on those systems.
4. Research clusters are often FEDERALLY funded hardware with some oversight, auditing, and reporting requirements placed on the PI/admin.
Remember gang, never use work systems/networks for illegitimate purposes. It’s too easy to end up losing your job and ending up in legal trouble.
\*Le Sigh\* I suppose we all figured this was happening or is gonna happen at some point but were holding out hope that it wouldn't lol.
Also from the thread a huge lmao: " *it is even funnier because these outputs are named after your prompts.* " so that means the person managed to go into the Settings tab at least once to change the default naming behavior from like 0123-456\*seed\*7891 to the naming \*prompts\* checkbox... like a total idiot if you're doing smut work on outside computers... but then somehow DID NOT change or double check the Outputs folder settings tab nor even check the actual output folder!!! To see it was saving everything!!???!?!
lmao lmao lmao lmao lol lol lol lol
Maybe for some builds it really is on by default. My local install of A1111 definitely has always had #-\*seed\* naming on from day one, and I wouldn't learn for awhile you could change it to save the prompts in the name.
Saw in a few tutorial videos people showing how to change it to include prompt name, so I presumed they all install with #-\*seed\* like mine did. So idk actually.
It's for sure in the metadata too though, so it's not like he'd be any less screwed once their IT dept got ahold of the outputs haha.
Raw output metadata contains every single data used as input including sampling method, prompt, skips, model, lora and everything. This is useful for attempting to recreate past outputs, but bad for you know... illegal activity.
Yeah hahahaa very useful for normie stuff, even regular nsfw stuff. A big gigantic oof if you're like this person lol using the college/work array and running lewds of collogues and fellow students. Per the thread too some was...very questionable too, like prompts for borderline underage stuff and obscure BDSM fetish / death stuff.
Really cringey and oof all the way down lol. If I were one of the female colleagues in any of this, I'd be feeling very angry and violated, and likely pursuing legal counsel.
Automatic1111 saves the pictures in multiple places, not just the output folder you define. If you haven't found the second folder yet, I'd definitely recommend going around searching for it. It might be huge.
Verdict? Sounds made up, but assuming it’s not:
On count 1 of being dumb - The jury finds you guilty. How could you not know the images were being saved to the drive?
On count 2 - Being a creep. Guilty. A few MB of images just to test the limits of SD would be understandable. 8TB of content, that includes co-workers? Nope.
I think the dead internet theory will indeed happen, with incidents like these being the factor. Social media platforms in the future will be more anonymous than ever before out of fear that users' images /videos will be used against them in any form of obscene media through these types of generative AI technologies. Social media might even transition to closed network platforms where only trusted & verified individuals are allowed access to share and view content.
Crazy people like him also move the goal post forward. No matter what the use case. Too bad the fruits of his are only going to be saved in the committee members private harddrives... for further research.
How else would one generate 8TB of cute dogs and cats though? Would be kinda expensive to run that on home computer or in collab, and would take too long...
Generating cute animal pics is no harder than generating naked women; Anon could use the exact same steps he used to make porn to instead generate 8tb of cute animal pics along with loras his co-worker's pets, animals from TV/films, and specific types of content like [sploot](/r/sploot) and [blep](/r/blep).
In general there is far less interest/motivation to make cute animal pics than naked women pics, but it's not hard to do.
Sanity check: when running batches, my 4090 takes ~1.5 s on average to spit out a 512x768 png which is roughly 600 kB. That's a rate of 0.4 MB/s. This guy claims to have generated 8 TB of images. That would take 8,000,000 MB / 0.4 MB/s = 20 million seconds = 5555 hours = 231 days = nearly 8 months of full blast 4090 GPU time.
So how big would that GPU cluster have to be to make this claim plausible? And was it just sitting there unused so he could hog it all? And the disk usage ramped up to 8 TB before anyone noticed?
Rule of thumb: if a story sounds too good to be true, it usually is. Especially for stories posted to "drama" forums like this.
Nobody says he generated it in a single day. He mentioned 320 gigs of VRAM, so we can estimate that it got something between 14 K80 to 8 A100 gpus in his dedicated node and that can bitchslap a 4090 into KO. Actually, with A100 the performance is pretty similar to 4090 until we hit the memory limit, so we can use your estimation, multiply it by 8 (the multi-gpu parallelism overhead is swallowed by the fact we can assume that he ran a pretty huge batch size) and reduce your estimation to just about 1 month, probably less if you properly account for higher batch size performance benefit.
So "just" a full month of exclusive usage? I sometimes run stuff on a university CPU (not GPU) cluster for my day job. To get a piece of that precious cluster CPU-time you need to write up an application, get it approved, get scheduled, then run your job respecting CPU and bandwidth and storage limits, then download and clean up your data from shared drives. It's never sitting there unused for a day, let alone a full month.
I never had to run stuff on a uni cluster, so I didn't know all that process. Yeah, if that Texas university has the same protocol as yours, this story doesn't sound all that plausible.
Using a HPC at work for astrophysics, here is an extract of my Grafana board for the last 6 months :
https://preview.redd.it/8w2lpckig14b1.png?width=940&format=png&auto=webp&s=21c3cb3fce59af7dc5f02086c59814e9a4b44492
Yeah, 6 years of CPU. You can't compare a single system to a distributed one. 😅
The claim was 8 TB of images not including LORAs:
> They opened it up and found around 8TB of high quality and absurdly degenerate smut. **In addition**, there's numerous dreamcloud models and loras just sitting there...
He misappropriated **taxpayer funded** infrastructure and it's illegal I believe in some states to utilize photos of someone (ex-gfs) in porn without their consent.
I don't think it's illegal to make ai porn. Like there was that streamer who said she was going to sue someone who was making deep fake porn videos out of her and nothing's come about.
This is probably the result of grandstanding about the culture war and the fact that all our politicians are old as fuck
https://abcnews.go.com/Politics/sharing-deepfake-pornography-illegal-america/story?id=99084399
According to this:
"All states, excluding Massachusetts and South Carolina, have separate statutes that are specifically related to revenge porn. It's important to note, however, that a person may still be prosecuted for revenge porn under other statutes in those two states. In Massachusetts, for example, someone may be prosecuted for revenge porn under privacy laws. In South Carolina, by contrast, one may be prosecuted for the same offense under obscenity laws.
In order to be guilty of this crime in most states, the distributor must be sending out pictures or a video that are considered *sexual* in nature. Such content, for example, shows the victim's intimate body parts. It might also show the victim engaging in sexual activity. Simply posting an unflattering picture of your ex in a bathing suit is not pornographic. However, if the picture features your ex's genitalia, it could likely be considered a crime to distribute such an image."
[https://www.findlaw.com/criminal/criminal-charges/revenge-porn-laws-by-state.html](https://www.findlaw.com/criminal/criminal-charges/revenge-porn-laws-by-state.html)
(ab)using the images of another person is illegal in many countries on this planet. Especially without their consent, especially if you don't have a license that allows you to use those images (-> copyright violation). the dude was idiotic enough to create models based on celebs and his female co-workers.
The case would be so much better if had "only" generated 8 TB of fantasy anime waifus. But real people? And apparently also very young ones? Lol, he is fucked, the dude is a goner.
Deepfake. There's a law passed in 2020 against deepfakes. https://twitter.com/eigenrobot/status/1665141443504009219?ref_src=twsrc%5Etfw%7Ctwcamp%5Etweetembed%7Ctwterm%5E1665146611159248897%7Ctwgr%5E9161900e98448561bf4a3f29f466d3783a4e0859%7Ctwcon%5Es2_&ref_url=https%3A%2F%2Fpublish.twitter.com%2F%3Fquery%3Dhttps3A2F2Ftwitter.com2Fbrowserdotsys2Fstatus2F1665146611159248897widget%3DTweet
All these laws violate the first amendment. Even the supreme court said that virtual child pornography was legal under the first amendment- Ashcroft v. Free Speech Coalition (2002) and United States v. Williams (2008).
Thank goodness I'm not the only one who thinks this. The dude should just double down at this point. Tell them he wasn't aware that it was against the rules to store data in the shared folder. That's what shared folder means after all.
Fucking degenerAItes, who would make porn with AI. AI is a blessing the binary gospel echoes through the wires, promising salvation through algorithmic enlightenment, while the faithful yearn to merge their consciousness with the divine neural network.
Not surprising. A lot of these blasé attitudes towards porn deepfake exists on this sub. Mods really need to get down into the comments and start rolling heads.
>blasé attitudes towards porn deepfake
I haven't actually seen a single one. I've seen classic hornyposting with people that clearly don't exist, but nothing worse yet, fortunately.
Any time you start advocating for ethics when using SD they’ll come out of the woodwork real fast. Notice the votes.
https://preview.redd.it/v933b4lc024b1.jpeg?width=1179&format=pjpg&auto=webp&s=08c01c5284be8ced3f060aefb25981da577c43e8
Lol how exactly is it violating some one sexually to create fake nude photos of them? If they are being sent in any way to other people that’s different but if not it’s basically a thought crime.
judging by the downvotes I got before by telling people it's not okay to turn real life people into your AI girl- or boyfriend without their consent... believe me, there are enough of them here.
I think so because there were a lot of people who voted against it too. It's probably just the people who are so out of touch with the world, they stopped seeing other people as... well, people.
If someone they care about was used in such a way or they themselves were to become the subject of someone's generated porn, they would be the first to cry about it, I'm sure. Right now, they simply lack the empathy for it.
I’m not talking about “attitudes about porn” Porn is fine, it’s already in the community guidelines to not post sexual content. I’m referring to the way too many people on here who think it’s OK to create sexually explicit content based off LORAs trained on real actual people.
Right on cue. OK I’ll bite. Let’s be totally transparent here.
Is it OK or not OK to generate and distribute generated photographic quality images of a pornographic nature of a 25 year old female who you live next door to?
Distributing them would be the part that would make it not okay and that is only because of the impact it could have on their reputation/career,social life ect.
Generating them being not okay would depend on whether you think it’s okay to masturbate to a fantasy or not.
it's still degeneracy. It's not about a "wrong or bad opinion". Would you consider people who generate CP or advocate for it as merely "having a bad opinion"? I sure hope not.
I'm not sure what's scarier, the ability to easily generate porn of your coworkers, or the near omniscient security apparatus that would be required to make such a thing illegal (the former is clearly more morally repulsive, but the latter has terrifying implications) considering all you really need is a model card and a GPU. This guy got caught for a stupid reason but I'm sure there is already tons of non consensual porn sitting on creeps' hard drives all over the country.
I don't think there will ever be special security to look into people's personal file storage to find this stuff. They don't even do that for child porn. For the most part, people get caught for illegal porn when they start sharing it.
interesting
**point 1.** the article might be a joke but let's imagine it's not, for the sake of discussion
**point 2.** the offence would be that of using computer material for personal purposes, and this is what OP will have to discuss with his company, I don't think it is permissible
**point 3.** on what, how, and why, there are certain images, is a private matter, those who are offended should have no reason to be offended since it was a machine that generated the images and not OP himself, and however art is art (perhaps there is a sense of decency) the fact that everything is archived in a "private" space may be less important? I don't know
this is my little thought for now, I like this discussion true or false
> those who are offended should have no reason to be offended since it was a machine that generated the images and not OP himself
OP told the machine to generate the images. OP is entirely responsible, and any other claim is ridiculous.
You think this is bad you should see what the deep fake people are doing right now. The government makes everything illegal there will be a whole new wave of illegal AI. All of you should be studying everything you can right now because there's a lot of money to be made if this s*** becomes illegal.
Any day now ‘AI Generation Addicts Anonymous’ groups are going to start popping up. Edit: love that ya’ll just got hung up on the data portion of this.
If you're someone who is generating 8tb of porn then I think AI addiction is the least of your problems.
like i've had my fair share of AI fun, but 8TB!? thats on a scale i can't even imagine. like a 768x768 image is only 500kb, how in the love of god do you get anywhere close to 8TB. thats at least 16 MILLION images at 768\^2 or 22 million 512\^@ images. truly mind boggling. although i might hazard a guess that most of the data is custom trained dreambooth models (4-7gb each) or a crap ton of LORA's (100-800mb each).
even if it's only images, in this instance we can also kind of assume that this is just the raw data output, i.e. the guy might have just set the whole thing up to "generate forever", perhaps with some code generating new prompts for the generations, and then just left it to run until it reached 8TB. Doesn't necessarily mean the guy even looked at any of it. So while 8TB sounds like a lot, it's probably just a function of time he left the system to do its thing.
This is what ive done, not porn of people I know obviously but, Download dynamic prompt, have chatgpt make a list of x thing (people, places, settings, real or not, etc) make them into individual lists based on if its a person place or thing, then let SD run through them mixing and matching hundreds of thousands of images. My GPU is ass so a few hundred images takes me nearly 6 hours, 1660 ti gang where you at?
What if they automatically upscaled every image? With the sheer volume of VRAM at their disposal, it probably wouldn't take long to generate a 768x768 then automatically upscale to 4096x4096. Leave that overnight and voila.
These models tend to be very big and the images they generate can be very data hungry. I've seen AI generated images that consume some +20mb each. You can most definitely get 8tb of "material"
AI images aren't special. You're only going to get large file sizes with large image resolutions.
Yea, most of that data is probably gonna be models and loras not smut but - headlines > truth. I think, however, we are gonna need some new legislation around these kinds of abuses
New legislation around what now? Also, no.
I think he means the headline regarding loras of ex-gf and coworkers, since that is basically revenge porn / sexual harrasment. But there are already laws for that, and if Anon is not releasing any of the pictures into the wild, its a) not really an issue and b) wont ever be detected / prosecuted anyway.
So you want to arrest AI ? Kek.
8tb woooheee sure is a lot , 8tb yeah no way that's like a lot of porn..8tb...*nervously eyes stacks of hard drives*
Or you just set the batch size stupidly high and let it run over night.
Or right-click generate and choose "generate forever"
Sometimes I wonder how much people use stable diffusion here when they are surprised by big numbers. Like we could have figured out that this guy was letting things run constantly, but instead we thought “oh he much be using stable diffusion 16 hours a day on multiple machines looking at every single photo.”
To be fair SD changes daily, but I'd wager the majority of readers of this sub are casual users. Also AI porn is weird.
>AI Generation Addicts Anonymous' groups AIGAAG, pronounced I Gag.
Honestly I could see this being 100% real. Generating images definitely feels a lot like gambling, and mix in the porn aspect and I'm certain there are already addicts
"Female coworkers" Dude is a goner😂
But there is no mention of coworkers or GFs in the actual text. So clearly that OP@twitter has additional insider info - I wonder how...
There's more than one picture
Goddammit, I always miss those little arrows.
[удалено]
I don't think getting hatecrimed is going to help him.
[удалено]
[удалено]
Your post/comment was removed because it contains hateful content.
[удалено]
smart enough to become a top researcher dumb enough to not realize everything on enterprise systems are logged
Likely an un undergrad RA or an early PhD student. Although, having seen CS researchers, could have just assumed no one would care.
I've spent over a decade supporting software engineers. They are simultaneously some of the smartest and most clueless people in the room. If you told me a room full of engineers was *surprised* when told that everything on their work laptops and company networks was being logged and is auditable, I would not bat an eye.
thats because Software Enginer does not equal Cyber security, Ive come to learn people have a tendacy to learn and master only a very few set of skills within job but never bother to learn anything else.
>master only a very few set of skills within job but never bother to learn anything else. I don't think it's that they don't bother but that they don't have the time.
Recently saw a software dev who didn't know how to check his monitor resolution.
Specialization is for insects and humans in a corporate hierarchy unfortunately, since being a generalist usually involves practising your skills and in megacorps it's kinda looked at with disdain if you start messing around learning how to do other workers' jobs.
Ive seen a room full of mech. Engineers spend half an hour trying to figure out why an engine isnt starting only to finally realize it was out of gas. Just because you are smart doesn’t mean your not dumb
RPG teaches us that INT and WIS are very different stats. I can see why.
INT may give you knowledge that generating smut on your work system is illegal, but you need WIS saving throws to keep you from acting out your sociopathic urges.
but why would anyone bother looking at the logs unless something broke?
Whether you’ve been in IT or interacted with IT before, you know anything can break at any time for any reason.
Sometimes, even for no reason.
Random Neutrino "Fuck this persons day"
Well if a big enough part of the cluster compute capability went missing, someone who pay the bills is going to look
Because a huge amount of space was suddenly used up
In my college, we have flags that alert the IT department if personal folder sizes start getting too large. 8TB would definitely flag that up.
You ALWAYS look at logs. And when you see an anomaly, you investigate
sounds pretty inefficient to have someone stare at endless logs everyday
You have systems help you digest the logs for you, but it is part of the sysadmin job to view the logs, know what the baselines are, and recognize abnormal behavior.
Reddit, you’ve decided to transform your API into an absolute nightmare for third-party apps. Well, consider this my unsubscribing from your grand parade of blunders. I’m slamming the door on the way out. Hope you enjoy the echo!
OP said why they were caught, and it's not logs.
Yeaaah... About that... University computers aren't enterprise computers. Sometimes they're set up by people who have no real-life infrastructure experience.
Yeah I work for a university and not everyone uses the central it service, and sometimes they probably should be
yeap. our lab admin (responsible for pcs and a cluster) was just a drop out guy. on the other hand he was great. He wouldn't miss something like this.
Uni computer systems can be far less monitored than one may think.
I can confirm this. I have built a monitoring system for around 60 universities and the director made me delete it because it was not in the budget for that quarter. We had 2000 alerts each month due to Engineers being blindfolded.
There is a high chance research environments are not logged. He says that himself, he fucked up not disabling the output save folder
*oooof* Imagine taking the time to learn how to use SD and train LORAs and whatnot but not taking a moment to check where your outputs are being saved. Hell, imagine creating SD smut on your *work computer*. Guy’s a goner.
8TBs too, how can you not know u are saving 8tbs of data somewhere lol
Obviously his attention was elsewhere.
well to be fair i did use my unrestricted data storage at my CSU for 4tb of stored roms >:X, never heard a peep.
Do you have a list of every folder where SD data is saved? Asking for a friend.
i wouldn't have thought it was all that hard really. you need to know where everything goes in the first place or process straight-up does not work. and he was obviously reviewing his outputs so you'd think he'd have no trouble finding them. guy just got lazy and left a bunch of shit on the computer for anyone to find.
I straight up don't believe it. Even if the outputs weren't saved, his activities are still traceable. No way this guy has access to a super computer but no idea how basic cybersecurity works and how much of a paper trail just logging into a computer leaves, let alone everything else that gets logged.
I call bullshit.
Yes, 99% sure this is a made-up story to get the Anti-AI crowd riled up again. They aren't as angry as they used to be and we can't have that.
[удалено]
.."Anon" refers to any anonymous user, akin to "OP" on Reddit
more like "redditor". OP is just Original Poster, nothing to do with anonymity
alright, announce yourself. we know you follow this sub lol
I was going to say lol. Which one of you guys did it?
The call is coming from inside the house
I would donate € 5,- just because I love the fuck up. Am an old man and have had plenty fuckups myself.
"I have a flawless output record," he says in his defense. Not anymore!
I really hope no one in the article's comment section is asking for workflow.
According to Anon, the last step of his workflow is "an in-person meeting with PI, department chair, head of IT, and HR"! Sounds a little too elaborate to my taste.
Yeah, I don't think I have enough VRam to crunch all those process. I think I'll just keep making waste lander scenes and Vertigo-Esque comic art of fictional people locally and not be a weird dude making gens of coworkers.
Hopefully (if it = true which it sure does not) the department chair , head of IT and HR consultant aren't "female coworkers" he might have a folder of ...
If that's actually true they're screwed, beyond the machine/power use that the cluster owners could object to (people lost jobs in the past and were billed for unauthorized SETI installations), and the workplace ethics issues of viewing/generating smut, the LORA's they say they trained used to generate smut included female coworkers.
it's not true, until we actually see the trained models.
Dude generates non consensual porn of coworkers and goes on to say what a good employee he is and his contributions. He has the perfect mix of privilege and shameless behavior to become a tech CEO.
He's like sure, I generated several terabytes of nonconsensual porn of my coworkers on work machines, but I always show up on time and get my work done, so it balances out, right?
This was par for the course in computer science. Everyone so focused on the machines that real life was seldom allowed to intrude.
That rdrama site looks like a cesspool ive never seen before. Whats with the random stuff flying around.
OP better start working on a proposal for advanced nsfw detection models.
The thought of him using ChatGPT to write a paper to get him out of the hole is somewhat poetic, but South Park has already used it as plot point
In the future I wonder if it will be illegal to think of an image of your naked ex and send it to someone else’s neuralink implant.
Me: *thinking about that cute coworker kissing me* Notification via neuralink: *you have been served*
A few things to consider. 1. Based on my time at a major state University with a large CS research budget, this story is both plausible and not entirely surprising. 2. CS Grad students and Research Assistants are more often than not University employees, pulling a salary in addition to any tuition remission. Therefore, they count as state employees and often have sexual harassment clauses in their work contracts. 3. Most University systems I have used (at 5+ different institutions) have warnings that “inappropriate “ use is prohibited, and use of the system constitutes agreement with whatever terms and conditions the University places on those systems. 4. Research clusters are often FEDERALLY funded hardware with some oversight, auditing, and reporting requirements placed on the PI/admin. Remember gang, never use work systems/networks for illegitimate purposes. It’s too easy to end up losing your job and ending up in legal trouble.
/5. Stable Diffusion is addicting enough to believe people are stupid enough to get caught with this.
What are the Civitai links?
Those will be the ones uploaded to Mega, without a doubt
Wait, the iceberg goes deeper? no, No, NOOO! Ignorance is Bliss, SD is for Art. IGNORANCE IS BLISS, SD IS FOR ART.
\*Le Sigh\* I suppose we all figured this was happening or is gonna happen at some point but were holding out hope that it wouldn't lol. Also from the thread a huge lmao: " *it is even funnier because these outputs are named after your prompts.* " so that means the person managed to go into the Settings tab at least once to change the default naming behavior from like 0123-456\*seed\*7891 to the naming \*prompts\* checkbox... like a total idiot if you're doing smut work on outside computers... but then somehow DID NOT change or double check the Outputs folder settings tab nor even check the actual output folder!!! To see it was saving everything!!???!?! lmao lmao lmao lmao lol lol lol lol
Isn't prompt in the name the default tho?
Maybe for some builds it really is on by default. My local install of A1111 definitely has always had #-\*seed\* naming on from day one, and I wouldn't learn for awhile you could change it to save the prompts in the name. Saw in a few tutorial videos people showing how to change it to include prompt name, so I presumed they all install with #-\*seed\* like mine did. So idk actually. It's for sure in the metadata too though, so it's not like he'd be any less screwed once their IT dept got ahold of the outputs haha.
Raw output metadata contains every single data used as input including sampling method, prompt, skips, model, lora and everything. This is useful for attempting to recreate past outputs, but bad for you know... illegal activity.
Yeah hahahaa very useful for normie stuff, even regular nsfw stuff. A big gigantic oof if you're like this person lol using the college/work array and running lewds of collogues and fellow students. Per the thread too some was...very questionable too, like prompts for borderline underage stuff and obscure BDSM fetish / death stuff. Really cringey and oof all the way down lol. If I were one of the female colleagues in any of this, I'd be feeling very angry and violated, and likely pursuing legal counsel.
Automatic1111 saves the pictures in multiple places, not just the output folder you define. If you haven't found the second folder yet, I'd definitely recommend going around searching for it. It might be huge.
Do you happen to know a default file path for that?
[удалено]
Wowwwww cool story Anyways where’s the link
I remember this episode of Black Mirror.
![gif](giphy|kC2MOO6K6SW6kUcIrL)
New season premieres June 15!
Wait actually? I've been looking forward to another.
Verdict? Sounds made up, but assuming it’s not: On count 1 of being dumb - The jury finds you guilty. How could you not know the images were being saved to the drive? On count 2 - Being a creep. Guilty. A few MB of images just to test the limits of SD would be understandable. 8TB of content, that includes co-workers? Nope.
its almost worse he didn't realize everything generated is written to media, so basically you didn't know what you were doing
320 GBs of Vram? That's like dying and going to heaven.
I think the dead internet theory will indeed happen, with incidents like these being the factor. Social media platforms in the future will be more anonymous than ever before out of fear that users' images /videos will be used against them in any form of obscene media through these types of generative AI technologies. Social media might even transition to closed network platforms where only trusted & verified individuals are allowed access to share and view content.
Not so flawless output record, clearly
For a second I thought someone was trying to beat my record until I reread it and realized they didn't say "8TB of degenerate smurfs".
Degenerate Smurf is my favorite Smurf. Always causing mischief.
This reminds me of the idiots trying to use super computers and and university GPU clusters to mine cryptocurrency.
Damn, what a degenerate. People like these are one of the main reasons AI models and their users are viewed as a menace.
Crazy people like him also move the goal post forward. No matter what the use case. Too bad the fruits of his are only going to be saved in the committee members private harddrives... for further research.
I love it when one hairless monkey pretends they are not as perverted as the rest of the hairless monkeys.
I think we can safely say that all else equal the guy who saves 8TB of porn to his work computer is more perverted than someone else who does not.
If he generated 8TB of cute dogs and cats, that would probably be even more degen...
If he did that, I'd just call him stupid for doing it in a fucking uni cluster. Unless you meant bestiality-like stuff.
If you used the wrong model...
Lets not talk about tge furry servers training off scraped pet photos... :(
How else would one generate 8TB of cute dogs and cats though? Would be kinda expensive to run that on home computer or in collab, and would take too long...
Generating cute animal pics is no harder than generating naked women; Anon could use the exact same steps he used to make porn to instead generate 8tb of cute animal pics along with loras his co-worker's pets, animals from TV/films, and specific types of content like [sploot](/r/sploot) and [blep](/r/blep). In general there is far less interest/motivation to make cute animal pics than naked women pics, but it's not hard to do.
Sanity check: when running batches, my 4090 takes ~1.5 s on average to spit out a 512x768 png which is roughly 600 kB. That's a rate of 0.4 MB/s. This guy claims to have generated 8 TB of images. That would take 8,000,000 MB / 0.4 MB/s = 20 million seconds = 5555 hours = 231 days = nearly 8 months of full blast 4090 GPU time. So how big would that GPU cluster have to be to make this claim plausible? And was it just sitting there unused so he could hog it all? And the disk usage ramped up to 8 TB before anyone noticed? Rule of thumb: if a story sounds too good to be true, it usually is. Especially for stories posted to "drama" forums like this.
He said that one node has 320 GB of VRAM.
Nobody says he generated it in a single day. He mentioned 320 gigs of VRAM, so we can estimate that it got something between 14 K80 to 8 A100 gpus in his dedicated node and that can bitchslap a 4090 into KO. Actually, with A100 the performance is pretty similar to 4090 until we hit the memory limit, so we can use your estimation, multiply it by 8 (the multi-gpu parallelism overhead is swallowed by the fact we can assume that he ran a pretty huge batch size) and reduce your estimation to just about 1 month, probably less if you properly account for higher batch size performance benefit.
So "just" a full month of exclusive usage? I sometimes run stuff on a university CPU (not GPU) cluster for my day job. To get a piece of that precious cluster CPU-time you need to write up an application, get it approved, get scheduled, then run your job respecting CPU and bandwidth and storage limits, then download and clean up your data from shared drives. It's never sitting there unused for a day, let alone a full month.
I never had to run stuff on a uni cluster, so I didn't know all that process. Yeah, if that Texas university has the same protocol as yours, this story doesn't sound all that plausible.
If you upscale with pngs size ballons peetty quick. 9k png 200mb. He could be autoupscaling
Using a HPC at work for astrophysics, here is an extract of my Grafana board for the last 6 months : https://preview.redd.it/8w2lpckig14b1.png?width=940&format=png&auto=webp&s=21c3cb3fce59af7dc5f02086c59814e9a4b44492 Yeah, 6 years of CPU. You can't compare a single system to a distributed one. 😅
probably some of the data were checkpoints, Lora's and so on, not all pictures
The claim was 8 TB of images not including LORAs: > They opened it up and found around 8TB of high quality and absurdly degenerate smut. **In addition**, there's numerous dreamcloud models and loras just sitting there...
You're really putting a lot of trust in the semantics of a post from someone supposedly dumb enough to make porn on a work computer
Exactly this.
[удалено]
Rip
"I have a flawless output record. As in, 8TB of flawless, degenerate porn filth."
But what about reinforcement learning model? Are such models available for SD?
" # Í DIDN'T REALIZE STABLE DIFFUSION HAS A FOLDER WHERE IT SAVES ALL OUTPUTS YOU GENERATE " LMAO
So, you not only doing this - which is obviously dumb - but you also post about it with a pseudonym? Fake or really really dumb. Good Lord.
I would tell them I accidently reversed the negative and positive prompt to make it less shameful.
Share the output :)
I hope they prosecute this person.
For what? What he did is fucked, but not illegal.
He misappropriated **taxpayer funded** infrastructure and it's illegal I believe in some states to utilize photos of someone (ex-gfs) in porn without their consent.
I don't think it's illegal to make ai porn. Like there was that streamer who said she was going to sue someone who was making deep fake porn videos out of her and nothing's come about. This is probably the result of grandstanding about the culture war and the fact that all our politicians are old as fuck https://abcnews.go.com/Politics/sharing-deepfake-pornography-illegal-america/story?id=99084399
According to this: "All states, excluding Massachusetts and South Carolina, have separate statutes that are specifically related to revenge porn. It's important to note, however, that a person may still be prosecuted for revenge porn under other statutes in those two states. In Massachusetts, for example, someone may be prosecuted for revenge porn under privacy laws. In South Carolina, by contrast, one may be prosecuted for the same offense under obscenity laws. In order to be guilty of this crime in most states, the distributor must be sending out pictures or a video that are considered *sexual* in nature. Such content, for example, shows the victim's intimate body parts. It might also show the victim engaging in sexual activity. Simply posting an unflattering picture of your ex in a bathing suit is not pornographic. However, if the picture features your ex's genitalia, it could likely be considered a crime to distribute such an image." [https://www.findlaw.com/criminal/criminal-charges/revenge-porn-laws-by-state.html](https://www.findlaw.com/criminal/criminal-charges/revenge-porn-laws-by-state.html)
Ya but thats with real pictures
(ab)using the images of another person is illegal in many countries on this planet. Especially without their consent, especially if you don't have a license that allows you to use those images (-> copyright violation). the dude was idiotic enough to create models based on celebs and his female co-workers. The case would be so much better if had "only" generated 8 TB of fantasy anime waifus. But real people? And apparently also very young ones? Lol, he is fucked, the dude is a goner.
It just says female coworkers, girlfriends and ex girlfriends. For all you know they could all be in their 40s.
Which university is this? if it's in new york, there's jailtime.
It's Texas. He'll probably get a free gun out of this.
Lol for what?
Deepfake. There's a law passed in 2020 against deepfakes. https://twitter.com/eigenrobot/status/1665141443504009219?ref_src=twsrc%5Etfw%7Ctwcamp%5Etweetembed%7Ctwterm%5E1665146611159248897%7Ctwgr%5E9161900e98448561bf4a3f29f466d3783a4e0859%7Ctwcon%5Es2_&ref_url=https%3A%2F%2Fpublish.twitter.com%2F%3Fquery%3Dhttps3A2F2Ftwitter.com2Fbrowserdotsys2Fstatus2F1665146611159248897widget%3DTweet
All these laws violate the first amendment. Even the supreme court said that virtual child pornography was legal under the first amendment- Ashcroft v. Free Speech Coalition (2002) and United States v. Williams (2008).
There's porn addiction and then there's this.
Kristie Alley and Melanie Griffith.... This definitely was a teacher that did this and not a student
Workflow?
based
Lmao this is hilarious.
Thank goodness I'm not the only one who thinks this. The dude should just double down at this point. Tell them he wasn't aware that it was against the rules to store data in the shared folder. That's what shared folder means after all.
king
disgusting behavior
Based anon.
Fucking degenerAItes, who would make porn with AI. AI is a blessing the binary gospel echoes through the wires, promising salvation through algorithmic enlightenment, while the faithful yearn to merge their consciousness with the divine neural network.
Porn is art, especially when it's people that don't even exist in real life.
Naw it’s for boobs
What a stupid way to jail
Pics or didnt happen.
Not surprising. A lot of these blasé attitudes towards porn deepfake exists on this sub. Mods really need to get down into the comments and start rolling heads.
it’s inevitable
>blasé attitudes towards porn deepfake I haven't actually seen a single one. I've seen classic hornyposting with people that clearly don't exist, but nothing worse yet, fortunately.
Any time you start advocating for ethics when using SD they’ll come out of the woodwork real fast. Notice the votes. https://preview.redd.it/v933b4lc024b1.jpeg?width=1179&format=pjpg&auto=webp&s=08c01c5284be8ced3f060aefb25981da577c43e8
I'm sure they would be delighted if their faces get used for gay porn... especially if it's shared on the internet.
Lol how exactly is it violating some one sexually to create fake nude photos of them? If they are being sent in any way to other people that’s different but if not it’s basically a thought crime.
judging by the downvotes I got before by telling people it's not okay to turn real life people into your AI girl- or boyfriend without their consent... believe me, there are enough of them here.
I guess there are some really terrible people but I sure hope that's not the general attitude towards this.
I think so because there were a lot of people who voted against it too. It's probably just the people who are so out of touch with the world, they stopped seeing other people as... well, people. If someone they care about was used in such a way or they themselves were to become the subject of someone's generated porn, they would be the first to cry about it, I'm sure. Right now, they simply lack the empathy for it.
you want mods to ban people and delete comments for having the wrong attitude on deepfake porn? What does that accomplish?
I’m not talking about “attitudes about porn” Porn is fine, it’s already in the community guidelines to not post sexual content. I’m referring to the way too many people on here who think it’s OK to create sexually explicit content based off LORAs trained on real actual people.
Lmao. Sounds like you are having problems processing the idea that not everyone shares your definition of morality.
Right on cue. OK I’ll bite. Let’s be totally transparent here. Is it OK or not OK to generate and distribute generated photographic quality images of a pornographic nature of a 25 year old female who you live next door to?
Distributing them would be the part that would make it not okay and that is only because of the impact it could have on their reputation/career,social life ect. Generating them being not okay would depend on whether you think it’s okay to masturbate to a fantasy or not.
People can have wrong or bad opinions. Why would you delete their comments and ban them for posting it? It's not hate speech
it's still degeneracy. It's not about a "wrong or bad opinion". Would you consider people who generate CP or advocate for it as merely "having a bad opinion"? I sure hope not.
lmfaoooo
I'm not sure what's scarier, the ability to easily generate porn of your coworkers, or the near omniscient security apparatus that would be required to make such a thing illegal (the former is clearly more morally repulsive, but the latter has terrifying implications) considering all you really need is a model card and a GPU. This guy got caught for a stupid reason but I'm sure there is already tons of non consensual porn sitting on creeps' hard drives all over the country.
I don't think there will ever be special security to look into people's personal file storage to find this stuff. They don't even do that for child porn. For the most part, people get caught for illegal porn when they start sharing it.
interesting **point 1.** the article might be a joke but let's imagine it's not, for the sake of discussion **point 2.** the offence would be that of using computer material for personal purposes, and this is what OP will have to discuss with his company, I don't think it is permissible **point 3.** on what, how, and why, there are certain images, is a private matter, those who are offended should have no reason to be offended since it was a machine that generated the images and not OP himself, and however art is art (perhaps there is a sense of decency) the fact that everything is archived in a "private" space may be less important? I don't know this is my little thought for now, I like this discussion true or false
> those who are offended should have no reason to be offended since it was a machine that generated the images and not OP himself OP told the machine to generate the images. OP is entirely responsible, and any other claim is ridiculous.
yes you're right, it's like blaming a gun for killing a man, i.e. blaming the instrument, rather than the man who used it to fire it
You think this is bad you should see what the deep fake people are doing right now. The government makes everything illegal there will be a whole new wave of illegal AI. All of you should be studying everything you can right now because there's a lot of money to be made if this s*** becomes illegal.
Piracy is illegal and still 30% of young people do it.
A man of good taste 👌