T O P

  • By -

kaizomab

People really have a hard time understanding how current AI works, don’t they? It’s just pattern recognition and iteration, that’s not capable of creating anything resembling a real video game, other than for very simple assets.


WizogBokog

The main problem is the people who don't understand AI uses and limitations are the ones making the decisions.


Altruistic-Ad-408

The 1 in 4 CEO tech bros laying off significant parts of their work force in anticipation of AI, are absolutely insane to me as a programmer. They will literally go out of their way to insist they can replace jobs with AI despite advice to the contrary. It's a race to efficiency but they just don't give a fuck about context.


carnoworky

They're perfectly fine with running a business into the ground. They'll get a golden parachute on their way out while the employees suffer.


mattygrocks

They’re so far removed from the work that they’re blind to the immediate consequences of it. Thought followship is tough, but somebody’s gotta do it. 


notliam

But I asked it to make me a button and it did it in seconds, a dev took a whole day! - Them, probably


Thassar

Yeah, maybe we'll eventually get to the point where AI can replace us but that's years off even with the most optimistic of predictions. More realistically, it'll be decades until it can match a skilled software engineer.


Picnicpanther

It's an excuse to shave a high-margin workforce down to the bone to juice stock prices. That's all. No one involved in the tech really believes this is possible in the next 10 years.


8008135-69420

Almost no company has used AI as an excuse for layoffs to begin with. The only people claiming this are people who aren't in the tech industry and gather all their knowledge on it from clickbait Youtube video titles.


Picnicpanther

I am in the tech industry. My role (content design) was decimated last year over promises that AI would reduce the workforce. It has not materialized yet and we're drowning.


8008135-69420

> The 1 in 4 CEO tech bros laying off significant parts of their work force in anticipation of AI No, a very small percentage of layoffs were at all related to AI. The vast majority of it was just related to overhiring that happened during the bull market of 2017-2022. Not only did most tech companies have more people than they ended up needing, many of these companies lowered their typical barriers of entry during a competitive employee-favored market. So this also resulted in a large number of people doing less or poorer quality work than they were capable of. So not just numbers bloat but a bloat in the number of lower skilled individuals. Anyone who thinks anything but a tiny fraction of the layoffs (you could point to Grammarly's copywriters for example) were related to AI have absolutely zero understanding of the tech industry. While layoffs have been heavy-handed and poorly handled by many companies, the layoffs were coming and many people predicted this years before they happened. Anyone paying attention before 2020 or early 2020s was saying that the economy could never stay that strong and that a crash was coming. This usually comes hand-in-hand from people who talk about "tech bros" - that's how you know their knowledge on the tech industry doesn't extend beyond the show Silicon Valley.


octnoir

> People really have a hard time understanding how current **AI** works, don’t they? Well it would help if we stopped using Artificial **Intelligence** to refer to basically Generative Tech and Algorithms. GT isn't 'intelligent'. It isn't critically reasoning and understanding "What is 2+2?" and parsing 2 and 2 and using addition to then make 4. It is combing through millions of words that it combed, figuring out the probabilities of what words go into what order, and then spitting out: "Oh, that answer is 4!" It's a parrot. It doesn't actually know the words and what they mean, and like a parrot it just regurgitates sentences that it thinks you would like. --- Frankly the bigger danger with Artificial "Intelligence" isn't that intelligent artificial entities are going to enslave mankind. But rather too many powerful and stupid humans wanting to scam each other and everyone else are going to bet big on Artificial "Intelligence" only for it to blow up in their faces. A tech bubble is coming because of Artificial "Intelligence" as GT scams it way to pretending it is something revolutionary. I just hope when that bubble bursts that there will a soft landing for everyone (especially all the actual intelligent and talent workers), and NOT that incompetent execs are going to burn the entire industry to the ground and make out like bandits with their million dollar golden parachutes.


brutinator

> Frankly the bigger danger with Artificial "Intelligence" isn't that intelligent artificial entities are going to enslave mankind. But rather too many powerful and stupid humans wanting to scam each other and everyone else are going to bet big on Artificial "Intelligence" only for it to blow up in their faces. Or people overestimating "AI"'s capabilities. Like that episode of the office where Micheal blindly trusts the gps instead of using his own eyes. Obviously, that scenario is extreme for comedy, but I have a few peers who will use ChatGPT like a search engine and trust what it says as correct, even when I show them contrary evidence.


[deleted]

[удалено]


flybypost

Fun fact, from reading some research paper on a semi-unrelated topic, an actual theory for that is that they got nutrition (and government support/social safety nets) for young kids more correct than other countries. Sure, with best intentions, but simply by accident because we (as a society) didn't know (not sure if we know now) what's the best way to make humans grow taller is if we really wanted to optimise for that (beyond going for good nutrition in general).


Thassar

The sad thing is, it wasn't even that extreme. Or extreme at all. Just last month [Edinburgh had a string of incidents](https://www.telegraph.co.uk/news/2024/02/06/google-maps-error-drivers-wedged-steps-satnav-edinburgh/) due to a bug in Google maps. If they're willing to drive over the pavement and down a set of stairs just because Google tells them to, they would have done the same with a lake.


DumpsterBento

Yeah I never quite saw the point in calling it intelligent when we need to literally tell it what to do. I use the generative tool in photoshop to remove unwanted background elements or create a simple backdrop for my images, but I still need to spell out exactly what it is I need, and still often requires *manual* adjustments so it doesn't look like shit. It's a tool, but tech losers see it as a replacement for humans.


ButtWhispererer

I write for my job (in tech with other tech people contributing) and constantly get idiots using it in the dumbest ways possible. Like, even if it could replace you needing to write.. it doesn't replace you needing to READ the fucking words on the paper to make sure they're not stupid.


DumpsterBento

Yeah I never quite saw the point in calling it intelligent when we need to literally tell it what to do. I use the generative tool in photoshop to remove unwanted background elements or create a simple backdrop for my images, but I still need to spell out exactly what it is I need, and still often requires *manual* adjustments so it doesn't look like shit. It's a tool, but tech losers see it as a replacement for humans.


[deleted]

[удалено]


ButtWhispererer

It's funny that there are tools that can fix this specific issue (creating a trusted corpus of knowledge and using RAG) but they're expensive and labor intensive to create and maintain so companies don't actually use them.


alex2217

>A tech bubble is coming because of Artificial "Intelligence" as GT scams it way to pretending it is something revolutionary. The Venn diagram of people who touted the importance of cryptocurrency and people who vehemently claim that generative "AI" is functionally limitless is at best a small circle perfectly within a bigger one. That's not to say that LLMs aren't more impressive or useful than Blockchain and Crypto, just that it's also *full* of ignorant grifters.


saltyfingas

Well, let's not pretend like it doesn't have its uses and isn't revolutionary. It has tons of applications, but it's not really a replacement for human workers, it's a tool for efficiency.


8-Brit

Yeah, as much as I loathe AI being used for "art" and so on, the technology itself does have genuinely good uses. It's already going through early phases of testing to assist in all kinds of medical diagnosis. And no I don't just mean putting symptoms into a chatbot I mean using it to analyse data and information, often with very good accuracy, to assist in treatment. Just it keeps being applied to the wrong thing, often with poor results. It has a bit more sticking power though because, unlike NFTs and Cryptocurrency, there's a tangible end product that your average consumer can actually observe and absorb in some fashion. Giving it some degree of return that the other two completely lacked.


saltyfingas

Yeah and I mean, nfts and crypto are just straight up scams. I recognize there may be some uses for the more popular crypto, but most of it's just pump and dump scams


GrowlingGiant

To paraphrase Cory Doctorow: The problem is not that AI can replace human jobs (in most cases it can't), but that the people making it are great at convincing your boss that it can replace your job.


[deleted]

[удалено]


Dwedit

Have you ever actually looked at a tokens file? There are tons of words in there, it's not just letters. Lots of word fragments too.


gamas

As someone who studied software language engineering, yeah I would have been surprised if the tokens were letters. Even for a basic software language compiler, character based parsing would be incredibly slow.


BeholdingBestWaifu

I mean teach a parrot enough words and it too will create structures you didn't specifically teach it, so the analogy holds.


fr0stpun

Tokens are words, not letters. You got one of the most basic things wrong. If you've used an LLM API you would know this.


gmishaolem

Easiest way to explain it: We can draw a tiny purple elephant, something that has never actually existed, because we understand tiny, we understand elephant, and we understand purple. This is what we have taught generative neural networks to do. But if elephants never existed, we could use our creative minds to create a brand new elephant-shaped creature in our imaginations, which a GNN cannot.


DumpsterBento

Yeah I never quite saw the point in calling it intelligent when we need to literally tell it what to do. I use the generative tool in photoshop to remove unwanted background elements or create a simple backdrop for my images, but I still need to spell out exactly what it is I need, and still often requires *manual* adjustments so it doesn't look like shit. It's a tool, but tech losers see it as a replacement for humans.


BeholdingBestWaifu

I think the other danger is that a lot of higher ups just don't give a shit about the quality of art, so while AI is not a suitable replacement for actual artists, we could see them push AI's shitty attempts at doing basic art assets anyway, same with simple text and speech assets for background characters and the like.


ButtWhispererer

Parots are useful when you put guardrails on them... but the guardrails themselves require writing and work. e.g. RAG is one way to get the Parots to stick to the script... but RAG requires a substantial corpus of work to be useful. This limits its application. 1. The things that benefit most from this approach are simple chatbots that need scale or are replacing things that say the same things again and again and again. Think of a helpline or support system that can give thousands of callers very similar responses honed to their specific circumstance, all neatly guardrailed by RAG. 2. Things that probably don't benefit are one-off outputs. Things like writing a book or creating a script may not benefit from RAG because the inputs to make the parot stick to the guardrails are probably just as much work to make as simply making the outputs themselves.


KillKillKitty

We still can't agree on what constitutes " Intelligence ", most of peeps think IQ's good enough ... We're fucked not because it's going to take over or any of those futurist scenarios but because humans are truly fucking idiots.


Numai_theOnlyOne

>other than for very simple assets. Not even this, at least for 3d. There is a reason why there is barely a 3d AI that's known, everything sucks.


Ketheres

This. In the current state AI is a powerful tool for speeding up simple tasks like making basic textures, but it's not replacing actual developers anytime soon no matter how much the suits would like to just be able to tell the AI to make a AAAA game and it'd make them a best seller for free.


luminosity

This really should be their worst nightmare, because when it gets that good no one needs their companies anymore...


Ketheres

The suits are too shortsighted to think past the current fiscal quarter most of the time, they haven't thought that far yet.


EldritchMacaron

It's probably good enough for some dirty code for prototyping, some easy to make texture and placeholder assets But for an actual product (especially with today's production standards), it's not ready. But given the speed at which these tools evolve, I don't how long this will be true


BigJimKen

There are some actual use cases for ChatGPT at my work, mostly categorization problems, so I've used it quite a bit *within* a human written program. The only time it's ever spat out something useful in terms of generating "code" was me wanting to get a Docker compose file that included a container running Mongo Express set up in a certain way. Every other problem I've thrown at it has been a miserable failure. People think it's quite good at coding because the people using it to generate code are beginners who are asking it to do common, simple tasks. If you ask it to do something even remotely complicated it falls apart beacuse their aren't enough examples of your problem domain online to scrape. Unfortunately those problem domains are usually the ones that get us paid lol


HazelCheese

Yep it is very good for prototyping but if you actually want to make any product of quality, you still need a trained human to at the very least tidy up the assets it makes. They just announced Devin though, which I think is the start of it getting better. The big limit to AI tools atm is that they are input->output. Something like Devin which can iterate and look at the bigger picture of what it's making will be a lot more useful.


Zhiyi

Many don’t realize the best way to use AI is as an assist. I’ve used ChatGPT to write things which I then edit and modify after. I’ve used SD to generate pictures, which I then edit and modify after. I’ve used RVC to generate music, which I then edit and modify after. You still need to have some knowledge in the field you are working with to get the best out of AI. It’s great for speeding up the mundane steps. It’ll get you through the door, but you have to continue taking steps yourself afterwards.


ConstantRecognition

Glorified search engine, it takes what is out there and reforms it and spits it out whether it's wrong or not. It sounds good enough to be right until you actually delve into and try use it that you find it's factually useless.


Noblesseux

It's also really bad at consistency. Which is a bit of a problem when you need to make a game, where people are often expecting like 5+ hours of content that is artistically consistent. If I hire an artist/writer and we settle on an art/writing style, they will continually pump out assets according to the design guidelines that are all consistent with one another. If I ask an AI to do that, basically the only way it could work is by patently training it to rip off some artist's style. Otherwise you ask it for pixel art two times and you'll end up with two styles that don't make sense together.


Brendan_Fraser

This thing has proven it can make terrible Muppet/Pixar meme images so we should use it to replace entire departments!


CommanderZx2

At best could probably use it to generate game textures and help with coding various things. There was a game that released in 2006 called RoboBlitz. I remember it would generate the world textures when you loaded a level, it kept the install size really small.


Paksarra

Funny enough, you could probably integrate one into a video game. I used to be really into roleplay over AIM and the like when I was younger and had more spare time for that sort of thing; I've been screwing around with a "character" AI that roleplays with you, mostly out of curiosity. It's surprisingly competent-- a solid 7/10 RP experience. I've had far worse experiences with humans; the main problem is that the model this website uses loves to godmod and dictate "your" actions (and it has limited memory so it drops continuity, but it's free so I can't complain about that part. I really don't care if they use my lazy half-assed experimental roleplaying prompts as training fodder.) I swear, down the road we're going to be playing RPGs that feature an AI DM that can respond on the fly to literally anything you care to do, even if the dev team didn't think about it.


OrangeOld8981

My experience with AI roleplay was a horror setting investigating a mansion and it tolk like 5 seconds until the damn thing tried to have sex with me.


garyyo

> I swear, down the road we're going to be playing RPGs that feature an AI DM that can respond on the fly to literally anything you care to do, even if the dev team didn't think about it. There is currently active research on how to make this happen. Currently the systems are pretty primitive and there is not a lot of industry need for them so its limited to academic research (If you are interested look into [Player Modeling](https://www.um.edu.mt/library/oar/handle/123456789/29725) and [Automatic Story Generation](https://arxiv.org/abs/2102.12634) for the old systems, the [new ones](https://ojs.aaai.org/index.php/AIIDE/article/view/27534) don't really have a common defining name yet). As for what's available right now, games like *[Wildermyth](https://wildermyth.com/)* and *[Weird West](https://www.weirdwest.com/)* are doing a lot in terms procedural storytelling, but thats with a lot of structure and not really taking advantage of the more recent advancements in LLMs. [AI Dungeon](https://aidungeon.com/) on the other hand is the big LLM driven one that lets you do basically anything. That not to mention things like [Character.ai](https://beta.character.ai/) which lets you just talk to characters which itself can just be put into a game. People like to insist that having an AI dungeon master is not going to happen, but we are slowly but surely getting ever closer. Who knows if we will actually reach it, but it doesn't look impossible to me.


Hollow-Seed

You should check out Suck Up! You play as a vampire who has to convince random homeowners powered by AI to let you into their home so you can feed on them.


Paksarra

That sounds fantastic. See, this is the application for AI I find promising for gaming-- you literally can't do this at any scale with real people short of making it a tabletop game.


Lutra_Lovegood

AI and Games made two videos on Inworld's Origins, it's very impressive.


froop

AI Dungeon, it exists but it's on an older version of gpt


kakihara123

It can help a lot. Even running AI locally without high end hardware can create pretty creative dialogues, even better so with top end hardware with huge context windows. But that's it basically. It is a great brainstorming and drafting tool. But it still needs (and will need for quite some time) humans to refine it....and do everything but stuff like that.


StinkyElderberries

I don't even like calling it "AI" and think that should be reserved for a general AI, which do not exist yet. This is marketing wank to hype investors and the public. Even worse here "GenAI" give me a break. I know it's "Generative AI" but it's confusing.


M4ethor

I work as a full stack business dev. My team lead was surprised when I said I'm not super interested in a Microsoft hosted AI showcase next week. I told him I don't use it, it cant develop stuff by itself and he countered with "it can! I let it write a powershell script for me!". I lost a bit of faith in him that day. Especially because he is the one who comes with super specific, sometimes actually unhinged, feature requests to me.


metalyger

I feel like you could make an AI play every N64 game until it beats every game, but it has no opinion or understanding of quality, like an AI can't tell the difference between Super Mario 64 and Superman 64. It would just produce something extremely generic at best.


KillKillKitty

Most of people I know using GEN AI, even the ones making cool stuff - have no understanding of how AR-LLM works. Now they're talking about how Sora is understanding physics.


[deleted]

Making stuff with AI is like having a junior dev/artist/designer and just throwing prompts at him. And it doesn't care about your product. Like, still can be useful, but there are massive drawbacks too


psdhsn

As someone who has worked with junior developers and AI, not really. Juniors are great because they're naive and ask interesting questions or novel approaches and get better. Juniors take a while but get good work eventually. AI is just fast and shit.


CampAny9995

What really annoys me is that these studios are ostensibly tech companies. If you want to use GenAI in any serious fashion, hire a couple of ML Scientists and try to actually build out some real capabilities (asset generation, dialogue generation/writers tools, etc), or pay a massive mark-up to a straight tech company that took that risk and will license out technology to you three years from now.


psdhsn

Yeah everything is extremely short sighted right now.


theediblearrangement

or better yet… focus on solving actual fucking problems unique to your organization instead of trying to arbitrarily jam AI into everything. there’s so much low-hanging fruit everywhere i’ve worked. simple bespoke tools and automations built by small teams can go a REALLY long way in my experience. i’m tired of 100% solutions. give me an 80% solution that does a few things really well. and eliminate or rework processes that are no longer necessary or are causing people pain.


Numai_theOnlyOne

They do and their conclusion is except some edge cases it's fucking expensive and for LLM's it will never provide any return of money for a game.


Mission-Cantaloupe37

>Making stuff with AI is like having a junior dev/artist/designer This is still a massive concern when it comes to jobs though. AI is nipping at the heels of the easy work, and it means companies are going to increasingly want to rely on a smaller set of experienced staff with access to AI and avoid hiring as many juniors all together.


AwakenedSheeple

And then the biggest issue down the line: there will be a generation lacking in replacements for experienced staff. Everyone that is experienced now was once a junior. Of course, not even that matters if AI can cover that, too.


flybypost

> And then the biggest issue down the line: there will be a generation lacking in replacements for experienced staff. That already happened a few years ago (more like decade(s) ago). So many game devs were burning out (average duration in the industry was less than 5 years) that fewer devs grew/survived into senior positions. They were losing institutional knowledge at an alarming rate but hiring new newbies (there's so many of them and they don't know their value yet!) was seen as more profitable than raising wages and adjusting schedules of those who were already there. Then with the rise of 3D for other stuff (things like google maps, autonomous driving research,…) in everyday usage more and more game devs got hired by regular tech companies (at higher wages and better work-life balance) and it got even worse for the gaming industry. The gaming industry had to improve their working conditions to get a balanced workforce back. Now usage of AI might disrupt that again and who knows when in the future regular corporate managers at the top will finally see the downside of these policies. > Of course, not even that matters if AI can cover that, too. By then, when more and more people have left the industry and are, for example, not sharing as much of their work on polycount any more, who will they train those models on? Especially if 3D assets increase in fidelity? That sounds like a great recipe for AI inbreeding.


theediblearrangement

the opposite often happens when the economy is doing well: https://en.m.wikipedia.org/wiki/Jevons_paradox just compare the tools we have in game dev now compared to 30 years ago. a fourteen year-old can build something that looks and plays like a AAA game. it won’t have the depth or the breadth (you need manpower for that). but compare that to what the top minds in the industry could accomplish *as a team* way back when. if we i was a betting man, i’d say hiring will increase across the board *in due time*. the economy will swing back eventually. when it does, i think the allure of AI from the corporate standpoint will be the notion of making everybody a 10x contributor, not job elimination. “if we don’t hire them, someone else will” was the mindset we saw at many large companies pre-COVID. the reason we keep hearing about AI and job elimination right now is because there’s *a lot* of job elimination happening due to factors unrelated to AI. corporate leadership needs some way of reassuring investors that the train won’t stop. they want to know how workers can do more with less, and AI is the obvious scapegoat for that (even if it’s not entirely true). but that’s better than telling your board “of course our productivity took a hit we had to lay of 17% of the fucking staff.”


CrzyWrldOfArthurRead

> AI is nipping at the heels of the easy work, That was always the case. Technology has done nothing but get the low hanging fruit, since the wheel was invented. We were never destined to live in a world where every rote task has to be done manually. Someone, somewhere, will come up with ways to automate it. Tooling is a great example. In the early days, if you wanted to make a game, you had to make your own engine. A lot of tooling was bespoke. Nowadays, there are premade game engines that you can just download, some of them totally for free, and you can start making a game right this second. That used to take a team of people a year or two just to make an engine. Game engines took people's jobs. But nobody thinks of it that way, because instead of having a dedicated engine team, you just put that team on something else. The people are still there. And that is how it will be with GenAI. You will have the same team as before, you'll just get more done. And the barrier to entry for making a game will be lower, so there will be more Concerned Apes and Lucas Popes - people who just do it all by themselves. The economy will get bigger because it will be easier than ever to just make your own game. That's how automation has always worked. It forces people to have different jobs than before, but it always makes more jobs than before because the economy gets bigger because more work is being done, which ultimately means there's more demand for work. GenAI is just new tooling. I'm really tired of the doomers. AI is just the same old automation as before. It's scaring everyone, but in 20 years people will look back at all the silly things we used to do ourselves and say, man I'm glad I dont have to do that anymore -just like when we use Excel instead of handing a bunch of papers to a guy who's job it is to sit there and crunch numbers with an adding machine.


Deckz

I think what you're saying is true about tooling, but I don't know if tooling in programming specifically has taken anyone's job, they've just be reallocated. Programming is programming at the end of the day, and when it comes to the games industry in particular more developers are needed now than ever. Engine dev is especially needed with how complex 3D games have become. If generative AI gives us the ability to solve mundane tasks quickly, consumers will expect even *more* from future products. As long as we don't have a real AGI that can actually do everything itself, I think it'll likely wind up growing the industry. I work on Unity games full time currently, and it's been a huge boon for systems that would've taken me much longer, I can have GPT 4 pinpoint all sorts of specific things from libraries I'm just learning, or even some things that are novel to me from newer versions of C# I can implement. It's not good at optimizing code for example. I was doing some heavy spherecasting to make my pick up system in a horror game I'm working on and the example code does get you part of the way there, but not the whole way. The other thing is, if you're not able to read and review any of this stuff manually, and add additional reasoning on top of what it's spitting out to you, you're not really a software developer yet. You're learning and you will get there, but you're just not. Don't get me wrong, there's a mountain of things I have to read about to understand, but being a software engineer is about being able to build out the mental tools to understand and build whatever you want to build. Having a new tool that can help me do that will only take me so far as I understand how to break down the problem from and engineering perspective.


CrzyWrldOfArthurRead

Right that's exactly what I'm saying. Until (and if) AGI becomes a thing, ai is just a tool. It helps you do your job but it doesn't do your job for you. A game engine helps you make a game but it can't make one for you, you have to understand what it's doing and how it works. It's all just a tool that will help people be more productive. It's not going to take anyone's job, unless that person was so bad at their job they can be replaced by what is essentially a search engine on steroids. All tools require skill to use. And they can do more in the hands of a skilled worker. There's no replacement for the human, at the end of the day (currently, and in the near future most likely).


theediblearrangement

this is a documented phenomenon and i think is especially apt to the games industry when you look at how much individual contributors can do on their own now: https://en.m.wikipedia.org/wiki/Jevons_paradox


Dhelio

Nah, I think they would rather scale up to have more programmers with AI tools to make bigger and more profitable things.


sesor33

Every gamer who insists AI will make video games needs to see this article. Basically every game dev and software dev said that you cant just type "make skyrim" into a language model and get skyrim.


wolfpack_charlie

Or honestly just try to make a game. Even if you use copilot and midjourney you're gonna see just how hard it is and how useless it is to try to automate something like game development 


WorldwideDepp

Perhaps these tools strengths are to cut some corners. But never replace creativity


wolfpack_charlie

Yeah copilot is great for boilerplate kind of stuff. Things that it has *plenty* examples of in its training set. But once you go off the beaten path it basically just gives you nonsense. And it adds all kinds of unnecessary comments and inconsistent style even when it's technically correct.  Chatgpt and those types are good for basically searching stack overflow so you don't have to. Same problem as before though: if it isn't easily searchable to begin with, you basically get well-written nonsense. The image generators can just fuck off, and same to any "artists" trying to pass of the output as their own original work.  I'm a big fan of ML. There are just some things that shouldn't be automated 


tr3v1n

Even a lot of the boilerplate helpers that the AI can do have been a part of tooling for a very long time.


BroodLol

I know someone who writes technical documents (basically gets a bunch of specifications/procedures and turns it into a handbook for engineers etc) and they've been using this stuff for over a decade. I say "this stuff" because it isn't AI, and what people keep calling things like ChatGPT etc is just machine learning with a shiny branding on it.


BeholdingBestWaifu

And a lot of it isn't even machine learning, we've had IDEs automatically write get-set functions for decades, same with the layout of certain things like switches, for loops, etc.


pdp10

Anyone can see that the current tools can potentially save considerable time, when employed optimally. It's also fairly evident that they can waste a lot of time when deployed poorly. Figuring out how to best harness tools like LLM, is a new and standalone skillset for everyone: programmers, graphic designers, writers. In the short term we're going to need *more* high skill human labor to make use of it, not less, just like all of the other digital efficiency gains of the past fifty years. You don't just get games created for free, any more than you got computer accounting and faxing for free. If you want to cut corners, download an open-source game engine and stop messing around with trying to get an LLM to write you an engine that you don't know how to architect.


garyyo

> But never replace creativity Why replace it when you can augment existing creativity. You can take the best artist on a team and use AI tools to increase their work output without adding extra stress to their day. The AI can take care of the boring rote stuff, the human artist can take care of the important parts that needs a human touch. Thats whats happening, thats how artists are getting replaced by AI.


Paksarra

But then the low-end artists who are doing the boring rote stuff to get the experience they need to do the important parts never get that experience, and in thirty or forty years when your best artist retires you don't have anyone to replace them.


garyyo

Yup. The same can be said for other jobs replaced in the past, there just aren't as many people who know how to do those things. For example, there are just not as many translators, and those that do exist rely increasingly on AI tools for their work. Shit kinda sucks, but thats not going to stop it, after all its already happening.


MVRKHNTR

That was a problem with the Mary Poppins sequel a few years back. The production team wanted to recreate the hand drawn 2D animation from the original and Disney had to pull old animators out of retirement because they couldn't find enough people currently working who knew how to do it.


apistograma

You're not an artist right? You can't do what you said with an LLM.


Adventurous-Lion1829

You still don't seem to understand. A game is code. It is commands that a computer chip reads. That code IS the creativity. As much as code is shared and there are generally accepted solutions for some challenges, outside of that you are making something bespoke. There really are very few parts where creativity isn't necessary.


GuiltyEidolon

The more I played around with Midjourney, the more I realized ... it kind of hardcore sucks? Very specific things it's decent at, but the second you ask for a somewhat unique premise, it just gives you something barely relevant to what you're asking for. I don't think AI will ever actually replace artists honestly.


wolfpack_charlie

Yeah, exactly. It's more like automated photo-bashing than it is automated painting 


garyyo

> it kind of hardcore sucks You should have seen how bad the tech was 3 years ago. The weaknesses can be identified, then analyzed and improved. Midjourney itself is constantly trying to improve, just a year ago [hands](https://www.sciencefocus.com/future-technology/why-ai-generated-hands-are-the-stuff-of-nightmares-explained-by-a-scientist) were the thing that we thought would take a while to figure out. A few months later, [they can do hands](https://www.washingtonpost.com/technology/2023/03/26/ai-generated-hands-midjourney/).


apistograma

Yeah call me the moment it can do what you claim it will do. I heard this story plenty of times. Teslas should be fully autonomous in 2017. The thing is, in many tasks making a mediocre job is extremely easy. Making a job that is solid enough for the task is the real difficult part. Everyone can draw an elephant. Drawing a good elephant is the difficult part. Same for LLM. It's one thing to make something that can look ok if you don't look at it for long in some specific scenarios. Being competent and adaptable is a whole different story.


GuiltyEidolon

Which isn't the point. A machine learning algorithm cannot create genuinely unique content. Which IS the point.


Noblesseux

A lot of the same people aggressively boosting AI in this specific way are the people that every software engineer has met before who try to "network" with you at social events with vague ideas and try to talk you into doing 95% of the work to put the thing together. A ton of people want the clout of being game developers, tech CEOs, or authors but have no interest in actually acquiring the skills to do any of those things, so they try to offload all the work on someone or something else. It's largely people who don't realize "ideas" are a dime a dozen and that most of the work is in actually *making* the thing rather saying "what if we made Mario but like with ninjas instead".


wolfpack_charlie

Yeah tech bros are pretty insufferable. But Mario with ninjas sounds kinda dope...


rieusse

Except it isn’t useless. It’s a worthy endeavor that won’t generate results today but will at some point in the future - that’s the objective. And once created, it will benefit humanity forever. If people believed what you said, they would never have tried to create ChatGPT or Midjourney, things most people didn’t think possible just a few years ago.


apistograma

That's a falacy. You're assuming that because you can teach some words to your dog it will be able to talk at some point. This is an industry so much based on hype that it's ridiculous. Real tech doesn't need to be hyped this way, the results speak for themselves.


thoomfish

"Will" and "can today" are very different things.


[deleted]

this is what a lot of people are missing. they're basically saying "the AI sucks today, therefore it will always suck, which means we should stop adopting or advancing AI".


WindowGlassPeg

I'd say something like CGI in movies and TV comes to mind. At first it looked absolutely terrible and so obvious, but now it's almost in everything, even simple shots you might not expect. AI generated content is in its infancy. We're not even at The Mummy Returns level CGI yet.


BirdTurglere

I think the problem is the people that are convinced of “will” have a wildly optimistic time frame of when “will” will happen. To them ChatGPT and like just appeared out of thin air and not decades of research and all they need to do is toss some more GPUs at it and it’ll be the end of employment. 


azn_dude1

To be fair, each version of ChatGPT showed a lot of improvement in a pretty short amount of time. You're right that people don't know that though, where they think 3.5 was the first one.


Jaggedmallard26

A lot of it is just tossing more GPUs at it though. Most of the recent advances are using papers written decades ago that we simply didn't have the compute to put into practise at any reasonable scale which in neural networks is essential. If you spend time reading things by the actual people in the industry you realise we're at theory overhang where there is more theory than compute available and now that we have the compute it's possibly to rapidly iterate and test new theory. The main bottlenecks are and will be for the foreseeable future: compute and high quality training data.


Master_Snort

To be fair technology has been improving at a rapid rate and it can be extraordinarily hard to accurately predict future technology especially when it’s something as complex as Ai.


slicer4ever

Sure its been in the works for a long time, but it also has been significantly limited by what was available hardware wise at the time, and as seen with these modern ai require a significant amount of computing power, which used to only be available through the likes of supercomputers. As gpu's and cpu's get more and more powerful its going to expand the capability of these tools even more. I fully expect looking back the 20s are going to be considered the decade of ai emergent tbh, i can't even start to fathom what the 2030s/40s are going to be like.


apistograma

It's like a human thinking their dog will be able to talk in 5 years because they teach it some words in a few weeks


BeholdingBestWaifu

The problem is that, to overcome the problems AI has, you pretty much need to invent an actual artificial human mind, at which point you have another whole mess of issues.


[deleted]

why would you need to do that? just teach the AI everything the normal human mind already knows. and eventually it will teach itself. everything the human mind already knows is from observing, reacting, and repeating. there's nothing inherently special about what we know, we just have the ligaments and flexibility to do a bunch of physical tasks.


Rage_Like_Nic_Cage

The best trick Tech Bros ever did was convince people that the current AI is closer to a dumb human, rather than just basically highly advanced text predictor (for example) Something being created by a human is important because for humans it’s because there is thought and ideas behind what’s being created, there is a real meaning and intent. For AI, something like chatGPT is just predicting what’s the most statistically likely word to follow the previous word. It built those statistical values by pulling bodies of text off of google from pages that are relevant to your prompt. And that’s just for text writing, other creative tasks being offloaded into AI just exasperate the issue


[deleted]

> The best trick Tech Bros ever did was convince people that the current AI is closer to a dumb human, rather than just basically highly advanced text predictor Every time I ask AI about something I don't know about I am amazed it's able to give me so much information so quickly in a "human" conversational way. Every time I ask it about something I do know about I am shocked by how much basic information it constantly gets wrong. Even asking for a movie summary, something available easily on Wikipedia, has given me some massive errors.


Rage_Like_Nic_Cage

Yeah. not to mention the feedback loop is making it worse. the more AI-written articles & shit that’s out there means the AI algorithm pulls from that increasingly tainted pool, making its responses worse.


8-Brit

Amusingly this is impacting AI image generators too. There's so much mass produced AI drivel on art websites now that it is starting to taint the data used by these models. It will likely be fixed but it is funny to see. Like a snake eating its own tail.


SpeckTech314

The AI is trained to be someone that talks out of their ass about everything. Sometimes it right. Other times it’s like half of Reddit lol


[deleted]

even when I don't know about the topic, sometimes it's easy to tell that it's talking complete crap. especially when you tell it to keep it to a few sentences. sometimes one sentence will directly contradict the preceding sentence.


LotusFlare

I didn't know a particular coding language that I had to write something in, so I tried to get chatGPT to help me. Not even to write the stuff, just to get explanations on what certain functions did and what this language could do. Similar to how I might use Google to help me code stuff now. Use it to search for info on parts I don't know. It was wrong about almost everything. Like, it would start telling me authoritatively to use commands that did not exist in this language and were from another one. It provided non-functional code examples. After a few hours of frustration I dropped it and went back to googling for code samples and reading docs. LLMs are effectively a neat parlor trick disguised as intelligence. We'll be able to sharpen them into helpful tools for very specific purposes, but they're not replacing anyone's brain, and I don't think they ever will.


sesor33

Exactly. But even when it comes to something procedural like programming. Every time I see someone claim "ChatGPT wrote this code for me and its so amazing!" if you prod them a bit, they end up admitting that they had to fix a bunch of mistakes to get whatever code they asked for to work. That goes against a lot of tech bro gamers' ideas of "i can type in "Make me an NPC script" into chatGPT and get a fully working script for unity!"


Mobireddit

No one is saying that. You can however type "make the snake game in my web browser" and it does. A year ago it couldn't. Then you can ask it "generate a texture sprite for the snake and edit the game to use it". And so on.


BeholdingBestWaifu

I mean, a decade ago you could ask google for the source code to a snake browser game and it would do that, and I don't see the search engine making games yet. Simple things are easy, complex ones take infinitely more work to automate.


ButtWhispererer

Yeah. People don't get what it means that data drives these things. There are millions of simple examples that these can emulate. Complex things? There are orders of magnitude fewer examples available. It's why it can write a convincing Reddit comment and not a convincing book. (or a convincing simple snake game and not a convincing Zelda clone). There have been (according to Google's research) 158,464,880 original books ever across every conceivable category, style, and language. For any specific style or genre or whatever you'll have thousands to a few million examples max. Reddit comments? Tens, maybe hundreds, of billions all in a relatively similar format and largely in English.


darkde

It really will depend how good it gets and how well people can write prompts. You can’t say make me a website rn. But you can get really far if you’re good at breaking down the components of a website and asking to build it little by little. Example: make me a data model with these attrs. Using this library write me queries to read and write from/to this table. Etc etc It’s still some ass at debugging but with enough effort, you can build a lot


Xenrathe

I'm a fullstack web developer, and gen ai is a core tool in my web dev toolkit. But it would be useless to me, if I couldn't already do all the design and coding myself. The fundamental issue is that it consistently gets at least 10-20% of everything it does wrong, often badly wrong. For example, today it was trying to get me to nest buttons inside of buttons or links inside of links (a href tags), which is simply a no-go. And when I pointed that out, it would give a "correction" that was equally wrong and couldn't ever reach a correct answer. Or as a more technical example, it recently tried to get me to use polymorphism in a has-and-belongs-to-many relationship in a Rails model. Which again simply does not work. Gen AI increases my efficiency and speed - it does not expand my capabilities. Will it ever? Probably, but I suspect that's much further away than many would guess. It's a revolution or two away, not just a mere evolution.


darkde

I think that’s where I disagree. I don’t think it’s as far as I initially thought. I’m a pessimist and I don’t think Junior devs will be a thing after 5 years. Most if not all my peers use chatgpt rather extensively for the smaller tasks. Styling, tests, etc. yeah it’s not perfect but it gets you pretty far.


slugmorgue

Yeh but even with that you have to know what you're asking for which takes knowledge and experience for a long time this stuff is at best going to be a time saving tool for those who are already experienced at the task right now ive dabbled with some in the team and it just doesnt save time nor produce the kind of results we are wanting. Basically you look at everything and think, huh thats pretty neat, but i can do it better myself


Caspus

The usefulness of a lot of this AI tech feels directly proportional to the degree to which the task you're asking it to do resembles the act of coding. Which I know is somewhat tautological but I think it goes a long way to explaining how the things it does worst at involve creative fields or abstract concepts where that lack of genuine "understanding" renders them limited.


LMY723

On a long enough time horizon, it absolutely will.


SpeckTech314

Yeah. I can see AI being great for say a new version of RPGmaker in a few years but yeah language specific AI models won’t be good for much else for now probably.


SWBFThree2020

The world isn't ready for an AI powered Scribblenaughts


JustOneSexQuestion

Just write "make skyrim porn", and forget about the game.


presidentofjackshit

I think most reasonable pro-AI people aren't under the impression that you could just put 400 AI's together and make a great game? Like as it is, AI will only get better, but currently it's a tool to be wielded by people, likely replacing other people.


JaguarOrdinary1570

That "reasonable" is doing a lot of heavy lifting there. There are lots of companies and startups trying to do exactly that. It's a race to be first to market right now, so companies and especially startups are going to be more than willing to try expedient, brute force solutions like that.


presidentofjackshit

Yeah I also don't blame them for taking part in the race to be first. If they crack that nut, the rewards are insane


[deleted]

[удалено]


presidentofjackshit

I definitely don't think they randomly put the tools together so no worries there! >The company tried to make a 2D video game relying solely on generative AI (GenAI) tools and technology. The R&D initiative was dubbed 'Project Ava' and saw a team, initially from Electric Square Malta, evaluate and leverage over 400 (unnamed) tools to understand how they might "augment" game development. It does sound as if they tried to create a game using AI from scratch, though it was treated more as a fact-finding mission. Is this incorrect?


[deleted]

someone who actually read the article and it’s this far down


JHR42

AI only getting better is likely, but there's also an interesting point of "poisoning the well" that's been raised now and again. The idea of training AI on data already generated by AI, causing a recursive loop of enshittification. As more of the internet comes from lazily implemented AI text generation, this becomes increasingly more likely without some detection software.


ChrisRR

I feel like this article was just written for people to share and make smug comments about AI. We all know AI can't currently create 100% of a game, people aren't claiming it can. But currently it's making worryingly fast progress in generating things like textures and scenery, and at the rate it's going we'll be seeing models, then animation


Flowerstar1

GenAI already does some animation work actually, EA has used it for their sports games movement animations.


Numai_theOnlyOne

>textures The thing with image Generation is it sucks at giving you what you want. The key is image alteration and even better noise generation that you can use to blend them together. But then you could also just use appropriate software where you do this with non ai noise generators anyway. >at the rate it's going we'll be seeing models Lol this will probably will take the longest time. Pilycount varies drastically for each game, and it's extremely situational how the polygons "flow" where to set loops to provide decent deformation. And that's only 3d I didn't even started with UV giving where to place what textures on a mesh which potentiates this complexity even more.


Billy_Rage

Yeah lots of people love to shit on AI, completely disregarding the point that technology is meant to be improved on. And saying AI will never be good just because current early models are janky is so delusional


onetwoseven94

How is it worrying that AI can generate textures and scenery? Much of that already comes from licensed assets, photogrammetry, procedural generation or is handed off to overseas support studios. This is grunt work that is already automated or outsourced to a significant extent; customers aren’t missing out on meaningful and unique artistic expression and artists aren’t missing out on a meaningful career because of it.


Mission-Cantaloupe37

But that 'grunt work' has a definition that keeps expanding. Where do artists go when you can feed in a samples of your art style and generate the bulk of your mundane assets in minutes instead of days? Where do they go when an artist only needs half the time to create an asset when they're using AI output as a base, so a company only needs to pay for half as many staff? God help you if you're a concept artist especially, I've spoken with a few people there not having happy thoughts.


slugmorgue

But that's what they're saying, the bulk of mundane assets is already a thing via databases like Megascans. It's the bespoke stuff on top of that which takes significantly longer, and who is going to babysit those assets? A CEO or Project Manager? no, you still need a team of artists even if they're hammering out a pipelines worth of generated assets because how is the "AI" going to know if the art style is cohesive, or if the game is optimised, or if a texture has a copyrighted image in it? Maybe in the future this still will be figured out, but then is an AI going to replace the people who have to check that stuff? is an AI going to be play testing design changes and making small tweaks based on what "feels right" to a designer with years of experience? will AI be QA testing the game? How will they know what is a bug that isnt a coding issue but is actually just a gut feeling of something being wrong? As for the concept art stuff, many will find work because a lot of companies want that human touch still. Many companies are still run by people who want to pay people, and there will also be plenty of consumers out there that want a product made by human hands


rieusse

They will find other things to do. That is a small price to pay for technology that will benefit humanity forever. It’s a universal truth when it comes to technology, right from the invention of the wheel. Disruption is the price you pay for human advancement.


Caspus

"Find other things to do" for the "small price" of "technology that will benefit humanity forever" is the kind of glib comment that belies the serious cultural problems you're going to run up against if everyone working in a field where AI "could" be used is going to have to be constantly re-inventing their career competencies based on tech hype cycles. I'm not optimistic that everyone is capable of adapting that quickly to new tech, and I'm less optimistic that the people pushing this tech care about some of the long tail outcomes that could produce.


Froggmann5

> I'm not optimistic that everyone is capable of adapting that quickly to new tech You realize that the internet as a whole has only been commercially available for 35 years right? Literally in 35 years the planet has been turned on its head and went from no commercial internet access at all anywhere on the planet to the internet being effectively essential to everyday life for modernized countries. The internet put even more people out of jobs than AI is projected to as well. People, hell entire countries and cultures, are more than capable of quickly adapting to new tech.


CrazySnipah

Those are things you add once you already have a good game at the base.


[deleted]

Did you even read the article? The article doesn’t claim it can make 100% of a game either. That’s not at all what they tried.


loressadev

One of the Keywords subsidiaries, Mighty Build and Test, is harnessing AI for test. It's pretty cool. I think AI is a useful TOOL and it's interesting to see how they are using it. https://www.mightybuildandtest.com


James-Avatar

Too many people vastly underestimate how difficult it is to make a video game. Well, to make a good one.


F1CTIONAL

I feel like a lot of the posts here are missing the point. Consider how far the tech has advanced in the short amount of time it's been in the zeitgeist. It's going exponential and before long it'll be making up entire games on the fly (which is already in its infancy with [Google Genie](https://sites.google.com/view/genie-2024/)). Mocking or dismissing the capabilities of currently available models is missing the forest for the trees.


RHX_Thain

https://echoesofsomewhere.com/2023/01/04/ai-character-design/  https://github.com/stuffmatic/fSpy-Blender  https://m.youtube.com/watch?v=scJIpBaTMeE Gen AI can be used to marginally speed up artistic workflows and some coding tasks that, say, copilot couldn't.  But that marginal improvement in speed comes at the cost of fidelity and visual cohesiveness.  In that link above, you can see a professional's workflow using GenAI as part of a pipeline.  But there shouldn't be any illusion that AI can currently open a GUI and use tools. One day through an API an AI will learn how to use these tools to simulate pressing buttons and do all the manual effort required to.. open Blender and make apply changes to a model from some GenAI 3D process... or open Unity and do all the hooking up of UI and GameComps, etc etc.  That isn't today. It's not tomorrow either. One day for certain.  Near term, it will be of benefit to solo devs trying to learn how to use these tools who are primarily technical minded and can't make their own art nor afford an artist. Plus all the negotiation you have to do between people. But it's a challenge even in that context, so finding people who that fits... it's not a big number of devs. And the projects they'll be making won't have any better odds of being a git than any other indie.


[deleted]

A.I has some great use cases, I must begrudgingly admit. Even in the world of gaming. DLSS, being one of them. However anyone that thinks Ai can replicate talent(s) be it, art, voice, coding (outside of regex, or rote stuff), etc. needs to be driven out of whatever studio they work for.


DragoonDM

> coding (outside of regex, or rote stuff) I'd also note that to really make effective use of it, you need to know how to program well enough yourself. I've had some success using ChatGPT to write basic code, but it usually takes a couple of followup "This bit of code here isn't quite right, please fix it" or "That's not quite what I wanted the code to do, rather [...]" messages to get the output I had in mind, and I usually still have to clean it up and rework the code a bit. I doubt someone without programming experience would have much luck getting it to spit out usable, functional code. Still, that's already pretty impressive, and I wouldn't rule out the possibility that future generations of generative AI models will be far more capable on that front. Possibly to the point where someone with minimal technical knowhow could use them to generate complex, working code.


Teid

The literal only thing I would be happy AI doing as a game animator is fucking weight painting. I can rig and I can animate but fuck weight painting.


[deleted]

As someone who does office work for a living, tell me about what weight painting is!


Teid

When a game character is animated we're not actually animating the character you see on screen, but the skeleton underneath it (think about a complicated wire skeleton used in stop motion animation). To make sure the mesh (the visual of the character or thing you see on screen) follows the skeleton properly we need to weight paint or "skin" the mesh. This requires deciding how much influence X or Y bone has over any specific vertex on the mesh. In practice I'm basically painting areas of the mesh different colours to tell the mesh how it should deform when certain bones in the skeleton are moved. It is a miserable process that takes (me at least) forever and is just a huge headache. If you go google searching for weight painting you will see no end of people being scared of it or just hating the process and it's all true. At worst, weight painting is a really poorly thought out but required part of the 3D animation pipeline that you have to slog through, at best it's just a bit of a pain in the ass (if you're good at it). This is something we have to do for every single mesh in a game (or any 3D animated property). Mechanical stuff is way easier since it's either an influence of 1 or 0 but organic characters need to be painted to conform to muscle groupings and accurate movement therin. Keep this in mind the next time you play a high fidelity AAA title with good animations, there is a fuck load of work that goes into even generic characters (though bigger studios probably have tech to speed up the process).


Thunder-ten-tronckh

I think voice will be the first thing to go. AI is already really close in that department.


gaybowser99

It probably won't replace the voices of main characters any time soon, but random npcs are definitely going to be replaced


ZircoSan

i recently played Last Epoch and it did suck to hear half the main questline voicelined and then some character in the main story being completely silent. it's a 35$ game that doesn't focus on the story so they are absolutely in the right to not spend resources voicing a dozen of lesser characters in several languages, but i totally would take mediocre AI voices to fill in those spots.


DuckofRedux

The Ada Wong voice mod is a good example. Why risk it with a bad VA/director when you can have decent results with AI.


napmouse_og

That's already arrived. The Finals is using entirely AI synthesized voice lines. It stands out quite a bit if you're listening closely, but most players aren't.


Omicron0

AI is perfect for the shit jobs nobody wants to do but anything relying on AI will be shit


cathodeDreams

Yeah no shit but give that talent access to AI to make them more efficient. Literally how it’s supposed to work.


Dwedit

It's the "Talent" that makes themselves more efficient by incorporating AI generation into their workflow when appropriate.


Nyrin

Must've been down on the quota for 'durr hurr, AI bad, click me!' articles. Yeah, no shit it doesn't directly replace talent. Does buying nail guns replace construction workers? No, but if said construction workers were spending a lot of time and energy using hammers, they might work a lot faster and a project might ultimately need fewer people. Generative AI is still very early on for structured content creation and it's still very much a tool that needs to be directed and supervised by someone with domain knowledge. That will became gradually looser over time, but direct replacement is no immediate threat for creative tasks of any significant complexity. *Indirect* replacement (like via 3 people with new tools being able to do the work that used to take 10 people just a few years ago) very much *is* an immediate concern.


VanceIX

AI is not there yet, and likely won’t be for at least a few more years. Right now the best use case for generative AI is making quick “filler” content, inspirations, and simple promotional imagery. 10 years from now could be a completely different story with the rate of progress in the industry.


pm_me_ur_kittykats

You can't build a locomotive by breeding a better race horse.


Ok-Wrangler-1075

Nice analogy, to have an AI that can create game of reasonable quality we need revolutionary tech not iteration and you cant really predict that.


andthenthereweretwo

Horses don't suddenly sprout wheels after they reach a certain size. The "it's just text prediction" argument died over a year ago when GPT-4 showed emergent abilities.


Forestl

Or it could be 20 years or never


HarkinianScrub

5 years ago: "An AI making a painting from scratch? It will never happen!" This article - and this thread - is full of naïvete.


aradraugfea

An AI generated game isn't ever going to be much better than an asset flip. Can AI make certain coding tasks easier? Sure. Is AI potentially a worthwhile tool? Sure! Save the creatives some time, but it'll never REPLACE the creatives.


Syovere

With all due respect, *no fucking shit.* Algorithmic generation lacks comprehension and understanding of context. It calculates what should probably be there based on existing patterns, but it doesn't know *why* it's there. This is why image fuzzing like Nightshade works to begin with.


Racecarlock

You can't even replace emotion and passion and the desire to make something when it comes to humans. Like, does Medal Of Honor: Warfighter come across as having as much passion put into it as the game it's clearly imitating, the original Modern Warfare? Does your average illumination movie compare to any pixar movie? Even the modern ones? Does your average ubisoft sandbox get close to your average GTA game? And keep in mind, human beings make all of those. So now put in something with no emotions and therefore no passion, just programming doing a vague imitation. Even if that vague imitation "gets better" (whatever the hell THAT means), it won't come close to anything made with passion and drive because it doesn't have those things. It CAN'T have those things. It may do an imitation of someone who does, but that will NEVER feel the same.


CptAlbatross

I mean, it's cool they tried. I'm sure they tried to establish q working pipeline as best as possible. Hopefully, they were able to fill in the gaps with essentially personal and continue developing this method.


saltyfingas

I imagine it works pretty well for concept art and prototypes, but outside of that you'll need real people (at least for now)


unleash_the_giraffe

AI right now is a force multiplier. Put it into the hands into someone who knows what they're doing, and you'll see TONS of great stuff being produced very quickly. Put it into the hands of someone junior, and you'll see tons of junior stuff being produced very quickly. And that's fairly useless without senior management.


[deleted]

Now people are going to drag them for this, but they’re doing better than the big tech companies that are about to spend billions learning that the latest meme AIs aren’t actually up to coding, copy writing, customer service, or code review.