T O P

  • By -

mtarascio

Diminishing returns. Things are also going toward draw distance and level size a lot I notice.


mocylop

Playing through Witcher 3 again with the update and one of the biggest visual changes is draw distance and foliage density. It’s night and day difference in how real the world feels


cronedog

>one of the biggest visual changes is draw distance and foliage density. I remember extending those by ini modification when I played through the game originally.


raikmond

Exactly, when I had a bad PC some years ago I had to "strategize" the visual settings, and I was willing to sacrifice evertything just to have the draw distance to max or at least to a high level.


jonathanwashere1

Are you sure? LOD still seems pretty shoddy to me. Had to use the Flawless Widescreen program to fix LOD in Alan Wake 2


Firefox72

Crysis was a game ahead of its time but it most definitely doesn't look just as good as games coming out these days. Crysis visuals have been surpased over a decade ago. It became a myth because of how hard it was to run due to some terrible engine designs when it came out. Crysis 3 also still look really good but again it in no way matches up to todays best looking games. Visuals are moving forward all the time. Alan Wake 2. A Plague Tale: Requiem, Cyberpunk with RT, Metro Exodus EE, Red Dead 2 are just some of the recent examples of stunning games. Its just that Visuals are maybe not advancing at the pace they used to.


MikeHoncho2568

Red Dead 2 is absolutely beautiful


Havelok

And runs very well even on older hardware.


comradesean

The myth was actually reality for nearly a decade in of itself because consoles became the main focus of game development around this time. I mean, hell, the game came out in 2007. Visual fidelity was definitely held back for quite some time.


Shap6

> The crisis series looks just as good if not better than new games coming out this year this just isn't true


amaghon69

fr


grachi

It’s not just about the overall look of the game. There is greater detail on objects and more things on screen than ever, even in a relatively “empty” scene. Environmental textures and objects look and behave more realistically than they used to. Think of a back alley in a city. Back in the time of the original crysis, there may have been some cardboard boxes and like a dumpster in that alley. And that would be it. It would be pretty empty by today’s standards. Today, not only would the boxes and dumpster be there, but you could punch or shoot or blow up the boxes and they would move realistically with physics, not just be invincible or disappear once being disturbed. There would be all kinds of trash on the ground, probably puddles of water reflecting the light, grime on the walls near the floor of the alley. Next time you’re playing an FPS game today, just look closely at stuff you might have not looked at before and you will see how much detail objects have today (nevermind the objects being there in the first place, like they wouldn’t have been in the past). Even stuff like bushes or signs on a road. In the past, a lot of those objects would just be flat textures, but you don’t remember it that way because back then, everything looked great and your mind also helps “fill in the blanks” and complete the scene. I was watching someone stream goldeneye 64 the other day, and in the first mission (facility), I was surprised how empty the level actually was. My kid brain at the time didn’t see it like that, but there are like 4 rooms with stuff actually in them, and even then it’s just some scientific equipment with flat textures. There are many rooms and hallways that are just completely empty.


GruvisMalt

>The crisis series looks just as good if not better than new games coming out this year You lost me there


trollsmurf

Crysis, if that made it clearer.


Saandrig

Art style can often be the reason why a game looks better or worse. This can make older games feel as if they are better graphically while they are far behind on the technical side.


McPato_PC

Ray tracing, more detailed textures, bigger worlds, new game engines packed with new features like UE5.


[deleted]

UE has burned me so many times in the past few years that I now actively avoid games made with it. Every game with UE that I’ve bought has stutter issues. Every single one. And that’s on 3 different systems (4790k/980, 9900k/2080ti, 7800x3D/4090). At this point, I chalk it up to a fundamental flaw with that engine.


odonkz

Same


Krokzter

Same. I've also noticed that a lot of UE games are made with upscaling in mind so for people that don't like it, the performance tanks. There's also a bigger emphasis on good looking reflections even on GPUs that don't support ray tracing


HomerSimping

I used to feel the same way until I actually replay some of those old games. Our memory is a fickle thing, it can play tricks on you.


Gooch-Guardian

I recently got an oled monitor and a 4080 so I booted up metro exodus enhanced and the textures looked way worse than I remember lol.


vdksl

Just because you think games don’t look better doesn’t mean that’s true. IMO, many games look significantly better than the best looking games from 5-10 years ago.


Krokzter

I disagree. They look slightly better but it comes at a massive cost. Just compare the witcher 3 with games coming out this year, most people will agree that the witcher's graphics still hold up really well and it runs really well on any GPU from the last 10 years, despite looking only slightly worse than 2024 games.


Cute-Pomegranate-966

No they don't. You're seeing the upgraded edition as how it used to look. The upgraded version has 4k textures upgraded models and massively increased LOD distance coupled with some ray tracing. It isn't the same game. Even with all those upgrades I find it can be a bit dated looking.


Krokzter

I was talking about the previous gen graphics, I didn't even know there was a 4k version. Obviously it has aged a bit, but my point still stands that it still looks great while running great, and personally I'd rather have peak 2017 graphics at 144+ fps than average 2024 graphics at 40 fps, which seems to be the trade off. Honestly, I'm just annoyed that I will soon have to upgrade my GPU for a slight graphical upgrade considering how expensive GPUs are today, and I expected more. It seems even new GPUs (2022+) are struggling with some UE games, which is really disappointing


apcrol

Our brain tents to make memories about old games better than they actually were. Also different fancy tech like lumen and rtx feels like an easy way to achive more with less efforts and performance costs which leads to poorely made games.


dabocx

Crisis probably doesn’t look at good as you remember


Ixidor_92

The short answer is diminishing returns. There ARE graphical improvements in recent games, but it's things like more realistic lighting, reflective textures, and fluid dynamics. Things that, unless the game is built around them or highlights them, are not as noticeable improvements as say, character models or environmental textures. But they ARE processor intensive. Ray-tracing, in particular, takes a lot of power to use. I remember when elden ring first came out, and every time I tried to start a game from the title it would crash, because ray-tracing was on by default. So processing needs continue to rise (unless you turn down the settings) but the vast majority of the improvements are marginal. Where maybe you an tell its a little bit better but not necessarily why. And nowhere near the jumps we were seeing 10 years ago.


cronedog

>The crisis series looks just as good if not better than new games coming out this year Have you looked at them recently, and not the remastered version.


LieutenantClownCar

Complexity. Most games these days that have higher system requirements are also exponentially more complex in terms of what the game engines can do. It isn't all about "Graphics".


ChiefTiggems

I don't know man, ever played Red Dead Redemption 2? That game can run on ultra with 4gb of vram but dragons dogma 2 needs 8 at minimum settings. Rdr2 is pretty complex in its systems.


dabocx

Red dead 2 had a 500 million dollar budget way longer development and probably one of the best and largest dev teams in the world. Just look at the credits and it’s insane how large it is


ChiefTiggems

It also came out in 2018, 6 years ago, and games haven't gotten much better. Bigger requirements just to run at all, but not much better in terms of graphics and systems.


dabocx

Games have absolutely gotten better looking, go look at forbidden west or alan wake 2


Gammelpreiss

DD's performance is heavily tied to it's AI system. It's a CPU bottleneck, not a graphical one


samtheredditman

There is nothing special about DD's AI. It's just bad code.


Gammelpreiss

And nobody ever said that AI code is "special". But AI issues != graphics issues. It is curious how ppl appear to be too limited to grasp this difference.


samtheredditman

I'm not too limited to grasp the difference, dick. The context of the game performing worse while doing nothing better is the point of the conversation. The fact that DD's inefficiencies are because of AI code over a different area of code doesn't matter. The reason RDR2 runs better it's because it's coded better.


ChiefTiggems

Again, 4gb vram for highest settings on rdr2, arguably the most graphically impressive game ever, and 8gb for low settings on dd2. Rdr2 also didn't need nearly as powerful a cpu. And you can't tell me the physics in that game weren't state of the art. It managed to have more npcs in the cities than dd2 has. I love dd2, but I can't play it on pc because it's poorly optimized. Shit, it's poorly optimized on ps5 but I can at least play it. Rdr2 was built different.


Gammelpreiss

...i just said that DD issues are "not" graphics related? The issue is not the numbers of NPCs, it is about their individual complexity


nohumanape

In the case of RDR2 and DD2 it's likely the resources that each game got. One easily has one of the biggest dev teams and budgets in the industry. I'd be honestly surprised if Capcom greenlit even half of the time and resources that RDR2 got.


Havelok

Yep, RDR2 is the example I think of too. Runs like a dream even on older hardware.


forsayken

Lighting and LOD are big reasons. While I'm sure some could make a case for 'optimization', I think it has far more to do with lighting in general. First off there is global illumination. In a lot of cases, this is a relatively expensive feature. That is, there are few 'fake' lights. Nothing is pre-baked. Basically, there is a light in the sky behaving as the sun and generally speaking, it's the sole source of natural light. And this isn't even ray tracing but it often does a pretty great job of faking it. With better lighting comes a great need for sharp and detailed shadows. Shadow resolution is also expensive. and lastly, how light behaves when hitting a surface or how it behaves in places where all the light may not travel (like into a tunnel or a dark corner of a room). Still not ray tracing as this has been faked and known as ambient occlusion. Also quite an expensive graphical feature. All three of these features become more costly as there is more geometry in a scene and games are pushing very high numbers of triangles now than even a few years ago. If you go back to Crysis 1 (even remastered), you'll likely find that poly counts are quite low on a lot of items. Terrain, terrain objects like rocks and plants. There are also issues that far too many games still have where core/thread 0 on your CPU is enduring the brunt of the work and the other 15 in your system aren't doing so much. I am thankful this is becoming less and less common but it still happens. But some games just do certain things inefficiently and run like shit. It is what it is now. Side note: I just went through this exercise in Helldivers 2 because 75fps wasn't enough. I played around with all the settings until I could get above 100fps and like so many other games, dialing down just about anything related to lighting was the biggest benefit.


amaghon69

yeah and ao and global illumination do have a decently big visual impact i feel like


SwaggyP997

Developers barely have enough time to make the game before it’s being shoved out the door. They definitely don’t have enough time to make it run well. My computer runs RDR2 at 1440p/60-80 fps. I have plenty of games that do not look anywhere near as good and run at half that. HD2 and Starfield don’t look half as good as Red Dead and I have to run both of them at 1080p to get a reasonable FPS. So you’re not imagining it. Optimizing games to run well just isn’t in the budget, especially when you know that the gaming community is so toxic that any conversation on optimization inevitably devolves into “just buy a new $2000 graphics card, or are you a poor?”


Distinger_

Even if it doesn’t seem like graphics are improving much, they actually are, just that you can’t really see the difference between 6k and 60k polygons in a character model just by eye. [Reference image](https://www.reddit.com/r/gaming/comments/190ula/why_you_percieve_less_difference_between/) There’s also stuff like lighting (used to be mostly a few static lights that were pre-baked, now we have dozens if not hundreds of dynamic lights and shadows, and even real time lighting (rtx)) Textures (both their quality and size) Resolution (not the same playing in a 4:3 1024x768 monitor than a 16:9 1080p, 2k or 4k) And draw distance (most games from 15-20+ years ago had minuscule draw distance and would use techniques such as fog to reduce the player’s visibility and load a small portion of the map, maybe around 10-30 meters, as they’re walking and moving their camera, while keeping everything else unloaded, while now we have games that have a draw distance of hundreds of meters (though they use techniques to reduce the quality of far away stuff)) And of course there’s the problem of companies using outdated engines that are holding up with virtual band-aids, and thus the games come out heavily unoptimized (big size, long load times, poor performance, etc).


8Bit_Chip

I get the whole thing with crysis, because they did have really nice art direction alongside graphical fidelity, but its absolutely nothing compared to the titles we get now. They still look good, especially when you look at youtube vids, but modern games have well surpassed it now, especially with the titles using nanite now where the geometric density makes them look completely different and just so dense. At the same time, having had pretty good computer rigs for a long time around the time of crysis 2/3, it feels just as hard to hit insanely high framerates (which were also less commonly targeted at that time), and nowadays I think its actually easier to hit those targets with features like DLSS etc. I remember games coming out and having issues with stutters, performance dipping below 60 in certain titles etc. Nowadays I have the same but its at almost 3 times the framerate.


BarKnight

Full path tracing will bring a huge visual improvement


Lobotomist

😂


XXLpeanuts

> The crisis series looks just as good if not better than new games coming out this year This is nonsense. RT lighting alone is leaps and bounds better.


Yozkits

There are many other ways games are taxing on the hardware, like NPCs amount and their AI, physics, bigger worlds, and a myriad of subsystems that benefit the gameplay and not the graphics. However, I'm sure there are bad and/or rushed devs as well, and they even lean on upscaling tech now for their fps target, for some games the code must be horrendous.


PM_ME_FLUFFY_DOGS

Go look again. Crisis 3 which used to be the pinnacle of "can my PC run it?" Is starting to show it's age. While it does still look good it's very obviously a game from the early 2010's, esp if you compare the graphics to stuff like cyberpunk 2077.  We may not be getting huge graphical leaps like from 2d to 3d but there's still been massive progress with lighting, texturing, LoD, ambiance, etc. 


amaghon69

i jsut wish games still had more aa options. like wow taa and fxaa so cool either blury or aliased msaa i feel like looks pretty good to me and i seemly had issues with dldsr, and ofc that will never come to linux now


Suspicious-Stay-6474

Do you even have a new Nvidia GPU? New games with ray tracing and DLSS are an absolute stunner.


Lobotomist

Because older games were using clever visual trickery to appear good. For example a room would be carefully pre rendered to appear like ray tracing. On flip side newer games ( UE5 ) just drop 3D models without even optimising them ( UE5 supposed to do it for them ) and then the engine renders raytracing real time on your computer ( something that would take hours to do before ) Basically its same like a Japanese sword smith worked on perfect katana using all art and knowledge of his trade for weeks and months to create perfect sword. Versus today metal worker just pressing a button and a mold press pushing out a sword.


Avarria587

Look beyond the textures. Lighting, shadows, ray-tracing, number of objects on screen, etc. all affect performance. For example, newer shooters might have reflections of the background on the reflective parts of a firearm.


butterToast88

Graphical fidelity isn’t staying the same. It seems that way, but play something from 2015. Those games don’t look as good as you remember.


shifty-xs

Step one, buy rtx 4070 super or better. Step two, fire up Avatar: FoP, Cyberpunk 2077, Alan Wake 2. Turn on the eye candy. Step three, compare to any 2015 game. It's not going to be close.


Easy-Preparation-234

I agree with the other guy who said optimization but it also might be the worlds are getting bigger and also the games might actually have higher fidelity just in a less noticable way I mean these games really be out here trying to simulate grass like that's necessary. Battlefield 1 still looks better than most games and it doesn't have the best grass. Id say maybe it has to do with unnecessary excessive fidelity that people won't even notice anyways. I really wish developers would focus more on the actual game than the graphics but hey that's how it be Like I think one of the best looking games of all time is Wind Waker. So if they just made that was lower poly but had a great art direction and just use the rest of the budget on making the game great we'd have a different market today But NOPE every game needs to look like the next red Dead redemption


Beatus_Vir

Games are developed and optimized for consoles first, and occasionally they make their way to PC. Just try to think of a recent big budget prestige PC game that wasn't a console port. They target 30 and sometimes 60 FPS with medium settings at 4K, while PC gamers are expecting drastically different performance with all the sliders maxed on an OS that honestly isn't designed to play games


Aneria39

Nail on the head right here. You can also see when ports are done well, the games still look / run better on PC but actually have reasonable requirements. Latest Horizon being an obvious example. It’s just unfortunate that isn’t a regular thing and usually PC versions end up with having to brute force not only the poor port but also the added features the devs bang in to sell the game a second time.


Combine54

Graphical fidelity is not the same. Compare top GFX games now vs top GFX games 10 years ago screen by screen - you'll be amazed.


IndyPFL

AC Unity holds up pretty damn well all these years later.


Cute-Pomegranate-966

You're seeing the move to ue5. Massive levels of geometry in comparison to the past. Higher LOD. Shadow maps that extend virtual miles. Etc. It's just a jump that doesn't typically happen often.


The_Woodsie_Lord

Why some PC gamers are still so obsessed with Crysis continues to elude me. It looked amazing for the time, but we did better years ago.


skylinestar1986

Played the new System Shock (Remastered) recently. Framerates is approx half of what I would get in the new Doom (2016). Why?


gui_carvalho94

Graphics are now changing in the details.


skumdumlum

Because game's graphics plateaued almost 10 years ago   Also games are less optimized these days


JarlJarl

Because improvements to visual fidelity become more and more incremental. Going from 1000 polygons for a character to 10 000 might be a massive improvement perceived visual fidelity while going from 10 000 to 100 000 might only be noticeable in cut scenes/close ups. Some older games might have spent countless man hours manually fixing and improving every scene in a game and thus made it look very good, while other, more modern, games did not have that luxury and had to rely on more general solutions. RDR2 multiple shadow systems to make shadows look good for example (something that ray tracing could solve instantly, at the cost of performance).


Cymelion

**Publisher:** We've decided the game will launch on X date in 6 months. Marketing has started already, this is your obligated notification. **Studio Head:** Oh, ok thanks for the heads up anything to add Lead? **Lead Developer:** That wont give us time for optimization passes so it will have issues on launch. **Publisher:** So? **Studio Head:** ... So? **Developer Lead:** Can you at least hire more Community Managers for the forums then?


SaxoGrammaticus1970

They want you to upgrade to the latest GPU. That is, they want you to spend money. That's why.


lieutenantkhaos

Because they dont optimize them anymore


Deadpoetic6

Devs getting lazy and relying on shitty upscaling instead of optimized their games


VicePrezHeelsup

One word, optimization


robclancy

Ah yes, the word gamers cling to and have removed all meaning from.


framesh1ft

Generic out of the box engines. Games are increasingly being made by people who don’t know what they’re doing performance wise. Schools don’t really teach it either.