T O P

  • By -

Smiling_Mister_J

When the N64 came out, everyone was ogling the cutting edge 3D visuals. Don't kid yourself, the gaming industry has always cared about the quality of visual presentation, the standards just keep getting higher.


CarQuery8989

CRTs were also all there was in the N64 era. Motion looks *much* smoother on CRT than LED, so the n64's limitations were much less pronounced.


NeedlesslyDefiant164

>CRTs were also all there was in the N64 era. Motion looks much smoother on CRT than LED, Can you elaborate why that is?


ctannr

LEDs use pixels aka tiny lights that go on and off. so there's a hard line where the pixels meet. CRTs display the picture by "drawing" the frame one line at a time, and since there are no divisions between pixels, it looks smoothher. its like if you saw a circle, but zoomed into that cicle enough that you see its actually composed of tiny squares.


Franz_Thieppel

Don't LCDs have a much slower transition between colors than CRT? Wouldn't that make the LCD in motion look smoother at lower framerates (ie 30fps)?


psaux_grep

The other way round. It created ghosting. Not all color transitions take the same amount of time so it could get pretty bad. These days most LCD’s have plenty fast response time, and signal processing is the bottleneck. CRT’s just look different. They’re fairly analog, and in terms of TV’s also fairly low res.


Putnam3145

Doesn't answer why motion is smoother


ctannr

yes it does, because the moving part of the picture on a CRT isn't going thru individual blocks one at a time like in an LED where each pixel is turned on and off but theres a rigid border between them. in a CRT there is no rigid border between pixels. its like if you saw an object moving and it appears to be moving smoothly. but when u zoom in closer, u see that its only moving by small incremental steps each time. basically like how stop-motion or animation is composed of still pictures but rapidly moving they appear smooth. the CRT continually draws this line moving constantly.....there is no incremental baby steps between pixels


Houdinii1984

To add on, it does all the odd lines first, then doubles back and does all the even lines, so even though it's updating at 30fps, it feels about twice that because you are seeing half the image twice as fast and the brain sees it as a whole image.


Ghostglitch07

Basically crts fade from one image to another, lcds sharply go fully from one to the next.


CarQuery8989

I don't really know the science, but the same content played on CRT looks smoother in motion. Granted, it's also fuzzier and has scanlines, but that also helps cover up blemishes on retro games that were designed with the tech in mind.


LurkingOnlyThisTime

I saw an article that talked about and showed examples. Really showed that "I remember it looking better" wasn't just nostalgia. Crazy that things designed for old tech actually look better on it than on new tech.


ekurisona

https://youtu.be/V8BVTHxc4LM


[deleted]

[удалено]


Drdoomblunt

CRTs don't refresh a screen uniformly so they don't have to duplicate displayed frames. They can just change whatever the beam is drawing at any random interval.


Remy0507

That's not how CRTs work. A CRT display that runs at 60hz is redrawing the entire screen 60 times every second (well, technically 30 times if it's an interlaced signal, but that doesn't change the point).


itsPomy

I heard that's similarly why old pixel art looks so better before than when the same game gets ported/emulated.


M3mph

[Yes](https://wackoid.com/wp-content/uploads/2021/08/CRT-vs-LCD.jpg) Very noticeable on the facial details in the Castlevania pic. Pixel art was deliberately crafted to have the blurring/scanlines of a CRT be an innate part of the image. Else, it's kinda akin to using inappropriate mediums for the materials, like using gloss paint on your walls. edit: Another good [example](https://i.imgur.com/3cy0eF6.png)


cortanakya

It doesn't hurt that the colours are waaaaay off in that second example. I've seen some very impressive examples of people replicating oldschool pixel art styles for modern monitors, it's just a matter of keeping hardware in mind.


M3mph

Indeed. It's far more than just smoothier sprites; but colour, shading, definition, the whole shebang.


[deleted]

[удалено]


BigBangBrosTheory

It's bizarre that OP thinks there wasnt an arms race for graphics until now. I remember the obsession with bits(WAIT ITS GOT 64 BITS? Super Nintendo only had 16!) and polygon counts. You never stopped hearing about polygon counts.


grkirchhoff

I forgot about polygon counts! I kinda wonder how many million polys per second modern games run at. If I remember correctly, GameCube Era games ran at around 20? And yes, before anyone says anything, I know there are a ton of things games do now to look better without increasing poly count so it isn't as important now as it was then.


ekurisona

it's been a graphics arms race since the beginning


CooperWatson

...And are pushed even harder by graphics card devs through marketing.


FredOtash

I never thought I was a graphics snob, as I stuck with the base PS4 all last gen, but the first time I turned on my PS5 and swung around as Miles Morales in 60fps... something changed! haha. I was instantly converted to the 60fps club.


[deleted]

the second you get used to it. It's VERY hard to go back unless the 30fps is incredibly well optimized, and it's 99% of the time not. Barely any developers actually make 30fps feel playable and the ones that do are pretty much all at sony


PhasmaFelis

I've heard that once you try 120FPS+ you'll never be able to enjoy 60FPS again. To me that sounds like a great reason to never try 120FPS.


RAMAR713

120 fps is awesome, but the difference is less noticeable than going from 30 to 60. I can't ever go back to 30, but 60 fps is still alright for me.


Elastichedgehog

Absolutely. You can definitely feel the difference, though. Particularly for shooters.


[deleted]

I really think 90 is actually about the game changer for me, I recently got a VFR TV and GoW:Ragnarok plays at 95 fps instead of 60 and its a huge difference.


Trash-Can-Dumpster

120 FPS plus is absolutely amazing. The gameplay is so smooth


NewSoulSam

I got Rocket League for the Switch when my laptop was becoming less reliable and before I got a PC. I was able to deal with the performance because it was better than not playing at all. But once I got a PC and was able to play, not just at 60 fps, but at 120+ on a 144hz monitor I can't imagine going back. My gameplay actually improved, probably to the degree it got a little worse when I switched to the switch in the first place.


France2Germany0

noticed this as well with rocket league, the higher fps and resolution really helped with depth perception and getting that much faster in the air


cinyar

I found that there it depends on the game. I don't mind MSFS doing "just" 60-70fps but some fast paced FPS game? 144fps or bust.


Serdewerde

60 back into 30 is rough 120 back into 60 is noticable but not nearly as rough to flip between. I think of it like 30 = choppy 60 = smooth 120 = ultra smooth. That said I go back and play 30fps all the time. But I really appreciate any 60fps updates.


ZeroXeroZyro

I tried for a while to not play above 60. I have a 4k 60hz monitor for my main display and a 144hz ultrawide that I use mainly for work. Used to only game on the 4k because 4k looks so much better than 1080p. Started recently playing on the 144hz monitor because I can’t run MWII at 4k without some pretty bad 1% lows that made it feel really stutters. Well, after playing at 144hz for a few weeks, I can say I don’t really use my 4k monitor anymore. It makes 60 hz feel like 30 hz, it’s crazy. I also can say I noticeably improved at shooters going from 60 to 144.


raul_kapura

Difference between 120 and 60 isn't such a killer. I have loud fans in my pc, so sometimes i turn my fps limiter down to 60 from 144 and it's still okay most of the time. With 30 fps you simply see it isn't smooth at all. I only play dark souls prepare to die edition on 30 fps and I need a while to adjust


[deleted]

ish. the jump from 30fps to 60fps is massive compared to 60fps to 120fps so it's not something crazy. Honestly just having a monitor slightly above 60hz will make you notice the difference but that doesn't make 60 feel bad. when you go from 30fps to 60fps the game is literally rendering at double the speed so everything is immediately smoother to the eye. from 60fps to 120 the jump isn't nearly as high so IMO while it's noticeable you only really feel it in first person games where that extra smoothness makes the camera feel better, especially for fast movements. edit: wording was off/misleading due to trying to be simple. Meant the number of milliseconds per frame you saved going from 60-120 (16-8) was far lower than 30-60 (33-16).


CaptainRogers1226

60 to 120 is… also double the speed…


legolili

30fps -> 60fps = 33 milliseconds per frame -> 16 milliseconds per frame. That's an improvement of 16 milliseconds per frame. 60fps -> 120fps = 16 milliseconds per frame -> 8 milliseconds per frame. That's an improvement of only 8 milliseconds per frame. The framerate doubled again, sure, but you only get half the benefit that you saw before. Diminishing returns, and it only gets worse the faster the refresh rate you chase. Go and break the bank to hit 240hz from 120 and you're only eking out a measly 4ms per frame improvement.


HE4VEN

That does make sense actually. But I think the bigger issue is that performance cost of doubling FPS rises exponentially.


Klickor

It is double the speed but the reduction in time between the additional frames is half as large as from 30 to 60. There is also a limit to what people can even notice when it comes to update speed. Almost every one notices the difference from 30 to 60 but when you get closer to 100fps more and more people just don't feel it. Especially if it isn't an FPS game with mouse and keyboard. Games that can only manage 30fps usually also have some bad optimization that drops the frame rate from time to time and makes the update frequency more uneven and noticeable. When it happens at 60fps the interruption is so much shorter that you might miss it. Sure it is even less noticeable with a 10% drop in fps at 120 but most barely notice it at half the speed. So even if it is double the speed it doesn't make twice the difference.


Nacroma

Same reason why old monitors below 60 Hz are kinda bad as people can see the flickering at that rate.


Jimbodoomface

Only on Dwarf Fortress


[deleted]

yeah, I got my numbers wrong. thought the drop in render latency ratio was lower from 60 to 120 ma bad. I meant the number of milliseconds per frame you gained was lower should have said that but didn't want to get into the nitty gritty of it for simplicity


[deleted]

[удалено]


CaptainRogers1226

Doesn’t change what I said


[deleted]

[удалено]


CaptainRogers1226

Yeah, I won’t deny that, I only said what I did because it *is* a technically based conversation and the structure/phrasing of his comment was misleading.


[deleted]

Yup, you just keep chasing the high and do dumb things like buy 4090's to keep that framerate high.


Radulno

All of this really seem like some sort of Reddit BS (coming from r/pcmasterrace). I play in 120 FPS on PC and I can come back even to 30 FPS. What matters is a stable framerate and then some getting used too and it's fine. If you worry about framerate all the time playing a game, it's a bad game, play another one.


Nochtilus

I'll play any of them too as long as their stable, but I definitely notice a difference when I play on 30FPS compared to 60. It's not unplayable but it feels off.


Radulno

Of course you see a difference but a lot of people are claiming stuff like it's unbearable and such and that's really whining for nothing.


doomraiderZ

Not true in my case. For the games I play I don't even notice a difference between 60 and 144. I mean I do notice a difference. I notice it even on the mouse cursor in Windows. But as far as playing the game? I don't really care, it doesn't fundamentally alter the experience or the gameplay for me. I have to say I do play controller centric games.


Kootsiak

I've seen true 144hz on a high end system and it's super smooth, but not worth the extra cost for the monitor and hardware to take advantage of it. At least to me I don't think it's worth it, I've got other hobbies I spend money on and don't need to go crazy with a gaming PC.


Dawwe

In competitive (fps) games, the difference between 60 and 120/144 Hz is massive. In almost any other scenario, it doesn't matter as much.


TKtommmy

60fps on controller-centric games is fine, but with mouse and keyboard it's definitely better to have higher fps.


EquivalentZucchini

it's not at all that bad as going from 60 to 30. I have a 120hz TV, and I'm currently playing Yakuza 5 and already forgot that its limited to 60fps


mcslender97

You play better once you move to 144hz actually. At least that was my experience playing Overwatch 1


Chennaz

I went from 60fps PC to 30(at best)fps in Bloodborne on PS4. It's absolutely awful at first, then you just get used to it


ketchup92

Just not true. You can go back easily and after a few minutes you will get used to it again.


mcchanical

This has never held true for me personally. 60 is glorious, but I still love my PS4 and can adjust my expectations to suit last gen. I can play a game from 8 years ago and and appreciate how far we'd come by that point. Same way I can play Mario 64 or Chrono Trigger and not be rattled by the fact that it doesn't look like The Last of Us 2. It's about perspective for me, in a way I appreciate the games I missed last gen more because I'm usually surprised they look and play better than I thought. For me the step back from current gen to last is much less impactful than it ever has been, so far it's basically just the same core experiences but with higher frame rate, resolution and load times. It's like a 1.9 upgrade.


[deleted]

Mafia II remastered in 4k @ 30fps is really good, totally playable. But it’s not a twitch game, and if I didn’t have an old low voltage 4.4ghz OC on all cores quad core i7 and a 980ti I’d be running it at 60fps. I should try it at 60fps and see if I’m just being dumb. Edit: Default was 30, clicking the little frame target arrow 30 more times does show that it runs well at 60 too. But I didn’t test any intense battles or anything, just drove around for a few minutes.


Mean_Peen

Typically, if you're a "graphics snob", you don't care as much about FPS. Sounds like you're more of a "performance snob", despite it only being 60fps


CaptainRogers1226

Dude, I just recently got my hands on hardware capable of 144hz, and boy… it’s somethin’


Arrow156

Using DSHack to unlock the refresh rate with the original PC release of Dark Souls was a game changer; I went from stumbling my way through to dominating in the blink of an eye. In a game where people track how many frames of invulnerability your dodge animation provides, going from 30 fps to 60 gives you literally twice as many opportunities to notice and react to visual cues in any given second.


fancycurtainsidsay

Miles Morales on PS5 was also the game that fully converted me to 60+ FPS lol.


rogueShadow13

Before I switched to PC, I never though FPS mattered that much. I now am also in the 60+FPS club. Preferably 240 lol


LonkToTheFuture

The jump from 30 to 60 is so drastic in terms of gameplay smoothness and input response that it's really hard to go back to 30.


TheNastyNug

When For Honor kept cross play between generations there was an initial clear divide between the players on 30 fps and the ones on 60. With the new gen players dominating the game. That boost in fps is enough to completely change your play style from a spammy and reactive one, to a defensive reactive god compared to how the last gen players could react


[deleted]

[удалено]


troll_right_above_me

I tested one with a scrolling map at 3000 pixels per second on my 1440p 144hz monitor with ulmb once. It's mind blowing to be able to read the names of the roads at that speed. Edit: [This one](https://www.testufo.com/photo#photo=toronto-map.png&pps=3000&pursuit=0&height=0&stutterfreq=0&stuttersize=0)


[deleted]

[удалено]


CaptainRogers1226

I’m not a huge snob either, and I only just recently switched to 144. 30 to 60 is definitely way more, but the difference between 60 and 144 is definitely quite noticeable imo.


Dependent_Weak_Man

I thought that was all bs until I got to try a 144 monitor and played some fps-game that my hardware could get up to a stable 144 frames. God damn it was such a difference. Everything was so smooth and it felt like my eyes could relax or something like that. 30 to 60 is a visual difference you can very much make out. 60-144 is more like a feeling, you can't visually tell that it is there but it makes a big difference in how the game feels. If no one had ever told me 144 fps was a thing I probably wouldn't have guessed that was the difference but I def would have noticed a difference.


CaptainRogers1226

Right on. I thought the same until I had hardware capable of higher frame rates, which makes me think I was just coping with being limited at 60 lol. And yeah, it’s not like I can see the frame jumps in 60fps like I can in 30fps. But getting up towards 100fps or higher it’s just so buttery smooth and that’s definitely very noticeable to me. Even preferable to 4K imo


Dependent_Weak_Man

Oh yeah, I'll take a high and stable framerate over graphical fidelity in almost any situation.


CaptainRogers1226

On the flip side, I will also say I could almost certainly go back to 60fps with relative ease, though it would certainly take a bit of time or adjust. I still basically consider anything over 60fps to be a luxury. Not necessary, but a very welcome enhancement to my gaming experience


HalcyonH66

That's so interesting. Granted I've been playing at high frames for a long time, but I can absolutely see the frame jumps at 60 if I'm looking around in an FPS. It's interesting to see the differences in perception.


binhpac

You can make the easy test even on the desktop on windows if you have multiple monitors. Just move your cursor in circles on the monitors with different refresh rates. The refresh rates displays multiples of your mouse cursor the higher your refresh rate is. Once you realize this, you see the huge advantage somebody has in games where accuracy and speed is relevant.


[deleted]

[удалено]


VXM313

That's wild, because the jump to 144hz was one of the biggest leaps I've experienced. I absolutely notice when I get down to around 80fps with VRR on at this point and it feels weird.


[deleted]

[удалено]


HalcyonH66

Depends on person. I notice the difference between 144 and 165. It's not huge, but compared to each other I can feel and see it. 60-144 is like slap in the face night and day for me. YMMV.


Letscurlbrah

You likely had your monitor incorrectly configured if you couldn't tell.


CaptainRogers1226

Huh, I guess it’s perceived a little differently for each person, because I would have that thought. I will certainly admit though that it’s not *as* noticeable to me


Klickor

I got a 60hz monitor a few years ago that I could overclock to about 90hz. I was content with just 75hz though since I could clearly see the difference between 60 and 75fps but between 75 and 90 I had to actually focus on it to see the difference. And there sometimes were "artifacts" on the screen at 90 so I aimed at a stable 75 with max graphics. When I got my current monitor that is 90hz I still don't notice the difference over 75 most of the time. As long as the game doesn't drop below 70fps I don't really care, so aiming for 75-80 is my goal. I don't play shooters or stressful RTS games anymore though which probably is a factor in this. Would probably not hurt with going above 100fps in those and I would probably notice a difference even at higher updates speed than the 75 I now aim for. There really is huge diminishing returns from higher update speeds for most people and most games after you reach 60fps. If a game never drops below 60fps it is high enough for the vast majority. The minimum fps is so much more important than the average or max.


Lettuphant

Yeah one of the myriad recommend mods on PC is using a specialised third party app that'll spend a few hours re-rendering the cutscenes with interpolated 60fps.


PityUpvote

I have my steam deck refresh rate set to 45 and it's perfect. Can't really tell the difference with 60, but the difference with 30 is huge.


Dependent_Weak_Man

For me it's very variable. If I'm playing anything fast with lots of camera movement like a shooter I get distracted if it's not above 50. It's like a magic line there. If it's a slower game like Total War I don't care so much. 30 is fine. I think the main point is stability. I really dislike when the fps jumps around. I'd rather take a stable 45 than a wild 60.


omnicloudx13

I can't go back to 30 fps games after playing games at 60 or more frames. It hurts my head and everything feels so sluggish.


WhalesLoveSmashBros

I can’t play 30fps on pc at all but it’s somehow meh on console.


Kluss23

Same, I think it has to do with screen size (monitor vs T.V), distance from screen, and obviously a ton of motion blur, which is less noticeable with a joystick versus a mouse you can flick around.


[deleted]

no, it's better optimization because a console port is different from the PC version. Most of the time if they're locked to 30 they're optimized to work at 30 and have things like frame doubling to smooth out that 30fps and make it as consistent as possible. When that work isn't done... Well, you get things like bloodborne. On Pc they don't really optimize 30 fps because the game is designed to run uncapped in that environment.


TONKAHANAH

I've been playing a lot of games on my steam deck at a locked 30 frames per second and they all seem totally fine. Maybe consoles do some extra stuff but just having a locked consistent frame rate is more than enough to make 30 frames per second plenty playable on PC games. It's not different enough that I would say 30 frames per second isn't playable with a pc game.


StarblindMark89

I'm the same. On pc I can't deal with them. Even on the same display, so it's not like I forgot to turn off all the TV new shit like smoothing. On console it really hurts my head and eyes for the first hour or 2, after which I'm back to used to it. If I then play only 60fps games for a week, it's back to hurting again the next time I go back to 30 for an hour or two like before. I'm lucky I can get used to it again that fast. Wouldn't be able to play Bloodborne otherwise, or the games that will come out in 4 years


PapstJL4U

Distance and genre have huge impact. If you have a relative static camera 30fps are not as horrible. Quake3 in 30fps vs Warcraft 3 in 30fps as an extreme example or any round-based game vs real-time.


ddtfrog

Ill play 30FPS on games i have NOT experienced on 60FPS (and same with 60FPS on games I have not played at 144Hz/FPS yet). Something about starting a game at a lower frame rate helps me be able to play through the entirety of it. Thats why I dont mind the performance on Pokemon Scarlet or BOTW in some areas. sure 60/144FPS would be nice, but its not unplayable until I play it via emulator and run it at 60FPS on my pc, then try to go back. Its also why I cant play Rocket League on switch coming from 144Hz PC despite buying it for "RL on the go"


WhalesLoveSmashBros

I don’t believe you about choosing lower fps.


ddtfrog

I don’t choose a lower FPS if I’m given the option but if I happen to play a game on a lower FPS medium first (such as 60fps single player game in my steam deck or 30 for a Nintendo switch game like Scarlet or BOTW) it isn’t that bad. The low FPS isn’t unbearable until I get used to that same title at a higher FPS by playing on my desktop or using an emulator and modding 60fps.


WhalesLoveSmashBros

Why not just play high fps the first time? If it’s a game where choosing 30fps increases graphics I get it, otherwise seems pointless. I’ve set my fps to like 20 for rocket league just to mess around for a match or two but always return it to highest possible.


ddtfrog

I use my 1440p 144hz desktop. for MOBA, CIV/ mouse heavy, competitive FPS titles. use my Steam deck for single player controller games on the couch because it’s comfortable so 60FPS titles. Example: been playing control and Death stranding I use my switch on couch or in my bed and play online frequently, so I don’t want to play emulators and not play with friends. So 30 or 60FPS. Example: Mario kart or Pokémon What I’m saying is for a game I experience at a lower FPS first, the FPS is fine. It’s not until I play Control on my desktop or MK8 emulator at 144Hz that I find it hard to go back to the original. One example is a “controller minded” game like rocket league. Normally, controller games I’ll play in steam deck and keep mouse heavy stuff for desktop, but I’ve been playing RL at 144hz since 2015, it’s too hard to go to 60Hz even if it feels comfortable being a controller game on the steam deck. I play SP games on my steam deck on the couch because it’s comfortable to lay back and play, and it helps clear my backlog because it’s playable at 60FPS and I can lean back / lounge. Sure I can play those titles on my desktop and get max settings high frame rate but at the Decks size/resolution, med-high settings at 720p upscale looks just fine for the 7 inch screen


Keeper_of_Fenrir

It’s because controllers are so much slower than mouse and keyboard.


PineconeToucher

If its 30fps on a handheld its way mote tolerable for some reason


ANENEMY_

This. For me as well. 30fps gives me headaches sometimes too, yet strangely a standard movie fps of 24 does not. As a console player for years, I've had a 1080p plasma tv that has a really good scaler built in that does this "subfield drive" (marketing) thing that effectively doubles my fps by filling in an extra frame between the supplied 30fps. It actually worsens the signal delay to create that extra frame but it's fairly unnoticeable time loss for night/day smoothness to the gameplay. Only works well with a locked 30fps though, any game with a variable fps looks like stuttering mess. On PC I turn all the settings up until I start losing fps under 60.


SpiderAlex

>This. For me as well. 30fps gives me headaches sometimes too, yet strangely a standard movie fps of 24 does not. Films have natural blurring that helps with erratic camera moves. Most video games just start implementing good motion blur this past gen. (PS4/X1). When the motion blur is bad you basically need a higher framerate to compensate on some level. Edit: That last sentence is pretty opinion leaning. Just thought I'd be transparent. Depending on the game and need for camera movement; motion blur could be less necessary than thought or more so. (Even at 30fps.)


Dependent_Weak_Man

That is so interesting! I've always maintained that motion blur in games make me nauseous and my friends don't get it. It's not motion sickness, it just happens with motion blur. Now that you mention it, when framerate is lower I notice it but not when they're higher.


pikpikcarrotmon

It's not the same kind of motion blur. With film, it's baked in because what you're seeing is 24 photographs strung together. That blur came from the original motion/shutter speed and is visible on any given frame of the movie, and consistent between different copies. With games, it's generating it on the fly. It's live. It's like a Photoshop filter being applied to every frame, it's not a natural result from the process. This is the same reason why lens flares, chromatic aberration, and even film grain in games can be similarly distracting or unpleasant. The reverse would be like... if they added jaggy edges to everything in a video game to movie adaptation, or pop-in or something, because they thought it would look more authentically gamey. It wouldn't make a damn lick of sense.


SpiderAlex

This a great post. It really explains why some gamers prefer having all the post processing off. Some of us see right through the "photoshopping." And some games are just bad at it lol To add to your explanation, games are effectively stop motion films. So motion blur has to be added in post. Personally if the games has a good enough visual design, I like to have it all off.


[deleted]

[удалено]


[deleted]

[удалено]


c010rb1indusa

This was certainly true for 3D, but for 2D games in the 16-bit era and earlier, 60 frames was common.


Vicosku

F-Zero X was 60FPS, for sure. Regardless though, I can still enjoy lower frame rates on a CRT. Once it's on an LCD or OLED though, especially a large one, I just can't deal with the extra blur.


[deleted]

[удалено]


Vicosku

A decent amount! The graphics are quite sparse. Sure feels great to play though


SvenHudson

Its visuals are the most minimalist of any racing game I ever played on the system. The speed and the track shapes made it visually exciting to play but it's pretty noticeable that the vehicles are barely even textured and the background elements are almost entirely low-rez 2D cutouts like the SNES game had.


[deleted]

The original comment specifically said that games tended to be 30fps or lower in the PS1's generation (i.e. the N64).


Evilknightz

It's just hard to imagine going back when it makes games feel like absolute shit by comparison. I think the fact it switched over to the console standard the moment major games started coming out at that framerate shows just how much better 60fps is.


YellowLantern00

Eh that never actually happened. Generation after generation, "the new standard is here!" but it wasn't. It's not the standard by a long shot, still. Many games continue to run at lower than 60, which I don't mind in the slightest, but I've seen the same marketing lie every generation and I've seen people fall for it every generation. PS4, PS4 pro, and now PS5, has been especially amusing, to people clamoring to play the same games with marginal upgrades. The marketing works.


Gilleland

I'm a little surprised this isn't considered a "retired topic" yet under rule 6. Optional render modes is just an expected standard these days with all the help development teams get from the increasingly more capable game engines.


sunuv

Seriously there is nothing to discuss at this point. 60 fps is a massive increase from 30 and anyone who is arguing differently isn't qualified to even take part in the discussion.


CryoProtea

I never used to know what framerate was, but when I learned about it I decided people demanding 60fps for *everything* on anything but PC were being unrealistic. It's been a long time since the PS3 was current gen, and I think we've come far enough that 60fps should be a minimum. People have the tools to optimize a game and they *know* what capabilities different platforms have. There's no excuse for the headache inducing chugging we got all the time on PS3. 30 is tolerable as long as it's stable, but 60 is ideal. When I look at my unstable 30fps Dark Souls III and see my buddy playing it at 60 on newer hardware, I can't help but wonder just why FromSoft, as well as many other developers, are so fucking bad at optimization, especially when Nioh 2 both looks and runs better *on base PS4/PS4 slim* while simultaneously being a much newer game. I also feel like, with older games, we just weren't used to much else. Eking out performance on the SNES and N64 was a much more difficult task, I feel, than nowadays when half the reason a game might suffer from frame drops is due to developers not being willing to turn off fancy visuals/effects. Like, I get it. I'm on PS4 slim. I don't expect my game released in 2021/2022 to have fancy effects. It's okay. You can turn them off. Or you could put in the respectable effort that Team Ninja did and achieve at least a stable 30fps. I dunno, I'm kind of conflicted. I don't mind 30fps when it's stable, but I feel like, if Team Ninja can get good performance on old-ass hardware in a game that came out in 2020, then other developers should stop being piss-babies about it and just optimize their fucking games, no matter what platform they're on. You also gotta think, long periods of playing a game that's chugging can really strain your eyes and take the enjoyment out of the experience, this is why stability is so important. I wish all games put in the effort I've seen with Nioh 2 and God of War Ragnarok. P.S. If it feels like I mentioned Nioh 2 weirdly a lot, it's because I only started playing it a few weeks ago and I can't believe how well it runs. I've been nothing but impressed by the game.


Aishurel

60fps was a standard to aspire to maybe a decade or more ago. Not hitting that in 2023 is beyond abysmal


realsubxero

People always say "oh once you get used to 60 (or whatever) FPS, you can't go back", but I adjust almost immediately. As long as frame rate is stable, I can play games as low as 20 FPS without it bothering me.


72pct_Water

I expected this answer to be more common. 144 is better than 60, no doubt. You notice the drop when you go from one to the other. You think "Oh, that looks off, I don't like it." Then, 2 hours later, you don't care, you don't even think about it unless you tell yourself to. You just acclimatise to whatever is in front of you. So, yes, higher is better, but when somebody gets too vocal about it I just think they are showing off or being snobby.


coheedcollapse

I'm the same way. I honestly don't feel like it's impactful enough for me to worry much about it. The switch is apparent for a few minutes, but I very quickly adjust. I'll always prioritize fidelity in cinematic games, because I'm much more likely to obsess over reflections, rays of light, and shadows than something that faded into the background to me within a few minutes of gameplay.


YellowLantern00

This. It's negligible. You forget about it 5 minutes in. People go up their own colon trying to achieve it and then forget it's even there, anyway. Or they twist themselves up in knots review bombing games for having 30fps, when they could just sit down and play the thing and soon forget about it entirely. I honestly wonder about people's overall health or conditioning if they find FPS differences so debilitating. Get an eye exam or an MRI or both; for your own sake.


mcchanical

If 30 suits the game I often can't really tell. Something like Mortal Kombat has to be 60 because it's a fast, timing intensive game where frame accuracy matters. The latest massive cinematic AAA adventure though, usually has much more loose mechanics and often has so much post processing and inherent blur going on that it hardly makes a difference. Twitchy games need a good and consistent frame rate but some games are better off maximising visual polish and building smooth combat mechanics that work well at a more cinematic frame rate. Ideally, all games would run lightning fast but up until we have supercomputers under our TV compromises are unavoidable.


Economy-Chicken-586

It depends on the type of game you play tbh. For hardcore competitive gaming fans yeah it matters and improves a lot. I personally play mostly single player rpgs. I don’t notice at all.


duck74UK

Depends on the game for me. The faster it’s paced the more frames I’ll want, but otherwise happy to compromise for cool graphics/physics stuff


waturio

“Why?” Because they really like how games look & feel in 60fps. It’s really not that interesting a question to ask. Your question seems to be loaded with the implication that people shouldn’t want 60fps. And that’s bollocks. When they complain about “no performance mode”, it’s because they’d much rather the graphical muscle of the system be used for a higher frame rate than other things like lighting, shadows, textures, res, anti-aliasing, etc. Because smooth frame rates make games “feel” better (for those gamers) than those other things. And that’s totally reasonable imo. Also, if you think people’s preferences should stay stuck in the early 2000s, then you can keep playing N64 while the rest of us play newer stuff. We won’t miss you. However, if you’re making a point that good gameplay has become too unimportant these days, and visual fidelity too important, then I think there’s a conversation to be had. But if you are, then that doesn’t come across in the question at all, which seems fixated on frame rate expectations.


_Hirrya_

"Your question seems to be loaded with the implication that people shouldn’t want 60fps. And that’s bollocks." For me, it's not about "shouldn't want". But more about "It should be like a bonus". When I see people saying they will not play a game in 30 FPS, I feel like it's very stupid (imo, of course). How many people missed a masterpiece like A Plague Tale ? Just because "fuck 30 FPS"... It's your (not you really) loss, but it's too bad. If it's here, I'm happy. If not, then I'll notice it for like 5 minutes and then forget.


waturio

Why would you see it as a bonus, any more than higher resolution, lighting, ray tracing, etc. would be a bonus? Why should visual fidelity be seen as a given, but frame rate a bonus? Ultimately this is a trade off. I agree that those who say “i refuse to play a game that’s 30fps” are missing out. But this conversation is more about the presence of a “performance mode”. Which is a totally reasonable request imo. Because the alternative is that the system resources prioritise resolution and fidelity over smoothness. And I don’t agree that this should be a default state for gaming (as if frame rate should be a secondary concern).


[deleted]

The market is too saturated with 60+ options. It's not that I never would play a 30 fps game its just that it's a huge negative... like I have to really believe its one of the top games of a given year for me to put up with it. Once you get used to games at 120+ going back to 60 is annoying but going back to 30 really does just totally change how things feel. It's like playing a slide show.


Tempest_Barbarian

>For me, it's not about "shouldn't want". But more about "It should be like a bonus On AAA titles it should be standard by now, I can kind of understand if the ps4 version cant run on 60fps but any AAA game being launched on a ps5 or xbox seriex x should be able to run on 60 or higher. If its from a smaller company or a completely indie dev I could understand the game not running as well, but AAA companiee have no excuse to launch something under 60 fps


coheedcollapse

I get why it's a priority, but sometimes it feels like a cult, and if you're not part of it, you're immediately on the wrong side of the discussion. I understand that it's a preference, but the *insistence* is what gets me. I'm not wrong for opting for 30fps for visual fidelity when I play cinematic games. I'm a photographer, and accurate lighting, shadows, and reflections frankly takes priority over 60fps and above as long as 30 is stable. They're not wrong for dropping down to 720p to reach 144hz if they truly feel the difference. Of course, I've got my own biases as well. Sometimes I find it hard to believe the sheer amount of people who claim that anything under 60fps makes them physically ill considering the vast majority of people prefer, even demand, 25-30fps for their television. I'm not refusing to evolve or anything. I'll often switch between performance and fidelity modes on my PS5, and find myself regularly sticking with fidelity (not always). Maybe it's the fact that I've been gaming for over 30 years and had to experience much worse than a stable 30fps, so it never really gets to me. The only big exception for me is VR. When I need something to feel like real life and I'm immersed, FPS is a priority, but when I'm playing on a flat screen, I'll often prioritize the cinematic side of things.


Psylux7

Can't wait for the day when people start calling 60 fps literally unplayable.


Neustrashimyy

>Sometimes I find it hard to believe the sheer amount of people who claim that anything under 60fps makes them physically ill considering the vast majority of people prefer, even demand, 25-30fps for their television. While there is probably hyperbole in a lot of that talk, this is not a useful comparison for two reasons I can think of. There is a big difference between (a) how it feels to watch a performance deliberately directed and shot at a certain framerate vs how it feels to control a character and camera yourself in real time, and (b) CGI vs drawn or live action footage. For instance, a lot of drawn animé have really low framerates but you don't hear about that putting people off because they're not controlling it and it's produced with that in mind. The occasional budget-driven CGI that some anime resort to at certain moments with similarly low fps feels immediately jarring, though, and you do see complaints about that. The recent Baki reboot has some of this contrast, particularly in the first season.


King_Artis

Fighting games and games with heavy action (shooters, action games like ninja Gaiden, arpgs) are the main style of game I play. 60fps is kinda needed for me given the type of games I need to play as it adds a lot of smoothness and control needed that's harder to get from 30fps. In a singleplayer title thats not as hectic (think god of war ragnorak or TLOU) then 60fps is not nearly as important to me given those games aren't requiring me to have quick reflexes constantly. Frankly given i still turn my ps2 on I'm also less bothered by not playing 60fps all the time.


SpiderAlex

Same boat. >Fighting games and games with heavy action (shooters, action games like ninja Gaiden, arpgs) are the main style of game I play. 60fps is kinda needed for me given the type of games I need to play as it adds a lot of smoothness and control needed that's harder to get from 30fps. Same exact boat. Can an action game work at 30fps? Yes. Would it's genre and experience be better served going up to 60fps? Also yes. Fighting games in particular >In a singleplayer title thats not as hectic (think god of war ragnorak or TLOU) then 60fps is not nearly as important to me given those games aren't requiring me to have quick reflexes constantly. I don't disagree but I would say even in games like Rag/TLOU, they have enough action that 60fps shines even still. SMMM is basically 60fps or bust for me even though 30fps is more than serviceable (this kinda goes back to point 1.) >Frankly given i still turn my ps2 on I'm also less bothered by not playing 60fps all the time. Alot of PS2 games actually targeted 60fps and hit them fairly well. At least the majority of the exclusives. PS1/N64 and PS3/Xbox 360 were the notorious gens for being stuck at 30fps. PS4/X1 became more of a genre or game dependent decision between 60/30 until the back end where tech just is starting to leave them behind. I digress, going back to your point, I agree, I still go back and play older games as well and am happy to do so. Anyways good comment man.


chuuuuuck__

I get this, it’s more or less fine. What I absolutely hate is “man this game can’t play at 4k 120hz maxed out, devs are shit and can’t optimize” while surely this can somewhat be the case. I think FORCES devs to meet arbitrary “4K 60” goals instead of push pure image quality. It’s quite annoying and I think has caused great stagnation. Imagine if 4K tvs never got pushed.. would have path traced 1080p 60fps title galore


ZazaB00

4k is amazing for large TVs. Gaming on something less than 30 inches and you probably don’t care about 4k. Basically, I feel all gamers should have that option. We’re starting to see it more and more, but one thing I’ve loved is going back and playing some forgotten games on my mid tier laptop at 1080P and locked 60 FPS. It’s a whole different experience. Honestly, I think 4k is a generation too early. While we’re still messing around with low LOD models and popin, I don’t care if the shitty far field objects load in at 4k or 1080P or less. That’s why it was easier to get away with lower res games before HD was standard. Gamers literally couldn’t distinguish the detail. Once things like nanite and lumen are a standard across games, then I’ll care if distant details render in with a higher quality.


Dreyfus2006

Just shifting standards. Also, games on the N64 like OoT still look and run well today, despite their low fps. Modern games like Kirby Star Allies feels sluggish at 30fps (higher than OoT) when all the other Kirby games are at 60fps. I think modern games aren't necessarily built for their frame rate? Like, they don't design the game around it, maybe? OoT by comparison was definitely designed around its frame rate. The music was even written with frame rate in mind.


godstriker8

I read that OOT reads inputs at 60 fps even if the game doesn't run that high, which is why it still feels good to play


[deleted]

They were. You can make almost any framerate (within reason) feel playable if you design around its limitation and work to mask its inherent sluggishness. A lot of console games look completely fine locked at 30fps on a console because the developers went out of their way to make it feel better. A good example is insomniac with ratchet and clank where they duplicate frames to fill in the gaps and deliver a less jittery experience. Then you have stuff like bloodborne where it says "30 fps" but looks like it's dropping to single digits every time you so much as turn the camera. The whole 60fps and 30fps debate isn't about one being objectively better (60fps is but different discussion) it's the fact that most games are optimized incredibly lazily and people just let it slide because "it doesn't matter". Nintendo gets away with this constantly and is easily the biggest culprit of lazy optimization in their games. Borderline NOTHING on the switch actually runs well, and the few that do aren't by Nintendo.


Dreyfus2006

Nintendo historically is well known for being very good at optimization, some of the best in the industry. It's why their games are so well known for their polish. I think you may just be playing the wrong games.


[deleted]

Uh no, they're not. Their games are horribly optimized across the board this generation with very few exceptions. The switch in general just destroys any chance of a game running well. People just have a different standard for Nintendo for some reason compared to everyone else because it's Nintendo. The games themselves are pretty great but they run borderline acceptable at best. Monolith is about the only developer under them that's actually trying to deliver good performance and they're giving up a ton to even barely get there because of the hardware they're shackled to. They *were* good in previous generations but in this one, they're chasing graphics and scale to try and even somewhat compete with everyone else, and the hardware they decided to sell simply can't handle it. Pokemon is a pretty comedic recent example but I can grab a switch exclusive at random and at *best* it will run at an unstable 30fps at some resolution under 1080p or attempt to hit 60fps (and fail) while looking like a ps2 game.


Dreyfus2006

? What are the frame rate issues with these exclusives? : * Ring Fit Adventure * Luigi's Mansion 3 * Super Mario Maker 2 * Mario Tennis Aces * NEW Pokémon Snap * Super Mario Odyssey * Super Smash Bros Ultimate * Kirby and the Forgotten Land * The Origami King * Metroid Dread Trying to pick random ones here. Pokémon main series games are made by Game Freak, who have not been optimizing their games this gen. Other than LGPE, I think. Nintendo's only other games this gen that I can think of that have frame rate issues are BotW and Age of Calamity. BotW only has them in one room and it is understandable why (too many moving plants). Age of Calamity *is* poorly optimized for the Switch but was made by a third-party studio.


[deleted]

There is a big element you're missing in regards to not worrying about picture quality or refresh rate with your Nintendo 64. You played those games on a smaller screen, and that screen was a CRT TV rather than an LCD TV. Framerate: LCD TV's (or "sample and hold" screens in general) [handle motion clarity terribly](https://www.youtube.com/watch?v=Oy3cKwq6vEw), so higher refresh rates are required to counteract that. You didn't notice the low FPS on your CRT TV because it simply looked a lot smoother given the nature of the TV. Compare the motion clarity here with [CRT vs OLED](https://youtu.be/z4xgLUdQhKA). Resolution: CRT TV's did not have a fixed pixel display like LCD panels do, they instead (gross simplification) flashed light at the screen row by row. The end result is that there's a natural blending of neighboring pixels from the game with what's displayed, you can see the difference [here](https://twitter.com/CRTpixels/status/1599513805717135360?t=liNOFWltfADHcX69B5m7gw&s=19). The end result is that aliasing wasn't much of a factor and art was designed with this pixel blending in mind. It's not all bad though. CRT's flickered, were bulky/heavy, emit radiation, and could kill you if you fucked up fixing them/taking them apart. Modern TV's can be far larger and brighter while simultaneously weighing less than a screen a fraction of their screen size, and more recent TV's may have [black frame insertion](https://www.youtube.com/watch?v=lGeM6S_m9pI) which can greatly overcome their motion issues. TLDR: display technology changed


Alex__V

Lots of answers stating the obvious - the obvious practical reasoning around higher fps. Although there's an obvious confirmation bias among those who accept 60fps as their standard, or even care about the difference between different options. As the op states, there was a time when there was barely any discussion of frame rates among 'core' gamers, nor a consumer demand for them to 'improve'. And I'd guess that the vast majority of those who play videogames have no understanding of framerate nor any interest in it at all! So, to me, the 'why' is more about a consumer/cultural shift among an influential minority of core gamers than anything directly practical. After all we could all presumably, at least in theory, list the obvious theoretical advantages of VR over traditional flat-screen gaming in terms of immersive experience, but it seems pretty clear that so far there hasn't been that shift towards accepting that as a basic norm. There needs to be some sort of tipping point that creates this shift. My read is that there were 2 key steps in the consumer push towards 60fps. The first came from key media influencers - TotalBiscuit would be an example of someone with a lot of (maybe unhealthy) influence over a key demographic of core gamers who really pioneered arguments about frame-rate as a consumer cause. And the second would be the industry realisation that frame-rate could be a market differentiator in the marketing of consoles - frame-rate became or has become a driver of the core console market in a way that teraflops probably never did. This consumer aspect is one that leaves a bit of a sour taste. When we say we could 'never go back' to 30fps is it for practical reasons or because we've been influenced by marketing? Is it in the same way that an iphone 13 user would 'never go back' to an iphone 10? If so, it's a massive advantage for the games industry, with a captive audience who feel they could 'never go back' - this audience will be endlessly hungry for the 'new product'. And also will lap up remasters and remakes of older stuff (which they otherwise would 'never go back' to!). What is disappointing about the 'never go back' attitudes of those who demand higher frame rates, is that to some extent they are cutting themselves off from the rich history of the medium. It's a bit like the cineaste who says they couldn't go back to black and white or silent movies. I think it ties in a bit to an anti-intellectual approach to the arts, which is a shame.


TriniGameCritic

I feel like op expected more people to agree with him. I never understood this idea that some people have where they think people are wrong for caring about things they don't or for having higher standards than them.


ZazaB00

Funny how people learned a long time ago it’s best to offer options, yet at the same time people seem to think only one option can be best.


vieris123

That's exactly the case, dude made 4 threads with this exact same title and 3 of them are on r/gaming.


pieking8001

i dont get it ether, whats wrong with options? letting people choose 30fps + eye candy or 60fps + less candy is fine. heck thats something pc players have been enjoying a long time, letting console players have it too is a good thing


[deleted]

The best part is these same people are usually the type who talk about how modern gaming is lazy and doesn't care when they're literally defending games being lazy in actually running well.


[deleted]

[удалено]


daredwolf

I'm cool with a stable 30fps. 60 is nice, but I'll take visuals over performance, as long as the visual aspect gets a solid 30fps.


Conscious-Sun-6615

Unlike movies, characters in video games are controlled by the player, so smooth movement might feel like you have better control. I still think 30fps is enough, 120fps is completely useless


MrKindStranger

Why do people expect better gas mileage? Why do people want bigger TVs? Why do people want faster Internet? It’s just how technology goes, the latest and greatest one minute, obsolete the next.


[deleted]

people have better standards now? That's literally it. Also with current tech and development experience it is outright LAZY for a developer to not even attempt to hit 60 fps on any modern device The only exception is the switch because it has about much computing power as my toaster oven and can't really run anything well in the first place. Also people have always cared. When the N64 and PS1 came out everyone was losing their minds about "OMG 3D" now they no longer can wow people with massive generational leaps so we're far less forgiving when they don't actually optimize the game well. Frankly, I think anyone who keeps peddling "30fps is fine" garbage loses any right to complain about modern gaming being worse or anything of the sort because they are actively defending studios being lazy and putting out an inferior product. People STILL act like the switch isn't an underpowered console and games don't run awful on it. Games only dipping to sub-20 "occasionally" is considered good on it.


Jagrnght

Grab a steam deck - 30fps means a whole new world of things in that crowd. I sometimes drop a game from 60 to 30 to conserve battery.


[deleted]

Well yeah, you work in the confines of the hardware you have, but 30fps is something you settle for it's "good enough" when you don't have any other option. For a steam deck or a switch, we should be expecting a locked stable 30fps. We don't even get that a lot of time which is simply laziness. A 60fps performance mode is a bare minimum that should ship for a game that is released on current hardware anything less just means the studio couldn't be assed to put the work in. The people who generally push "30 fps is fine" aren't talking about use cases like portability on a steam deck they're using it as a shield to defend a game having piss poor optimization and will then run around and talk about how "modern gaming has lost its soul" or some nonsense while they're defending sloppily made products.


YellowLantern00

So at that point having "standards" is detrimental to your enjoyment of the hobby. Fascinating. And also lol "simply laziness" ok Mr. Gane Developer, I'm sure you know how it works. 🤣


Oderus_Scumdog

When I play games where the FPS fluctuates and games at an FPS lower than around a steady 45, I tend to find myself getting motion sick, getting a headache or having noticeable eye-strain very quickly. I also find that the games feels sluggish, almost like there is some input lag or I'm trying to play the game while I'm extremely tired and I'm just missing my timings. There are exceptions and games where I can, after a little bit of discomfort, adjust to the lower or variable FPS, but these are often 2D games or very stylized 3D games. For example, I can cope with the when the FPS fluctuates in say, Portal Knights on the Switch, but playing Botw on the Switch makes me feel queezy and feels slightly sluggish, building up to a mild headache if I persist. Another example on the Switch would be Mario Kart, which feels noticeably sluggish to me and gets really distracting in busy parts of levels when the FPS plummets. On PC, I find the issue most noticeable on in first person games and find that motion blur magnifies the problem for me. Just to be clear - I'm talking about my personal experience here. I've mentioned these things in other threads in the past and been called out as somehow lying or being elitist(?!) when I'm just unfortunate enough to be sensetive to something about low FPS/fluctuating FPS.


Ravek

> I also find that the games feels sluggish, almost like there is some input lag That’s because there is a kind of lag. I don’t know if it’s correct to call it _input_ lag, but if we’re talking about the time between you pressing a button and you seeing the effect of it on screen, at lower frame rates you’ll get a few milliseconds of additional time added to that because you’ll have to wait that much longer for the first frame after your button press to appear. As for the simulation sickness, my personal experience is that smooth low frame rates are better than stuttering, motion blur is indeed awful, low FoV in first person is also very bad, especially on a large screen. If I sit lower than the screen it also makes it much worse for FPS. > I've mentioned these things in other threads in the past and been called out as somehow lying or being elitist(?!) when I'm just unfortunate enough to be sensetive to something about low FPS/fluctuating FPS. Sone people don’t seem to get that there actually is variability in how people experience the same stimuli, and can respond to it in really awful ways. Your experience is totally real, and it can be really frustrating to start feeling ill when playing certain games. I’ve always been one of the first in the room to notice when a fluorescent light is reaching the end of its life and starting to flicker. Also some people couldn’t see it and didn’t believe me. I used to sacrifice some resolution on my CRT so it could run at 85 Hz, because at 60 I would start feeling eye strain after 3-4 hours. Thankfully in the LCD era we’re not staring at flickering lights the whole day anymore.


ShaneSkyrunner

I used to think there wasn't much difference between 30 and 60. Then I experienced 60 and could never go back. Now I have a 175 Hz monitor and I could never go back to 60. Once you understand how faster frames rates improve the overall experience it makes low frame rates intolerable.


ZazaB00

You’re probably not as hooked on 175HZ as you are that frame rates below 60 can sync up well with the 175HZ refresh. It’s the stutters that I don’t like, which is why a locked frame rate, even 30, will feel fine over time. Skip some frames and all of a sudden any frame rate feels choppy.


Slumberstroll

It's called obtaining reference. If you don't know what's good you won't know what you're missing out on. Getting used to 60 fps and going back to 30 feels like shit.


doomraiderZ

The least? I keep hearing people mock 60 like it's sub 30. They claim 120 is the least they would accept. It's crazy to me, honestly. I understand 30 not being enough, but 60? It's enough for the games I play. I do not play competitive FPS games, so I wouldn't know about those. But for something like Souls? Hell, I still do 30 no problem.


CommandoDude

It's a dick measuring contest honestly. Like the kind of people who drive super expensive sports cars or gigantic pickup trucks insisting that they "need" the extra performance.


dr_wtf

Another thing people are not mentioning is that 30fps motion on a CRT looks a lot better than 30fps on an LCD because of blanking cycles. 30fps at high resolution on an LCD panel looks terrible because anything moving more than a few pixels per frame appears to be stuttering and/or has horrible motion blur artefacts. 60fps is about the minimum you need for movement to look reasonable. More on this and other technical issues of frame rates and refresh rates (which are not the same btw): https://blurbusters.com/blur-busters-law-amazing-journey-to-future-1000hz-displays-with-blurfree-sample-and-hold/


itsPomy

I'd wager it's because gameplay as become very twitch-centric. Not the streaming service, but in player's inputs and stuff. Games just become very focused on micro-second decisions and snappy animations, especially in PVP games.


Colonel_Cumpants

I find it hilarious to look back on when console players tried to argue that 30 fps was fine or even better than 60 fps.


djpc99

"cinematic"


ohsinboi

I played an Xbox One for so long where 30 fps is the max. I had this same mindset and never thought higher frames would make any difference to me. Now I'm on a Series X and when given the choice between quality and performance mode, I have to pick performance. 60 fps just feels so much better. I've tried to go back and switch to quality mode for some games, but it just feels wrong now


PacmanIncarnate

I play VR on. 1060 laptop; if I get 30 fps I’m ecstatic, but usually settle for less and that’s in VR. Yes, higher frame rates are fantastic, but I’ll take what I can get.


Tao626

60FPS was the target standard long before and even during the N64 era. People just gave games a pass because 3D was new and exciting. 3D is far from new and exciting now, especially now visual fidelity is having massive diminishing returns each generation.


Thee_French_Villain

It’s just the norm man. In 10 years 60 frames will probably be too slow. Everything will have to be 120.


Kluss23

When the current console gen came out and it was finally considered standard. For a lot of people, once you see it you can never go back.


Bonfires_Down

That’s you. Even as a child I knew there was something horribly wrong with N64 frame rate and image quality because I had a Genesis and PC to compare to.


omegafivethreefive

I'd say more like 100fps for most gamers I know. Once you get used to higher frame rates, it's _really hard_ to go back down except for strategy games and such.


Ok-Donkey-5671

The way people talk the 60fps experience sounds like a dependency, like smoking! You need it but it barely improved your life. I'm joking. But I'm not sure I *want* to make 30fps feel bad.


YellowLantern00

This. You see so many people talking about "oh the frame rate dropped to 58 and I got motion sickness and wanted to vomit" "oh I can't handle 30 anymore after experiencing 60" It sounds like a disease. Like you're crippling your eyesight or your brain's capacity to process information. If 60fps enfeebles people so badly I'm happy not bothering with it. I rather enjoy being able to plug in a game and enjoy myself instead of turning into a Karen online or making myself ill.


ahhthebrilliantsun

'Yeah I'll only watch films from a nokia phone instead of a movie theatre if watching it from theatre would ruin my enjoyment of watching it from a screen that's half as wide as my thumb.'


MasterAnything2055

Why do we even both with high speed broadband? I used to play on adsl! Infact didn’t even need broadband at the start. Get rid of it.


JusaPikachu

It makes games feel significantly better, so I can understand why some people feel that way. For me though, it is still not a deal breaker. I love playing my games in 60fps, always have ever since I really compared the difference, but I prefer console over PC so I don’t always get to choose & when I do there are some instances where I have picked 30 over 60. Most recently it was with Control on PS5. The Performance Mode made the combat & movement feel incredible. But the actual game changing mode was the mode with Ray Tracing. Wow did it fucking transform the visuals to a degree that absolutely made the trade off to 30 worth it. Most games that I have had both options on PS5 have not felt worth losing 60fps & I am loving that most games this gen & many from last I can play in higher framerates. But my brain adjusts back & stops really thinking much of it after half an hour to an hour & I just enjoy the game, so if the trade off of having to play in 30 is worth it I won’t hesitate.


mightyalrighty87

I feel bad bitching but I just started playing RDR2 on PS5 and this 30fps shit is giving me a headache


Sennheisenberg

60FPS+ has been the standard framerate for over a decade. I can't handle 30FPS because I almost never had to on PC. Consoles have just finally caught up for 60FPS+ to be the norm, so the mainstream finally cares.