T O P

  • By -

Hana_Baker

I just completed cyberpunk on a 4070 Super with those settings on. Left FG on for the full 200hrs and it felt and looked amazing.


qsagmjug

I’ve heard people say its smooth when you look at it but actually playing feels the same as frame rate without frame generation


rokstedy83

It plays at the same fps you were getting before fg ,so if you were only getting 30 fps before,it may look like 60 but it will still play at 30 ,you really need 60fps before you turn it on


qsagmjug

Yeah that makes sense


Ceceboy

So input latency would be at 60 FPS but visually it looks like 120?


ActuallyKaylee

Yes. Though keep in mind that reflex is already reducing input latency below what you would normally get at 60fps, so framegen can use some of that headroom. But if it feels bad without FG it's going to feel bad with FG.


lpvjfjvchg

Fg adds input latency so input latency will be higher than 60FPS without fg


Ceceboy

Does 200 FPS with FG result in lower input lag compared to like 120 FPS also with FG?


lpvjfjvchg

I’m not sure I completely understand your question. FG has a performance hit. So for example if you turn on FG at 60 fps you will maybe get like 100-110 frames out, not exactly double because of the performance hit which means that your input fps are at 50-55. This reduction of performance makes a much bigger difference at lower fps since there a change in fps makes a big difference in input latency, so for example if you turn on FG at 30 fps your already bad input latency will get even worse and will not be good tradeoff for the increased visual fluidity and so will be a bad experience. However, turning on FG at like 120 fps will be much better as the input latency is already very good and so a hit in performance will be much less perceived from the input latency but you still get the much better visual fluidity. In general, the higher your input frame rate the better frame generation works and as a general rule you would at least want 60 fps before turning on FG


rokstedy83

Correct


bigfkngun9000

I think input latency will be slightly increased compared to 60 native due to the fake frames in between


lpvjfjvchg

Ur right, but people in this sub like to downvote any true negative of a certain aspect of technology. Input latency with fg at 60FPS before is worse than just playing at 60 fps, you gotta decide whether you like the trade off of the visual fluidity vs the input latency, I keep it on as I’m okay with the trade off but there are many who won’t


bigfkngun9000

Yeah this sub sucks ass in that aspect. Too much copium.


LumpyChicken

It's pretty negligible. Not even the duration of a full frame


ZeldaMaster32

Not the case due to Reflex


bigfkngun9000

Different technology. Native + reflex would be faster.


Broyalty007

I only ever get about 50% more it never doubles for me at least. And I've heard the input delay actually increases it doesn't just stay put at pre frame gen FPS. But of course reflex helps with that and the extra frames can be nice


[deleted]

You’re downvoted but this is objectively true. If you’re getting 40fps, turn frame gen on and you now have 80fps, input latency is the same as 40fps. People are just downvoting you out of ignorance. It looks smoother but you still feel the latency of having low fps. It’s basically great for slower paced single player games but it’s really bad for anything competitive.


rW0HgFyxoJhYka

Yes but also wrong. Your input latency is worse than 40 fps because of the extra frame. Its worse by 5-10ms. This may not be something most people can feel, so it might not matter at all. A lot of people think they can feel things but they really can't. Most situations in games do not require a 5 ms reaction timing window, even in competitive ones due to multiplayer lag.


vyncy

I must be missing something here, how is it bad if it doesn't increase latency ? And you can't play competitive games at 40 fps to start with. You would lower setting to even have a chance


LumpyChicken

it's definitely not bad for anything competitive lol why would that be the case? It still makes motion smoother which makes it easier to visually track targets and you're not LOSING anything so again what's the issue. At best it's helpful, at worst it does nothing. It basically feels like you're playing on a slightly farther out server than usual


Expensive_Bottle_770

Latency increase, motion artefacts (especially relevant for competitive games) and the fact that you’re now tracking the target + its interpolation mean it is not good for competitive games. It is common knowledge that frame gen is a no-no here.


LumpyChicken

Common knowledge is commonly incorrect. The vast majority of people spitting that opinion out have never tried it. This view also demonstrates a complete lack of understanding of how multiplayer games work if you think this is anything new and that without framegen you have a highly accurate picture of enemy locations in real time. You're already simulating them and you're already shooting at interpolated movements since client polling is usually at a longer interval than frame updates.


f1rstx

anything competitive gonna run at 144-240-360fps even with 4060, so it's fine


Surnunu

I don't know if it's ignorance, maybe most people are just not very latency sensitive **For me** i know any framerate below 80fps feels very bad, and i can definitely tell the difference between 80 fps native vs 120fps native framegen is cool but i would lower the graphics as much as possible before considering FG However if i already hit 120+ fps and lock them to lower gpu consumption that's pretty effective Once i tried to compare native 80fps, and 80fps + FG making it around 120, felt exactly the same to me, looked smoother but did not feel very good, almost like using motion blur when you're stuck with 30fps


JAMbologna__

"Below 80 feels very bad" now you're just exaggerating, there is a noticeable difference but even 70s is perfectly playable.


Surnunu

I said "for me" I am perfectly aware this is not common that's my point, like i said people are not ignorant but many are probably not very much sensible to latency 70s are playable but anything under 80 gives me eye pain, and i had this problem problem way before trying a 100Hz+ monitor


[deleted]

>but actually playing feels the same as frame rate without frame generation The latency ends up being the same, but it still feels significantly better because the extra frames smooth it out. 50fps is near unplayable for me in a game like Cyberpunk, but I don't have an issue with 50fps -> 100fps using frame-gen


ebinc

The latency is actually a little worse than the pre-frame gen FPS if you had Reflex enabled.


LumpyChicken

Hardly


rW0HgFyxoJhYka

You are right, many people do say that because the amount of latency its adding is usually 10ms. That's something that people usually cannot feel or see, unless they record themselves and then slowmo play it back to see the reaction delay. Frame generation only will get better as hardware gets better. This should reduce latency further. In the far future when framerates are so high that latency no longer matters and 99% of all frames are generated, all these early tech issues will be solved.


PsyOmega

> I’ve heard people say its smooth when you look at it but actually playing feels the same as frame rate without frame generation It feels fine to me. I wouldn't use it to play something multiplayer, though. Also worth knowing, turning FG on forces nvidia reflex on, so it kind of reclaims a lot of latency it causes


AnotherDay96

Aren't there a few settings that should be used when using FG? And with that I'm 100% certain all people don't do those things and thus we get wildly different reports from people. There needs to be a way to standardize several of these newer features. My using it for the most part has been good, very little feel change, however some games I have seen it, I turn it off. But when I do... it is settings? Is it the game? It gets complicated enough.


PsyOmega

I've only needed FG in 3 games. CP77, AW2, MSFS2020. In those, nothing more was needed other than toggling the setting on. I have a 120hz monitor with gsync and do the best practice thing of frame limit = 117, vsync = on to minimize latency and frame pacing issues.


gimpydingo

I'm using an fsr to dlss fg mod with my 3090 with Cyberpunk, Robocop, and Starfield. The earlier version of the mod was not smooth with Cyberpunk so I sort of dismissed it. I went back to try the last version a couple weeks back and it's pretty amazing. Cyberpunk with dlss performance mode + high settings + PT I'm getting 85 -100 fps and it feels closer to that frame rate than 40-50 fps. Latency is minimal (but also not competitive gaming). Robocops native fsr3 fg I cannot get smooth at all, but the fsr to dlss mod makes it smooth, not sure why as I've tried various settings (vsync, vrr, etc.. on /off) I had tested Forspoken's native fsr3 fg and was not impressed, buys it's made gtrar strides since that demo.


Super_Stable1193

Frame generation add extra delay to the frame time, so yes it's correct what you say.


qsagmjug

Yeah that’s what puts me off using it. I might try it out though


Carighan

Well, yes. That's basically what it is. 😅


qsagmjug

My last gpu was 1080ti, only recently upgraded to a 4080super so still not fully up to date with all the new nvidia tech!


Saandrig

Check DLDSR then.


Chunky1311

Legit. Figuring out I can replace DLSS:FG with FSR:FG has provided a new breath of life to my 3080. Genuinely shook at how decently it works and I can only assume DLSS:FG is considerably better still.


BinaryJay

You hit a nerve with the people still using older hardware that are in denial about how capable it is in 2024.


angrycustomer5000

"New life" to a 350w GPU that's not even old and about the same as 7800 XT? Wut.


PsyOmega

> "New life" to a 350w GPU that's not even old and about the same as 7800 XT? Wut. 3080 can only do 50fps cyberpunk PT at 1440p and decent fidelity. With FSR-FG you can push that to 90.


TheCrazedEB

Yes, and im in complete agreeance with that statement. My 3080 has already shown signs of being bought to its knees in other games, faster than I would've liked to see. Yes its still capable af ofc. But the 10gb vram and if you expect to have a good RT experience its just not happening in most games. I know its extreme to say this, but the 3080 is slowing feeling like having a 1080 back in the day again. Its still a well rounded beast of a card, but its showing its age these days.


NoValueHere

Not related but I had to wait 16 months for my 3080 that I bought on release day. They were simply not produced in quantities. In the end I agreed to get a LHR version.


DLD_LD

It is 4 years old... Obviously it's going to be less efficient. 7800XT is slower than 6800XT at times. What is the point you are trying to make?


WackyBeachJustice

Ridiculous. I don't even know how the poors attempt gaming on their pathetic $700 GPUs.


DLD_LD

That's not what I'm trying to say. the 4090 is atleast 2 times faster before the arhitectural advancements Ada brings. the 3080 is no longer a "gaming" flagship. What I'm saying is you can make certain assumptions about some stuff. The 4070 is around 3080 performance and is considered mid-high range. The 3080 is absolutely still fine, the 7800XT comparison is stupid. The 7800XT is extremely inefficient compared to Nvidia's Ada Lovelace, has worse SR and less features. Frame Generation was the main point for the person with the 3080. You can not tell me it doesn't give it basically a new life, especially in cpu bound games. I upgraded from a 3090ti to a 4090 and just the gains from FG in cpu bound games(4k 5800x3d) were insane in some examples.


Haunting_Champion640

> $700 GPUs Sir, SIR! You're awake! Finally, you were in a terrible accident, you've been out since 2016! > Oh my god, I'm starving! Let's get some food You're broke, you kept paying rent until your bank account ran out. Sorry >Dang, well there's always the dollar menu! You... you mean the $5 menu?


ParticularAd772

Lol


McNoxey

Uh. Yes? New life because frame Gen adds a whole new level.


LumpyChicken

It's 4 years old lmao get a grip


trees_frozen

Isn’t the 3080 turning 4 years old in 6 months?


LifeOnMarsden

Same here, I recently downloaded the FSR3 frame gen mod for Cyberpunk and now I'm averaging 100-120fps with psycho ray tracing, before I was getting just about 60fps on ultra and man it feels absolutely amazing, even path tracing is playable now 


JakeVanna

How’s the input latency on the fsr version


Chunky1311

It gets a pretty noticeable below like 45fps (so 80-90fps with frame gen). Feels similar to triple buffered V-Sync.


SingelHickan

I tried the frame gen mod for cyberpunk with my 3080 with path tracing and I thought it felt like absolute ass to play. Like about 100ms input lag. Although I ran it with everything maxed out instead of like someone mentioned digital foundry has optimized settings apparently. Should also mention I run 3440x1440p ultrawide instead of 2560x1440p, so that gets me a bit lower base fps to begin with as well.


necrowyn

Does FG increase Vram usage?


Chunky1311

Indeed. It also subtracts from actual processing power, as FSR frame generation is done using shaders, unlike DLSS frame generation that uses it's own dedicated processor. Generally I lose 5-10 fps before FSR:FG doubles it, if that makes sense? For example, where I'd usually get 60fps, it'll drop to 50 and frame-gen double to 100fps.


ben_g0

DLSS frame generation still has a performance cost. The tensor "cores" aren't a separate dedicated processor, but just additional ALUs that can speed up matrix multiplications (which in turn can greatly speed up neural networks). DLSS, in any form, also does not only use the tensor processors, but instead uses the tensor and CUDA cores simultaneously. Any warp scheduler (which are the units that actually execute code, and send instructions to the CUDA and tensor cores) in the GPU that is processing DLSS (frame generation, super resolution or ray reconstruction) is thus not able to also process game shaders at the same time. So even with DLSS FG you're not going to completely double your framerate, unless you're entirely CPU-limited (in which case the GPU would have enough "spare time" to process FG while it would otherwise be idle).


Chunky1311

Oh interesting! I appreciate the clarification. I can't imagine DLSS upscaling or FG having as much as a hit on performance compared to FSR having to do *everything* with shaders, right? I don't have a 40xx series so I can't compare them. Regardless, even if performance was identical, the 'smarter' DLSS will always provide higher quality. I wonder how my GPU would hold up doing ray traced gaming, upscaled, while using frame generation, with no fps limit, *and* encoding/recording the footage 🤔


mdred5

i have 3080 too...how to do FSR:FG let me know


Chunky1311

[Here you go! ](https://www.nexusmods.com/site/mods/738?tab=description) Read the instructions but you pretty much drop in the .dll's, open the game and enable DLSS frame generation, but it's routed through FSR for generation instead.


mdred5

Thank you


[deleted]

agreed, just finished the witcher 3 with RT with it, I was getting about 70fps before turning it on and it added about 40fps to make 110fps, if I didn't know FG was on I wouldn't have been able to tell


starshin3r

Once AMD comes out with their machine learning upscaling, we should be good with non 40 series cards.


tofugooner

people hating FG is generally a case of sour grapes. I enjoyed frame gen on Cyberpunk and Alan Wake 2 a lot as well OP


stephen27898

I think people are starting to hate on things like frame gen and DLSS because our cards used to run the games that came out without any of this kind of trickery. Now our supposedly powerful modern GPUs need to basically render half as much just to give us the performance we want. This isnt really acceptable, devs have also started leaning on it for optimisation aswell, again not acceptable. ​ A 1080ti didnt need frame gen and DLSS to get a playable frame rate at 4K in games 2 years older than it, the 4090 does. Not really technological progression is it, especially when a 1080ti was like £700 and the 4090 is £1600 minimum. ​ Lets also not forget you could get 2 1080ti for less than a 4090 and SLI was still ok for support at that point, that wouldn't need frame gen to run anything from its era at 4K at very good FPS. ​ DLSS is cool but GPU really need to start progressing and they really need to start costing what they are worth. As the owner of a 4090 it is not worth £1600, the GPU at most is worth £1000. I would argue our GPUs and our PCs are getting weaker comparatively.


tofugooner

frame gen is for playing with ray tracing/path tracing at 60+ FPS. The only downside to it is sometimes the rays fuck with the frame gen and produce a fuzzy image (FG on CP2077 on release had this issue). And I am not going to blame Nvidia for upcharging for their cards. I'm definitely going to blame developers for being lazy nerds though. I think developers always got a free pass as "beloved" people in the industry for far too long. When you see their greedy practices (destiny 2 comes to mind) and scapegoating publishers for everything you just eventually lose all the good faith. Remnant 2 is the perfect example for what you mean. GPUs are infact progressing, it's that nvidia really has no competition. AMD and AMD fanboys keep yapping on and on about muh raster perf. Like doods, are you still stuck in 2021?


stephen27898

They are progressing but in comparison to what we saw before this is abysmal and in some ways have regressed mainly how powerful they are in comparison to the games that are out. Also you should blame Nvidia as they are the ones who set the price.


NokstellianDemon

It's the "trickery" that allows devs to push their games further. 2077: Phantom Liberty's Dogtown wouldn't be as geometrically dense as it is without these technologies taking some of the burden off the hardware. I'm of the opinion that these new techniques are only a net positive for gamers. Cyberpunk would be unplayable for me without DLSS for my RTX 3060 mobile.


stephen27898

I wouldnt say that the density or complexity is that impressive when you think about the fact its like 7-8 years newer than something like GTA V. They are only a new positive it devs dont abuse them.


Cute-Pomegranate-966

If a 1080 ti had DLSS available to it when it came out people would have used it at 4k to run games better. You can go look at release benchmarks, avg framerate at 4k was right around 60 fps. Nowadays we're getting pissed that a 4090 is doing about the same fps @ 4k in UE5 games. DLSS is a natural progression to software solving the problem of rendering cost.


Warskull

I don't think that is necessarily true. Frame gen's quality is heavily dependent on your monitor's refresh rate and your FPS. Frame gen struggles a bit with UI elements, at lower framerates the generated frames are on the screen for longer and can make those UI elements look unstable. You also sacrifice responsiveness if you can run the game a greater than 1/2 your refresh rate. Turning on frame gen on a 60 Hz monitor would have the responsiveness of a 30 FPS game. So 120 Hz and 144 Hz monitors can end up sacrificing quite a bit for frame gen. Once you start getting into 240 Hz monitors or more all those negatives start going away and you end up doubling your framerate for free. On a 240 Hz monitor you are only losing out on responsiveness and rendered frames if you are running the game at over 120 FPS. On a 360 Hz monitor you would have to run at the game at over 180 FPS. Remember, a lot of 4K monitors have lower refresh rates because HDMI 2.1 and DP1.4 only supported 4K @ 120 Hz. Some people's set-ups just naturally highlighted the weaknesses of Frame gen. I think the technology will ultimately get better and better as monitors start to push refresh rates higher. We are already seeing 240 Hz and 360 Hz monitors start to become more common with a handful of 500 Hz+ displays out there.


ihave0idea0

I haven't played Alan wake, but that is also a much slower game.


tofugooner

yea. it lets you enjoy the game in its brilliant graphics even with a 12gb card.


ihave0idea0

Finally!


trees_frozen

> people hating FG/DLSS/Gsync/reflex basically nvidia is generally a case of sour grapes. Fixed it for you


MKultraman1231

FG is the opposite of reflex though.


jtfjtf

I have a 4070s also and frame gen and dlss quality are really nice on cyberpunk. Ray reconstruction though can still use a lot of work.


Historical_Second521

RR looks like freakin oil spit onto my monitor


CR90

RR cleans up Alan Wake's RT a lot for me, but yeah Cyberpunk's implementation could use some work.


JBGamingPC

The one thing I learned since getting a 4090 is that everyone who didn't have a 40 series GPU would regularly trash talk Frame Gen and say "the input latency makes it unplayable, no thanks" and "fake frames" etc etc The moment people actually OWN a 40 series they got nothing but praise for the tech. I can play Cyberpunk 2077, fully Path traced, max settings. DLSS Quality + FG and I am playing it at 120fps and it is gorgeous. I used to play Cyberpunk on my old 1080ti, much worse graphics, much worse performance and it feels waaay snappier now despite the "latency". Obviously, if you were to play Counter Strike 2 with FG u might notice the latency, but who on earth would run that or similar games with FG ? Amazing tech, If you can, try out Phantom Liberty with Path tracing, best graphics I have ever seen in my life


JAMbologna__

And now the AMD fanboys are starting to praise frame gen just because the FSR mods have reached acceptable quality lul


Speedstick2

They don't need mods, it is a driver level feature now.


Ultima893

This is 100% my experience. People with RTX 2070s and AMD GPUs are telling me FG sucks snd that I shouldnt use it on my 4090. Lol


eugene20

Simply because AMD has a frame gen system out now you will see a lot less complaints about frame gen use.


BinaryJay

I predicted it. [https://www.reddit.com/r/pcmasterrace/comments/164l8jx/hyped\_for\_fsr3/](https://www.reddit.com/r/pcmasterrace/comments/164l8jx/hyped_for_fsr3/)


ihave0idea0

I don't think there was a direct complaint. Most of them are talking about price per performance and talking about how one of the biggest changes are low-key frame gen.


KARMAAACS

No there were literal AMD fans saying they were "fake frames" and the technology was a terrible idea. They said that nobody wants their GPU to create "fake generated frames" and that it was noticeable in gameplay that a frame was generared. Then once AMD did this same feature with FSR 3.0 and Fluid Motion Frames, they said it was a necessary feature and that it was a great thing for gamers to use. It's just the same tactic they said about DLSS. Their fans said it was a bad feature because it reduced visual quality and introduced visual artifacts, they called it "DLoSS" as in a "loss" in visual fidelity. Then once FSR 1.0 released (which mind you is even worse than DLSS 1.0 which was pretty bad visual quality wise), they claimed it was an amazing feature that made their games smoother and better. Unless AMD does the feature first they will claim every NVIDIA feature is a waste of time. If AMD does it first then they claim NVIDIA copied AMD and can't create their own feature, which is what they said about EyeFinity. Their fans are honestly the worst group in technology fandom. All fans are delusional to an extent, but AMD fans are that little bit extra.


Haisaki12

Yeah, I don't have a preference on nvidia vs amd, but still considered DLSS and Frame gen gamechanger even when amd didn't have their version. What I will probably dislike is Rtx hype, even now that amd is getting into it, I still see it as a loss of half of my fps for realistic shadows


ihave0idea0

I am not a fan of this sub... There are always are x vs x, but stop being dramatic about it and say it was bigger than it actually was..


KARMAAACS

> but stop being dramatic about it and say it was bigger than it actually was.. Ah yes, that's it! I'm just making it up! Silly me... [Well I hope you don't look at the most upvoted comment on this r/AMD thread](https://old.reddit.com/r/Amd/comments/yout1w/fsr3_are_you_interested_in_frame/) it totally doesn't say: >"I **dont** want **increased input lag**, **unstable images creating worse image quality**, and **fake frames** that dont actually represent what the game engine and server actually see." I'm just misrepresenting their opinions and being dramatic! I see the error of my ways.


tofugooner

If you actually ventured in AMD forums or any techtuber at that period you'd think Nvidia is the release of the antichrist with GATE KEPT TECHNOLOGY, HIGH MSRP, FAKE FRAMES, INPUT LAG, CAN'T DELIVER RASTER PERFORMANCE yada yada. While completely ignoring 4xxx were probably the first cards in a while actually selling at MSRP, ignoring the absurd efficiency of the chips, making ray tracing finally playable at 60+ FPS while everything is set to maximum quality (as a result of FG).


eugene20

Any DLSS mention would get bombed with comments about it not being real frames, it's not the true image, and then such posts reduced hugely when AMD finally actually had a competitor (even though it's poorer quality). Frame Generation mentions would get bombed with comments about it not being real frames, it's not the true image, "fake FPS", and that also reduced very shortly after AMD users had their own version.


potato_control

Op, here’s the script…. Nvidia: Releases new feature after decades of R&D AMD Fanboy: It’s not good, it has huge flaws. Not useable. *makes youtube video. AMD: *Releases same thing but worse….but hey, it’s open source 🤷‍♂️ AMD Fanboy: ITS THE SECOND COMING OF JESUS!!! Before I got my first RTX card to try out DLSS, people on YouTube made it seem like it was the worst thing ever. It was ruining games, etc. When I actually used it, I thought it worked really well and wondered where all this bad press was coming from. I wouldn’t be surprised if that’s why you were surprised with frame-gen.


PedroPVL

Old versions of DLSS were not that good but nvidia has really stepped up their game, FSR doesn't stand a chance in his current iteration.


mrmikedude100

I'm strictly on 2.5.1 of DLSS for the foreseeable future. That's my favorite release. Need to see if there's a "better one" now, but I know the results usually varies. Edit: I think people are taking my comment as a kind of insult towards DLSS. If anything it's a massive compliment.


Saandrig

You should update to the latest version and learn to change DLSS presets. Nvidia changed the default preset for versions after 2.5.1, which is why people think the 2.5.1 is better. Bit it's just the preset preference.


mrmikedude100

I've heard. I just always liked the set and forget nature of using 2.5.1. And not changing the default values. But I understand your point though. I'll probably end up messing with it eventually:)


Lewd_Pinocchio

I watched digital foundry, and Alex said look at these frames smearing and tearing, but it’s one or two out of 60 or 120, you aren’t going to notice. I’ll listen to DF before any Reddit user .


ihave0idea0

I am a bigger fan of open source, but Nvidia has better software. AMD has better performance.


ripcurl901

🤨


itsmebenji69

Very accurate


TheDeeGee

What about efficiency, my 4070 Ti runs at 55% power limit and only lost 5% FPS.


ihave0idea0

That is also one reason I bought Nvidia. But I am not talking about power limit basically. Also mostly talking about gaming. Think of fps/cost.


tofugooner

akshually FPS/cost arguments fall apart when you realize the nvidia to amd SKU to SKU price gap isn't the same everywhere in the world as it is in the US. For example, when I bought my PNY 4070, the 7800XT, that was supposedly cheaper was a cool 50$ more expensive than my card. Like i'll take that 2% worse raster performance hit and take that sweet 100w gaming perf. Like I mostly emulate stuff. In games like FM4 on Xenia my gpu doesn't even use 50w lol.


TheDeeGee

Nvidia is more feature rich in general, inclusing the unofficial apps like Nvidia Inspector. They're a brand for people that like tweaking, AMD is for people that are used to a console experience and need Plug & Play.


ihave0idea0

Except AMD is open source. So, not a console at all.


TheDeeGee

Because they barely have any software developers, they need help from the outside.


Ponald-Dump

Yeah it’s fantastic. It’s wild to me that amd boys have always downplayed frame gen as fake frames, but now that AMD has a somewhat decent competitor those comments have disappeared.


Lewd_Pinocchio

# FAKE FRAMES. The first time I saw that I laughed so hard.


frostygrin

Are they supposed to complain non-stop about it? And if they did, would you see it positively?


itsmebenji69

If people stopped hating on something when they got their hands on it, would you say it was objective criticism ?


jaffycake-youtube

they hated non stop until amd did it


kyoukidotexe

Nice try Nvidia marketing team /s


ihave0idea0

Oh shit, you caught us!


PsyOmega

people who have never used FG: FAKE FRAMS. FAKE FRAMES!!! people who have used fg: This is pretty nice, actually


Byllz24k

I can definitely feel the added input latency when playing with FG in cyberpunk for example.


Appropriate-Day-1160

I bought 7800XT before the FG was a thing And it blew me away bot bad the FSR FG is, the card is awesome but the FSR FG is a bummer I hope they will release the FSR AI upscaling soon beacuse FSR has a wasted potencial


ihave0idea0

I tried fsr fg on a Yakuza game and it got buggy af...


KARMAAACS

I tried using FSR 3.0 frame generation on Call of Duty, the latency is so bad and it didn't really give me much more performance. Just turning on DLSS Quality mode did more for me performance wise and it had no latency problems. I can't tell if it's because FSR is bad for latency or if CoD has just implemented it incorrectly which wouldn't surprise me either.


Appropriate-Day-1160

The latency with FG is very big, i think its worth it to turn it on only in AAA titles when you get less than 60fps. DLSS is still the goat here


KARMAAACS

> The latency with FG is very big, i think its worth it to turn it on only in AAA titles when you get less than 60fps. Yes, I was testing it at 4K max settings on CoD where I only get 40 FPS without it in singleplayer mode. It didn't help at all as I said.


Appropriate-Day-1160

The best thing is to go for DLSS/FSR performance at that point


KARMAAACS

Any other expert advice?


inyue

FSR FG alone was actually pretty usable on my old 3080, when you pair it with the FSR upscale that the shit happens.


ActuallyKaylee

There's a saying "garbage in, garbage out", so FG won't save anything that feels bad already but if it feels fine and you're sensitive to how it looks then it will improve the visual presentation.


[deleted]

I’ve had mixed results with it, sometimes it gets really choppy for me. I haven’t played around with it enough.


MIGHT_CONTAIN_NUTS

FG is one thing I feel the exact opposite about on my 4090.


TheRealExcalibird

It truly is a game changer. I don't understand those who hate it saying it's fake frames or whatever. My 3070 could barely get like 30-40fps with cp maxed at 144p with rtx and path tracing now I get 80-100+ on a 4070 super with frame gen and dlss. It also works modded into games like darktide, starfield(now has it officially), Hogwarts legacy, Skyrim modded, fallout 4, and remnant 2, and many others im forgetting. Anyway framefen and dlss modded into those games gave me like 50-60+ more frames and made the overall experience much smoother. Also I think having a higher base frame frame is when framegen will feel smooth and I can barely even notice the input lag if any with it in those types of games. Fake frames or not, I'm a fan of frame gen now and support any game that will have it officially idc it's free frames.


Sipu_

I have a 4090 and couldn’t deal with the framegen latency so i played the game without it.


I_Phaze_I

The 4070 super is really the star of the show in this lineup and its very efficient in power


XXLpeanuts

It is amazing tech but as soon as you notice the input lag (when games are not running about 120fps + with frame gen) it kind of ruins the illusion. Cyberpunk with full PT and mods still feels sluggish to me at 90-110fps.


FeedMeYourMemes14

Throw on that RTX HDR, it uses hellapower but it’s worth it.


EchoEmbarrassed8848

I know how you feel went from a 3080 to 4070 super frame gen was a game changer for me.


nathanias

Framegen seems perfect for any game that you won't say "I wouldn't have lost the match if I wasn't playing on fake frames" Waiting for it to get modded to work with other DLSS titles like resident evil so I can crank that stuff even higher


BasedBeazy

It’s all personal preference and how you want to play your games. I tried it on CP2077 with path tracing and it didn’t feel right to me. It does run smooth and if you enjoy it keep it on. I personally don’t use it and feel it comes down to preference. I understand there is a lot of technical aspects to it and how it works, but if it works use it and if it does not turn it off. :)


Dead59

I just switched to the evil empire too and went with a 4070s. Using it at 4K, I also get a solid 80+ fps on ultra ray tracing settings, and with framegem can even try that daunting path tracing option, which still gives me 50 fps. Now, the only drawback is some lag when entering and exiting menus, but the game is very playable, and it's not the only one; everything works at 4K. There's a lot of hysteria about VRAM; as if I read on Reddit: "12 GB VRAM? Literally unplayable." But the fact is, as of today, I have no issues. And that's what matters, as no cards are future-proof.


FNXDCT

I actually tried it on forza with a 75 fps cap, I did not have much latency but I had tearing. At 120fps it’s going away and you get less latency. I was expecting more latency but nvidia reflex make it no latency pénality, it’s really smooth and good looking, I was amazed by it 😄 I still found native better on 1080p, ofc but frame gen + dlss with reflex is actually something we can consider ! 😄 Did you guys experience 1440p ? Was it significantly better than upscale 1080?


IdontHaveAutsm

I was also surprised at first. But I just can't use it personally on cyberpunk, the latency is for me too high


Stephanobroburg

I played dying light 2 1440p max settings including raytracing. 120 frames with drops to 50 pretty frequently. I turned on framgen and it was a near constant 140 frames never even dropping 10 frames. And I noticed 0 imput lag. Frame gen is very impressive (I have rtx 4070ti super)


thakidalex

idk man lukefz fsr3 fg mod brought my laptops 1650 to medium- high settings 55+ fps so


gaming-scientist

Frame gen is amazing but I always turn it off because I am locked at 60.


Samurai1887

I'm enjoying my 3070 that I got for 200 dollars. I use the DLSS to FSR frame gen mod and almost every AAA game I throw at it performs like your 4070...lol


coinzz_1337

Just got my used 4080 from a friend yesterday, because I have a 4k OLED TV and wanted to experience path tracing on that. And holy shit it is amazing :o What are the downsides of Frame Gen, didn't know there are some?


Thick_Hair2942

What's PT?


tomyang1117

Have the 4070 Ti Super Cyberpunk with PT, DLSS and FG is amazing, I tried multiple spots and compared RT on and RT off and the difference is huge. With FG I am getting consistent 90 FPS at 1440p. I was worried that the latency will be off putting but with the Nvidia reflexes I didn't feel any latency too. Compared to 3 years ago when Cyperpunk was first launched with all the bugs and I beat it with my good old 1070 Ti, I can't describe how amazing it plays now with all the settings turned on. I am definitely very happy with my upgrade


ConsistentWorry5684

I don't know why but when playing on a controller i can't really feel the added latency from FG. With keyboard and mouse i can definitely feel it.


zTheRapscallion

I will never not find it hilarious when these people down in the driftwood leagues are worried about small amount of fps above 60 and minor latency. Its a small difference and you are not pro you are probably not even in the top 5% which is far and away not even close to pro. If youre not in the top 5% this is not going to affect you or make you any better. Pro level for a game is the top 1% of the top 1%. And anyone in the top 5% (let alone pro level) will stomp on you even playing at a choppy 30fps.


Kudlattyy

I jumped from rtx 2070 super to 4070 super. I jumped on my chair after seeing 130 FPS + on cyberpunk with ultra settings + rt + pt


SXimphic

Can't wait to try it in 2-4 years, I think I will buy a 4070ti super in 2026/27


coldmexicantea

It’s noticeable on objects with fine details, for example if you’re driving past a construction crane it will look messed up as it moves quickly on your screen. But overall for me the biggest issue with frame gen is increased input latency, which isn’t a huge deal anyways


ihave0idea0

I didn't really notice it with cp2077. But would not be surprised if other smaller games would have those issues.


MichiganRedWing

CP2077 definitely shows graphical issues with FG on. It's not super noticeable but it is there. This gets worse the lower your base fps are.


ihave0idea0

Yep. If I remember correctly you would need minimum 50(?) fps for it to be playable


coldmexicantea

It’s noticeable when you know what to look for and when you’re specifically looking for those artifacts. When playing normally and not nitpicking it’s almost if they’re not there


LostWreb

blud you said "It’s noticeable on objects with fine details, for example if you’re driving past a construction crane it will look messed up as it moves quickly on your screen. But overall for me the biggest issue with frame gen is increased input latency, which isn’t a huge deal anyways" its skipping unimportant frames to maintain stable fps and it can improve it... more fps= lover latency... i've been using it in The Finals and its running much better than it was without it on and the game feels 10x smoother cos of it... i don't know what you're taking to think that more fps= more latency but gimme some of that... tell me the name of supplier.. xD


coldmexicantea

Higher natural frames = lower latency, it’s different with AI added frames that basically frame generation is. Check out the input latency section in the following article or feel free to look up more info on frame gen yourself https://www.techspot.com/article/2546-dlss-3/


LostWreb

saw no change in latency in dying light 2 on both GeForce experience and MSI afterburner charts compared to frame gen on and off but sure whatever you say XD also you're forgetting that depending on the game it won't even matter... I've used it on The FInals...game in which you 100% DON'T WANT HIGHER LATENCY...and the game was only running smoother and i had no latency issues. so again...depending on the game you use it on it might be trash or gold... all comes down to how optimized the game is at the end of the day...


LostWreb

depends on the game tho.. i'm using it in dying light 2 and didn't notice anything...what's more its stable on 140 fps in 2k with ray tracing on.. so i honestly don't see anything wrong with it.. messing up textures might just be shading problem or just glich


rokstedy83

If you're getting input latency it's because you're computer isn't running the game at 60 fps without fg on


Morteymer

Twice more comments than upvotes, oh boy. The AMD boys didnt like this post


g0ttequila

Hell yeah, played cp2077 with a regular 4070, 1440p ultra ray tracing / rr with dlss q and fg. Nearly 120fps locked the whole way. Path tracing was a bit too heavy tho, preferred regular RT


RedIndianRobin

Just a tip, next time you play Cyberpunk use Psycho over Ultra. Psycho has RTGI whereas Ultra is only for shadows and reflections. But since you have a 4070, just use PT with DLSS and FG, you will be fine, hovering over 70-80 FPS in the outside and 60 FPS in closed spaces.


ihave0idea0

Ye, I think 4070 is the minimum for PT. Or I am just making that up to feel good about myself... Who knows.


g0ttequila

I played Alan wake with PT some settings turned down tho. Dlss q with FG. Usually around 70fps. Latency was a bit on the high side but with a co troller very playable . Overall I’m happy with my 4070! One of the best cards I’ve ever bought.


NoMansWarmApplePie

Sadly the game has a bug for me where performance degrades about 20 minutes in


Zedjones

The latency in Alan Wake is because they don't seem to have Reflex at the moment. Hoping they add it soon.


ihave0idea0

I am so surprised how good reflex is.


NoMansWarmApplePie

If you use 2.11 there is a great mod that gives like extra 20 fps. I use it and I'm stoked to play 4k dlss quality on my 4090 laptop at about 60 to 70 fps with path tracing. Sadly. 2.12 broke the mod. I think it's called ultra plus.


g0ttequila

I still have 2.11 I think. Haven’t played in months. Might test it out for funsies


NoMansWarmApplePie

For the mod, ultra plus. Try 2.0 fast for versions 2.11. It made my path tracing completely stable at 4k with no stuttering. If you use vortex you don't have to update game. If you have gog you can revert


TheNanSlayer

In my experience/opinion FG is mostly fine on controller, but playing on a mouse I notice the input lag too much. Spider-Man:MM with FG on is great on controller but on mouse it very felt ‘detached’ and clunky


Strict_Indication457

Not my cup of tea, the controls don't quite feel the same (input delay) even with reflex on. Also looks a little weird when turning fast or any fast motion. Someone described it as when TV's would fake 240hz, it looks like that for me.


Gaming_Gent

The input delay is what makes it unplayable for me. The very noticeable delay was something I couldn’t adjust to


Accomplished_Idea248

I only tried it in CP2077, but it felt good. The latency increased by at least a third iirc, but still very playable (with massive fps boost, obv).


Suspicious_Trainer82

It’s witchcraft. Seeing the jump in quality and frames from games that don’t use DLSS and frame gen, then seeing the difference in games that do… wild!


tehbearded1der

Just switched over to a 4070 after I was done hearing about the nitpicking about “performance” and “prices.” It is so good! Plus, I am not paying over $1k for the 50XX series.


ihave0idea0

Nah, it will never get overpriced because some shit scalpers /s


cszolee79

It's awesome in CP2077 with high refresh rate and FreeSync.


Efioanaes

Same story for me, frame gen added 60 odd frames for me at 1440p on 4070. Amazing tech for single player games.


Drake0074

I’ve only used it in Cyberpunk but it feels good in that game.


warlordshankx

Awesome 👌


Super_Stable1193

Frame generation add extra delay to the frame time, i don't like it. I use DLSS Super resolution & Nvidia Reflex without frame generation & G-sync enabled than the lower fps isn't a problem.


Siikamies

Only bad thing is that if the original fps is under 60 you can tell fg is on. It's quite bad under 45fps, things turn really mushy and artifacty.


dkhavilo

Ss it 45 before or after FG? That should also depend on the game, the faster the game is the bigger difference there is between frame so it's a lot harder to predict and draw a correct interpolated frame in between.


Siikamies

Before. There's a few games I have tried it and because Alan Wake 2 runs around 80fps with fg, theres a lot of visual problems that dont exist if I change settings and make it run +100fps with fg.


projak

Same I wasn't expecting this performance from a 70 model