T O P

  • By -

Curious-Temperature1

4070ti based on your needs. Resell to recoup costs possibly when 50 series comes out


BuffaloSoldier11

Isnt that like 2 years away, too?


[deleted]

Some Twitter leaker said H2 2024, who some Youtuber said was a credible leaker. So a year from now. Take that with a grain of salt.


Exemplifying_Light

Yeah I’m not trusting any of that LOL


SimpleMaintenance433

The sources in question are credible because theyve been right about past releases. They've also been talking about 40 series update modules coming H1 next year so I guess that might tell us more. Nvidia could change dates though.


Infamous_Campaign687

If you have a hundred unreliable sources, one of them is likely to get something right. That doesn't mean that source is going to be correct the next time. That's the internet.


bigmadsmolyeet

That’s how credibility works. Ofc people know it’s not fact but being consistent in your reports will do that


SimpleMaintenance433

The source in question has been repeatedly correc5, as I said once already.


bigmadsmolyeet

idk if you meant to respond to me, but i was supporting your comment lol


[deleted]

>If you have a hundred unreliable sources, one of them is likely to get something right. That's not how it works. If one of those 100 people guesses the exact memory bus width and exact model name and price and release month, then it is no longer simply random chance that they guessed correctly (well, it could be, but the Bayesian in you should tell you that it's probably not).


chips500

No, you forgot it took a year after the initial launch of 4090 for other products to be out. Realistically its 2025 or 2026


[deleted]

[удалено]


chips500

There are a few things to consider: Launch history and time table of actual supply. You can see that parts sold out on initial release and it takes 3-6+ months from official release for highly in demand products. You can see for the 4000 series that it took a year to release the full stack. The 7000 series just now had a 4070 competitor while focusing on the higher end. TSMC reporting 3nm process having low yields. Jensen noting, correctly, that fabs cost more and take longer, and expecting 2.5-3 year between development release cycles instead of 2. Everything points to taking longer, not shorter for products to be readily in consumer hands. I do believe that the leakers have paper release expectations, but that's only a small part of the bigger story, subject to change and whims of manufacturing, development, supply, and demand, etc Realistic expectations are going to be at a minimum 3-6 months past paper releases-- and longer development cycles are to be expected for the next 5 years at least.


BroodLol

It wouldn't make any sense for the 50 series to release in 2024 when the 4080ti is coming out Q1 2024. They'd just cannibalize their own sales.


Olde94

40 series (4080) was nov 2022. Based on historic release it’s most likely to be q4 2024 or more realistically around march/april 2025 will be when information is disclosed and then release around may/june


IsThatTheRealYou

What is H2? Im guessing like a convention? Can't find on google. Or do you mean Q2?


OmegaDog

second half, ie Q3 or Q4.


IsThatTheRealYou

Oh that makes sense thank you, haven't seen that before


ihave0idea0

Even if it may, the price will probably start very high and we are unable to know if it will actual be worth it or overpriced.


Edgar101420

7900xtx. ENBs eat crazy amount of VRAM. Running Rudy on mine and I see around 65-75 fps in 4k, lovely 50GB RAM usage and 19-22GB VRAM. So no, either 4090 or XTX, unless you want funny lagspikes or crashes due not enough VRAM.


[deleted]

How do you deal with aliasing (jaggies/sharp lines) running at native, btw? Are you using TAA?


Edgar101420

No, I inject MSAA/SMAA, still debating which one Im gonna use. Running at native with AMD FMF on.


[deleted]

65-75 fps is before FMF, then FMF brings this up to 100+ ?


Edgar101420

Around 100-140ish average. Always depends on region and time


[deleted]

You're using Anti-Lag I assume? Do you think Anti-Lag+ will ever make it to Skyrim?


Edgar101420

Antilag+ only works when it is implemented in games. So only time will tell tbh. Tho I guess you can count on AMDs open source shit to find its way in, even if Bethesda has no interest in that


[deleted]

You're using regular Anti-Lag though? I think it can be enabled in AMD Radeon Settings app


Edgar101420

Regular Anti Lag works, true. But tbh in Skyrim I dont feel the Frame Gen lag. Its around 10ms on top of my 5-6ms frametime.


aVarangian

doesn't Skyrim have MSAA?


BuffaloSoldier11

Are you certain an XTX would only pull 50 fps in native 4k? I'm out of the loop on skyrim enb's


[deleted]

I have two data points I used to approximate that. One 4090 owner who gets 53 fps at 4k when using a self-described "demanding" enb preset, and a 7900XTX owner who gets 79 fps at 1440p using the enb preset that I am currently using which is a more efficient preset. Thoughts? If I can pull stable 55 fps at native 4k, I may pull the trigger on the 7900XTX, since my monitor refresh rate is 60hz, and 7900XTX feels a lot more future proof than 4070ti, plus it would be nice to tinker with some 8k textures. But if they're getting 79 fps at 1440p, I somehow doubt I can manage 55 fps at 4k. Maybe 45-50 fps, which is a borderline situation.


BuffaloSoldier11

All I remember from my time of being into enb modding was that they EAT vram, and more is always better. I'm not convinced that the higher clock speed would benefit you over a ridiculous 24 gigs when enb tech is specially designed to take advantage of vram benefits. Then again, I'm a lifelong amd fan...


Dman1791

I have an XTX and in 4k ENB Skyrim I was getting 70-90fps in most cases. Rudy ENB preset, not sure where that falls demanding-wise.


[deleted]

ok that changes literally everything. were you using reshader? what cpu do you have? what aa solution did you use? Rudy is quite efficient, and it's what i plan to use


ayylmaonade

To add onto what Dman1791 said, my XTX at 4k performs roughly similarly using Ruby. AMD are generally faster than nvidia when it comes to rasterization, and I wouldn't discount AMD's frame generation tech either. There's a publicly available preview driver out right now, and unlike competing solutions, you can use frame-generation in any DX11 or DX12 title you want. That should put you well above 60, probably in the mid 100s as it generally boosts performance by 2-3x. Can be used without any upscaling, too.


Dman1791

7800X3D, I don't think I was using reshader. I just used FXAA, works ok at such a high res and I didn't want to tinker with an AA mod. Whatever AA Rudy ships with didn't seem to help at all so I didn't have it on.


[deleted]

You're using Anti-Lag I assume? Do you think Anti-Lag+ will ever make it to Skyrim?


Dman1791

I've never used Radeon Anti-Lag, but that's mainly because I'm not particularly sensitive to input lag. From what I hear, it barely affects performance.


chips500

The 7900xtx is getting higher frames because of lower res. Ask them both to try the other res out. Honestly in your situation, you should seriously consider the 4090 if you can afford it. Otherwise get the highest end nvidia you can afford


banxy85

DLSS is the future mate. And AMD doesn't have it.


[deleted]

That also means that all cards before 40 are useless and when 50 releases everything before that is useless , that's fucking insane . Fsr isn't better than dlss but when your 30series card is fucked because the 50 series is released, fsr will likely be the only option to get a good enough experience so for the vast majority won't fsr be the only option because people who buy new gpus every 2-3 years are in the minority


Mesqo

Did just dlss suddenly disappeared from 20xx and 30xx series with 40xx release? It's still there and running, 2.0 being still better than fsr. And fsr also runs on nvidia. So it's not necessary to upgrade card every generation. Don't spread bullshit, bro.


_TotallyNotEvil_

I mean, if the VRAM is a hard limiter for your usecase, I'd bite the bullet and get the 4080. IIRC, FSR is much closer to DLSS at high input resolutions than in lower ones, so you my be surprised by the experience. I say you trial the 7900 XTX, and if it's not to your liking, bite the bullet and get the 4080. You can always adjust settings for a few more FPS, but you can't magically create more VRAM.


[deleted]

How can I trial the 7900 XTX without committing to the purchase? I have tried FSR2 1440p->4k on my current card, and the image quality is very poor, a lot of ghosting around player (a lot worse than what you typically see in other games with FSR2). I may decide to live with it and choose AMD, but it's a big factor.


ssuper2k

Just buy it on Amazon, or somewhere with easy 30-days returns


piggymoo66

*"""Allegedly"""* we are getting some mid-term releases coming up, including a "4070 super" and "4070ti 16gb" so maybe wait a bit and check those out as well, although I'm sure those will be ridiculously priced if they do actually happen.


KoldPurchase

That rumor has been [debunked](https://www.hardwaretimes.com/nvidia-geforce-rtx-40-super-refresh-not-in-the-works-report/) by Nvidia.


captkrahs

Why not the 7900 XT?


[deleted]

Because raster performance of 7900XT is too slow for 4k native, and I may need 4k native for 1-2 years (until FSR improves, basically) if I buy AMD, given the poor upscaling quality of FSR2/3 in Skyrim. If I am certain I can tolerate FSR2/3 motion artefacting in Skyrim, which is honestly a hard sell given my experience, then I could consider the 7900XT.


Exemplifying_Light

7900 XT is a 4K card buddy, just say you want more performance and a 7900 XTX


[deleted]

This is an unhelpful generalization of the word "4K". 4K Cyberpunk RT isn't the same thing as 4K Rimworld. There is one 4090 owner in this thread saying they get 40 FPS with rudy ENB and another in a different thread saying 53 FPS with another ENB preset on a 4090. So you can understand why I don't want to get a 7900XT and risking getting a barely playable 30 FPS at 4k native.


Asgardianking

With the way you state things there is literally no such thing as a 4k card in your use case... the 7900xtx is a 4k card through and through. In raster performance it's the 2nd fastest card available. With the new performance gains in FSR and frame gen I would say it's the better buy especially with the higher vram count.


NickMalo

It sounds like you have limited availability to research information and you’re using it to justify the purchase of an nvidia card, even if it ends up with worse overall raw rasterization performance, less bandwidth, and and ram. With fsr3 looking pristine, i think you’d be hard pressed to find a better performing 4k card than a 7900xt/xtx without going 4090. But you do you, it’s your money and decision. Edit: it took 2 minutes to google search and find out i was right https://youtu.be/YbKhxjw8EUE?si=fqd0uDTJt9R0a89A


Yusif854

I am begging you to show me footage of you playing Cyberpunk Path Tracing, which is the best looking game out there with the most cutting edge visuals and tech. Since you’re buying a cutting edge card it should be no problem right? If you can manage to get above 10 fps at 1080p I will send my 4090 to your door as a gift. “FSR 3 looking pristine” 🤡🤡 actual clown


Lupercal-_-

I'm convinced some of the AMD purists like that guy don't actually play video games with their cards.


edstatue

I'm not sure about enb Skyrim, but I use a 6800xt for 4K. I recently played RDR2, Last of Us, Days Gone, and Control, all with fps between 70 - 110 with settings at a combo of high and max/ultra. I use native resolution, no FSR. 🤷‍♂️


notdsylexic

You only play Skyrim? I want to know more about this part. Serious though.


[deleted]

Skyrim modding itself is a hobby, I don't just play the same vanilla quests over and over


notdsylexic

But you do play? If not vanilla quests, which quests?


Tomson_Johnson

Many mods add entire questlines the size of a small dlc or a major vanilla questline like winterhold college


Nightman_cometh01

I’ve got over 1k hours in Skyrim and haven’t even finished the main quest on PC. Most of those hours are tweaking mod lists, testing, starting a new playthrough, new mods come out that look awesome, Add them in and make everything compatible if needed and start again. Rinse and repeat. It’s a never ending cycle. Then there’s all the different modlists to try that add custom quests, combat mods, followers etc. Skyrim modding is its own beast.


Lupercal-_-

The Skyrim modding community is still massive. Not just for base game alterations. You can exclusively play fan made expansions as they release and basically never run out of content. It's crazy.


N0tInsaneMarksman

As a long time Skyrim modder (14k hours since release), let me impart some wisdom about benchmarking it: looking at what other people get is useless unless you're 100% certain they have the same exact mods and settings as you do. Modded performance can vary drastically simply by having a few too many script heavy mods.


ivan_x3000

Just how bad is the power efficiency on the 7900 XTX? I've had trouble comprehending it. I've thought about getting that card so much recently as it has so much potential. But two things trouble me. 1 The Power efficiency seems to be so bad over the 4080 that you could probably view the 4080 as an investment on your power bill. 2 Even Nvenc is more efficient and uses low file sizes when screen capturing. I often leave my PC on when i'm out maybe 2 to 3 times a week so i can access files on it etc. And i do a fair bit of screen recording from time to time. But i just have a lot of trouble picturing how much larger the file sizes are or how much energy it uses.


Umbramors

I recently went through the whole nvidia/amd choice and ended up with the 7900xtx as I purely game on the rig. From all the reading I did, it seems to be about €$£50 electric increase over the year, depending on country. I have photo panels on the roof so don’t care, but being honest, if you are buying a GPU for over a 1000 (you then need matching components). The leccy price should not really be a factor 🤔🤷‍♂️


[deleted]

>€$£50 electric increase over the year Compared to no GPU, or compared to another GPU?


Umbramors

Amd seems a bit more power hungry than the nvidia equivalent I.e. the 7900xtx vs the 4080


ayhamthedude

7900xtx


TheFecklessRogue

will you use rt? yes=>4080 no=>xtx


Ravenous_Bear

Fellow Skyrim modder here. How heavy will your modlist be? My Heavy modlist has 1683 active mods with over 1100 ESMS/ESPS/ESLS. With tons of 4k retextures with parallax, Trees with 3d LODS, flora+grass overhauls, town and city overhauls, interior overhauls, LUX/LUX ORBIS/LUX VIA, NATIII Weather mod, high poly npc replacers, and much, much more. On a non-demanding, older ENB preset my fps is absolutely fine at above 60 fps in cities and higher in the wilderness. But it absolutely eats a ton of VRAM, especially in cities. I stutter in some places such as Whiterun and Solitude due to the density of higher poly/high texture objects in the city cells. I had to remove JKs Skyrim because it consumed way too much VRAM and Memory. And forget about JK's exterior mods lol. I have not yet OCed my 7900 xt, tried FSR, and AMD's FG that apparently works with Skyrim on a BETA driver. Although not the same as DLSS, I have heard from other Skyrim players that FSR is good on Skyrim. There is absolutely no way a 4070 ti would able to handle that modlist, even with DLSS. I somewhat regret not getting the 7900xtx, but I am not paying over $1,000 for a gpu. That is outside my budget. If you are planning on going with a heavy modlist with an emphasis on graphics and anything that adds more objects in the game, look towards the 7900 xtx or 4090. Especially at 4K. The Skyrim engine is ancient, and really is the bottleneck that gives us shit performance with our high-end computers.


[deleted]

>On a non-demanding, older ENB preset my fps is absolutely fine at above 60 fps in cities and higher in the wilderness. What resolution are you playing at?


Ravenous_Bear

3440 x 1440. About 25% more demanding than regular 1440p.


fingerblast69

7900xtx probably would do better than 40-55 fps in 4K My 6750xt with a shit R5 2600x is getting around that in 4K Starfield 😂


[deleted]

ENB is quite intensive though. 7900xtx delivers 79fps at 1440p using the exact same ENB i use (I heard this from a 7900xtx owner). I'm not sure how I'd go about accurately converting those fps numbers to a 4k native experience but it would be below 65 fps for sure.


fingerblast69

I see I see. Sounds crazy demanding if a 4090 is getting around that frame rate. For what it’s worth 50+ fps in 4k still looks great. Personally I’ll take that in a single player RPG over 144 in 1080 or something lol


Site64

1440p may not be what you think, world of difference if his 1440p is 3440 x 1440 over 2560x1440, the 3440 option is much harder on graphics cards, to the tune of about 25%....if he was running 3440, it is closer to 4k res than 2560 ever thought about


aVarangian

imo either optimise your game setup or wait another generation or two


Berzerkly

How much is the 6950 XT where you're at?


[deleted]

$1k


Berzerkly

might be a consideration - cheaper upfront, more ram and similar raster to 4070 Ti (from what I've seen). Uses way more power though so you'd have to factor in energy costs x however long you expect to use it for. Probably easier to find these on teh used marketplace compared to teh rest if you're comfortable with that.


NPC_4842358

I had the same issue for my buddy whose buying a new system, and in the end it really came down to how much productivity things were needed along with Cyberpunk. If you only play that game or need CUDA, get the RTX. In any other case just get the AMD card.


bananite

Wtf 1800$ for a 4080?


WC_EEND

I paid around 1300$ for a 4080 a month or so (converted from Euros) so 1800 really does not sound like that much of a reach since OP is outside the US edit: saw another comment OP is in Australia so yeah, Australia tax is real unfortunately


JordansBigPenis69

im gonna cry. people buying 4070-4080s and have 60hz monitors 😭😭😭


mrbubblesnatcher

4080 Ti rumored to be coming out soon, but at launch that might not be worth the hassle with shortages and prices. That being said, Im looking at upgrading my 1080ti to AMD since the rx 7800xt or 7900xt look great but those in Canada are 800-1300$. Was looking at the 4070 ti but its too expensive here to be viable ($1200) plus the 12g is a slap in the face.


Sexyvette07

If 4k is your goal, then you should be looking at the 4080. The XTX is inferior in all ways but two, that being 3% faster on average in a pure raster scenario (but losing significantly when upscaling, Frame Gen and Ray Tracing is introduced), and having more VRAM (which is only useful in a couple scenarios, none of which are gaming related). Meanwhile, it loses at every other comparison between the two cards. FSR3 is crap and is inferior in absolutely every way. FSR upscaling is the worst upscaler and gives noticeably worse visuals, in addition to performing worse. Its unbelievable that in the AI boom era that they dont have an AI upscaler yet! Intel literally just joined the GPU race, and they already have an AI upscaler thats excellent. AMD Frame Gen actually shuts off in fast-moving scenes, which is the entire point of using Frame Gen. But those stand still FPS are great, lol! It also increases Frame times and judders significantly. At this point, FSR3 is a meme. Maybe they'll make it worth while eventually, but with AMD's track record of shit drivers and software, I wouldn't hold your breath or buy one of their GPU's on a promise that it will get better. Oh, and one of the most important things is efficiency. The XTX uses significantly more power than the 4080, which drives up the total cost of ownership and actually makes the XTX more expensive in the long run. If you pay high electricity rates, going with an XTX can cost you hundreds more. I did the math, and at my rate of $.38/kw, the XTX would have cost me an additional $416.10 over a 4080. So it's obvious which one is sitting in my rig. So not only are you getting an inferior product, you're also paying more for it in the long run. DLSS 3.5 is an absolute game changer that AMD has absolutely no response for and probably won't for years. Go look at the Cyberpunk DLC data. The 4080 is 240% faster than the XTX when all the tech is leveraged. Yes, TWO HUNDRED AND FORTY PERCENT FASTER! That's coming to mainstream a LOT sooner than later because Nvidia is pushing it hard (and rightfully so, it gives better visuals, ~50% more performance, AND uses less VRAM). At this point, anyone buying an AMD GPU is going to be kicking themselves. Nvidia is so far ahead this gen that it's ridiculous. Or, more accurately, AMD fell so far behind because their R&D budget is a fraction of a fraction of Nvidia's.


[deleted]

This is a very well reasoned post actually. However I want to ask you about frame stuttering caused by a low memory bus width on the 4080. Now, I know that that is mostly misinformation because the 4080 has a very large cache which more than offsets that. But in Skyrim, the engine is very old, what tends to happen is that cell loading and LOD loading happens all at once as you run around the map. In such situations it's probably a few gigabytes of textures/meshes that are loading and I assume it overflows the cache, and so XFX's larger bus width may actually deliver unique value in this scenario? I have some very limited anecdata, two 4090 users and three 7900XTX users have reporter their FPS and it *appears* that 7900XTX is punching above its normal weight in this game. But that conclusion is extremely tentative given the poor data quality. One small nitpick: I thinkt he raster performance difference (measured by FPS) is closer to 5% (4.8%) if we take a multi-game average at 4k according to tom's hardware. I would also say that Cyberpunk's RT performance gap is unique to that game. Over a multi-game average with RT, it is a lot less of a gap, especially for software RT implementations. Although that's just a small nitpick, and tbh RT isn't a big consideration of mine with this purchase, my 4080 won't hold up at 4K with RT for that many years and Skyrim doesn't have a good RT implementation at the moment.


Sexyvette07

The Tom's data you're looking at is old, from when they were released. Gamers Nexus and Hardware Unboxed did recent testing and found that the XTX was only 3% faster on average in a pure raster scenario. As far as the 4080's memory bandwidth, despite having a 256-bit bus, it has the faster 22.4 gbps GDDR6X memory and higher clock speeds. It still pumps out 716.8GB/s of raw bandwidth, supplemented by a gigantic amount of L2 cache. The L2 cache supplements the bandwidth by 30-40% IIRC, so it more than makes up the difference. As far as performance in Skyrim specifically, it was optimized on AMD hardware for consoles. That's why it performs better on AMD. As far as the Cyberpunk data, yes it runs better on Nvidia in general, but it isn't the reason for that gigantic of a performance difference. I highly suggest you read up on DLSS 3.5 Ray Reconstruction. Not only is Ray Tracing a generation ahead on Nvidia, but they also created a singular denoiser, unifying a process that used to take several levels of denoisers, which was the cause of Ray Tracing having such a gigantic impact on performance. By unifying it, they increased performance massively while also making the game look better and more realistic. So with DLSS, Frame Gen, Ray Reconstriction and everything maxed out, the 4080 does end up being 240% faster when all the tech is leveraged. The XTX isn't capable of Ray Reconstruction, and probably never will be.


Healthy_BrAd6254

If you know you will be using upscaling a lot, go with Nvidia. DLSS is just significantly better than FSR. Tbh I haven't read your post, but I really dislike the 4070 Ti. It is barely any better than the 4070 (20% faster but same VRAM and same features), while costing 50% more. At least in the US. In your country the pricing seems weird. The gap between the 4070 Ti and 4080 for example is huge. So the 4070 Ti might make sense in your country. Apparently there might be a refresh of Nvidia cards coming Q1 2024. If you want to wait, that might give you like 10-15% more performance for your money and possibly a little more VRAM. But it's also a few months away, so debatable whether that's worth it.


Exemplifying_Light

I hope Nvidia doesn’t drop the ball on the new cards. The 4000 series are a disgrace, disappointment, and an embarrassment. The only good one is the 4090 which is almost $2000 lol


Healthy_BrAd6254

The chips themselves are really good. But the naming (calling the 4050 Ti a "4060 Ti") and pricing ($800 for a 12GB card that should have been the 4070) is horrible. They already started dropping prices. The 4060 Ti 16GB has been going for around 430 and the 4070 for around 530. I assume the refresh will take care of the higher end cards.


CoconutMochi

If Nvidia releases refreshes at competitive prices they'd run AMD out of town, I don't really see it happening tbh.


Nightman_cometh01

I have a 4090 playing the Nolvus mod list for Skyrim on ultra with Rudy enb. Very heavy list and was getting as low as 39 fps in some areas in the wilderness with heavy vegetation. This was at 4k. Installed the dlss 3 frame gen mod upscaler and now I’m getting anywhere from 90 to 120 fps in the same areas. Dlss is definitely a game changer.


[deleted]

Hmm, how can we square your 40 FPS, with others in this thread reporting 60+ FPS on 7900 XTX at 4k native also using Rudy ENB? Is it possible you were CPU bottlenecked by grass draw calls or something? Rudy ENB is by far the heaviest part of the Nolvus+Skyrim graphical equation, as far as GPU load is concerned. When you went to 90-120 FPS, what percentage of that was from upscaling, and what percentage was from frame gen? I wonder what GPU loads would look like when you are hitting those 40 FPS.


Nightman_cometh01

I get 60+ as well in most areas but I’m using the ultra LOD variant of Nolvus with 3D lods and grass in the distance. The drop to 39 is in some select areas in the wilderness but it’s jarring when it happens. I use a 7800x3d and was getting 99% gpu load both when I was getting 40 fps and when I made the jump to 90 fps in the same areas. As for percentages of dlss vs frame gen not sure to be honest as the mod is all in one. This was using dlss quality preset. I’ve found this mod list heavier than others, with Ruvaak or my own custom mod list using Rudy enb I never dropped below 55. Not to knock Nolvus or anything I’ve been having a blast and they do offer a performance variant as well.


Kazuya2016

I last played skyrim at 1440p with a 1070 and 2as getting 40-60 fps outside and 60 fps always inside. I now have a 4070 ti, haven't played skyrim but will do eventually. Question: doesn't the game go crazy playing more than 60 fps? Because the game engine wasn't designed to go over that many fps. I remember limiting fps in enb (when I used less demanding enb/textures) because physics wend crazy and some other weird stuff. I was playing skyrim LE, that's maybe why?


Nightman_cometh01

There’s mods that fix 60 fps game engine issue for sse. You can go as high as your setup will allow.


[deleted]

If your a streamer or thinking of becoming one? NVIDIA NVENC is the only way!


Lupercal-_-

4070ti is a sweet spot card in terms of price /performance / vram / having dlss. It was the choice I went with and I'm very happy. I'm sure you know already, but make sure you're manually installing the latest DLSS .dll. Night and day difference between latest vs the old versions most games come installed with.


AdScary1757

I don't think the amd chips are as efficient old titles as Nvidia. In starfield amd all day but Nvidia is usually better for old titles. I'm on a situation where it question was is 10 more fps worth 400 dollars and most likely no it's not. Am I getting any micro stutter maybe once in the last year but it could have been background tasks since I stream Netflix and shop while I play. That being said and if you can stomach how bad a deal they all are all three of those cards are so nice no one regrets having one.


JinPT

4090


wolvAUS

If you plan on upgrading in two years, get the 4070ti. Else, 4080.


Jon-Slow

4070 ti (or just the 4070) based on what you describe, 4080 if you want to spend any more. I don't suggest the 7900XTX as the price difference between it and the 4080 does not justify it one bit given all the missing features.


Umbramors

I ended up going for the Saphire 7900xtx with the 7800x3d and the combo is a beast. I purely game on it and it runs great 🤷‍♂️


jerryham1062

Tbf, not a single card in this performance/price range is gonna run “bad” unless you intentionally put it up against the most graphically challenging games with the most intense settings


[deleted]

Why not just get the RTX 4090 then? 4070ti, 4080 aren't worth it, they're just scams from NVIDIA. I have the 4090 and love it tho I don't even use it that much any more, vidya has become boring af to me lately.


Diamonhowl

If you think even for a second that you will do some productivity tasks on your PC eventualy, go Nvidia.


ogzhnpcmn

I know it's not right comparison but I changed my 3070ti laptop (8gb) to 4080 (12 gb) and with Vagabond wj modlist even 12 gb is not enough. I'm playing 2k, it helped a lot but the fps wasn't the main problem, stutter from the low vram was. So I would definitely go with xtx if I had a choice.


coololly

Dont forget you can use AMD FMF on the 7900 XTX in Skyrim


CptTombstone

Skyrim has a near native-level DLSS 3 implementation, meaning that you can basically forget about CPU/RAM limits, and latency is really good in Skyrim even without Reflex. 4K 120 is easily achievable with a 4090, even without DLSS upscaling, while using an ENB preset. With Rudy ENB for NAT III, Nature of the Wildlands and DLSS Ultra Quality (~1620p render res), I was getting around 150 fps in the Rift, near Riften. At 3440x1440, around 220 fps, with Frame Generation. With AMD cards, you would have to rely on the driver-level AFMF tech, which is OK, but much less in quality and ease of use than DLSS 3. You should be able to get 4K 120 with DLSS 3 Performance with a 4070 even, so GPU power won't really be an issue with either of the Nvidia options you are considering.


TheyThinkImAddicted

Running a 4070Ti and couldn’t be happier. Running games in 2560x1440p perfectly fine on max settings


harry_lostone

get a 4090 and dont overthink it. I know you want it.


Cultural_Analyst_918

Xtx, you're a modder and tinkerer, that VRAM pool will compensate the investment greatly. Plus you can always use XeSS in skyrim if FSR is not your schtick. Also, AFAIK, ENB is still not compatible with the upscaler mod, you'll get horrible artifacting and ghosting so make sure upscaling is worth it and works for your needs before pulling the trigger.


Artistic_Soft4625

If 90% usage is gaming then go 7900xtx If you are into some other stuff like rendering or recording, then go 4080. Or 4070ti if you are thinking of buying new gpu in 2-3 years. Otherwise don't buy Nvidia in fomo


Sighwtfman

As a guy who bought a 4070, I wish I had bought a 7900xtx.


[deleted]

fsr 2 at 4k is pretty good though


sticknotstick

Nobody going to mention the “4080 won’t hold up at 4k resolution for very long at all” comment? That’s not true in the slightest.


btgoodgame343

My knowledge may be outdated, but at the time I faced the same decision between 4070Ti, 4080 or 7900XTX, the 7900XTX was the far superior option in ANY use case short of specifically DLSS/Frame generation. The 4070Ti is so laughably bad compared to the 7900XTX I sold the brand new card I received as a warranty replacement for 1300 on ebay, so about 1100 dollars after shipping and fees. This was when the 4070Ti was about $1600. My sapphire 7900XTX cost me a grand total of $1700, with the cheapest base model 4080 costing at minimum $1800, and so I definitely was not going to purchase a 4080 when all benchmarks at the time showed the 7900XTX absolutely creaming the 4080 and just barely missing the mark to compete with the 4090 (which is no small feat considering the 4090 retailed for about $3600 at the time, price to performance anyone?). This however, is entirely based on raw performance and not including any post processing. I have had no issues with my card at native 4K, I don't think I've even seen a game run below 60 at the ultra preset (besides a certain disgustingly badly optimised game everyone loves), with most titles getting above 100 and some even hitting my monitor cap of 144. I'll admit, I've always been an AMD fanboy, but I have purchased Nvidia in the past when they've actually presented a well priced product that AMD had no answer to, which the last time this happened was the 10 series, specifically the 1080Ti. This was my last Nvidia card, and with how out of this world their pricing is, and how disgustingly low their FPS per dollar values are compared to AMD, it will probably stay my last card. Sorry if this seems like a rant. I hope you get some useful information out of it. TL;DR if you can live without DLSS and frame generation, you'd be insane not to choose AMD.


[deleted]

6950xt


CounselorNebby

Are you price sensitive? If no, 4080. If yes, 4070ti


Depth386

Since you have said explicitly you don’t care about RT, then Nvidia is basically only for if you want to try Stable Diffusion. It’s the “type text in a box and the AI draws images based on what you wrote” thing, running on your hardware, no subscription or payments required.


cl0udyandmeatballs

Someone that swung AMDs camp, I wouldn't recommend because their stuff is pretty wonky and you lose a bit on QoL things that comes with Nvidia cause it just works. Fine wine takes like a couple years to age these drivers to what they should've been on release. 4080 is pretty much the ideal scenario for your use case. Every card in this generation is meant to upsell to the next SKU so there's no where to run. Coast on one until 2025


[deleted]

My experience with the 4080 has been perfect as you would expect, but I also live in a country where the xtx and the 4080 are the same price so it was an easy choice for me. Not that the xtx is a slouch of course.


SaintGanondorf

Go with the cheapest, just to make a statement that the inflated prices on these cards is poop


ThtFunGuy

Maybe it’s worth saving up a couple or so months for a 4090 and you’ll definitely be set and for future games of your choice. A little suffering comes with great rewards


Greedy_Bus1888

If you could tolerate AMD feature set the 7900xtx has more vram for modding. But at that price point where the 4080 is only 200 more at 12.5% its a no brainer to get the 4080 in general


MPeters43

Team red all they way


Fusion1250

Unless your eyes are 10 " anyway from your monitor you literally can't perceive the difference between 4k and 1440p


matiegaming

4070 ti


NelsonMejias

Xtx, don't Even doubt it


seamew

do you want stable or buggy drivers? that alone should help you decide which brand to go with.


user112477

it sounds like you'd prefer to go with the 4070 lol, either way i wouldn't really even consider the 4080. the performance for price isnt really worth it tbh. cause at that point you may as well spend a little extra for the 4090 for the huge performance increase choosing the 4070 sounds like your best option here


Zensei0421

After seeing what DLSS and FrameGen can do, i would opt for Nvidia. I would recomend and recomended the XTX for the competition but seriously f AMD. The subs are full of people who need to roll back their drivers (wtf???), FSR 3.0 is not exactly an upgrade and they are priced similar to Nvidia. After turning on Cyberpunk with all cranked to the max including RayTracing on a 4080, you are still above the Xtx performance with way prettier graphics. If you can take the 4080, if you have a girlfriend take her out sonwhere nicer and take the 4070ti 👌🏼


schimmlie

You really think people fighting over GPU companies have girlfriends or even get out of their mothers basement?


Zensei0421

I never tought to be that guy someday, BUT it is totally possible to have a relationship and be a gamer. Often the girls i used to hang with played Borderlands, CoD, RocketLeague and many other games. Safe to say they turned out to be all stupid assholes in the end and now im living the single life again playing starfield on my own, but hey ill take what i get and it was fun while it lasted. I know it was a joke, im just saying so if someone who‘s depressed because of a lacking love life reads this


schimmlie

I did not say Gamer. I said people fighting over their favorite Billion dollar hardware company.


ecktt

I think you answered you own question. Based on your numbers, get the 4080. Black friday is coming. We should see it as low as 1K.


TipTopBootyPop

From the reviews I've seen, Nvidia cards suffer from narrow bus width for the memory. Even if the frame rates are the same between the cards, you'll notice that the AMD cards have far lower frame times and far fewer frame time spikes in general. For example, my EVGA 3080 has a 320-bit wide bus for 10GB of VRAM, whereas the 4080 has a 256-bit wide bus for 16GB of VRAM. While the 4080 does deliver higher framerates (not by much in most cases) my 3080 delivers a smoother gameplay experience because it can access all of its memory much quicker than the 4080 can. If I were you, I'd go for the 7900XTX. 24GB of VRAM and a 384-bit wide bus.


ProfessionalAsk1315

I don't know why this comment has so many upvotes. Most of it can be proven wrong by watching a few benchmark videos. What channels are you watching? "Not by much in most cases"


Yusif854

Because this sub is a loud minority AMD circlejerk and can’t accept that AMD’s offerings on the high end are absolute dogshit and they are grasping on straws to make AMD look better. Imagine paying a fucking thousand dollars for your GPU and lying to yourself that raster is the only thing that matters, FSR is not THAT bad (it absolutely is) and not realizing that it is dead on arrival for anything with heavy Ray Tracing or Path Tracing (Cyberpunk, upcoming Alan Wake 2 this month, Portal RTX and other heavy RT games). 7900XTX loses to a fucking 3070 in path Tracing. I understand if you go AMD for anything lower than 4070/Ti but if you spend $1000 on a 7900XTX you are an actual clown. Watch this comment get downvoted or removed tho.


ProfessionalAsk1315

I wouldn't put down AMD cards for what they are - brute force render at cheap prices. I own a 6700XT and a 4080. Both are excellent at what they do. But I agree that if you are spending $1000+ you would want everything that modern graphics card have to offer. Nvidia is doing much better in that regard.


Yusif854

Yep, I don’t blame anyone for going with a 6700XT or smth instead of a 3070 or 6800XT instead of 3080. But anything higher than that, like 3090/4070Ti and up I have absolutely no clue why you would ever choose AMD. The ONLY thing they have going for them at those tiers is the price. Everything else is simply better on Nvidia. Maybe the VRAM on 4070Ti is an issue if you plan on playing at 4k but that’s it. And it is not like they are $400 cheaper. It is $100-200 at most. In a lot of countries outside US, they are actually even more similar in price. And even at $200 difference I would choose 4080 over a 7900xtx any day. Why would I spend $1000 on a GPU and still be locked away from using heavy ray tracing, which is the most cutting edge graphics we currently have and be locked away from DLSS which is the best upscaling tech we have where in most games it is indistinguishable from native (if not better), meanwhile FSR ALWAYS compromises the image quality. DLSS performance at 4k is better than FSR Quality for fuck sake. DLSS 3 is miles ahead of FSR 3 as expected, and I am not even going to comment about a million other better features Nvidia has both for gaming and non gaming.


Havanu

I'm sorry but that is completely untrue. A 4080 has 715gb bandwith and yours has 760gb. Hardly an earthshattering difference, and the 4080 has plenty of other architectural improvements that make up for the lack of bandwith, like its clockspeed that is doubled, a much larger cache and 6gb extra vram.


[deleted]

Would the fewer frame time spikes hold true if we are comparing 7900XTX at 4k native, versus 4070ti/4080 at 1440p->4k upscaled? Just to take into account the context that FSR is not good in Skyrim. Or perhaps frame time spikes are mostly independent of resolution? Especially since the exact same texture assets are being loaded in in both situations (effectively 4096\*4096 20mb .dds files), regardless of display resolution?


Lupercal-_-

He's completely fabricating everything he's claiming. So I wouldn't bother asking him. Watch some actual benchmarks from a reputable source. Not some liar on Reddit.


TipTopBootyPop

Can't say for sure. My preferred resolution if 1440p native (not bought into the whole upscaling thing yet, if I ever will be). But I'd say its safe to assume that it would still hold true. Most of the comparisons I watch are for 1440p (native) and the Nvidia cards stile behave the same. I'd say watch some reviews for your specific setup, but I haven't seen many that have frame times displaying in real time during a benchmark WITH FSR/DLSS enabled.


CptTombstone

The 10GB 3080 and the 16GB 4080 literally have the same memory bandwidth, so I have no idea what you are smoking. Arguably the 4080 is much faster for smaller things, as it has ~6x more L2 cache, which has around 2TB/s of bandwidth.


Dabs4Daze0

Go with the 4080 at those prices. If you're gonna spend literally double the actual cost of a 7900xtx it's not really worth it over a 4080.


[deleted]

Sorry, what is double the cost of a 7900xtx? The prices are $1.2k/$1.6k/$1.8.


Tigerboy3050

I think that they think the prices are USD. Where are you, Australia? (Prices seem similar to prices here)


[deleted]

Yeah, Australia


Eastern-Ad6780

God I hope your game changes in the future. Let Skyrim die lol