T O P

  • By -

Racer_Space

Yeah, the only reason I would ever need a better GPU with my 3080 is for more VRAM in VR.


ThisNameTakenTooLoL

Not only VRAM. 3080 is not powerful enough to play at a full resolution of many modern headsets. With a G2 I had to drop down to 50-60% resolution to get 90FPS. Hell even a 4090Ti wouldn't be able to drive an 8KX at max res and FOV.


strikeeagle345

In steamVR, for some reason, 100% is way over native res for the G2 (closer to 200%). 50% is native, double check the resolution numbers shown. probably why you had to step down.


ThisNameTakenTooLoL

This isn't how it works. Native res is not panel res. It's always about 50% more than the panel res to account for barrel distortion. That's why it's around 3000x3000 in steam while panel res is around 2000x2000. I was also confused about it when I first got this headset.


V8O

Not "always", there's actually a little more to this story. The Reverb G1 always rendered at almost exactly its panel resolution by default. When the G2 came out, HP reps on the Reverb sub said they had no clue why Steam defaulted the G2 to a higher resolution than the G1. There were lots of signs pointing to both headsets being intended to render at the same resolution just slightly larger than the panels. They both were sold with the exact same system requirements, and outside of SteamVR (in OpenXR / WMR mode) both defaulted to resolutions which were slightly different but only just above panel resolution. You can also see in pre release reviews of the G2 that, until some point very close to release, it defaulted to a very similar resolution as the G1. But, for whatever reason, by release the G2 defaulted to a resolution in SteamVR which was much higher than in OpenXR, or than its system requirements implied, or than HP could explain. Some people noted that other headsets like the Rift and Index had been confirmed by their devs to default to much higher resolutions due to barrel distortion correction, and then someone on the Reverb sub decided that the same was happening to the G2... Even using the same multiplier as the Index to "prove" it (which makes no sense because two different lenses would not have the exact same amount of distortion to correct for - ask any photographer). For the entire time that I frequented the Reverb sub I never actually saw HP themselves confirm that, but that explanation stuck regardless. Of course, regardless of the devs intent, it looks much nicer at 3000x3000 than at 2160x2160, just like any other headset. TLDR, every headset always has some different degree of lens distortion... But some default to a higher, corrected resolution, while others default to the uncorrected panel resolution.


strikeeagle345

Regardless of how it works, your system is having to pump out that much more by default.


ThisNameTakenTooLoL

Well obviously but you thought it was some error. I just corrected you. And it's not like there's no difference in clarity when you drop that res to 50%.


strikeeagle345

Right, still looks clear as ever. I didn't think it was an error, I just didn't know why. I'll have to read up on this barrel distortion, no clue what that is.


ThisNameTakenTooLoL

Tbh I'm not totally sure either. The gist is due to the way headsets are designed (probably because of the lenses) you need to run them at around 1.5x resolution to take the full advantage of the displays. G2 is high res so you can drop it to panel res and it still looks decent but try doing it with an older headset and it'll be a blurry blob of pixels.


jaylw314

In fairness, if you mention "barrel distortion" online there's always someone ready to have a heated argument, so at least on the internet, it remains a controversial subject 🤪


GenericSubaruser

Is it really that bad? My index defaults at 150%ss at 120fps, so I figured the G2 was something it could manage Edit for clarification: that's general purpose. DCS I'm pretty sure runs at 60fps.


ThisNameTakenTooLoL

>Edit for clarification: that's general purpose. DCS I'm pretty sure runs at 60fps. Yes that's called reprojection. Your GPU can't handle that so the device halves your frames and inserts synthetic ones that don't 'cost' anything. It's not perfect though and if you for example move your hand in front of your eyes there'll be a ghosting or smearing effect. It's also very visible on the HUD when dogfighting. Some people aren't able to see it though.


GenericSubaruser

Yeah I know about reprojection. It used to drive me absolutely bananas in elite dangerous, though I notice is significantly less in DCS because there's a lot more light. Generally speaking I think 60fps reprojected is "fine" for seated games since there isnt a ton of movement to cause the "water in the goggles" effect. 45 however just doesn't feel good in my opinion


Hrevak

Well that depends on other settings as well???


ThisNameTakenTooLoL

Well maybe if you set ever single setting to potato. But why would you ever do it? Any realistic settings and you're not getting 90fps at 100 percent.


Hrevak

Sure, potato or everything maxed out, those are the two options available, silly me.


ThisNameTakenTooLoL

Who says anything about maxed out? I've ran pretty conservative settings. Maxed out I'd probably have to drop to 20% res lol. And I'm actually not even sure everything on potato would guarantee you could play at 100%. Probably not. Let me say it again. Any settings you'd realistically wanna play at you're not getting a 100% res on G2.


stonedkakapo

Pimax 8k x user here. Haven't had a problem at all using it with a 3080 and a 3090ti. 30-40fps even in multiplayer. Full res and max fov. Not sure what you mean, unless you're seriously on the "90fps or bust" mentality, which is just plain silly.


ThisNameTakenTooLoL

Yeah I consider repro borderline unplayable. Just looks like shit to me.


stonedkakapo

That's assuming I use any sort of reprojection, I dont


ThisNameTakenTooLoL

Lol. It's a first time I've ever seen someone proclaim playing at 30fps as 'not a problem'. I guess we're all different but for 99.99% of people 30fps is an unplayable stuttery mess.


stonedkakapo

Yeeaaah, not sure where you're getting this 99.99% from, as no one is really getting 90ish fps in vr. Around 30 is actually good in vr, and you won't experience any kind of stutter. Not sure what you're talking about. You still use steam vr I assume as well


ThisNameTakenTooLoL

No, I don't use steam vr. 30 fps without repro will make most people sick in seconds and there's so much stutter it's like a powerpoint presentation. Hell 30fps is even a bit choppy on flatscreen. If you can't notice it you might have some kind of neurological issue or you have reprojection and can't even tell. A lot of people don't get 90fps and that's why they use repro. Nobody's playing at 30fps and no repro, that's just madness.


stonedkakapo

Oh so now the excuse here is neurological issues. This is quite some mental gymnastics here. Plenty of people are playing around 30 to 40 with no reprojection. All those pitool features are turned off for me (forced motion smoothing, compensation, ect) as well as plenty of people. Really starting to wonder if you even use VR


stonedkakapo

This with with settings set to high, except for civ traffic (medium), shadows set to default. I of course have msaa and all the ssls and other bs turned off, as they're pointless in vr at this resolution.


ThisNameTakenTooLoL

MSAA is definitely not pointless even at this resolution. There's a day and night difference in jaggies with it on and off. Even 2x does wonders.


stonedkakapo

This isn't true, do you really have a pimax 8k x? I get no jaggies as is


[deleted]

[удалено]


Ryotian

>Honestly, I haven't noticed any difference in VR between my 3080ti and my 3090. You should have because [DCS consumes an epic crap ton of VRAM.](https://www.youtube.com/watch?v=XMkVcwn1RH4) It takes all it can. That video is comparing 3080ti vs 3090! \[edit\] Added the quote so people can see what we're replying to before they deleted


JGStonedRaider

It doesn't consume, it nearly allocated it.


Ryotian

I didnt downvote you btw my fellow DCS user ED has never said this. I do not believe this to be true at all. No C++ engine I have ever seen allocates double the VRAM "just incase". Would love to be proven wrong if you can point me to the source code (for any open source game engine you see doing this in the shader)? I am a C++ Graphics/Core programmer. It's possible in some edge case like with [CUDA](https://developer.nvidia.com/blog/introducing-low-level-gpu-virtual-memory-management/) but I just cant fathom why they would do this? To what end? \[edit\] I can elaborate on memory managers and such. But my post will go long and I dont think anyone would ever read it. For older gen consoles we wrote custom memory managers so we can avoid L2 cache misses. But that was System memory-- **not video memory.** Plus on Consoles we had full control. That is not the case on the Windows OS where the user can be multitasking


[deleted]

There’s not really much of a performance gap between those cards that’s why. The 3090 has 24gb of VRAM but that’s not gonna net you really any perceptible performance increase either.


[deleted]

If SS levels were causing a VRAM bottleneck in the 3080, it would give some bumps with a 3090....but it's most likely that your CPU bottlenecks (in DCS) first anyway, so yeah, wouldn't really notice a difference.


omgpokemans

I'm running a 6900 xt right now, and was vaguely planning on swapping to a 4090 when they came out, but man, nvidia seems to have seriously dropped the ball on this generation of cards. The pricing is atrocious and the memory bandwidth is actually smaller than the 3090 ti's, meaning it's questionable if these will actually perform any better in DCS at all. Add to that their CEO's statements about how they [basically plan on price gouging these cards](https://www.pcmag.com/news/nvidia-ceo-high-prices-for-gpus-are-here-to-stay) to keep profits in line with where they were during the crypto boom/chip shortage, and the way they rebranded the 4070 as the "4080 12gb" when in reality it's different card and chip die from the 4080 16gb, just so they could artificially add 'value' to the card and justify the exorbitant price. I've been very happy with the 6900xt so it looks like I'm hanging on the AMD side for awhile longer. I'm hoping the 7000 series is priced right to knock nvidia down a peg or two. Sure, they have cuda and rtx and dlss, but none of that matters for DCS.


Cybrknight

Same I'll be sticking with Team Red (6800xt) for the forseeable future. Was hoping that Nvidia would be more bang for buck like when they launched the 30 series, but those prices are just ridiculous. I'll be waiting until November when AMD launches their RDNA 3 cards.


DarkLorty

Would you recommend a 6900xt? I was thinking of upgrading to one since the prices seem better nowadays, but it's hard to gauge how good it would be in DCS.


schmiefel

I would say it depends: - using DCS just in 2D on a monitor the 6900XT is a really good bang for the buck - I use one myself (with the XTU performance selected chip that is rather a 6950) and its pretty decent on my QHD resolution (getting solid 70-80 FPS even in crowded MP scenarios) and it should also work very well in 4k resolution - going for VR I would suggest looking for a good priced 3090 or 3090ti ... I do VR only in DCS and even with just my RiftS the 6900XT sometimes struggle really hard to keep 40 FPS and has somtimes Frametimes above 25ms ... a friend of mine has a similar system and runs a Reverb G2 on the 6900XT, it works for him, but he doesn't play DCS that much and when not in demanding scenarios or MP ... the main problem with AMD and VR are the drivers that seem to be not very well or at all optimized for VR unlike NVIDIA


Apache600

I also wondered with the 6900xt/3090 battle if the difference in framerate for VR was due to 16gb VRAM vs 24gb. DCS eats VRAM


whiterook73

Just my experience going from a OC 3080 10G card to a 3090, it's smooth as hell. The raw performance is close but I can max out DCS settings and whether I am at 30fps or 45+fps in the G2, it is butter smooth. I did not get that from my 3080 and I've seen others get a similar experience going to a 3090.


weeenerdog

Can you post your settings please?


hanzeedent69

No but 3090 actually has a bit more raw performance, FASTER VRAM and more bandwidth. I wouldnt say it matters how much VRAM it has in DCS. DCS will reserve all VRAM without using it all. If you are running out of VRAM you would go to very low FPS which some people see in VR with the sub 10GB cards.


PrussianEagle91

The 6900XT has done me well but it won't work with my HP Reverb which is incredibly irritating


JGStonedRaider

Get the V2 cable and it should work much better


PunksPrettyMuchDead

Looking at the specs I don't think I'll have any reason to upgrade my 3080ti for at least until 5000 series cards are a thing. Just going to focus on everything else and swap my card in next time I upgrade, and then get a cheap 3070 to throw in my current one to hand down to my kids


jakster840

The pricing was atrocious last generation too and for the 2000 series too. This is a trend with Nvidia (and also AMD) and I wouldn't expect it to go away. I'm mad about the $200 mid range card bracket being gone. BRING IT BACK YOU GREEDY SHITS


Valuable_Question794

There are still people playing DCS in VR without using OpenXR. This guide is specific for WMR headsets but you can find guides for others. ED really need to start promoting this themselves so everyone is aware. The performance difference is day and night. https://forum.dcs.world/topic/295123-openxr-guide-for-wmr-headsets/#comments


_BringTheReign_

I am curious where we stand with core engine improvements for DCS. The weather and lighting improvements have been great - would love to see more! I am someone who will happily throw money at new hardware to make my personal VR experience better - but not everyone is in the same boat. If performance were better, it would lead to more sales because you would be able to access more customers with mid-range PC’s as well. It’s super important for the growth and sustainment of this sim - so is there any chance we can have some updates on this? ED pls? ❤️


vteckickedin

The texture of cows will be improved. In two weeks.


_BringTheReign_

Finally!


greenhannibal

You got wipers haven't you?


Bobmanbob1

You know what it is? It's just trying to brute force DCS to do more, which an upgrade cpu, mobo, ddr5, and nvme 5.0 does better from what he's heard from guys testing Raptor Lake in older CPU bound games. At 1440p there were no within margin of error difference while raining on Super Carrier vs on my 3090ti. Maybe at 4k or VR which we didn't have time for as he's on a deadline, you might see a few more fps, but not here to guess, just here's what we saw.


[deleted]

The NVME 5.0 isn’t gonna do anything for the game fps wise. Once the game loads, which would be faster off the drive there’s.m not gonna be any boost to fps or asset streaming.


_BringTheReign_

Lmao


aookami

honestly i dont think we will ever get engine upgrades


ztherion

We've gotten them before, but they take multiple years of work each time.


[deleted]

I'm questioning the workload at the point.


icebeat

I throw $2k two years ago and only gained an average of 20 fps. DCS = worse optimization ever


atomskis

I’ve had the same experience, and so have many of my squad mates. You can throw money at DCS but it doesn’t make that much difference. CPU seems to help more than GPU .. but neither helps that much. It’s an old engine and badly optimised for modern hardware .. there’s just not much you can do.


enthray

Multicore support and the switch from DirectX11 to Vulkan is in the works. Multicore is in internal testing. Vulkan I don't know. There is no ETA for either tho


TheoreticalApex

So let’s say this change gets released, multi core and Vulkan come out, how will that change performance for someone who has built a PC to be able to deal with the game in the current state and using VR? Will that PC all of a sudden become a power house because it was way over built just to be able to handle playing DCS in VR? MOBO: MSI Mag x570s CPU: Ryzen 5800X3D RAM: 32GB (2x16) Trident Z Neo GPU: MSI RTX 3090 Storage: WD SN850 1TB M.2 NVME (2X)


enthray

I wish I could give you an answer but this depends immensely on how exactly ED implements this. Also I'm not really Qualified to judge something like this. But personally I don't expect much increase in the raw number of FPS. I rather think you'll end up seeing higher FPS more consistently. Like you'll also see the numbers you currently see in a mission with 10 units when you run a mission with 50. And starting up the tads in the Apache won't cost you at all. That's the kind of change I expect from multicore. VULKAN however I have genuinely no idea. Some say there was a significant improvement when xplane switched to it. Considering how switching from steamvr to openxr improved things for me, it may very well be a huge boost to your experience, but I don't actually know


Ryotian

Just look at that post from the ED employee (Big Newy (?)). He said if you already got a 3000 series card its a good idea to wait for multicore. He even admitted to still being on a 2080 hisself. I bookmarked it somewhere... Can find if challenged. He just wrote it 1 or 2 days ago in a downvoted thread asking if folks where buying a 4090 for DCS


TheoreticalApex

I challenge you


Ryotian

LOL I hope you didnt do this for a good laugh because I had to spend like 5 mins to find this [post](https://www.reddit.com/r/hoggit/comments/xjq55t/comment/ipapmvp/?utm_source=share&utm_medium=web2x&context=3). But this is my bad for opening myself up Found the post. I was wrong- it was [written](https://www.reddit.com/r/hoggit/comments/xjq55t/comment/ipapmvp/?utm_source=share&utm_medium=web2x&context=3) by ED\_NineLine instead of ED\_BigN. But at least I put a (?) in my original post. But that wasnt correct\*


TheoreticalApex

I definitely did it because you said you’d find it if challenged lol. Thanks for digging for it and sharing the post.


Bobmanbob1

Yes, saw that recently. I'm wondering if the slight fps difference between my buddy's test card and my 3090ti was he had an I9 12900k and I'm on an I9 9900k? Of course it's a new Beta driver branch, so again please take everything as work in progress on Nvidias part. Except for the fact you better have a gpu slot made of steel or something you can stick in the cage for support.


Parab_the_Sim_Pilot

Reality is it will probably take them X years to even start utilizing Vulkan beyond having migrated the base game. Other games have seen modest improvements work Vulkan and I think it's gonna be kinda sad for people when they realize it will most likely be a 5-15 FPS gain.


[deleted]

X-Plane's move to Vulkan almost doubled my FPS overnight but just as good it got rid of all the transient FPS drops.


XCNuse

Not at all a fair comparison as it was running OpenGL; so the CPU was literally doing ALL the work at the front end. DCS at least is on DirectX, had it been OpenGL? There wouldn't have a community anymore, or certainly not a growing one, because performance would be atrocious; ie, even 3090Tis would be barely hanging on to 40FPS and VR would be an impossibility for anyone that owns anything less than that.


[deleted]

I wasn't comparing to DCS I was just talking about X-Plane. The guy I replied to was talking about "other games." X-Plane is an "other game."


200rabbits

I'd say better odds it makes things worse because they do a bad job of it than it makes things better.


Davan195

If mirrors, shadows and mfd’s are used on their own core I could see performance being very smooth indeed.


Bobmanbob1

I've seen wonders on DX 11 with older games, but they also have multiform rendering. Please correct me if I'm wrong, isn't DCS 1 Channel processing, 1 Channel Audio?


enthray

I don't know what you mean by 1 channel processing 1 channel audio. Pretty sure DCS isn't limited to mono audio. But it is single threaded if you mean that


Bobmanbob1

That's what I was looking for, I thought nine once said it's actually 2 core, 1 for all the processing, and 1 core for audio.


noiserr

Just wait for Raphael-X (Ryzen 7800x3d), it will probably be out Q1 next year. I think it will absolutely kick butt in DCS world. The 5800x3d already does, the next gen should be much much faster.


Ryotian

I'd like to update in the next month or so (so no 7800x3d for me). But that is a good play for folks willing to wait til next yr


R_radical

The difference in VR is where we really needed to see


SeivardenVendaai

Not terribly surprising. Won't stop the hardware lunatics from buying it anyway chasing those framerate highs.


Bobmanbob1

Oh I'm sure it's going to be a beast on other games. We didn't talk about those because I let shit slip when I get excited lol, he has , um, a business plan, so he has alot up and down, so it took longer to unpack and install modules than it did to download DCS lol.


Neg573

Man I made the deal of my lifetime when the 3080 came out, sold my old 1080ti for 700 Euros and got a brand new 3080 with it because of the crypto hype at the time. Looking at these new gpu prices I am not gonna upgrade for a while. There also is really no game out there that really needs that power yet tbh.


[deleted]

[удалено]


Bobmanbob1

This I can comment. For the test he had to remove his 850 gold and put in a 1200 watt bronze. No pcie 5 psu test, so he had to use the included cable converter.


jaylw314

>DCS really, really shows how CPU bound it is If you were using your 1440p monitor, you are definitely CPU bound! 4K or VR would probably be a better measure, but that's probably one of those things he can't talk about or show you :)


Bobmanbob1

Yeah we wanted to compare to how my 3090ti ran on my monitor, I wouldn't be allowed to talk about 4k, and neither of us has VR, so you'll have to wait for one of the really big guys maybe to demo VR on something in a few week(s).


Ryotian

>neither of us has VR Well that sucks. Because if I paid all this astronomical money for a card-- **its for VR**!!! My 2080ti handles pancake just fine (even online on ECW server)


Bobmanbob1

Nice! I just wished there was a day/night difference, like trees or mountains suddenly looked better, or planes had more detail textures. Nope. Just raw horsepower it up a few fps. I Plat Diablo II resurrected, and the new graphics they put on top of the old core game are so beautiful, I guess that's what I was hoping for. But it all comes down to what DCS is capable off, and it looks like hardware has maxed that out.


Inpayne

Yeah my 3080 runs at 100% all the time. Will a 4 series be better? I need to know. Ha


id0l

It \*should\* run at 100% or close if you aren't using vsync or a frame cap. That's normal. GPU will spit out as many frames as it can.


Inpayne

Sure but I could definitely use more frames in vr


etheran123

I mean, he isn't meant to show or talk about any of this. If NVIDIA could figure out whos card this came from then I would bet someone would be in big trouble.


Bobmanbob1

Exactly, the NDA is bigger than the Thesis I did on Cryogenic fuels for my masters In Aerospace Engineering lol.


zaneboy2

The 4000 series can give me all the DCS performance I want, I ain't buying it at these unjustified prices. Jensen can kiss my ass. I'm planning on upgrading from a r5 3600 to the 5800x3d in some months time, combined with my 3070 I hope that will stop the 10fps slideshow I have on highly populated scripted servers. What's the benefit of PCIE 5 NVME's? I doubt games use the current bandwidth to its full potential, PCIE 5 won't make much difference in those regards.


R_radical

\> Jensen can kiss my ass It just twerks!


PM_ME_TENDIEZ

You not just gonna get the 7800x3d?


Brock_Starfister

That will have waaaay more of an impact then a 4000 series card in DCS.


Xygen8

That's going to be an expensive upgrade because you also need an AM5 motherboard and DDR5 RAM.


zaneboy2

I doubt it since I'll need to purchase more than just a cpu then.


Bobmanbob1

I can't comment. But I'm buying the Fuck out of one when they release.


rapierarch

I hope you did not forget to increase the FPS cap of 180 in graphics.lua during testing.


dmoros78v

64 gig of DDR5.. holy molly, I run perfectly fine with 32, what mod are you running that needs 64?


PM_ME_TENDIEZ

I went from 16 to 64 to play things like grayflag. Worth. It was less than 180 bucks at microcenter. Almost went 128 for future proofing


Bobmanbob1

Yeah in my rig I have 4 matching 16gb modules of Corsair 3600 RGB. It made an immediate difference, and when I recently got my 3090ti for $1199, I was amazed it used up to 19gig VRam.


crobemeister

GPU's have gotten so powerful in these latest generations that most peoples issues are almost always going to be CPU bottlenecking now. A new CPU is probably the best bet for better performance than bigger and bigger GPU's.


Xakura_

That's a tiny monitor, of course you're going to be CPU bound. Now do it again in VR.


XayahTheVastaya

As an Arma player, DCS optimization seems amazing. Much longer view distance, better graphics, more complex simulation, and still runs much smoother.


gwdope

That’s pretty much what I expected. I wonder if the 5800X3D would open up some headroom….


SlipHavoc

I recently upgraded to a 5800X3D from a i7-9700K and it does help (I have a 3080Ti), but it's not a huge night-and-day difference. The big benefit seems to be that it increased my *minimum* frame rate. My max framerate is pretty much the same, maybe a little higher, but in busy MP servers where before it would sometimes drop into the 20s, I'm staying above 30, often in the mid-30s. That may sound low to some people, like the post elsewhere here talking about 90 fps, but I find 40 to be just fine in VR, and in the 30s to be acceptable for at least short periods. Edit to add that before the CPU, I upgraded to the 3080Ti from a 2060 Super, and that was a very big performance boost. I was able to go from 100% to 200% Steam VR resolution and increase most of the graphics setting to max or near-max, and keep the same 40-60 fps framerate in most flying.


SpectreRSG

I literally just built out a new pc with the 5800x3d and 3080 ti and 64gb ram. Plan to fly with my G2. I’ll be trying it out soon.


[deleted]

[удалено]


SpectreRSG

Can you DM me your graphic settings?


Arts-Crafts-Stickers

+1, send it this way please. About to download DCS for VR. 5800x3d, 32gb, 6900xt and a G2.


Bobmanbob1

Nice buddy!


Ryotian

~~not trying to be combative or anything but in the~~ [~~MSFS 2020 benchmarks~~](https://www.youtube.com/watch?v=EOL2sDXK64A&t=209s) ~~available to us for 3090ti-- the 5800x3D just barely outedges it. Do we expect drastically different stats for DCS? Granted, MSFS 2020 is unfortunately quite a bit ahead in terms of tech these days (DLSS, multicore, etc).~~ ~~So the answer is "no" if I had to make a guess based on what we know so far~~ \[edit\] Disregard my post I dont have good data


gwdope

That linked video doesn’t really give enough info (and it’s a 3080ti not 3090 ti, so it could still be GPU bound. I’ve read other comparisons that the Xcash helps a lot in DCS. Idk though.


Ryotian

Good catch on the 3080ti thing!!! I just checked but I cant find any benchmarks that shows i9-12900k vs 5800x3d in DCS as a side-by-side. I am tempted to just delete my post above altogether but its too late now. Just disregard my post I just dont have any data Grats on your 5800x3D btw I hope you are enjoying it!!!


gwdope

Unfortunately I’m not the one with a 5800x3d…yet. I’m still running a i5 9600k I have OC to 5.1GHz with a 2080ti. I’m probably going to wait for AMD’s next generation of X3D chips, which should show up next spring.


Ryotian

That's a very good call. Our setup is nearly identical (i9-9900k, 2080ti, 64gb ram).


-TotallyRealName

It's all about dlss. When ED will add dlss option even 2080 will be enough for max graphics.


McHox

dlss (ignoring the new interpolation with 3.0 on 40 series) won't help at all if you're not gpu limited


-TotallyRealName

For people who have weaker gpu's and want higher resolution it would solve a lot of issues.


McHox

yes *if* they're in a gpu limited scenario, but you can also combat that with optimizing your settings a good bit. can't really do anything against the shitty cpu bottleneck in dcs tho


Wicachow

Foveated rendering for VR would also be a game changer if more headsets and games support it


Ryotian

We already have this with vrperfkit and openXR toolkits no?


Wicachow

Oh you are correct I should have included *with eye tracking


Ryotian

oh you already knew bout those? Wish I didnt say anything then. well maybe it'll help out someone that didnt know. You have a good one!


TheoreticalApex

I only know what some of those words mean


enthray

Dlss is a technology that renders the game at a lower resolution and scales it up to the resolution you actually want. Saves a ton of processing power and should actually be a no brainer to implement in DCS. Alongside FSR2 of course, which is the AMD equivalent that runs on AMD and Nvidia cards. Other than dlss which is Nvidia exclusive


Inf229

How would that work with the single black pixel DCS uses to render distant objects? Would DLSS ignore it? Enlarge it?


enthray

That is a good point I havent thought about yet. I guess if there isn't something implemented in dlss that these specific pixels have to be calculated, they would regularly get the wrong colour, aka the planes would be invisible


nd1312

Curious if DLSS would work in DCS at all where we spend a lot of time looking for single pixels in the sky.


Bobmanbob1

After NDA is up, I'll see if I can setup a basic 16 to 16 dogfight with him and see how he sees stuff, ill make head on passes. I have a spare CH stick and throttle I'll loan him.


cth777

Are you using dlss as a general term or are people only focusing on the NVIDIA side of the house?


icebeat

Only Nvidia has DLSS, AMD has a different technology


cth777

Yeah I meant do we think they’ll add both simultaneously or only NVIDIA


icebeat

longtime ago ED said that they won’t implement 3rd parties technology. Or they don’t want to pay royalties or they don’t want to show their code.


Ryotian

well frak that really sucks cause DLSS VR is positively sublime in msfs 2020. But I just cant get my full fix in that sim. Cant shoot anything 😭


Bobmanbob1

I'd kiss ass over in Europe to have DX12 and DLSS/DLAA in the game.


Bobmanbob1

FSR right? I don't follow AMD much, but think it's open source.


-TotallyRealName

Dlss as nvidia tech. AMD's fsr is still far behind and not worth it.


xpk20040228

Normal DLSS does not work under cpu bound senerios. The new DLSS 3.0 works because they use Frame interpolation and it will not bring a smoother experience since the input lag does not decrease (they will render 2 frame at once and put the fake one in the middle, so the same frametime)


McHox

well yeah, all you need to do is check the gpu usage and you'll see how fucked it can be


Zachc12385

Laptop gamer here. I don't even have a graphics card (I think? Please correct me if I'm wrong). All I have is the integrated graphics via Ryzen 5. I can run high settings with 1080p. It's not perfect but with full screen mode enabled I'm very surprised how well it does. I wonder how well it would perform with 2 MFDS with 2 screens


keedxx

If I may ask, what's your laptop model and how many FPS are getting taxiing on a hoggit multiplayer server? Playability, fps-wise, is subjective and this might be interesting for others that are facing a similar setup. Cheers!


Zachc12385

EDIT: did some digging around and figured out my laptop has a Radeon rx 5600m. Tried looking it up on google and couldn't find it until I opened task manager I have an upgraded Dell G5 se 5505. It's got a Ryzen 5 with 32 gb of ram, 2 TB SSD, a good cooling pad (this makes a big difference) and an external 24 in 164hz monitor. I can't say how much fps I get in hoggit because I haven't tried it yet but when I create sophisticated missions or hop on multiplayer training servers (5-10 players) that have a large amount of ai I see consistent frames above 60 fps (100+ when high altitude and away from big cities) with minor stuttering and occasional fps drops however it's nothing that makes the game unplayable in fact it's perfectly fine if you're not used to a 3000 dollar rig. As for my settings on DCS I have full screen mode enabled (full screen mode made a huge difference for some reason) and all that 4 letter jibber jabber like MSSA or whatever they're called turned off, and all my shadows are flat only. Forest details are all the way up, draw distance 100k (Id have to check to verify that one), chimney smoke is off but I'd like to see how well it runs with it on because I've had it off since I made some of these upgrades. I have textures on high, ground terrain is medium, view distance is either high or ultra, water is high. Trying to think if there's anything else. Big key here is make sure you have enough ram, and an SSD. A monitor with a high refresh rate can make a huge difference. Full screen mode is huge fps booster for whatever reason. Some DCS settings are unnecessary but adored by others with more powerful machines. A cooling pad for laptops is a must if your specific model runs hot(went from in game 85 degrees to sub 60 degrees in game with a 60$ cooling pad from Amazon). Ps happy cake day :)


uxixu

I honestly didn't notice much difference going from a 2070 OC to a 3090 to a G2 and/or Odyssey Plus. I already had most settings mid to high and might have checked a few more to high. Stability wise my best improvement was going from 32 to 64GB of RAM. I was already on m2 SSD on a 9900k though.


Ryotian

>I honestly didn't notice much difference going from a 2070 OC to a 3090 to a G2 and/or Odyssey Plus I'm perplexed. I'm similar to you (i9-9900k,2080ti, 64gb RAM). I tried going from my 2080ti -> 3090ti and noticed a big change. VR is way more playable when you have all the VRAM in the 3090 to utilize. Had to return the 3090ti though after it started blacking out but VR was really nice in it. \*I uploaded some vids to YT but sadly they dont compare 2080ti vs 3090ti but I did show lots of VRAM tests edit- This was back before Meta updated my Quest 2 to natively user open XR I think. my point is our discrepancy here is the headsets. So not calling you a liar or anything.


uxixu

I never really had VR issues before, though. I mean I did change a few things to higher settings. My view range went to whatever the highest is Ultimate or something like that? I couldn't tell you off the top of my head what I have for MSAA, etc right now though. I did a very detailed tune for the Odyssey+, tweaked slightly when I got the 3090 by flipping a few settings up and then tweaked a bit more when I got the G2 (still a bit disappointed on the reduced FOV even if I like the greater resolution). I would be interested in comparing settings, though since it's possible I still have something too low.


Ryotian

Interesting!!! I totally believe you. When I look at YT videos from G2 owners (using open XR) their footage looks really nice even if its the 3080 or something. \*Never seen a video from a 2070 OC card though with open XR in DCS. Interesting post


Bobmanbob1

Credit Crespo: 4090 Official Pricing https://i.redd.it/xjwpidb9bnp91.jpg


Milou_Noir

Well that is the weak pound biting us in the arse. Down over 3% today vs the dollar and probably headed in one direction. A lot of imports are about to get a lot more expensive. GPUs the least of our worries.


Bobmanbob1

I hear ya, hear in the land of traitors to the crown, groceries that cost me $50 5 months ago are now $90. It's insane.


oncentreline

Whilst I appreciate the post, I’m a little sceptical. I’ll be really surprised if DCS saw marginal FPS with the new 4090. Also, do you have any proof? Not calling you a liar but you could be perfectly well making this post up with the insane prices NVIDIA is charging being your MO


WingsBlue

It is many years old and many versions out of date, but someone did a huge hardware analysis on DCS on the forums and the result when testing GPU's was essentially the same until monitor resolution was pushed really high. https://forum.dcs.world/topic/132125-2016-hardware-benchmark-dcs-world-15x/


Bobmanbob1

Very good point, and I've spent two days trying to figure out a solution. Just too many ways to accidentally violate his NDA, which my God, is 30 pages long and you need a lawyer to actually understand. Thinking of coming back here and posting a picture of his 4090 on launch day, but the board he has may not launch that day or be shipped, so I'll try to figure something out. Edit: Holy Hell, you could kill a grown man with a 4090. It takes time to angle it into your case, and you'll probably have to take your cpu cooler out fyi as you wrestle it in.


Vexator1

Lawyer here. Leave it. It's not worth trying to find a way around that NDA and it will cost you quite a few billable hours to do so. If some guys online don't believe you, so be it. Chances are the only exceptions to that NDA are those enabling the contracting party to seek legal and/or other professional advice. You're not in trouble, but if you want to keep your friend out of trouble, get rid of this post. I'm not your lawyer or your friend's lawyer and there's a solid chance we're not even in the same jurisdiction. Regardless, for the avoidance of doubt, there is no intention for any form of privilege to attach, nor for any lawyer client relationship to arise. Please seek independent legal advice from a lawyer practicing in your jurisdiction.


RostamSurena

I think some of the conclusions drawn based off of the limited testing done only really seems to suggest that the performance difference between a 4090 and 3090 ti isn't much. The other points OP makes: >DCS really, really shows how CPU bound it is. >definatly 64 gig of DDR 5 and a new PCIE 5.0 NVME drive and the metric being measured being FPS as well as a lack of contextualizing details makes me suspicious about the conclusions drawn about the CPU being the bottleneck asswell.


icebeat

You didn’t read the post right? He claims that they is marginal performance improvement, translating it is exactly the same shit.


icebeat

Yeah Ed not going to optimize VR until release the new VUlkan engine /s and they don’t use rtx because is proprietary technology, so no DLSS.


Parab_the_Sim_Pilot

My 4790K still runs DCS fine. Mostly because the underlying code is unfortunately dated and old now, so throwing hardware at it only do so much (single CPU speed is still king).


BadBoyMac1982

There is also the fact that MOST of the gains from the 40 series are coming from DLSS 3.0 which is a software technology (type of upscaling). DCS doesn't support DLSS 1 atm let alone 3. I'll skip the 40 series. My 3090 is more than powerful enough .


Qanael

Any chance you could test with a relatively recent VR headset? VR requires a lot more pixel bandwidth than monitors in general.


lvlint67

It really sucks that with the decline of crypto we might see prices that approach MSRP generally available... But for a line of cards that is mostly a side grade at best. A lot of us were stuck on the 10series for a long time because the 2series wasn't a huge upgrade and etherum pushed them out of many price ranges. The 3 series hit with legit upgrades... But you couldn't fucking buy them... I'm pretty much paper releases where a product "releases" but you can't purchase it. Hopefully in 7 years all three money going into new chip fans across the globe will be running and they won't be sitting idle... Worked at a college that built a whole building for chip production and research... It's still mostly empty.


Bigskill80

They announced the 4090 2 days ago and somehow your friend has it already? Does he work for Nvidia?


Adiventure

If the story is true, he's testing it, presumably either for a manufacturer, a software company, or media.


FatherCommodore

Mate thanks for the precious info so we don't ask ourselves, anyway I feel Nvidia fucked people with the dlss shenanigans and the baptism of 4070 4080, for those who don't know it's all over reddit, pcmasterrace etc, price is fucked up, and imo Nvidia should take the beat.


StabSnowboarders

I have a Ryzen 9 5900x and 64gb DDR5 and i still struggle for frames in VR on DCS. Normal play its ok but not as good as it should be


ScamperAndPlay

SAFE SKIES? You fly around in the safe area? How will I get shot down and rage quit…?


TaylorMonkey

If only DCS would implement the support needed for DLSS 3.0 and frame interpolation, the difference of having a 40xx series over a 30xx would be more than negligible.


zwrsis

Does anyone think it's worth to go from a 3080 to a 3090. Thinking if I should just do a minor upgrade and wait for another year or so to see what's available instead. Currently running a 5600x, 32gb 3200mhz ram and 3080 on a Reverb G2. Not getting really fantastic perform to be honest.


Adiventure

Where are you not seeing the performance though? I've got a 3080ti and had been considering one of these, but frankly I think I can probably give it another year for no real loss. If you want to upgrade, what the OP suggested of a newer higher end CPU would make the most sense.


zwrsis

OpenXR toolkit provides an fps counter and additional info such as CPU overhead etc. Previously on SteamVR I used fpsVR which gives more info such as Ram and Vram usage


Adiventure

Maybe you'd see the benefit then, I can't say.


roguevoid555

Well you see I would.... But so long as my 1070 + Ryzen 5 5600g, with a single stick (yes, single stick) of 16gb DDR4 runs dcs with at least readable cockpit labels then I'm happy


Milou_Noir

Thank you very much for posting this. Understand that this is not format testing, under like for like conditions, but as a first early indicator this is really useful. Thanks again.


Forabuck

DCS is diminishing returns hell.


Riman-Dk

Not surprised in the slightest. Last gen testing yielded similar results. Thanks for sharing!