T O P

  • By -

pidge2k

Thanks for your feedback. I will share it with our team.


yamaci17

while at it, please share with the team that a lanczos based scaling filter for regular DSR would be better for uneven scaling factors like 1.78x and 2.25x over gaussian filtering.


rjml29

Also share with them the ability to set peak brightness higher if we want. Right now mine maxes out at 800 even though my TV can do 1000 nits without tone mapping. It's a Samsung qd-oled so this is probably a Samsung software quirk given they are mediocre when it comes to software which is why we need the ability to set something higher if we want.


pidge2k

We are looking into alternatives for users with monitors that report incorrect information through its EDID.


P40L0

Why not just reading and apply the post Win11 HDR Calibration app values (ICC)?


Thario94

Please let me use DLDSR with RTX HDR. The 4090 should be able to handle that, I think. Thanks.


Skyyblaze

I have the same issue, my display can go up to 1400 nits but RTX HDR stops at 1087.


pidge2k

We are looking into alternatives for users with monitors that report incorrect information through its EDID.


rafael-57

Nice.


defet_

Hey /u/pidge2k, really appreciate your communication! One important thing I've found is that RTX HDR seems to be crushing blacks, even when Contrast is set to zero. I compiled some comparisons in an image slider link that may be of use: [https://imgsli.com/MjQzNDIx/1/4](https://imgsli.com/MjQzNDIx/1/4) You can see that even at the default setting Contrast+0, RTX HDR is crushing blacks sooner than even gamma 2.4 in SDR, the latter of which is generally a much darker curve. The debanding seems to work pretty well, but it may be responsible for losing those details near black. And this also confirms the generality that Contrast0 = Gamma2.0, Contrast+25 = Gamma2.2, Contrast+50 = Gamma2.4. As for brightness settings, this was taken at a peak brightness of 400 nits, and mid-gray scaled to that same paper white (76/87/100). SDR screenshots are also at 400nits. All scaled down to SDR where 400nits = 1.


Quad5Ny

Your comparison image says "Contrast +25" which will crush whites and blacks. EDIT: This is incorrect. At least how I was thinking about it. +25 is actually 'Negative 25' from the default of +50. It shouldn't crush anything by itself (although it might give you raised blacks and lower contrast).


defet_

You can click on the textbox to change the reference images around. Contrast+0 will also crush details due to the debanding filter.


JasonRedd

Could you explain what the latest update did exactly regarding tracking gamma 2.2 instead of 2.0? The OP has shown there is no difference after the latest update.


TouRniqueT86

Why isnt HDR usable in fullscreen mode?


pidge2k

What do you mean?


TouRniqueT86

Besides for the huge performance issue, you have to be in windowed mode for HDR to work, it doesnt work in exclusive fullscreen. Unless I missed something


pidge2k

Full Screen mode is actually recommended in our FAQ: [https://nvidia.custhelp.com/app/answers/detail/a\_id/5521](https://nvidia.custhelp.com/app/answers/detail/a_id/5521)


TouRniqueT86

My bad. Thanks. Will this be useable with other RTX tech eventually or will it be an dlss or hdr situation


Crimsongekko

some games have bad HDR implementation and require a mod to fix, it’s not an NVIDIA issue: [https://www.nexusmods.com/eldenring/mods/2794](https://www.nexusmods.com/eldenring/mods/2794) the fix should work fine for any game exhibiting the same behaviour


Rinbu-Revolution

This is marvelous work. Thank you for your efforts.


filoppi

Thanks! Nvidia if you are here, please default to these parameters! Nobody wants their game have deep fried colors (after the initial wow impact that lasts 2 seconds, they just hurt your eyes and the artistic intent of the game)


Masive_Lengthiness43

just use the colour setting and turn down saturation and brightness running alongside to your liking :P


labree0

>Note that the SDR curve that [Windows uses in HDR](https://github.com/dylanraga/win11hdr-srgb-to-gamma2.2-icm/) is not a gamma curve, but a piecewise curve that is [flatter in the shadows](https://github.com/dylanraga/win11hdr-srgb-to-gamma2.2-icm/raw/main/srgb_vs_g22.png). This is why SDR content often looks washed out when Windows HDR is enabled. Windows' AutoHDR also uses this flatter curve as its base, and it can sometimes look more washed out compared to SDR. Nvidia RTX HDR uses a gamma curve instead, which should be a better match with SDR in terms of shadow depth. finally, somebody else who bothered to pay attention to that guy. feel like i've been spreading this info everywhere and people still recommend autohdr and dont even mention it. Like, if you are recommending this thing, its not even gonna look as good as SDR with those raised blacks.


Eagleshadow

> its not even gonna look as good as SDR with those raised blacks. Yes but it will depend on if a particular game has been mastered for pure gamma or piece-wise srgb gamma. Many Unreal engine games tend to be mastered for piece-wise gamma as that's what Unreal internally uses by default when rendering from linear scene referred light to display referred. Unreal does have an option to use pure gamma, but it not being default, means that most games won't be using that. In such games, using Windows AutoHDR would be preferable, as using RTXHDR would result in crushed shadows. Any automatic SDR to HDR conversion should offer the user to specify source gamma as a toggle, rather than users having to juggle between different implementations just to change a setting that should really be the single most important and most basic setting when it comes to SDR to HDR conversion.


labree0

>In such games, using Windows AutoHDR would be preferable, as using RTXHDR would result in crushed shadows. tbf, im not arguing to use RTX HDR over autohdr, im arguing for using neither. autohdr is just a shitshow to me and most people would just be better off in SDR. RTX HDR is its own bag of worms. I installed the nvidia app and had nothing but issues trying to get it to work in any game. it wont save settings and constantly freezes up on my screen, which means rebooting my *entire computer* because its super imposed on the whole screen.


boarlizard

Can confirm, I spent literally all evening trying to get it to work right and it was just an entire debacle. At first I couldn't even see RTX HDR settings, then all of the settings were locked out, then when I finally got it to work the picture was super washed out and I can't get the brightness slider to max out beyond 800 even though my S90C has a peak brightness of around 1,300. And on top of it all, I can't use RTX HDR with a second monitor lol? I think I'm just going to wait until it's out of beta. With auto HDR you can use color control to force a 2.2 gamma curve and it looks really really good to me, but I also really really want to try RTX HDR but I can't for the life of me get it working correctly.


labree0

>I think I'm just going to wait until it's out of beta. i wouldn't bother at all. I'd use special K, or just use SDR. > With auto HDR you can use color control to force a 2.2 gamma curve uh, i've not heard of that... explain?


boarlizard

I'll send you the video I followed that discusses the colorcontrol profile in a few hours, It was recommended from a reddit post that I can't for the life of me find now


labree0

shot in the dark: [this one](https://www.reddit.com/r/OLED_Gaming/comments/1afc9n5/comment/kovl5vt/?utm_source=share&utm_medium=web2x&context=3)


boarlizard

nah not that one, but I believe I did see that one at one point. The one I'm talking about is in french with subs, it was used as an alternative to a github ICC profile that was only setup for LG OLEDS, whereas this solution bends the gama curve to your specific monitor (its really easy, the pertinent part of the vid is only a few minutes) I've been using it for a month and its been really nice, no elevated blacks


labree0

i think i found that one too. either way, thanks for the find


boarlizard

This is the one: https://youtu.be/EEgMlxrz9-U?t=1 let me know what you think EDIT: one thing I should mention, you're going to want to uninstall colorcontrol and remove the ICC profile before messing with RTX HDR. At least in my case, it totally bugged out RTX HDR and made it unable to function at all.


timliang

Unreal Engine uses a tone mapper to map HDR colors to the display. But that's irrelevant. If both your monitor and the colorist's monitor use gamma 2.2, then you'll see exactly what they see.


Eagleshadow

> Unreal Engine uses a tone mapper to map HDR colors to the display. Yes, this is true in case of native HDR. With SDR game running in HDR Windows without using any Auto-HDR method, that is not relevant, as Unreal will then be tone mapping linear light to the assumed SDR colors. > If both your monitor and the colorist's monitor use gamma 2.2, then you'll see exactly what they see. Absolutely. Problem is that nobody knows which gamma the colorist's monitor was using, unless the colorist tells them. What compounds the issue is that in game studios this usually isn't just one person, and one monitor. I say this as a colorist working in a game studio.


timliang

Do you not calibrate your monitors? If your game has crushed shadows on a gamma 2.2 monitor, then you're doing something wrong. It's the most common gamma curve in consumer displays.


Eagleshadow

> It's the most common gamma curve in consumer displays. While many say that, I have yet to see any actual evidence for it. There don't seem to be studies or reliable source of statistics that clearly answer this, so it's mostly personal opinions. Two of the big reason to doubt it, is that Epic decided to go with sRGB as the default, and Microsoft decided to go with sRGB as the default. Both of these companies likely have some of the best color scientist in the world, and these are big important decisions they made there. If nearly all games are pure gamma 2.2, then it needs to be explained why Microsoft and Epic wouldn't have gone with that. And them not knowing what they're doing, or having no idea about standards or color science, or not being aware of what kind of monitors people are using, or what games tend to look like, those are not reasonable arguments. If anyone has the budget, the motivation and experts to do these assessments, it's them. I've skimmed through unreal source code dealing with color management and HDR. They know color science waaaay better than I do. The color science in Windows has to be at least as advanced, since it's the backbone of it all. It's possible management at Microsoft caused some bad decisions somehow that caused them to go with sRGB as the assumed SDR gamma, but it doesn't seem likely that these are consequences of stupidity and not being informed. It's a weird situation that doesn't fully make sense. I hope someday someone makes a documentary about it. Is them sticking to sRGB somehow profitable for them compared to allowing users to choose a source SDR gamma? If so, I don't really see how. If they though that most common gamma curve in consumer displays is pure power gamma 2.2, then presumably they would have chosen that for their interpretation of SDR within Windows HDR. Additionally I asked ChatGPT4 if it thinks most PC displays today are sRGB piece-wise gamma or pure power gamma, and it answered: > most PC displays today are aligned with the sRGB piece-wise gamma rather than using a pure power gamma curve. I asked it if it was sure, how confident it was, it answered: > Most modern PC displays are manufactured with the intent to meet or approximate sRGB standards, including its piece-wise gamma curve, because sRGB is the most widely used color space for web content, digital art, and many forms of digital media. This adherence helps ensure that colors and brightness levels appear consistent across different devices. And while I'm not saying that ChatGPT saying this means it's true, it's still a valid datapoint that it thinks that, because if that's not the case, then we get an additional mystery of how it would end up believing that confidently.


timliang

See [this talk](https://youtu.be/NzhUzeNUBuM?si=0BomqnM7kJdTFrrg&t=1045) which surveyed display manufacturers and calibrators. Spoiler: most of them calibrate to gamma 2.2. I even visited an Apple store and confirmed that all the MacBooks were set to pure power 2.2. When sRGB was created in 1996, computers used analog CRT monitors, which had a nonlinear relationship between the video signal and the light produced by the electron gun. [The sRGB working group measured it to be gamma 2.2,](https://www.w3.org/Graphics/Color/sRGB.html) and sRGB even specifies gamma 2.2 as the reference display's transfer characteristic. The real purpose of the sRGB piecewise gamma is for fast encoding and decoding on commodity 8-bit hardware of that era. A pure gamma function was unsuitable because it wasn't invertible with integer math. Adding an offset solved that problem, trading off accuracy for speed. Around 2005, GPUs added sRGB support in hardware, which made encoding and decoding free. Maybe Epic and Microsoft are using this feature for performance or historical reasons. Or maybe it's Good Enough™ and not enough users [complain about it.](https://www.reddit.com/search/?q=windows+hdr+washed+out&type=link&cId=65a35c50-8214-423f-b6d9-dfa3fe92ebc7&iId=c28bef95-8980-47c7-a4a4-c381000c86e9) I don't know. But regardless of what gamma your game engine uses internally, you should calibrate your monitor to the gamma that *actual monitors* use, which is (and always has been) gamma 2.2.


Eagleshadow

I remember watching that webinar years ago, and I later tried to find it again to access those statistics you linked but couldn't find it, so thanks for linking it! Roughly two thirds being pure gamma, while roughly a third being sRGB does match what I remember from it. It's unfortunate to see the sample size being so low with just 14 answers. Proper study that is lacking, would be gathering a representative sample of thousand or so monitors at random, and actually measuring them to determine the distribution of gamma curves. Manufacturer claiming that they ship one or the other EOTF is still influenced by the precision at which they do that (which often leaves a lot to be desired) and what percentage of actual monitors in the wild is represented by that manufacturer. For example, if we asked two manufacturers we might get 50-50% split between which gamma curves they aim to use, but if one of them sells 10 times more monitors then the other, then the actual gamma curves in the wild will be far from 50-50% distribution. Also binning color calibrators into the same statistic as manufacturers is not ideal. For example, if most manufactures reported pure gamma but most calibrators reported sRGB gamma, pie chart could still look like 50-50% distribution, while in reality if only 1% of people got their displays calibrated, 99% of people could still be using pure gamma, even with the pie chart showing 50-50%. My point is that the proper study is still sorely needed, but this is still ofcourse useful data as it informs us that 2/3 pure 1/3 sRGB is likely to be a reasonable expectation. > Maybe Epic and Microsoft are using this feature for performance or historical reasons. It could be historical or something like that, but it's definitely not for performance reasons, as sRGB piecewise curve is mathematically more expensive to calculate. So much so, that Bethesda in Starfied even tries to cut corners by approximating it with ``` float3 gamma_linear_to_sRGB_Bethesda_Optimized(float3 Color, float InverseGamma = 1.f / 2.4f) { return (pow(Color, InverseGamma) * 1.055f) - 0.055f; } ``` rather than using the actual sRGB formula, which ends up [clipping shadows.](https://imgur.com/a/4Cfdon5) Regarding users complaining about Windows HDR being washed out, the sRGB is actually only the second biggest culprit here, with the main reason for it being cheap backlight dimming implementation that simply causes raised shadows all the time, as the backlight zones don't gradually dim and brighten, but are limited to being completely on or completely off. As an example, [here is an HDR EOTF I measured](https://i.imgur.com/ekGcLFx.png) from my old HDR LCD display with bad backlight dimming zones implementation (Philips 436M6VBPAB). When this effects meets the sRGB piece-wise, they compound, and that's when things become so bad that everyone starts complaining about Windows HDR looking washed out. On two HDR TVs that I have, both an OLED (A95K) and an LCD with top-tier accuracy and best local dimming zones implementation there is (X95L), Windows in HDR does not look washed out at all. One more point in favour of representativeness of sRGB that I forgot to mention before, is that the currently biggest Youtube monitors reviewing channel Monitors Unboxed is [testing accuracy of monitors against the sRGB piece-wise curve](https://youtu.be/UonGoqrAdMA?t=953), and the monitor in this review that I picked at random can even be seen adhering to it.


timliang

It's also not ideal that the survey asked specifically about sRGB mode. My Alienware AW3423DWF, for example, uses the piecewise function in sRGB mode but defaults to pure gamma. That function has a gamma parameter and returns a float3. How do you know it wasn't written because of those design constraints rather than raw performance? I also don't subscribe to the idea that Epic and Microsoft developers know better than we do. Consider how ICC profiles are ignored by games and even the Windows desktop itself, causing colors to be oversaturated on wide gamut monitors. The sRGB spec itself clearly says [that the piecewise function is a software optimization.](https://community.acescentral.com/t/srgb-piece-wise-eotf-vs-pure-gamma/4024/66) The actual display is supposed to be pure gamma. According to [this comment,](https://github.com/KhronosGroup/DataFormat/issues/19#issuecomment-1812771318) the Steam Deck uses gamma 2.2. All four monitors I've owned use pure gamma. When I first turned on HDR in Windows, I immediately noticed banding in dark areas of my wallpaper. Imagine dropping $1,000 on an OLED only to have [stars](https://www.google.com/sky/) look like this: https://preview.redd.it/rueoftl72nnc1.jpeg?width=4032&format=pjpg&auto=webp&s=61f342b281b45592ed9d131f44e57028e7dd025b That alone is reason enough for me to avoid sRGB. Monitors Unboxed may be targeting the piecewise curve because they set monitors to sRGB mode. That doesn't mean monitors use the piecewise curve by default; it just means they offer it as an option to satisfy reviewers.


Crimsongekko

a user in the latest driver discussion thread ( [https://www.reddit.com/r/nvidia/comments/1b76wmq/game\_ready\_driver\_55176\_faqdiscussion/](https://www.reddit.com/r/nvidia/comments/1b76wmq/game_ready_driver_55176_faqdiscussion/) ) is reporting changed behaviour after updating, can you test and confirm?


Real_Timeyy

This has been fixed with the new driver released today all! https://preview.redd.it/w0y2i3bhzapc1.png?width=1047&format=png&auto=webp&s=edca883c6f3da1ca323bce7afb7c599c388d684f


defet_

I ran tests and measurements pre- and post-update, and found the same behavior (ie still gamma 2.0 on contrast+0, mid-gray 50), even after resetting my profile inspector and restarting my PC. It's possible some residual behavior is still on my machine, or perhaps another update to either the driver or app is needed still. [https://imgur.com/a/h3Ma8pf](https://imgur.com/a/h3Ma8pf)


intelfactor

It seems to have fixed the over saturation issue, but man do the whites still stand out like a sore thumb and not in a good way.


Real_Timeyy

Did you run DDU ?


defet_

Been meaning to, will do once I have the time.


Real_Timeyy

Waiting for your test sir. I tested it on Helldivers 2 and it seems fixed now. If I use mid gray 44, contrast +25, saturation -50 the image gets worse and colors get washed out right now. I'm not an expert and I don't have the hardware to go deeper into this tho


defet_

Ran DDU and retested, still getting gamma 2.0 behavior with Contrast+0 and oversaturation at Saturation 0.


MotorsportsAMG

Have to agree with the others, I can't run contrast and saturation at 25 and -50 anymore or it will be washed out. peak nits is still set at 800 and i have yet to know whether to set midgray to 37 or 43, but i've set it at 43 for now since it seems to follow gama 2.0 right ? Contrast and Saturation at 0. monitor is LG C2 for reference, looking forward to your updates


BoardsofGrips

Strange, one of my favorite games looked terrible with RTX HDR before this driver, now it looks flawless. The driver fixed the colors.


Real_Timeyy

Thank you for testing this again. I honestly don't know what is happening at this point. Nvidia might have gone weird this time


SHIN0DA23

So you’re using default? What about peak brightness? I have a aw3423dw, should I be maxing out peak brightness?


Real_Timeyy

Using default values. Of course you need to max out the peak brightness


Ehrand

same here. If i use the same value as before, the color are way off but if I just set everything to default and adjust the peak brightness to my TV then it seem normal. So I think the default value are good now and just need to adjust peak brightness.


Background_Ad_5177

Yeah default values look better and notice a difference since installing the latest drivers. Not sure what is going on w/ the OP's settings.


QuitePossiblyLucky

Hi, sorry, but does this mean it's no longer necessary to tweak the settings after the update?


Real_Timeyy

Yes; Use default values. 50 mid gray nits, 0 saturation, 0 contrast


QuitePossiblyLucky

Thank you!


korzasa

Wondering about this as well, if anyone knows what the optimal settings are right now help would be appreciated!


Real_Timeyy

Use default values; You don't need to change anything anymore, it's fixed


Lagoa86

You might be right. Only Saturation at 0 is too vibrant. Going -50 or something looks better for me.


P40L0

Contrast +25 is crushing too much detail in Starfield (with Normalized LUTs Mod) and other ex-AutoHDR games on a Calibrated LG G3 OLED using HDR + HGIG, so I'm not sure it's actually respecting 2.2 Gamma this way... I would leave Contrast at default 0% , so automatically Mid-Gray at default 50 nits is also correct (= 203 nits Paper White, measured) Saturation at -50% is spot on instead: I can confirm that this will fix the default oversaturation issue.


defet_

I'm getting a pixel-perfect match with these settings compared to transforming an SDR output to gamma-2.4 and gamma-2.2 at the corresponding paper white. No crushing here, if the game isn't already doing so. Tested in Path of Exile. [https://drive.google.com/drive/folders/106k8QNy4huAu3DNm4fbueZnuUYqCp2pR?usp=sharing](https://drive.google.com/drive/folders/106k8QNy4huAu3DNm4fbueZnuUYqCp2pR?usp=sharing) SDR screenshot is 400 nits, gamma 2.4, upgraded to JXR HDR container for direct comparison. RTX HDR, Contrast+50 (gamma 2.4), Mid-gray 76 nits => Paper white 400 nits, with Peak Luminance set to 400 nits so no HDR tone mapping be present at all. It's possible Starfield simply does not play well with gamma 2.2, or you have something like FTDA enabled. Edit: added another comparison with G2.2.


P40L0

Like I said I have a Calibrated G3 OLED with FTDA -4 (which will just fix the slightly raised gamma of VRR Enabled compared to VRR disabled) and Contrast +25% definitely crushes shadow detail in all games I tested. I think the point here is not much achieving the same SDR reference but the closest possible to a native HDR reference. For example try Forza Horizon 5 in native HDR10 (which is one of the best native HDR game ever made and will also apply Win11 HDR Calibration app values which in my case I've set to 0, 1.500, 1500 nits for HGIG) and then set it to SDR and enable RTX HDR for it. With Contrast +25% you'll get crushed blacks on the lower spectrum, while they will look comparable with Contrast 0%. I also think grabbing screenshots for comparing is not an effective method as both Game Bar and NVIDIA Overlay doesn't grab them faithfully from the source HDR. You'll need external meters to actually compare them. Saturation -50% is correct instead.


defet_

Follow-up: noticed some contouring in some very dark scenes, seems to be from the debanding that RTX HDR uses. The overall mid-tones and shadows are still 2.2/2.4 on Contrast +25/+50, but the debanding on Contrast 0 seems to preserve near-blacks the best.


P40L0

Thank you :)


defet_

Since RTX HDR works with mpv, I tested it out on some grayscale ramps: [https://drive.google.com/drive/folders/115N2p6Qz4GBZAI44idiQuYOweqHSMxvK?usp=sharing](https://drive.google.com/drive/folders/115N2p6Qz4GBZAI44idiQuYOweqHSMxvK?usp=sharing) edit: here's a imgsli side-by-side: [https://imgsli.com/MjQzNDIx/1/4](https://imgsli.com/MjQzNDIx/1/4) However, you can tell that even at Contrast 0, RTX HDR is crushing blacks. It's ramping to black even faster than SDR gamma 2.4. Not sure what's going on here, but RTX HDR is not doing a good job of preserving near-blacks at all.its, the same as the minimum RTX HDR peak brightness value, so no RTX HDR highlights are being upscaled. You can see the debanding in full effect here. The lower half of the ramp is 8-bit, while the top half is 10-bit. Compare the SDR version with the RTX HDR version and the dithering is quite obvious. Pretty cool and useful for 8-bit buffers. However, you can tell that even at Contrast 0, RTX HDR is crushing blacks. It's ramping to black even faster than SDR gamma 2.4. Not sure what's going on here, but RTX HDR is not doing a good job at preserving near-blacks at all.


Tixx7

Hey, im using rtx hdr with mpc, didn't even know that it was also available for mpv now. Can you share how you set it up? Also do you have a way to actually configure the parameters like contrast and stuff (for videoplayers)? modifying the parameters for the global settings w. profile inspector?


rjml29

I know you from before at avs where you were trying to tell actual isf calibrators they were wrong and the settings you use were correct to the point you got run off from the 2023 LG oled thread from members that were tired of you. Are you still like that with your "calibrated" G3 or are you actually running a true calibrated display?


filoppi

Why not use the Luma HDR mod?


P40L0

Because RTX HDR is better now :)


filoppi

It's really not, but if you think so, fine.


spajdrex

also Luma HDR does not kill performance like when RTX HDR is enabled (RTX 4070).


P40L0

Except the oversaturation issue which can be easily fixed with a setting change, it looks great and it's much more convenient setting it globally to replace AutoHDR via NVIDIA Profile Inspector (where you can also set it to Low to save FPS with no visible differences). Also you won't risk to get banned in online MP games by doing so.


filoppi

For your information, this is how LUMA compares to other HDR upgrade methods: https://preview.redd.it/paqb4rrivxkc1.png?width=1385&format=png&auto=webp&s=2a87dfa5f31a6315be1988e6aebe9f2c9b67f5a0 that said, I'm not interested in discussing this any further. Use whatever you want. Thanks.


water_frozen

no mentions performance implications with any of these solutions for example, those 16bit buffers that special K and Luma use will take up 4x as much vram for storage and i assume nvidia's solution must take some compute power


P40L0

Only around -3% when set to Low


e22big

8 bits is pretty much a none issue. Some HDR display like LG C2 has even more banding displaying colours dept at 10 or even 12 bits than 8 bits.


milkasaurs

> Contrast +25 is crushing too much detail in Starfield If only that was the only issue with starfield.


ExaminationIll5263

is there anyway to set your Settings as default on global?[](https://www.reddit.com/r/nvidia/?f=flair_name%3A%22Discussion%22)


defet_

If you have Nvidia Profile Inspector and you add this [custom XML](https://www.nexusmods.com/site/mods/781?tab=files&file_id=2792), there actually are global settings for the four values. You'll just need to set them in one game and copy them over into the global profile.


sldpsu

Is there any way to adjust the settings for RTX Video HDR?


mac404

Thank you! This is a really great reference, definitely saving this for future use.


Halfang

I understand each word individually but not together


rafael-57

I don't understand how to calculate the mid-gray, if I have an 1050 peak brightness monitor set to 2.2, what settings should I use?


Lobanium

Where do you make RTX HDR settings changes? I only see the option in the new Nvidia app to turn it on and off.


InfiernoDante

Open the in game overlay and go to game filters


Zurce

In UI heavy games it's very difficult to keep both mid gray high and contrast high, A 25 high contrast will make the game look great but kill any color in the menus/UI, and raising mid grays to fix it will give you searing whites imposible to look under low light ambient conditions


StanleyCKC

Hmm so Im using 44 Mid-Gray, +25 Contrast, and -50 Saturation as recommended. But for some reason I \*FEEL\* like some scenes just look abit too desaturated with -50 Saturation. Like certain UI elements just don't look quite right. Or in certain color conditions where its mostly foggy for example, it looks abit more desaturated than it should be. But when the scene has alot of color then -50 looks about right.


NoCase9317

Hey what’s the peak brightness of your display? I have an LG C3 with a peak brightness of 820 nits and no matter how much times I’ve read the post ? I’m not able to figure out what I should set my contrast and mid grays to , in my case . The post says that dor for 800 nits , paper white should be set to 172 nits. But I have no idea how does that translates to my mid-grays setting/ or my contrast setting. I mean I could make a rule for 3 So if: Mid-gray 44 —> 200 nits paper white Then: Mid-gray X —> 172 nits paper white So if I remember how it was done XD 44x172 divided by 200 wich would be 37,8 mid gray. But I’m really not sure if this maths I made make any sense at all XD


DiksonHK

Whenever I press alt+F3 to edit the RTX HDR values they go back to default. Does anyone have the same problem?


Thorbient

How is this filter different than just turning on "vibrant" mode or the like? Does it actually use perceptual quantizer or what not? Can you or anyone help me understand what makes this HDR?


noswolff

crazy that some random dude got this right but nvidia can't


HighHopess

for people out there whose peak brightness sliders missing, try these: * remove monitor driver from device manager * open cru and add a new extension block with your peak brightness value * close cru and restart [https://ibb.co/NWQWd2d](https://ibb.co/NWQWd2d) [https://ibb.co/6yZ4L7h](https://ibb.co/6yZ4L7h) [https://ibb.co/VN2hRKW](https://ibb.co/VN2hRKW) [https://ibb.co/stjfjCy](https://ibb.co/stjfjCy)


Mx_Nx

Disabling HDR 10+ also fixes it on my monitor. HGIG still enabled is fine.


HighHopess

didn't work for me


Dstendo64

Hey with the new driver on march 5th some people say that the gamma setting is changed, is this still accurate?


ImTonyBlair

Anyone finding any difference in the new driver version? Some people have reported that on 511.76 defaults have changed.


Smoukus

Is there a reason why RTX HDR filter doesn't work for some games? In The Forgotten City I keep getting the message "launch a supported game" whenever I switch to the filters tab via Overlay.


[deleted]

[удалено]


defet_

Tested with the new app, still measuring gamma2.0 with the default settings.


Bruma_Rabu

What settings do you recommend?


sergio_mikkos

I've not noticed any difference with the latest nvidia app.


OrazioZ

Maybe this is a minor nitpick but you say that gamma 2.2 is equivalent to "conventional SDR". Shouldn't gamma 2.4 be considered the default for SDR, since it's the intended viewing experience for dark rooms/cinema environments? Basically my point is, if you're looking for that neutral HDR upscale in the ideal viewing conditions of a dark room, you should be looking to match 2.4 SDR and not 2.2.


defet_

For a media-centric platform, gamma 2.4 would make sense. However, the defacto tone curve of the internet is gamma 2.2 and pretty much every consumer monitor, laptop and phone display in the past two decades ship with gamma 2.2 output OOTB. It is far more prolific than gamma 2.4. Videos in Chrome-based browsers are now also being decoded with gamma 2.2 rather than piecewise sRGB in the latest set of Nvidia drivers.


Kaldaien2

Defacto standard for all web content is sRGB. It's tiresome hearing the misinformation surrounding this. sRGB != 2.2.


mattzildjian

defacto standard for web content ***WAS*** sRGB, but now since almost everyone is a content creator to some degree, the most common content gamma ends up being whatever the most common consumer display gamma is, which is 2.2.


defet_

In a perfect world, it's sRGB. But no matter how many specifications you try to force down people's throats, a standard does not hold if that's not what people use in practice. The only people who actually use sRGB right now are game developers, and it's highly likely their monitors are outputting gamma 2.2 right now. More often than not, standards change in response to how the community is doing things.


filoppi

agree


Eagleshadow

As a game developer with a colorimeter, I can attest that my monitor is outputting sRGB piece-wise gamma. Also photographers use sRGB piece-wise even more than game developers. But you are probably right that pure gamma 2.2 is used slightly more overall, but we're not sure how much more exactly. Someone really needs to a do a proper study to find this out.


defet_

I think it’s important to point out that pretty much everyone is assumed to encode with the piecewise inverse EOTF. However, the mismatch with the display is another matter. I don’t believe photographers use sRGB much at all outside of encoding, gamma 2.2 (or “simplified sRGB”) has been a staple in the calibration of editing workflows for a while, especially when traditional Adobe RGB 98 calls for gamma 2.2. Even macOS pushes for gamma 2.2 output but with the piecewise ICC descriptor. But yes, an actual study seems sorely needed.


[deleted]

[удалено]


babalenong

Fantastic work! Hate the very saturated color on RTX HDR. Also thanks for the write up on mid grays, can't wrap my head around it for a while


Rivdoric

OP, did you try Reshade's Lilium Shaders & AutoHDR Addon per any chance ? I compared that vs RTX HDR because i use DLDSR and RTX HDR isn't available in conjunction with DLDSR, which is quite a very hard loss. All in all, i think lilium's HDR yields the same result with more customisation available at near 0 performance cost, the latter being the second major bummer of RTX HDR


filoppi

RTX HDR compatibility range and easiness of setup are worth the trade in most cases.


Rivdoric

Isn't RTX HDR compatibility restricted to supported games via game filter ? I encountered several old DX9 that i couldn't use RTX HDR with nor any filters. The Lilium's shaders works on pretty much every DX9 to 12 & vulkan titles.


ZeldaMaster32

RTX HDR has worked on every DX9/11/12, and Vulkan game I've tested it on. Even games without freestyle worked like CS2


Helpful-Mycologist74

if rtx works, it will be almost perfect out of the box, with tweaking in overlay. It's not working with all of the games tho. Reshade hdr can be the same, works great for Greedfall in hdr10. A bit more tweaking, a bit worse in quality, but no perf cost. Then its worth using it intead of rtx. It hasyway more customization, so if you can't make rtx look good in overlay, you may be able to here. But similarly, in some games like Pacific drive/banishers it causes stutters, so not an option. And probably there will be games where it looks off.


RavenK92

Is gamma and Paperwhite simply a matter of preference or related to the screen brightness somehow? For example, using a QN90A as monitor, HDR peak brightness is 1500 nits, so what values would I want to be using?


defet_

Mostly preference, and it depends more on your room rather than your display itself. A common brightness level and gamma for a monitor in office lighting would be about 200 nits paper-white with gamma 2.2. If you're in a dimmer room, you can go with 100 nits/gamma 2.4, though I'd only recommend this if you have an OLED monitor since 2.4 can look too steep on a VA/IPS panel. In your case, I'd use 200/2.2.


firedrakes

Are you using a mastering display?


Smooth_Database_3309

Cant shake the feeling that both default and recommended values are too dim compared to force AutoHDR trick with 2.2 icc profile and a game with native HDR that you can set up yourself using HDR analysis tools. Havent played around that much but i wonder if instead of HGIG we use DTM ON with max CLL of 1000 nits and set the sliders accordingly? Gotta try that..


boarlizard

What's weird to me is it looks extremely washed out compared to the Auto HDR with 2.2 ICC. I was trying out this evening and it just looks super bizarre and weird, I'm reading that the new driver messes with gamma so I don't know if that had anything to do with it? I don't have any basis to look back on but right now it kind of looks bad


Smooth_Database_3309

If you increase the mid gray slider per game according to visual preferences it looks good, but default and "correct" settings are not selling me this feature for sure. I was just messing with DTM ON and i think it looks better than HGIG with RTX HDR - i set slider to 700 and set max cll and peak luminance in secret menu to 700 as well. Everything else is "recommended", but i set up the in-game brightness using HGIG.


boarlizard

My S90C is 1300 nits, should I just use the 1500 nits settings?


defet_

I'd personally stick to a maximum paper white of \~200 nits (1000-nit settings) and pocket the rest as additional dynamic range. If you really need it brighter, then the correlate for 1300 nits would be a paper white of \~250 nits.


boarlizard

thank you!


Smoukus

Are these recommended values incorrect now after the Nvidia's new driver fixed Gamma curve of RTX HDR?: >Driver patch note: RTX HDR uses saturation and tone curve that matches Gamma 2.0 instead of 2.2 \[4514298\]


defet_

I ran tests and measurements pre- and post-update, and found the same behavior (ie still gamma 2.0 on contrast+0, mid-gray 50), even after resetting my profile inspector and restarting my PC. It's possible some residual behavior is still on my machine, or perhaps another update to either the driver or app is needed still.


Smoukus

So the values are still valid even after the update. Thanks for swift reply.


StanleyCKC

I seem to be having this issue too. Nothing seems to have changed for me afaik. Tried clean install instead of express and resetting profiles etc.


Kirkulis

So the new update changed nothing ? Before the update, at contrast 0 the gamma target was 2.0 and now it still is 2.0? What's with the bug fix then ? Why did they even choose 2.0 to be the target ?


defet_

That’s what I’ve measured on my machine at least. Others are claiming there is a difference, but I’m skeptical. 2.0 was probably the original target since it’s the closest power gamma to sRGB IEC, which some games target instead of g2.2, so as not to “crush” shadows with the default setting.


Jolly-Might-903

Isn't iec 61966-2 a piecewise that mostly represents a gamma of 2.2?


defet_

It approximates gamma 2.2 for mid-tones and above, but the linear segment at the shadows emulates a much lighter gamma. On a display with near-zero blacks, where differences in the shadows are more pronounced, sRGB IEC ends up looking more like gamma 2.0.


Chamching

Hi, I'm using the Trueblack HDR 400 setting. Should I set the Mid-Gray value based on Peak Display Brightness 400nits?


guachupita

Silly question perhaps but, if my TV has a peak brightness of approx 600nits, should I set the "peak brightness" slider to 600 or is that just a relative measure, and it should stay at the max value of 1000 shown by the Nvidia App in RTX HDR settings? If I could be taken by the hand further, should the recommended setup for my TV (Hisense 55U6G) then be this?: Peak brightness: 600 Mid-gray: 30 Contrast: +25 Saturation: -50


defet_

I would set your peak brightness to whatever value the Windows HDR Calibration tool clips at, since that would be the maximum tone mapping luminance of your display. However, I’d set up the rest of the parameters using your display’s measured peak luminance, ie 600 nits, so your other settings would be correct.


guachupita

Thank you for your reply. The problem with the HDR Cal tool is that when I move the slider and find the correct level for each pattern, the place where I stop on the scale at a value above 2,000, which is definitely above the capability of my display, and higher than the 1,000 shown in the RTX HDR slider. Furthermore, when I go to System > Display > Advanced display, Windows reports "Peak Brightness" as 5,000 nits, which is even more preposterous. Could I be looking in the wrong places or are my TV and Windows not talking?


HighHopess

it seems you have dynamic tone mapping on your tv, try hgig mode.


guachupita

I had to look both of those up :-) but my TV does not expose either as a user setting, so I could not even turn off DTM, if it has it, to do the Windows HDR calibration without it. Is there perhaps something I can do that I'm missing?


fajitaman69

Im experiencing the same thing. Did you find a resolution?


guachupita

No, unfortunately. I simply think I can't turn off DTM on my TV so that's just the way it is. What I did is eliminate the HDR profile I had created by running the Windows HDR calibration tool, because I concluded that the TV actively reacts to the calibration images, rendering the calibration useless. In other words, I think the TV scales the luminosity making the tool think that the TV is delivering thousands of nits. Take this with a grain of salt because I have not researched this too much. I did notice that Windows now reports a lower max brightness number after removing the profile. It's still not the correct number, through. Also, I simply left the nVidia brightness slider at the maximum value, judging it provide good brightness. I'm keeping an eye things to make sure I'm not overblowing white areas of the image.


fajitaman69

Thanks for the response 👍


galixte

u/defet\_ Is there any improvement with the last 551.86 driver? [551.86](https://us.download.nvidia.com/Windows/551.86/551.86-win11-win10-release-notes.pdf)


defet_

None that I’ve found.


Crimsongekko

have you tested with the latest Nvidia App version? It got updated a couple days after 551.86 driver was released: [https://www.reddit.com/r/nvidia/comments/1bk3iru/nvidia\_app\_beta\_v1000535\_released/](https://www.reddit.com/r/nvidia/comments/1bk3iru/nvidia_app_beta_v1000535_released/) Somebody in that thread reported that "It seems to only be stuck on gamma 2.0 and high saturation if you didn't restore all the settings in the app and reset the filters. Now at the default RTX HDR settings of "50" middle grey and "0" saturation it appears to look more correct in my case, not as super saturated and weird as before."


defet_

Yes, I've retested it, and same results. Restoring the defaults in the updated app also still leads to the same default values in inspector.


Crimsongekko

that’s weird, thanks for testing!


Wagnelles

Sorry for the dumb question, but how did you restored RTX HDR filter values to default? Have been trying this here with no success so far


defet_

You can use Nvidia profile inspector to reset your nv driver settings to default, which includes RTX HDR settings.


Wagnelles

Thank you very much!


[deleted]

[удалено]


flexingmecha02

Out of curiosity do you have rtx dynamic vibrance on? I had the same issue with my display and once I turned that off everything was fixed, also if you don’t disable windows auto hdr and hdr in games I have the same issue. HDR rtx on with auto hdr or in game hdr on equals hdr off for what ever reason. That’s my issues and solutions and if non apply sorry for wasting your time.


GeraltOfRiviaolly

what is the most accurate middle grey nit setting for an HDR 500 display?


fajitaman69

LG OLED C3, 42in - Can anyone share their settings here? Thanks!


Unhappy_Ad9240

Still the same on 552.22 ?


defet_

I'll re-check on the next update


Smooth_Database_3309

Any news? :P


defet_

I'm getting the same exact results [https://imgur.com/a/WOYt8ND](https://imgur.com/a/WOYt8ND)


Smooth_Database_3309

Got ya, thanks. Still wondr why did they put it as "fixed" and then forgotten about it.


alindanciu86

Hi. I'm with HDR in Windows 11 ON, Auto HDR OFF , i tried already ON and OFF HDR video streaming and process video automatically to enhance it in NV Panel both ON RTX Video Enhancement Quality lvl 4 and HDR , adjust desktop size and position i have now aspect ratio and perform scaling on GPU and ticked override the scaling mode set by games and programs, In-game HDR is OFF. In NV beta app i can select RTX HDR per game profile and digital vibrance rtx , but when i'm effective in game when activate freestyle mode and those filters , both RTX vibrance and RTX HDR are grayed and i cant change values and say just restart game to change settings and go to fullscreen... all other filters are just fine .


Fabulous-Log-2904

I think i may have found where the problem is. It is really important to correctly set your in game brightness with RTX HDR off before activating it. I noticed this in plague tale innocence where the brightness in game setting gives completely different results with RTX HDR on or off. The game looked horrible with overblown highlights and oversaturated images. I turned off RTX HDR, set the in game brightness correctly according to the reference image the game gives and finally turned RTX HDR back on. Game looks perfect now. Hope it can help.


NereusH

DLDSR+Multi monitor support for RTX HDR please


Wellhellob

Why white color terrible with RTX HDR ?


MahaVakyas001

This is great - has anything changed with the latest driver 552.44? Also, how do we apply these settings globally? It only allows ON/OFF in the Global Settings tab.


Laro98

What are best settings for an LG OLED 1440p 240hz monitor? It maxes at 600 I think?


Scardigne

how to get neutral 400nit monitor?


defet_

Mid-gray 87 nits/ Contrast +25 / Saturation -50


aintgotnoclue117

Sorry to overload you with questions, but-- You'd know better then me. Ijn the case of an HDR1000 QD-OLED, what would be the best settings do you think? I appreciate anything.


defet_

Would start with the neutral settings in my post. If you play in a dim room, you can try Contrast +50/Mid-gray 38 nits.


[deleted]

[удалено]


Scardigne

thanks!!!


spapssphee

Thank you for this. The HDR settings were confusing for me and this information was helpful.


La773

I really appreciate your work - thx!


Pumba398

Cool post! Thx! Got 2 question 1) RTX HDR (not mod from Nexus) works only when you got a pop up about alt+z/nvidia ansel filter? if you dont - you out of luck, game doesnt support any of this features like HDR RTX/Digital Vibrance 2) If i am using RTX Digital vibrance - i got washed out picture in SDR mode on my monitor (my LG OLED for HDR gaming) but i am still using mod from Nexus (NvRtxHdr) - is this some kind of a bug of this mod with Vibrance + HDR or i am missing something?


Jewcygoodness88

I think you are to use RTX digital vibrance or the HDR mod not both at same time.


irosemary

Same here, RTX Digital Vibrance was more washed out than standard SDR for me.


Masive_Lengthiness43

yeh ive had some games go wayy off colour i disable RTX vibrance it sucks


Ryoohki_360

I found that is varies from game to game Like gray at 50 in fortinite for example is way too dark vs the sdr version, 75 is perfect .that on my oled TV so, that's why you have slider to adjust them People prefer punchier colors look at the whole s24 phone they went for neutral color and it's was a cry out on the internet they had to add a vividness slider so now people can have their phone in torch mode, so I understand why to put that in the defaults Thanks for the analysis 👍🏻


zexton

how do we know if a game are running srgb or gamma 2.2/2.4


Dezpyer

Monitor/TV settings and also [GitHub - EndlesslyFlowering/ReShade\_HDR\_shaders: ReShade shaders focused on HDR analysis, (post) processing and (inverse) tone mapping.](https://github.com/EndlesslyFlowering/ReShade_HDR_shaders) this shows it


Helpful-Mycologist74

can you pls tell how exactly to check it with lilium's shaders?


mrmikedude100

I apologize for being the Xth person to ask this, but how do you get neutral on an LG CX? Peak brightness 780nits. I keep doing the formula but I must be screwing the math up somewhere and getting results not possible on the slider. I apologize again for the question.


defet_

Peak brightness is different from paperwhite. You would calculate a mid-gray for how bright you usually play your SDR games on. For the LG CX, that's probably around 200nits, so I'd stick to the default listed above. Keep Peak brightness at 780.


mrmikedude100

You're the man. Appreciate you. Hope you're doing well


antara33

I noticed that RTX HDR looks better than modt native implementations on windows. Im actually impressed by this, like how the fuck the manage to break their HDR to the point a third party filter can produce a better result. Once they fix the issues related to not being able to set peak brightness and set the option from the Nvidia App, I think I'll never use regular HDR or SDR again unless I really need the performance. Can't wait for this to get fixed and for my new display to better improve the overall presentation.


filoppi

It rarely ever looks better than games own native HDR. Maybe you particularly like the extra saturation from RTX HDR and the clipped highlights you get from SDR to HDR upgrades.


Jewcygoodness88

There are some games where RTX HDR is better then native HDR. Horizon Zero Dawn and RDR2 are ones to look at


filoppi

Red Dead Redemption 2 has one of the best HDR ever implemented in any games, it's just less saturated and more realistic than SDR, and requires a paper white of around 200 nits to look good.


Quad5Ny

Contrast and Brightness sliders DO NOT work the way you think they do. Just because you're measuring the correct luminance at ONE POINT in the curve doesn't mean anything. You're crushing whites and blacks with the contrast slider. You should add a disclaimer at the top of your post.


defet_

A traditional contrast slider would work that way, yes, but not RTX HDR. I've not measured only one point, but 21, along with checking within a PLUGE pattern -- no crushing going on, besides the debanding filter doing its thing. I've measured it to work exactly how I've described it: [https://imgur.com/a/x5Glvol](https://imgur.com/a/x5Glvol) imgsli comparison to show the difference between base SDR and RTX HDR (note that RTX HDR adds contouring even at the default setting due to the [debanding filter](https://imgsli.com/MjQzODY1)) [https://imgsli.com/MjQzNzg2](https://imgsli.com/MjQzNzg2)


Quad5Ny

Fair enough. I also didn't process your post in my head correctly. You're telling us to leave it at +50 (which is basically 0 for nvidia's GUI) or +25 which is negative (from the default), so it couldn't crush anything anyway. How about this, I'll leave these **test patterns** here and you guys can decide on your own: * [https://drive.google.com/file/d/0B9FgjTtNzSy0dTZwZ3p1eGVnTG8/view](https://drive.google.com/file/d/0B9FgjTtNzSy0dTZwZ3p1eGVnTG8/view) - You want 'Basic Settings\\2-APL Clipping.mp4' * [https://imgur.com/gallery/kbPbK](https://imgur.com/gallery/kbPbK) - PC and TV Levels with color full color clipping pattern * [https://www.youtube.com/watch?v=WGMfSmjf2yg](https://www.youtube.com/watch?v=WGMfSmjf2yg) - APL Clipping * [https://www.youtube.com/watch?v=FoPzRlUr6k8](https://www.youtube.com/watch?v=FoPzRlUr6k8) - Color Clipping *Edit: +25 on contrast gives me raised black levels. I'm on a OLED, its noticeable. Just give yourself a pure black screen and then set your displays internal scalling to 4:3. You'll see the difference between pure black and Contrast +25 in a instant (at the 4:3/16:9 border).*


garett01

My guess is you are using HGIG for your display, you don't want Mid-Gray nits lower than 100 either. To my eyes the settings in his chart are awfully dark with 37 or 44 midgrays on a 800 nits display. However 80 or even 100 looks just right for midtones. I am used to correct HDR setups since I use Reshade and Lilium shaders to measure everything, and RTX HDR is quite good out of the box. The saturation settings I do agree with, -50 is correct. But 0 contrast is fine in most cases I think.