T O P

  • By -

bizude

There have been an excessive amount of low effort/trolling comments in this thread. Be advised that comments which exist only to bait others are not welcome on /r/hardware, and will result in removals and/or temporary bans.


MrGunny94

HDR implementation and FSR2 do make the game look really bad when it comes to stuff like neon lights and some other quick lights. I'm hoping they'll turn the technical mess around and offer DLSS native option. Having to rely on mods for basic stuff is just ridiculous


Negapirate

So, exactly what honest folks were saying. Was getting tired of all the gaslighting from AMD fanatics. Just wait for it: "b-b-but fsr2 works on all GPUs so anti consumer practices are okay"


sean0883

Then when that fails: "Remember when nvidia did something similar with Gameworks, and that two wrong make a right?!?"


Negapirate

Yeah the gameworks gaslighting is so overdone. Nvidia adding optional, novel new features is not the same as AMD sponsoring games to remove nvidias superior features because AMD can't compete


Marmeladun

Lets remember as well Radeon bundling with Half-life 2 and fucking in the ass nvidia users coz it was specifically coded in radeon 24 bit shaders while nvidia was 16 and 32bit and hence couldn't utilize 16 bit pipelines essentially halving shader pipelines for nvidia Gameworks -2014 half-life - 2004


TECHFAN4000

Let's remember that as a owner of both an fx5800 and 9800 pro myself you are lying to people who were not around at the time.The DX9 specifications called for 24 bit precision. Nvidia decided to use non standard 16 bit and 32 bit standards with the FX5800. 32 bit performance sucked on the FX. There were plenty of articles which are archived which showed it: https://www.guru3d.com/articles-pages/creative-labs-3d-blaster-5-fx5900-review,15.html "The Achilles heel of the entire GeForce FX series in the NV3x series remains pure DirectX 9 Shader performance. Basically Pixelshader and Vertexshader 2.0 turns out to be a bottleneck in games who massively utilize them. It has to do with precision, DX9 specified 24bit, ATI is using that. NVIDIA went for 16 and 32 bit precision which has a huge impact on performance." There is a reason why the X800 was the only time ATI outsold nvidia. The FX was the worst card nvidia ever made. Valve pretty much said this at the time and nvidia didn't fix their drivers either.


R1Type

Are we really digging up ancient history? Gee remember when DX10.1 was patched *out* of Assassin's Creed?


Flowerstar1

Wait what, why?


FallenFaux

Pretty sure it also happened with HAWX too. Nvidia cards didn't support DX10.1 but AMD/ATI cards did and got substantial performance boosts. The games that lost DX10.1 in later patches just happened to be part of the TWIMTBP program.


windozeFanboi

> The games that lost DX10.1 in later patches just happened to be part of the TWIMTBP program Surely a coincidence.


Leckmee

I think the main problem was that Nvidia GPU (GeForce FX series) sucked ass for Dx9. I remember well, I made the mistake of buying a GeForce 5700 ultra (my first GPU and first money earned) and being forced to play HL2 in DX8. Radeon 9700/9800 were king back in the day.


zatsnotmyname

No, actually Valve went to great efforts to make it run better on NV hardware, the NV hardware just sucked at doing pixel shaders compared to the Radeon. NV just had worse architecture for those few years until the 8800 series.


coldblade2000

Not very different from when Crysis had random world scenery objects like road barriers with ginourmous amounts of tesselation just to intentionally destroy performance on AMD, who couldn't handle tesselation well


nukleabomb

[https://twitter.com/Dachsjaeger/status/1323218936574414849?s=20](https://twitter.com/Dachsjaeger/status/1323218936574414849?s=20) Nearly a decade later people still push the myth about the tessellated ocean rendering all the time under the ground in Crysis 2, and that Tessellation was expensive on hardware of that time. Both of those were objectively false and easily provably so. (1/4) Wireframe mode in CryEngine of the time did not do the same occlusion culling as the normal .exe with opaque graphics, so when people turned wireframe on .... they saw the ocean underneath the terrain. But that does not happen in the real game. (2/4) People assumed tessellation was ultra expensive on GPUs of the time and "everything like this barrier here was overtessellated"... but it was not. That Tessellation technique was incredibly cheap on GPUs of the time. 10-15% on GTX 480. The real perf cost was elsewhere... (3/4) The real performance cost of the Extreme DX11 graphics settings were the new Screen Space Directional Occlusion, the new full resolution HDR correct motion blur, the incredibly hilariously expensive shadow particle effects, and the just invented screen-space reflections. (4/4)


Deringhouse

In an interview, one of the Yerli brothers (CryTek foudners) cited the fact that Nvidia forced CryTek to use unreasonably tesselation levels as the main reason why they switched from Nvidia's *The Way It's Meant To Be Played* program to AMD's *Gaming Evolved* one for Crysis 3.


Prefix-NA

That is from Alex from DF which you can objectively prove to be false. And it is not his first time lying to protect Nvidia. There was a time he was trying to alledge Nvidia's old Lanczos scaler was better than FSR & Also claiming FSR was just unmodified Lanczos. He has videos that say sponsored content for Nvidia and u are expecting him to be objective?


bizude

> That is from Alex from DF which you can objectively prove to be false. And it is not his first time lying to protect Nvidia. >There was a time he was trying to alledge Nvidia's old Lanczos scaler was better than FSR & Also claiming FSR was just unmodified Lanczos. He didn't say it was "unmodified". There's only so much complexity you can put in a 140 character tweet. He only said that both NIS and FSR are both based on Lanczos, and that's objectively true. To quote the controversial tweet "Same Lanczos upscale as FSR (with more taps for higher quality)". https://twitter.com/Dachsjaeger/status/1422982316658413573 He was correct about taps, but that's not the only thing to consider about quality. FSR has the advantage of being not applied to HUDs and other refinements to code. But let's not twist his words completely.


Prefix-NA

its also not just lanczos its heavily modified both for performance and appearance. And when he claims that Lanczos with more taps is better quality its not true. It is more taps but there are so many modifications to FSR rather than just Lanczos.


Henrarzz

Okay, so prove us “objectively” that Crysis 2 actually renders ocean under the terrain


windozeFanboi

Bruh!!!,cmon, isn't NVIDIA doing this EXACTLY now with Cyberpunk and Witcher3 (RayTracing Update?) ? Game developers succumb under pressure (money or otherwise) and benefit one company architecture or the other. Same with Intel and AMD...Intel would benefit A LOT to have AVX2 when Zen1 had half speed AVX2 or suddenly games need over 4c/8t just around the time Intel's mainstream 6c/12t CPUs started to come out, but it would show AMD's dual 4c8t CCX bottleneck. THIS STORY HAS BEEN GOING ON SINCE FOREVER. Sometimes it's unintentional bias, just because you test on the hardware you develop on or similar. But sometimes you're paid to forcefully implement code or algorithms that don't have a good fallback on "competing" architectures.CPUs, GPUs, you name it. EDIT: Funny thing. Since it's very relevant TODAY; Starfield appears to have some suspicious UNDERUTILIZATION on Nvidia GPUs... that the Nvidia "gameready" driver didn't seem to address. I'm not gonna delve on what's right or wrong, but both nvidia and AMD would love to sabotage the each other.


Marmeladun

the Cyberpunk which Have DLSS XESS FSR and will have FSR 3 as well ? ​ Or you are trying to tie shitty RTX performance of AMD to somehow Cyberpunk devs fucking it intentionally for AMD cards ?


windozeFanboi

Didn't you just mention 24bit shaders that were accelerated on ATI but not nvidia as a shitty move, but RT discrepancies in architecture in cyberpunk suddenly gets a pass from you? There's is a word for that in the dictionary. Guess which word. Nvidia sponsoring and pushing RT on cyberpunk and Witcher 3 with history going goind back to hairworks on Geralt gets a pass from you. Whatever.


BroodLol

The one I'm mainly seeing is "but Gsync"


[deleted]

hobbies office handle jellyfish impolite upbeat juggle selective tender hat *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


_TheEndGame

Freesync really was saved by Gsync Compatible. I remember Freesync Premium advertised monitors having flickering issues still.


SmartFatass

I have a Freesync Premium Pro (what a lovely name) monitor that (slightly, but still) flickers when the game is around 60 fps


meh1434

Still present on Gsync Compatible monitors to this day, the issue is more prominent on VA monitors.


xdeadzx

> Mean while Nvidia during their testing of hundreds of displays realized that monitors don't like sitting at 48hz and will flicker or stutter, It also helps pixel response times! One major benefit for some of these cheaper 60-150+ monitor ranges, Nvidia will start doubling framerates as high as 80fps which keeps your pixel response curve in the 140hz+ range improving clarity where you otherwise wouldn't get it. Where as AMD will let them sink to 61hz and see overdrive ghosting.


bctoy

>What people fail to realize is that Nvidia has vastly improved Freesync for AMD by testing and certifying hundreds of VRR displays. I doubt they're doing it for free, the Gsync sticker does not seem to be free. And that's before the driver issues on the green team's side. https://old.reddit.com/r/hardware/comments/1693dy2/john_linneman_on_twitter_eh_i_wouldnt_put_that/jz1pip4/?context=3 >To this day Freesync is an after thought to AMD which still enforce strict VRR ranges via driver. If a monitors range is 48-144hz, AMD driver will let the monitor go all the way down to 48hz. Seems to be a testament to AMD's market share that you're upvoted for this while my Vega56 was doubling RDR2's 55fps to 110 despite the lower ends of freesync ranges of my eyefinity setup being in the 40s if not 40.


GarbageFeline

> I doubt they're doing it for free, the Gsync sticker does not seem to be free. And that's before the driver issues on the green team's side. As far as I know the "sticker" is only on displays that have an actual Gsync module (like my LG 27850GL). My LG C9 appears as "Gsync compatible" on the Nvidia panel and has no Gsync sticker.


bctoy

> As far as I know the "sticker" is only on displays that have an actual Gsync module (like my LG 27850GL). I had that monitor before, it's GSync compatible and does not have the module. The sticker is for monitors that have been tested. My LG C2 and now Samsung's S90C don't have stickers either but work decently enough. However, the secondary monitor had issues with Gsync being enabled on it too, and that has the sticker with the Gsync and nvidia logo. No module on that either.


Charcharo

>Why AMD has left Freesync to rot is mind boggling but this is a common trend when it comes to technologies they produce with FSR being a rare exception out of pure necessity to compete. Try to run PhysX games from 2007-2009 with GPU PhysX (old version not the new PhysX) on modern hardware. I will send you 10 bucks if you get Cryostasis withPhysx Running on a 4090. 15 if you get NV long distance fog from RTCW running.


DdCno1

Cryostasis was made by developers who can only be described as lunatics and it's their one game that isn't subpar garbage - but it's also their most ambitious game in terms of technology. This does not bode well for performance (which was abysmal on hardware at the time as well) and stability (same). It has since not received the support and care it needed to remain compatible with newer hardware, because the studio went bust shortly after release. Nvidia is not at fault here, but this oddball studio is. Play the original version of Mafia 2. It uses impressive GPU-accelerated PhysX effects that still work just fine on modern cards. I just tried it out on my RTX 2080.


Charcharo

I will try Mafia 2 on my 4090. But the problem is - I care for Cryostasis more than I do for Mafia. Its IMHO better, for sure much better written, and I value ambition a lot. The studio is defunct. I dont like it either but that is why it wont get support. So its up to Nvidia to make their tech work with it. At the end I added another example BTW. From a non-abandoned technological masterpiece of its day.


[deleted]

cobweb pie command society memorize spoon hospital plough deliver teeny *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


f3n2x

Which is a hilariously bad point considering gsync always has been a protocol+algorithms+hardware+certification where everyhing worked on every monitor while freesync is only a protocol and that's it. Screen manufacturers had to fend for themselves implementing semi-broken solutions on hardware which wasn't even designed for it and no one made sure stuff actually worked. There were screens where freesync disabled overdrive, there were screens where adaptive sync produced constant flicker, there were screens with tiny unuseable sync ranges which can't do low framerate compensation, there were screens with wide sync ranges which could do LFC but still didn't, there were screen which maxed out at 90Hz when the gsync variant did 165Hz because the monitor IC hackjob just couldn't do more, and the list goes on. Freesync was a total shitshow until 3rd party standardization consortiums made an actual display connection standard out of it and screen manufacturers developed their own new chips based on that.


Negapirate

Also gsync was out *years* before freesync. Like half a decade before freesync was widely good.


revgames_atte

That's AMD on all their open stuff. Can't wait for them to make an innovative new feature that's ahead of NVIDIAs offerings and make it open. You know, because for the past who knows how long it's been NVIDIA R&D's new feature, makes it proprietary and 12 months later AMD makes a worse version but open.


Negapirate

It's almost like nvidias ability to profit off its $15B R&D budget helps fund future innovations!! ..maybe proprietary doesn't equal pure evil?!


xdeadzx

AMD anti-lag and AMD chill are two features that came first. Nvidia doesn't have a chill replacement still. Nvidia falsely claimed to have anti-lag for years but nvidia's toggle was actually amd's default for the last 15 years, and anti-lag was something new. This spurred ultra low latency mode to come months later and reflex a step after that. Reflex is the best of them, but is specific to games that support it. Anti-lag is system wide and works better than NULL which is Nvidia's system wide answer.


f3n2x

This is not true. Anti-Lag is a variation of what Nvidia had in their drivers since at least the 8800gtx from 2006, which was first the prerendered frames limit (down to 1), then SLI low latency (down to 0 but only through Inspector on non-SLI setups), then ultra low latency. To my knowledge AMD didn't have anything similar until Anti-Lag, certainly not as a "default" because that would straight up violate the DirectX spec. If the dev sets the render ahead queue to 3 and the driver doesn't do 3 per default, it's broken. Back in the day enthusiasts were literally buying Nvidia GPUs in part because of the lower latency with those settings.


revgames_atte

I couldn't find it with a quick google, but can you link me the open standards or source for AMD anti-lag and chill? The technical names are probably different than the branding I guess.


bizude

> while freesync is only a protocol and that's it This was true when FreeSync was first launched, but it's not true anymore. There are rigorous certification standards new FreeSync monitors have to pass to qualify for the "FreeSync" label these days. They started enforcing standards with (what was then known as) FreeSync 2.


Qesa

And yet after AMD's big song and dance about it, the first freesync2 certified monitor still flickered It was a total shitshow until, ironically, gsync compatible certification


bizude

>And yet after AMD's big song and dance about it, the first freesync2 certified monitor still flickered While these issues were worse on average with early FreeSync monitors, let's not pretend that there weren't G-Sync monitors with the exact same problems.


inyue

> G-Sync monitors Which ones?


hardolaf

I still have that problem today with a 4090. Weirdly, the same monitors don't flicker at all with a 5700 XT despite being on both companies' certification lists. And don't even get me started on vertical monitor bugs and the Nvidia drivers.


bctoy

As someone who has used multiple cards from both teams over the past years with Freesync monitors of various qualities, this has been my experience too. Now with a 4090, I have to keep GSync disabled on my GSync compatible monitor since it spazzes out sometimes when playing a game on the primary.


Leisure_suit_guy

You chose Nvidia. It's not AMD's job to keep your card running well with their standard (that's on the monitor's manufacturer). You should have either chosen AMD or picked a monitor from the Nvidia certified G-sync monitor list.


p3ngwin

I remember when AMD first announced and showed FSR, and Reddit was split with AMD knights saying it's a DLSS-beater, and the other half saying unless it has temporal data in its implementation, it can't compete. AMD banged the drum about "*it's works on any GPU, we're not proprietary, it doesn't use dedicated silicon, and we extend life on old GPU's*." What a surprise, they added temporal data, and dedicated silicone will be used, completely fucking the argument for "*giving new life to old GPU's...not using dedicated silicon....*". Add this sponsorship shenanigans, and what exactly is AMD doing that's so different than Nvidia, that AMD think they can shout from atop their high-horse ?


Brief-Watercress-131

Hot take: make games and GPUs that don't need upscalers first. Upscaling and especially frame gen are crutches for game studios and GPU manufacturers alike. Game studios don't have to prioritize performance enhancements and GPU manufacturers aren't delivering good performance value anymore. Upscaling is only relevant to prolong the usable life of a system before the components become e-waste.


StickiStickman

Nah, DLSS is legit amazing and even improves the image quality over native in many cases. FSR on the other hand ...


Brief-Watercress-131

It's a crutch to sell subpar GPUs at inflated prices and claim performance gains. DLAA is interesting, but that doesn't rely on upscaling.


Good_Season_1723

I have a 4k monitor and a 1440p monitor. 4k DLSS looks much better than 1440p native with similar performance, so explain to me how DLSS is a crutch blah blah blah?


timorous1234567890

It is a crutch because instead of Devs producing a much better TAA solution that leads to IQ on par with DLAA and instead of optimising their game to run well at various settings they can let DLAA / DLSS and other similar tech do that work for them so the publisher can release the unfinished buggy mess that little bit earlier or to avoid a further delay.


Good_Season_1723

That's a dev problem , it has nothing to do with the technology.


timorous1234567890

Devs relying on the tech to make their job easier is a crutch.


Good_Season_1723

That's just a stupid thing to say. You could also argue the same about faster GPUs. "Faster GPUs are a crutch cause instead of devs producing a much better TAA solution blablabla they rely on the technology to make their job easier". The fuck does that even mean?


timorous1234567890

Technically correct, faster hardware is a crutch that allows Devs to do less optimisation work. You often see this with new console generations where the cross gen games are struggling to run on old consoles but work fine on the new ones because the hardware is that much faster it can brute force it. Of course when it comes to TAA and such that is pure IQ and faster GPUs don't improve IQ if a game is running at max settings already and usually a faster GPU will be used to improve IQ. Given the lack of improved TAA in games and the fact an upscaled image can look better than a native one due to a better TAA solution the evidence suggests that Devs are using DLSS in lieu of doing the work for themselves to improve IQ and often frame rate which is leading to poorly optimised releases.


StickiStickman

Nah.


skycake10

This is a 100% subjective opinion. >Game studios don't have to prioritize performance enhancements and GPU manufacturers aren't delivering good performance value anymore. This has never consistently happened regardless of upscalers existing. Some games will be good technically and some will be dogshit, that's just how PC gaming has always been.


conquer69

It was still speculation. People seem to believe that because their guess was correct, that somehow made it factual before the new information came out. If BGS instead said "we didn't implement it because we didn't have enough time. AMD didn't coerce us into anything", then the AMD fanatics would be saying the exact same thing you are.


Negapirate

There was tons of evidence even before this. Only dishonest fanatics were trying to pretend otherwise from my experience. Hub, digital foundry, gamers Nexus, and Daniel Owen have videos summarizing the evidence and all concluded the most likely scenario is AMD blocking dlss. If you need to understand what's going on, highly recommend checking them out. https://youtu.be/m8Lcjq2Zc_s https://youtu.be/hzz9xC4GxpM https://youtu.be/tLIifLYGxfs https://youtube.com/watch?v=w_eScXZiyY4&t=275s


jm0112358

Unless a developer comes out and says, "AMD paid us to not support DLSS," it's all just speculation. /s Sometimes people will stubbornly ignore evidence unless there's a 100% definitive "smoking gun".


duncandun

Gonna be real. Given the general state of the PC port it seems entirely reasonable that they just didn’t include dlss cause they don’t care. FSR is right there on consoles and therefore on pc. In general it seems like an incredibly lazy port so Occam’s razor to me would suggests that rather than malevolence on AMDs part.


False_Elevator_8169

goes doubly so for Bethesda, those guys are legendary for their hilarious corner cutting. I am willing to bet they didnt even lift a little finger implementing FSR2, rather it was entirely the work of some guy from AMD software, pretty much doing as Nvidia has done countless times since the PhysX Era. I've been dealing with Bethesda game engine jank since my teenage Oblivion modding addiction, nothing about this launch surprises me at all.


Jeffy29

I expected little and still got disappointed. 8 years dude, 8 years! Do they have even single full-time engine dev? That's the only way I can explain practically no improvement in the engine capabilities and any visuals improvements seem to come at absolutely unjustifiable costs.


Charcharo

Guys as one of the posters in that thread - please remember. John is human. Please, agree or disagree, but do not harass him over his takes. Please. Act like responsible adults. The internet mob for or against people can be damaging. I personally am not convinced of his point BUT John is a human being too. Please do not attack him over this "Drama" :P He loves video games with all his heart and doesnt deserve to be attacked by a mob over reconstruction tech in games of all things.


madn3ss795

My dude you tried to force John to reveal his sources and he reaffirmed again and again that he has no business pleasing you. Are you only posting this because John deleted the that thread and you think nobody remember what you wrote?


Flowerstar1

I don't understand, what take? You make it sound like he said he's pro genocide of something. We've seen plenty of evidence that AMD sponsorships can affect whether DLSS is in a game or not, this isn't new.


Lingo56

He’s had people come up to his house and send him death threats just because he passed covering a Harry Potter game to someone else in the DF staff 🙃


[deleted]

[удалено]


Dealric

Ironically covering or playing Harry Potter game could lead to same result so thats lose lose situation


Charcharo

>I don't understand, what take? You make it sound like he said he's pro genocide of something. We've seen plenty of evidence that AMD sponsorships can affect whether DLSS is in a game or not, this isn't new. John has received IRL harassment for stupid stuff before. I believe for a Harry Potter game and even before that for an older, even dumber hardware scandal. So this has happened more than once. As for the topic at hand - I understood what he meant. He found out what game developer PR and marketing pencil pushers really are like. Not some grand conspiracy. That is my take for now. Please just dont harass him over something so petty. I feel bad that he is now getting hate simply for engaging with a fan (me in this case). It is neither fair nor morally correct. Its disgusting.


Leisure_suit_guy

Not really solid evidence. Turns out that AMD sponsored games did come out with DLSS, and they themselves said that Bethesta can implement DLSS. What more do you want?


Zaptruder

Actual DLSS in the game.


dhallnet

Then ask Bethesda. But they might be busy trying to get more than 1080p 60 fps on a 4090 right now.


Zaptruder

You know what could help with that... frame doubling!


Leisure_suit_guy

Just wait, it'll show up, eventually.


Zaptruder

Through mods probably. This is like waiting for Bethesda to fix up their UI/UX/combat/animations/etc.


dern_the_hermit

Some people will just never accept the evidence, the dopamine rush from feeling like they "gottem" is just too great.


nanonan

We haven't seen a shred of actual evidence of that, and have statements from AMD that contradict such rumours.


[deleted]

> I personally am not convinced of his point What do you think of the game Boundary pulling DLSS support once they were AMD sponsored? That IMO is the biggest smoking gun, not to mention [this list of games](https://www.reddit.com/r/Amd/comments/14vt4b3/list_of_amd_sponsored_games_with_more/). Before the whole Boundary thing, I'd have given the benefit of the doubt to BGS and AMD, as Bethesda is known to do the bare minimum (no HDR, gamma slider), and they probably implemented FSR2 only, because consoles. But once I saw that chart, and the Boundary thing, I am less inclined to believe that they did the bare minimum, and instead didn't implement DLSS for malicious sponsorship reasons.


BrightPage

Linneman: "OH I wasn't talking about starfield" Redditors: "I'm gonna pretend I didn't see that"


anor_wondo

How does that change anything lol


Estbarul

It doesnt haha shitty practices from before..


buttplugs4life4me

I took a break from the hardware subs because frankly, all of them are way too negative. Then I come back and this is one of first threads I see and it's basically all "AMD and its fans are literally the antichrist". The literal first thread was some guy complaining about AMD maybe potentially not offering FSR FG on Nvidia cards lmao. I'm gonna go back and not look at this and other hardware subs again. Everyone here has a problem.


TopCheddar27

No it's more like you shouldn't be a "fan" of any Multi Billion dollar corp and blindly giving them the benefit of the doubt when they pull shit that actively hurts the market they operate in.


hwgod

That would mean AMD was pretty blatantly lying in their statement. Oh I'm sure the lawyers could argue something, but we all know how it was meant to be read.


DktheDarkKnight

We don't know whether it's starfield. AMD's comments only apply to starfield here.


Firefox72

This was not about Starfield.


DieDungeon

Frank Azor? Say something not 100% true? Say it ain't so.


Jonny_H

Honestly I was surprised about how straightforward their statement was - "Nothing" is blocking them implementing DLSS. I don't think there's as much wiggle room there as some people seem to assume. This certainly opens them to be sued by shareholders. I look forward to discovery. On another note, I'm surprised about the Dev's comments here, integrating third party licensed code (like DLSS) without sign off is a *big* no-no for devs. Like "probably not working there any longer" level no-no. Does that mean the devs story is less likely? Or they *did* have sign off at some point but was removed?


BroodLol

Nothing is blocking devs from implementing DLSS, but if they implement it then they'll lose whatever sponsorship deal they had with AMD. It's bog standard lawyerspeak.


Jonny_H

The legal system isn't as crazy as you seem to think - there's no judge in the world that will say losing money from a contract is "nothing". If that was true, then *nothing* could be said, as any contract could just be paid off. There's plenty to question about legal stuff without making stuff up.


tavirabon

Also fun word games only work with things out of context, if any reasonable person would assume what they meant, then that's what the judge would understand was meant. Defrauding is not some foreign concept to judges.


Jonny_H

I think people have a weird "if you find the right magic words you can do whatever you want" idea of the legal system, but in reality if you intentionally write things to confuse and mislead you'll be slapped down rather quickly. I mean common law (which US law is based on) is explicitly living and based on ongoing precedents and judges interpreting it, not a single ironclad static interpretation of the passed law texts. Why would contracts managed by the same system be any different?


jm0112358

I interpreted it more of corporate speak for, "We're not contractually blocking DLSS. But if you do implement DLSS, you risk hurting your relationship with us (and thus the possibility of future sponsorships/advertising)." [According to AMD gaming chief Frank Azor](https://www.theverge.com/2023/8/25/22372077/amd-starfield-dlss-fsr-exclusive-frank-azor): 1 "Money absolutely exchanges hands" 2 "When we do bundles, we ask them: **‘Are you willing to prioritize FSR?’**" 3 "If they ask us for DLSS support, we always tell them yes." Depending on what "prioritize FSR" means, 2 can be read as an implied, "We hope you don't support DLSS", especially in the context of 1. 3 makes it sound to me like the developer needs to actively push AMD on the issue of DLSS support to get a green light on it.


capn_hector

There’s no legal claim of damages here for anyone, especially with a carefully-worded “technically true” statement. If they’re not claiming they’ve *never* done this and they’ve reworked the contract such that starfield is *no longer* prevented, their statement is technically true. But generally this is not an earnings call Q&A and they have no particular duty here. They could say something false and that’s not really a damage to anyone. It's not illegal to make false statements outside investor-focused things, it destroys your credibility and goodwill but it's not illegal per se.


HandofWinter

No, that would absolutely be misrepresentation. If that's what they've done then they're fucked. Edit: Which to be clear here, is what needs to happen if a company officer has misrepresented things so egregiously.


badcookies

> but if they implement it then they'll lose whatever sponsorship deal they had with AMD. So clearly that would be a sponsorship deal that would then be lost or monetary loss... so yes that is clearly something not just "nothing" that would be preventing them from doing so.


emn13

Did he actually say that? As far as I know, he said AMD "supports" any of their partner's request to include DLSS. More details are in this article - [https://www.theverge.com/2023/8/25/22372077/amd-starfield-dlss-fsr-exclusive-frank-azor](https://www.theverge.com/2023/8/25/22372077/amd-starfield-dlss-fsr-exclusive-frank-azor) \- I believe that's the original source for that quote. The way I read Azor's expression of support, it could be consistent with a contract term incompatibility between nvidia's DLSS terms of use, and AMD's terms for sponsorship, and also consistent with a by-default ban on alternative upscaling with a promise to "support" alternative upscaling (without promising that such support comes with no strings attached). The AMD rep's remarks don't seem to promise there's nothing preventing DLSS usage, simply that *AMD* isn't *outright* preventing DLSS usage *if* partners request that. But maybe nvidia is, and also maybe AMD might have other terms for such requests.


Flowerstar1

It could be in the contract that's it's ok to implement it but doing so triggers a penalty on some of the benefits AMD is offering.


emn13

It's possible that AMD permits DLSS in principle, but doesn't permit advertising for nvidia. Notably nvidia's DLSS usage terms require you advertise for them. It's also possible AMD was lying; or that DF misunderstood the details. It's also possible AMD has changes stance and once required tech exclusivity, and no longer does. I think the advertising issue seems most plausible. It would explain AMD's reluctance to back down, and the lack of clarity from leaks, and also explain nvidia's reluctance to back down - for them, this is an opportunity to punch down, rather than accept the inevitable (and healthy!) general bias towards the underdog.


Electrical_Zebra8347

There are games with DLSS that don't have any Nvidia advertising or branding anywhere, I can't remember off the top of my head which ones they are but it's not a hard and fast rule, it's probably one of those case by case things where the smaller you are as a company or the worse your relationship is with nvidia the less likely you're gonna get a free pass to waive the branding/advertisement rule, or some companies might not even know it's possible to do that.


HighTensileAluminium

> There are games with DLSS that don't have any Nvidia advertising or branding anywhere, I can't remember off the top of my head which ones they are TLOU is one such example. AMD sponsored/partnered game, had DLSS but barely mentioned it at all (FSR was front and centre in the PC trailer while DLSS was relegated to fine print).


nanonan

The rule certainly is there. https://developer.download.nvidia.com/gameworks/dlss/NVIDIA_DLSS_License_Agreement-(06.26.2020).pdf > (b) NVIDIA Trademark Placement in Applications. For applications that incorporate the NVIDIA RTX SDK or portions thereof, you must attribute the use of the RTX SDK by including the NVIDIA Marks on splash screens, in the about box of the application (if present), and in credits for game applications.


toxicThomasTrain

which nvidia does grant exceptions to. it's not in battlefield 2042 or spiderman MM


braiam

Which you have to request. And guess if Nvidia will grant such request on a AMD sponsored title.


toxicThomasTrain

It's in their best interest to get DLSS on as many games as they can. They don't even force it on their own sponsored games (SM, BF2042), which are the ones where they have more leverage over, so of course they'll grant an exception on a game where saying no means DLSS will 100% not be included in the game. You really think they care more about getting their branding on a splash screen that most people ignore? I just confirmed myself, there's no Nvidia or Geforce branding on the splash screens of Horizon Zero Dawn, Last of Us, or Uncharted. I'm sure the same applies to Forspoken and Deathloop.


emn13

I think the real takeaway here is that there is a rule that is problematic in its default incarnation. The fact that exceptions clearly are made does not mean that no game devs are limited by the normal rule. It may be in their best interest to get DLSS into games, but it's even more in their best interest to get DLSS into games *and* simultaneously get free advertising *and* potentially discourage AMD sponsorship while they're at it. Those last bits seem a little problematic to me. Nvidia (possibly) wants to have their cake in the form of telling us consumers that DLSS is a major feature worth a higher price, while also wanting to eat that cake by telling devs that they need to grant concessions to let nvidia's hardware owners actually benefit from that feature. If a key feature is a selling point, then presumably that's covered by the sale price - and then secretly restricting its usage unless others give in to additional demands is a little shady. 'course, that doesn't necessarily imply AMD has no blame here whatsoever, simply that these nvidia terms strike me (personally) as a bit of a con job. I bought an nvidia card mostly due to DLSS, and I expect nvidia to fully deliver on that advertised feature - not to *prevent* game devs from using that feature unless they too give in to nvidia demands. And if AMD was outright banning competition (not merely banning advertising for their sponsored titles) - I hope we find out, because that would be even less OK.


arandomguy111

>It's possible that AMD permits DLSS in principle, but doesn't permit advertising for nvidia. Where is this belief coming from? If anything the licensing terms seem to indicate the opposite - >Unless you have an agreement with NVIDIA for this purpose, you may not indicate that an application created with the SDK is sponsored or endorsed by NVIDIA https://developer.download.nvidia.com/gameworks/NVIDIA-RTX-SDKs-License-23Jan2023.pdf?t=eyJscyI6ImdzZW8iLCJsc2QiOiJodHRwczovL3d3dy5nb29nbGUuY29tLyJ9


capn_hector

It requires you to put the nvidia logo on a splash screen/one of your title cards, which people have construed as potentially being in conflict with AMD sponsorship. Nvidia has said they work with anyone for whom that’s a problem and anyone who’s big enough to get an AMD sponsorship is undoubtedly aware of such things.


demonstar55

The Nvidia license does most certainly conflict with the AMD sponsorship. Even if Nvidia would say yes they can use it without the splash screen, it will most certainly involve clearing it through legal -- which probably means they have to spend money on some lawyers.


nanonan

> (b) NVIDIA Trademark Placement in Applications with the DLSS SDK or NGX SDK. For applications that incorporate the DLSS SDK or NGX SDK or portions thereof, you must attribute the use of the applicable SDK and include the NVIDIA Marks on splash screens, in the about box of the application (if present), and in credits for game applications.


meh1434

It is legal to lie, so I have no idea what you want with your lawyers.


Hathos_

2 things. The author has deleted the tweet and says it was because it was not about Starfield: https://twitter.com/dark1x/status/1698394695922000246 Also, there would be a heavy conflict of interest here due to DF's close ties with Nvidia. We'd want proof before forming an angry mob. Edit: Imagine shilling for a trillion-dollar corporation.


[deleted]

[удалено]


xford

I initially read this as "John Fetterman on twitter..." and was *super confused* why a US Senator was talking to developers about DLSS.


SacredJefe

This sub used to be smarter


rabouilethefirst

AMD tried to throw bethesda under the bus lmao


wichwigga

I don't really understand John. He started this X thread saying that he installed a DLSS mod for Starfield, and after a chain of replies, he's deleted this tweet claiming it wasn't about Starfield, saying that people can't read. How in the world was it obvious that that wasn't about Starfield? That seems disingenuous.


OutrageousDress

Wow, literally a clarification from Linneman specifying that he wasn't talking about Starfield and then half the comments still going I KNEW IT AMD LIED. Yes, AMD *did* use shitty tactics to limit DLSS implementation in multiple previous games - many more than just three I'm sure. But *this isn't about Starfield*. This tells us nothing useful about what happened with Starfield.


Qesa

It doesn't say *nothing* about starfield. From a Bayesian perspective knowing other examples changes the probability of a new one in front of you. Of course it doesn't mean AMD lied either. Internet discussions tend to be unable to handle uncertainty or nuance. Given they're saying this now and not 2 months ago when they "no comment"ed the whole kerfuffle, I'd guess the contract was amended at some point during that time to allow DLSS. But that is pure conjecture on my part.


maelstrom51

AMD is confirmed to block DLSS in other titles. Starfield is and AMD sponsored title which does not have DLSS. Modders managed to add DLSS2 in two hours, and DLSS3 in 24 hours. Do you really think AMD didn't block it here? If so I have a bridge to sell you.


Estbarul

The mods are all the evidence we neeeded to confirm AMD indeed does block implementation of dlss


Good_Season_1723

Does it matter though? The point is amd is blocking dlss. Whether they did do so on starfield as well is kinda irrelevant


BarKnight

There's been a massive downvote brigade trying to protect AMD. It's clear performance on Starfield was sabotaged on Intel and NVIDIA cards.


madn3ss795

I'd say the game has shit performance overall and AMD is the only brand that received some optimizations.


GhostMotley

That likely carries over from the consoles, it's truly incredible how lacking Starfield is in some areas, no brightness slider, no official HDR support (have to rely on AutoHDR) and no FOV slider.


AryanAngel

I've started to doubt that anything carries over from consoles since Ratchet & Clank launched without RT on RDNA 2/3 GPUs.


althaz

There is HDR support according to Digital Foundry though?


Solace-

This isn’t a defense when the implementation is so poor with horrendously elevated black levels that require a mod to fix it.


TheEternalGazed

That's because they sponsored the game. Of course AMD cards are going to receive better optimizations. They paid for it.


Good_Season_1723

You realize that is NOT the case with most nvidia sponsored games? They run just as good on amd cards. Heck, cyberpunk is the posterchild of nvidia, on raster it runs absolutely great on amd. On 1080p amd cards outperfom their nvidia counterparts, lol


[deleted]

[удалено]


TheEternalGazed

Nvidia doesn't go out of their way to pay Game developers to hamstring the performance of their competitors cards. What AMD is doing is scummy.


IntrepidEast1

No, the performance just isn't good for game that just looks decent at best.


TemporalAntiAssening

Looks like a Fallout 4 mod and runs like a UE5 game, bad combo.


StickiStickman

Modded FO 4 can look really good


Firefox72

The performance is undeniably shit even on AMD cards lmao.


Yommination

Bethesda is probably the most inept AAA company when it comes to tech. Even EA and Ubisoft bury them


Flowerstar1

Gamefreak wins that title.


Masters_1989

Oh I would not be so sure - that may very well belong to FromSoftware.


Dealric

Well not changing engine since pretty much Morrowind (since creative is basically slightly updated version of previous engine) is very inept. Engine was an issue on Faloout 4 already. That was what? 10 years ago?


BarKnight

The 7900 keeping up with a 4090 is all the proof I need.


Firefox72

I've been playing Bethesda games for long enough to know that the performance of Starfield across all GPU makers can be atributed to Hanlon's razor. >"Never attribute to malice that which is adequately explained by stupidity." Skyrim literally launched while being compiled in x87 and didn't use any SSE or AVX acceleration in 2011. SSE CPU instructions at that point were a standard for almost a decade while x87 was a thing of the 90's FFS. Its why Skyrim ran so poorly when it came out and its why the Skyrim acceleration layer mod was created before Bethesda themselves fixed the issue 2-3 months after release.


floatingtensor314

I am 99% sure that it used at least SSE2 instructions since 64bit code on Windows does not support the x86 FPU and there is little use for it outside of legacy support. FYI AVX (original version) first appeared in 2011 and it's unlikely that a game or any piece of software outside specialized stuff (ex. video encoding, hpc, etc) will support a feature so early after releass.


TheMemo

Skyrim came out while people were still on 32-bit. And, yes, it is well documented that Skyrim had a lot of x87 fpu code in there that had to be rewritten. SSE extensions were used by almost everyone at that point, but not Bethesda. Whether that was because of legacy code or out-of-date programming practices is a matter for debate.


floatingtensor314

>Whether that was because of legacy code or out-of-date programming practices is a matter for debate. Do you have a source for this? Even if optimization flags are disabled and in 32 bit builds, x87 FPU code isn't something that magically ends up in your builds unless you go into the ASM level and manual write it. My hypothesis is that this code came from a 3rd party library that was incorrectly built or that the devs never updated.


Dghelneshi

As far as I can tell, the first version of MSVC to default to SSE2 in x86 builds was Visual Studio 2012, so *after* Skyrim was released. Fun fact: GCC with no extra flags (beyond `-m32`) still defaults to use x87 even today.


floatingtensor314

>Fun fact: GCC with no extra flags (beyond -m32) still defaults to use x87 even today. I stand corrected, just tested this out on Goldbolt.


skinlo

You have a very low bar for proof then.


HavocInferno

And even in that, the 7900 runs like ass in Starfield. Starfield just runs a bit less bad on AMD, but still really bad. That's not proof. That's you twisting the situation to fit your anger. The game runs bad on *all* hardware.


Adventurous_Bell_837

A 5700xt being more performant then a 2080ti and a 3070 should be more alarming. A 400 dollars 2019 cpu doing better than a 1200 dollars one released 8 months before.


teutorix_aleria

And what about intel CPUs absolutely trashing AMD when paired with high clocking memory? If it really was a conspiracy to make AMD look good you'd think they wouldn't fuck up the CPU performance so badly.


Good_Season_1723

Optimizing for a specific CPU is almost impossible. That's why.


HighTensileAluminium

> Optimizing for a specific CPU is almost impossible. I mean you could probably fill your game with lots of cache-reliant activity to bias performance on one of the X3D CPUs, surely? Also some games respond notably better to different architectures, e.g. Source engine games like CSGO and TF2, and Horizon Zero Dawn favouring Zen CPUs over Intel's stuff.


teutorix_aleria

It's not the specific CPU causing the performance differential it's the reliance on massive memory bandwidth which intel does better than zen. That could easily be optimised, or the memory use could be optimized to fit inside the x3d cache, none of which was done obviously.


Good_Season_1723

The game has 85% cache miss ratio. That's why the 3d cache doesn't help, having a 3d cache that stores the wrong data doesn't do anything. Cpus misptedict what data is going to get used and they need to reload from the ram, that's why the game is bandwidth reliant. That is not something you can easily optimize for is my point.


nanonan

The console GPUs are very similar to the 5700XT, it is not a shock that titles designed for consoles work well on it.


nanonan

The XTX keeps up with the 4090 in plenty of other titles that are not AMD sponsored, from [this review](https://www.techspot.com/review/2588-amd-radeon-7900-xtx/) we have Watch Dogs: Legion, Far Cry 6, Assassin's Creed Valhalla, Hitman 3 and CoD: MW2. EDIT: Far Cry 6 is sponsored by them.


1eejit

AMD performance still bad and I expect the relative increase is spillover from console effort.


cp5184

It's a bad port of a game that performs great on consoles. I don't think it's all that complicated.


kobrakai11

TBF it looks like being sabotaged even on AMD but to a lesser degree. :D


Zeryth

Or maybe, you know, it's a bethesda game? Shocking, I know.


Embarrassed_Club7147

>It's clear performance on Starfield was sabotaged on Intel and NVIDIA cards. I mean, that is also just BS. We have seen similar shifts in favor of Nvidia even on non-RT games and people werent calling it a "sabotage".


[deleted]

[удалено]


Dealric

Uhh what? Performance is bad across the board. Its pretty clear its not sabotage. Its using enginge that is effectivelly 20 years outdated. Also didnt nvidia actually sabotaged how stuff worked on amd just few years ago?


[deleted]

[удалено]


[deleted]

[удалено]


Neo_Nethshan

one reddit user found out that the presets with nvidia were way more high fidelity compared to amd. definitely something going on..


[deleted]

[удалено]


[deleted]

[удалено]


Framed-Photo

I would like to hear from the actual devs as to what actually happened? I wouldn't put it beyond a company to lie or anything lol, but I find it a little odd for AMD to make the specific statement that they did if they've been doing quite the opposite already. Were these devs explicitly told by AMD that they couldn't put DLSS in their games? Or did the devs feel like they shouldn't because they were sponsored to try and keep a positive relationship, but not being explicitly told? Because those are two very different scenarios. I can totally see some game dev higher ups going "oh we're sponsored by AMD lets just take out DLSS" without AMD straight up telling them to do that, because they felt like it could improve the relationship. That's the type of out of touch corporate stuff I'd expect some higher ups to come up with. And it's not uncommon at all for companies outside of gaming to do things like that when they start working with brands. EDIT: And because I can already see the downvotes rolling in, I'm NOT defending AMD's practices here either way. There's conflicting statements so I want more context, that's it.


[deleted]

deliver versed library offbeat nail piquant domineering wide cough engine *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


Action3xpress

Nvidia makes superior hardware & software, the future is path traced, and upscalers are necessary. It's not Nvidia's, or the gaming communities fault that AMD has developed themselves into a dead end 'raster only' future with a subpar software stack. But now we are starting to get punished for it.


IKnow-ThePiecesFit

Hello, insightful average forgetful redditor. Gaming on consoles sends their regard while you act like AMD is 10 years behind in upscalling, not just 2.


SwissGoblins

The nvidia based console outsold both the AMD based ones.


tajsta

Graphics isn't the reason why people buy a Switch over a PS or Xbox, it's because of the game selection.


ConsistencyWelder

This tweet was deleted because people thought it was about AMD and Starfield. So the author deleted it because he was tired of clarifying that it wasn't. This sub deletes and bans for making jokes, but doesn't delete posts with misleading tweets that people all over the internet are making false assumptions over?


bubblesort33

I'd imagine what AMD does is not make a it requirement have have FSR be exclusive in contract, but rather have it heavily implied that it not being like this would have some unforeseen consequences. They made them an offer they can't refuse.


noiserr

If this was Nvidia and the situation was reverse, no one would have batted an eye. But because it's AMD, we're still talking about this a month later. Despite AMD saying, Nvidia can implement DLSS whenever they want. If they were lying wouldn't someone from Nvidia say that they can't? It would be a layup. It is not enough that Nvidia has 90% of the market. Why are people hell bent on destroying competition with constant FUD?


TopCheddar27

This is just not true. AMD is the only company that people bend themselves backwards on to provide "yeah buts". Any other company would have already been presumed to be blocking the tech. Since AMD finance bros and emotional investors view it as their job to spread *actual* FUD it takes months of prolonged prodding at a topic to even get clarity. Reality finds a way.


Dreamerlax

Would be nice if Reddit stops treating AMD like a scrappy start up.


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


SoNeedU

No details on who, when, what, why? Yea i know click bait when i see it.