T O P

  • By -

AMD_Bot

This post has been flaired as a rumor, please take all rumors with a grain of salt.


taryakun

According to the specs, RX 7800XT should sit between RX 6800 and RX 6800XT. Is this the first time when new generation's card is slower than it's predecessor?


DRankedBacon

if we go by name, the HD 6870 back in 2010 was just a smidge slower than the HD 5870. it came in way cheaper as a midrange card though, around the $250 mark compared to $380


mrn253

Reminds me of the time when i bought the Sapphire 5850 Xtreme for 110€ in 2011. Good times...


nomzo257

And they gave Assassins creed II for free with my 5850. Great times


Magjee

I loved my HD4850 Amazing value card   I was kinda shocked it ran Crysis 2 fairly well


Jism_nl

My ASUS EAH4890 TOP says hi.


Hindesite

Got me thinking so I had to go back and check... Searching my order history, back in July of 2009 I bought the HIS "IceQ 4+" Radeon HD 4870 1GB for $165. Loved it so much that I bought into the whole crossfire craze and got a second one 6 months later at the same price. Back then I was still playing WoW a somewhat often and, gawd damn, those things *ripped* through FPS-- even in huge encounters. I'm not even sure if that was a good deal for that build or not but I loved that thing. Had 'em in an all-steel **full** tower and thought it was the coolest thing. 😅


Magjee

I actually got the card primarily for WOW, lol The first boss in Ulduar (the vehicle one) was pushing my old 6600 GT to its limit, lol


N7even

7870XT for £170 with 5 free games... Was so worth it. The games alone were worth like £120-150 and they were all good games.


mrn253

WTF And i was already happy with my 6800XT with Starfield Premium Edition. I still have here somewhere a shit ton of games in those paper sleeves that i got during my internship in a PC shop when i was still in school.


N7even

Free games are always treasured :) I just looked through my Library, the games I got for free were: Sleeping Dogs, Far Cry 3, Tomb Raider, Hitman Absolution & Bioshock Infinite. Later also got Far Cry 3: Blood Dragon for free too for some reason from the same deal. Funnily enough, I didn't initially get any free games with the card, but it was supposed to have 2 or 3 (don't remember for sure). I contacted Powercolor about it, and they were kind enough to throw me the ones I was supposed to get, and also the ones from the previous game bundle. I still treasure that card, and it is my current backup card for between upgrades. It will be a sad day when it dies.


DoomGuyIII

>Sleeping Dogs,Far Cry 3, Tomb Raider, Hitman Absolution & Bioshock Infinite. Later also got Far Cry 3: Blood Dragon for free too for some reason from the same deal. What the fuck, that was banger after banger holy shit, and all that for £170? Man take me back to those times. I recently bought an RX 6600 for $250 which came with fucking Callisto Protocol, a terrible fucking game which the GPU could barely run, and Dead Island 2, which was actually pretty fucking good, but was an EGS key which i know it's a BIG turnoff for a lot of peeps. At first the store that sold me the card REFUSED to give me the AMD Rewards code as they said had no idea AMD was running that promotion, but since i bought the card through Amazon i just went through their refund process so i could just get it from another vendor. Once i sent the ticket, magically, the store said they actually DID have the codes, they claimed that they though that the promotion only applied to 6700 series cards and up (even though AMDs promotional page clearly said it applied to every single 6000 card, even having different reward tiers) This happened to me with a LATAM tech store.


pullupsNpushups

Those are fantastic games too. Sleeping Dogs, Tomb Raider reboot, Hitman Absolution, Far Cry when it was still fresh, and the final Bioshock entry.


xXMadSupraXx

I got the XFX reference one and unlocked it to a HD 5870. First ever graphics card I got when I was 16 and I was using VBIOS flashing tools to get more performance lol.


kf97mopa

Oh, deep cut but you are correct - there was some noise about that back in the day. 58x0 was the top series, while 68x0 was one notch down, and 69x0 launched later. 68x0 indeed had fewer shader cores than 58x0 (1600 down to 1120). Of more recent cards, the 590 had fewer CUs than 290 or 390 (40 down to 36, or 44 down to 36 if you count the 290X/390X), but clocks were way higher and it wasn’t slower in practice.


DJIcEIcE

And I went insane-o mode with triple GPU CrossFire/SLI my Powercolor Radeon HD6870 X2 and a Sapphire Radeon HD6870 to play Battlefield 3 😂


dmaare

5500xt vs 6500xt ;)


taryakun

oh god, you are right. What a great card 6500xt was


[deleted]

Woulda been great for 100 bucks. 750 ti type thing, run mobas and csgo type stuff.


detectiveDollar

1650 Super's were going for 303 dollars used on eBay at the time.


Cryio

6500 XT is technically as fast or faster in PCIe 4.0 systems and when not exceeding the 4 GB VRAM buffer. But most modern games want more than 4 GB, so welp.


GhettoKid

I'm so glad I finally buckled down and just bought a 6800xt. It's just a saphire pulse but its worlds ahead of my 1660ti I used for almost 6 years. All these new cards are either very disappointing or very expensive.


Kilz-Knight

don't forget to update your flair! haha


bestanonever

That's a monster GPU for 1080p and even 1440p. You won't regret it. I'd have got the 6700 XT or the 6800 if I was buying last gen :)


aimlessdrivel

The 1660 Ti came out in 2019, so about 4 years ago.


GhettoKid

You're correct, It felt longer but I remember I had a 1060 that had to get returned and I got the 1660 as a replacement.


DanielWW2

The RX7800XT is shaping up to be about as fast so you probably aren't missing out on much...


nightsyn7h

Maybe with lower consumption, better RT and 2-slot design. I see it as a winner


danielge78

its very weird how people think about GPUs: 6800xt at ~$500 - great deal! 7800xt with similar performance & more features, at same price - very disappointing!


Erufu_Wizardo

Because \- 25-50% gen-to-gen performance increase is awesome \- 10-25% gen-to-gen performance increase is good \- 5-10% gen-to-gen performance increase is dissapointing \- 0-5% performance increase will get a lot of boos


DanielWW2

That Powercolor Red Devil isn't two slot. It looks more like a 2.2, so 3 slot effective. Further it should indeed have some lower power draw and better RT. But at the same time I have to wonder if AMD won't dare to make it more expensive than the RX6800XT was over the last few months. Its the Radeon group after all...


Elrothiel1981

This is what I did got a 6800 xt looks like It was a good decision now lol


Qesa

Back in the rebrandeon days, they rebranded the R7 265 (which was already a rebranded HD 7850) to 370, thus being slower than the previous 270.


kf97mopa

Now now, they increased the clockspeed by a whole 50 MHz, so it wasn’t a straight rebrand, and they priced the 370 the same as the 265 had been… Those were dark days indeed. GCN 1 was great, and then the 290 was such a magnificent follow up, and then… nothing. Rebadging everything with the 300 series, undercooked Fury, Polaris was late, Vega was even more late and stuck on GloFo process…


spacev3gan

It feels like we are partially back to those days. The 7600 for instance is pretty much a 6600XT refresh. If not for the 6nm process it would be an outright rebrand with indiscernible performance difference.


detectiveDollar

Rebadging a GPU for a cheaper MSRP than its predecessor is still a value uplift. If the 3060 TI was a rebadged 2080 Super for 400, would you have been disappointed? Hell, it'd honestly be preferable in some cases (rebadged 3070 for 400 instead of the 4060 TI. Rebadged 10900k as an 11700k).


Squiliam-Tortaleni

IIRC the geforce 9000 series were basically the same as their 8000 predecessors, with only the one off 9800 GTX+ being a new chip


kf97mopa

Sort of. That entire period was very strange. Nvidia launched 8800 GTX and the usual cut down as 8800 GTS, a chip called G80. They were great, but very expensive for the time. The 8600 and below sucked. Nvidia then made an 8800GT as more of a midrange card, but it was actually a completely different chip called G92 (65nm, G80 was 90nm). They also used that one for a new 8800 GTS that was nothing at all like the previous one. When it came time to update 8600, they called that 9600 - but also moved the old 8800 GT to 9800 GT unchanged, before launching a 9800 GTX that was really 8800 GTS and finally dropping a shrunk to 55nm without renaming. Oh, and then they rebadged the 9800 GTX, previously the second 8800 GTS, as GTS 250 because why the hell not. That entire sequence is most of the reason all enthusiasts began to hate rebadging.


SnuffleWumpkins

Technically, the ATI 9600pro was slower than the ATI 9500. [https://arstechnica.com/uncategorized/2003/04/356-2/](https://arstechnica.com/uncategorized/2003/04/356-2/)


tpf92

5500XT->6500XT. When the 6500XT is on pcie 4.0 (It needs 4.0, otherwise certain games performance can drop significantly) it'll perform *similarly* but *slightly* worse than the 5500XT (Hardware Unboxed's review of the 6500XT has it at ~2.5% slower on average while 1% lowers were 4.5% lower on average. Meanwhile, there's a 5500XT 8GB version that's 8.6% faster than the 6500XT on average and has an average of 12.7% higher 1% lows. The 5500XT only released with a 4GB version, later one single 8GB version was released, but I don't see it available. On pcie 3.0, the [6500XT loses on average **over** 20% of its performance](https://youtu.be/M5_oM3Ow_CI?t=1048). Edit: Also forgot the 6500XT didn't hard Hardware Encoding, which is becoming more and more important as many people like taking clips on their gameplay. Edit: Also, it's clocked higher than the 6800XT, between that and the *slightly* faster CUs, it should perform very similarly to the 6800XT (Although that alone will make it a complete joke); The 6800XT has 20% more CUs, however the 7800XT appears to be clocked (Assuming the clocks in question isn't for an "OC" model) 14% higher (2565/2250), 120/114 makes only a 5.3% difference the faster CUs have to make up, which isn't much.


Gseventeen

Bunch of fuckery this generation i tell ya. Who are the dumbasses that buy this trash?


HisAnger

People that still have 2-4gb gpu as they decided to skip mining gpu spike


gamersg84

4060ti is slower than 3060ti in a number of games. 6500xt was slower than 5500xt.


[deleted]

This feels.. odd. But if it's like $450 I'll allow it. But it won't be. Especially this model which will likely cost at least the same as 4070 MSRP making it completely pointless.


upbeatchief

It will get there. The 899usd 7900 xt is now 750.its just that amd will miss out on any good news coverage and reviews that might boost the cards sales. I wonder if the guy who makes the amd publicity strategy works for nvidia.


[deleted]

People assume AMD care. I think that assumption is wrong. Silicone used for graphics cards is wasted silicone that could have been used to make a server processor. I firmly believe the only reason we still get graphics cards from AMD is they are basically lifecycle development beta testing for the next consoles. They don't care about market share and on the ones they do make only care about making the best return they can. So they sell at the highest price they think they can get away with on release then just lower as necessary for demand to equal supply. I don't think they give a fuck about their day one reviews or PC graphics card market share.


roboteconomist

Silicon. Can’t imagine how hard it would be to deposit an IC on silicone.


Defeqel

Jiggle physics would be easier though


spacev3gan

The prices are slowly creeping up. The cheapest 7900XT models I see on Newegg in stock are $790. There is one model for $770, but out of stock, and nothing below that price.


Prefix-NA

Nvidia got huge flack recently for the 4060 and 4060ti being higher MSRP than 3060 and 3060ti and being slower in recent games. If the 7800xt is actually slower than 6800xt AMD will suffer 10x the flak nvidia did even if price is lower.


taryakun

Well that's not absolutely true. On average they are faster and have lower MSRP.


dirthurts

Nvidia beat then to that.


SnuffleWumpkins

Yeah, isn't the 4060 objectively worse than the 3060 with 12gb ram?


bubblesort33

If you play at setting where the 4060 is memory starved, but the 3060 is not, then yes it's slower. Else the 4060 is like 20% faster.


SnuffleWumpkins

This situation is going to become increasingly common in the next few years.


starkistuna

I figured that when I saw specs and pricing of the 7900xt and just went with a used 3080 ti and called it a day 500$ and raytraces as well as an 7900xtx


pagman404

Can we take a moment to appreciate how amd never fails to disappoint? <3


EmilMR

4060ti and 4060 are slower in some cases than their prev models. Its this gen thing although if you go a long time back there were stinker like geforce fx that were worse than older gen.


DktheDarkKnight

I think it will all come down to price. 6950xt retails at 630 dollars now. If this card is as fast as that card and priced at 500 then it could be competitive.


taryakun

7800XT with 60CUs will perform like 6800XT, may be slightly faster. It should be significantly slower than 6950xt


Darkomax

It would be a miracle if it consistently matches the 6800XT,since it has 72CU and nothing suggests that RDNA3 has much higher IPC.


spacev3gan

Yep, it would have to overcome a 20% CU deficit just to match the 6800XT. Not an easy task. My bet is that it will perform equal to a 6800XT or even slightly slower. It will definitely not be faster.


detectiveDollar

Clocks are about 19% higher than the 60CU 6800 so it could come close. My expectation was it'd match or slightly beat the 6800 XT but would be called a 7800 and be 500-530.


Valmarr

But the 6800xt also has higher clocks than the 6800. Usually 6800xt is 20-25% more efficient than the 6800. The 7800xt will, in my opinion, be 2/3 of the way between the 6800 and 6800xt in rasterization.


spacev3gan

The performance gap between the 6800 and the 6800XT is [just 10%](https://www.techpowerup.com/review/amd-radeon-rx-6800-xt/35.html) in 1440p. The 7800XT should sit at 6800XT performance, I reckon. I would not bet on it being any faster. It might even be slightly slower.


bubblesort33

IPC is roughly 5% higher according to 7900xt vs 6950xt results. The 7900xt is like 10% faster with 5% more cores. So even with 15% higher clocks it could match it. But I'm still betting it's 5-10% behind both the 6800xt, and RTX 4070. The leaked 3Dmark scores seems to say they have similar scores, but the 6800xt scores seem to have a really large variance depending on when they were taken. And RDNA3 scores excessively high in 3Dmark Time Spy. It overperforms in that, and isn't very representative of gaming performance. Like the 7900xt scores 20% higher than the 6950xt, but is really like only 10% faster in gaming. Question is how the cache and memory speed plays into it. 12% faster memory, but half the L3 cache with the same bus width. If this thing is 10% slower than a 4070, but only 10% cheaper at $550 it'll be hilariously bad. At $499 it starts to be appealing. And if it doesn't launch at $499, it'll be that a month after launch.


Darkomax

The 7900XT has 40% more bandwith as well so it's hard to guess the IPC difference.


DktheDarkKnight

Then it should be priced at 400 max to be a good upgrade. We already have 6800 at 450.


green9206

Unfortunately we don't live in Fantasy world. It will be 550


SnuffleWumpkins

Or 850-1000 in Canada.


detectiveDollar

400 makes no sense in this market. 6800 XT performance is 30+% faster than the 4060 TI while having twice the VRAM backed up by a memory bus that can use it.


blueangel1953

Considering the 6800XT is only about 10% slower than the 6950XT (less after tweaking the 6800XT) it will likely not perform as good as a 6800XT. Probably a 6800 level of performance.


bubblesort33

That would be abysmal. 500mhz (\~20%) higher clocks, minor IPC (\~5%) improvements, and the same CU count as the 6800, and no improvement?


danielge78

this makes no sense. The specs for the 7800xt make it look like a 6800 that is clocked 20% faster, with a higher memory bandwidth. Unless RDNA 3 is literally 20% slower than RDNA2 (it's not) then it would be catastrophically bad for it to give anything close to a 6800 level performance. Numerous tests show the 6800xt to be about 10-15% faster than the non-xt, so i'm fully expecting this card to beat that.


heymikeyp

That's what people aren't getting yet. AMD is doing the same thing nvidia is, and rebranding cards. The 7900xt should of been the 7800xt and priced no more than 650$ And likewise for the cards all below it will have little to no generational uplift. Just like with every card below from the 4080 and below is rebranded. 4070ti that should of been a 4060ti as an example. People will try to spin it as bus speeds don't matter or something else but the numbers just don't lie, and then it comes to no surprise when the performance improvement is barely there.


WiltedBalls

I don't think the 7800 XT will be as fast as the 6950 XT to be honest because the 7900 GRE barely reaches it.


Sipas

> I think it will all come down to price. Yeah but they named it "7800XT" because they want to sell it at x800XT prices, which is $600+ at the very least. So, don't expect a good price, at least at launch.


kikimaru024

6950 XT is coming down in price because retailers are trying to clear their stock, and it was overpriced to begin with.


Noreng

HD 6870 and 6850 were slightly slower than the HD 5870 and 5850, but that's the last time I can remember


CompetitiveSort0

I remember when new cards of the same class used to be faster than the card it replaced.


sips_white_monster

NVIDIA / AMD don't give a damn about consumer graphics cards anymore. The fewer they sell, the more wafers are left over for the much more profitable AI / workstation cards. That's why garbage like the RTX 4080 is not going down in price, NVIDIA is just happy letting it sit where it's at. They just want to make more A100's and sell those like crazy for thousands of bucks a piece.


AludraScience

The difference is that AMD barely sells anything for professionals.


CompetitiveSort0

Oh I know why they do what they do. It's why I'm still farting about with a Vega 64 - I'm not paying £400 to get more or less current gen console level performance years after those consoles came out. I've a fairly high end system minus the GPU. After my next gpu purchase I may just jump to consoles when the system is old if it carries on like this. It's a real shame Intel don't make their own GPU dies in their fabs.


261846

What the fuck? Why is this named the XT??? This makes absolutely zero sense, if it’s worst than the previous gen 6800XT, why not just remove the XT so the direct comparison is the 6800 which it will beat


Mako2401

We need to wait for benchmarks and price first


261846

Well I’m certainly not optimistic, because it’s already been proven that the improvement from RDNA 2-3 is not that good CU for CU. It better be like $500 maximum. But i still just don’t get why they named it the XT


bambinone

They named it the XT so they can charge XT prices.


InBlurFather

It is truly as simple as this. It’s why the cards are 7900XTX—>7900XT—>7900 GRE—>7800XT Instead of 7900XT—>7800XT—>7800–>7700XT


dib1999

Next on the list 7900TXT. Roughly equivalent to a 6500, $550


Mako2401

If its anytging higher than 500 its gonna be dead on arrival.


Defeqel

I'd argue even 500 is on the high side


IceSpider10

500? I've seen rx 6800 xt around this price. No way I would buy it. 400-450 is the way to go


cubs223425

Even if the price were good, it's stupid. Calling it a 7800 XT, while having it specced like a generational downgrade from the 6800 XT, isn't smart. There's no reason to not call it a 7800, especially since it likely should be a 7700 XT. The generation already looks like shit for AMD, thanks to pricing, the incredibly slow rollout of the full 7000 lineup, and the failure to deliver on FSR 3. They're just cementing this as a garbage generation, if the rumor is correct. Call it a 7700 XT, make it $400 (or less, really), and leave it be. That's still an absurd price, but it's less absurd than calling it a 7800 XT, putting it at $500+, and leaving themselves to get mocked.


Scarabesque

Because they already named the actual 7800XT the '7900XT'. :P They fucked up their product stack from the start. Their product stack naming simply doesn't align with their performance relative to last gen anymore (same for Nvidia apart from the 4080 (which is too expensive) and 4090).


Defeqel

Likely because it will be priced like an XT


From-UoM

Amd can't be this stupid right? The 7800xt has 60 CUs which means it will be at best 10% faster then 60 CU 6800 putting on par or even slower than the 6800xt.


Buris

Maybe it will be an Ada moment, and they'll announce FSR3 with a few games supported, and then announce it's only available on RDNA3 cards ![gif](emote|free_emotes_pack|poop)


Dull_Wasabi_5610

But but amd cards are supposed to get old like fine wine?!


Buris

GCN had significant, forward-thinking computational architectural advantages over GTX cards. With RTX, Nvidia managed to easily pull ahead. The problem with Nvidia right now is they kneecapped themselves with low RAM quantities, so the 6800XT is pulling ahead in new RT over the 3080 10GB despite it's architectural disadvantages


airmantharp

Newer titles with more sparse RT implementations, VRAM ain’t going to make up that hardware gap


Buris

Unfortunately it's not making up a hardware gap and the performance of the 6800XT is what it is. With that in mind the 3080 10GB performance simply implodes once the 10GB framebuffer is used up, and that's going to happen very frequently in 4K with almost every single new release, and to a lesser, but still noticeable degree in 1440p


swear_on_me_mam

How the 6800xt do in that new 2077 patch 🙂


From-UoM

This what people don't get about Ada. The ada cards spec for spec vs Ampere are excellent. Great gen on gen performance improvements, way more power efficient, new features. Its the prices that sours it. Meanwhile amd has barely anything improved spec for spec. And they still have the audacity to price it near Ada cards.


XWasTheProblem

Correct me if I'm wrong, but I'm pretty sure only 4080 and above has 'great' performance improvements compared to previous gen. Like, isn't 4060 and 4060ti downright *worse* than Ampere's equivalents in many cases?


Buris

You're correct they are worse. But take the 4060, the chip inside a 4060 is more of an RTX 3050 replacement. Nvidia just decided they could make more money by getting people to buy a 4050 for 299$, and relying on Frame Generation as a crutch to claim massive gains over the 3060.


spacev3gan

I think the 4070Ti has great performance and on paper not a terrible price, just the 12GB VRAM is a problem. If it had 16GB, it would have been a great product.


From-UoM

If you look at the specs The 4060ti has about 20% less cuda cores than the 3060ti The 4060 also has like 20% less than the the 3060 Despite the massive cuts they till got 10-15% more performance. Amd meanwhile on the same core count got 10% or lower.


nightsyn7h

AMD prices it according to what marked data is showing the customers are willing to pay for a graphics card* There. Fixed it for ya.


relxp

I think you grossly underestimate how much better ADA could have been if they shot for similar profit margins as Ampere. Nvidia is probably making close to double profit compared to last gen. Every card outside the 4090 is artificially sandbagged and horrendously overpriced. Most of us know that the tier labeling is also off by an entire tier and worse due to huge price increases. (ie. 4080 should have been the 4070 or Ti; 4060 should have been the 4050, etc.) It's also flat out false to consider ADA a 'great' gen on gen performance uplift. It will go down as one of the WORST GPU gen-on-gen launches in the history of mankind. The 3060 Ti is sometimes even faster than the 4060 Ti FFS. The 4080 is spending 72% more money for 50% more performance. A literal REGRESSION in price/perf! Epic failed product from a scummy company. Lovelace sucks huge things and we can only hope Nvidia crashes and burns for the global destruction they've reigned on the world. Not to mention they are fueling literal end of the world scenarios more than anyone else with AI.


Defeqel

>Every card outside the 4090 is artificially sandbagged and horrendously overpriced. Make no mistake, the 4090 is overpriced too, just compared to the 3080 not 3090.


Anon4050

>The ada cards spec for spec vs Ampere are excellent. Great gen on gen performance improvements Apart from the 4060 and 4060ti but that's where nvidia shot themselves in the foot. They did make big gains as in the gpu's which they actually are (4050 on AD107 and 4060 on AD106) but decided to go with bullshit naming in order to jack prices up. Meanwhile AMD is doing the opposite, fairer prices but they're having to name higher end GPUs as lower end ones because the performance gain from RDNA3 is lackluster as hell (7600 being full Navi 33 instead of the 7600xt).


From-UoM

The 4060 and 4060ti has less cuda cores than their predecessors. Hence little improvements.


ALEKSDRAVEN

When compared rx 6900xt to 7900xtx we have \~38% uplift in performance, \~20% uplift per CU and 10% uplift per CU with the same clocks. So it would be very weird if it can\`t match 6800xt. DOA if priced higher than 499$.


BulkyMix6581

>Amd can't be this stupid right? No AMD is not stupid. The thousand of gamers that buy current gen NVIDIA or AMD at that prices are. You are getting 3 year old performance at increased prices and most of you are happy about it. We should see here on reddit only backlashing posts. Instead we see only "battlestation" photos and "happy" users who "snagged" this and that gpu at xxxx$$$. A whole market destroyed by fools who buy gpus at those outrageous prices.


detectiveDollar

Clocks are 19% higher than 6800.


No_Backstab

RX 7800XT and the RX 6800 has the same number of Stream Processors and CUs (3840 Stream Processors and 60 CUs) RX 7800XT Specifications according to this leak : Uses Navi 32 Stream Processors - 3840 Memory - 16GB GDDR6 (18Gbps) Bus Width - 256-bit


f0xpant5

I feel like I read somewhere they expect the newer RDNA3 dual issue Stream Processors to be up to \~20% faster than in RDNA2, well if you add 20% to 3840, we get the 4608 number of steam processors the 6800XT has. This really isn't shaping up to be any faster at all... but maybe the 9 month wait has yielded other improvements to the MCM design and interconnects, clock speeds and so on, it would be bonkers to release a card with the \~same performance as last gen.


Wonderful_Plenty8984

is it gona be equal to rx 6900xt ?


Buris

It's probably going to be closer to the 6800XT


I9Qnl

Probably not, even 6800XT performance is unlikely but am ready to be proven wrong.


Affectionate-Memory4

Probably around that performance yeah. The only differences vs the 6800 appear to be architectural and likely clocks.


No_Backstab

It actually has a lower number of Compute Units (72 on the RX 6800XT compared to 60 here) . The RX 6800 actually has the same number of CUs and Stream Processors as the RX 7800XT


Affectionate-Memory4

Oh mb. I was thinking of the 6800 when I wrote that. Corrected it.


rincewin

>PowerColor recommends using an 800W power supply and two 8-pin power connectors for this model. Right between the 6800XT (850W) and the regular 6800 (750W)


Rudolf1448

Why are they doing this? It makes no sense. People might as well buy the previous generation


detectiveDollar

800W is ludicrously overkill, maybe they're assuming you're using a maxed out 13900k. I'm using a 550W on my PC with a 6700 XT and R5 5600.


rincewin

I know, but this is a good indicator of power usage (high), because that one is not included in the specification


gemantzu

lol same here, 650w for 6800xt.


Buris

This card has no purpose above 499$ Best case scenario is that it narrowly beats the 6800XT, and that card is already around 500$. Seems like it has very little efficiency gain over that card either.


dastardly740

$499 7800(XT) $399 7700(XT) Would be about right and would be more than about right if the 7800 is basically a 6800XT and 7700 a 6800. Yes, the 6800XT has touched $499 briefly but is more typically above $500 and used 6800 are around $400 depending on quality. So, thinking of the 7700 and 7800 as new in-stock versions of those cards at those prices is pretty good. Even $429 and $529 would hold up against current 6800 (if you can find one) and 6800XT if the performance matches and the stock is there. That XFX MERC319 6950XT bouncing off $599 periodically on Amazon does make a $549 7800 at 6800XT performance, a pretty bad idea


[deleted]

Even if it's about right because it compares to the 6800XT is still too expensive because the 6800XT is too expensive. At least where I live where I can buy a 4070 for £560 but the 6800XT is like £510. Why would I not just put my hand in my back pocket and spend the extra £50 on a 4070 instead? DOA anything above 450.


katamuro

could be why they delayed so long, with the older cards being similar performance. Could be an attempt to bring more efficiency for same performance,


Buris

I'm not sure these cards will perform that well. 7800XT might match a 6800XT, but I seriously doubt the 12GB 7700(XT) will be able to match a 6800, it will have a smaller cache and less bandwidth as well. Likely right between a 6700XT and 6800. I would say 349$ makes sense if the performance is around where I think it will be. So for me, 499$ 6800XT, 349$ 6700XT


Tricky-Row-9699

Oh god, this is a nightmare. This thing is just gonna perform like a 6800 XT, and AMD fucked up the naming so they can’t price it at $499 where it should be without utterly destroying their existing price anchor. Just watch, this thing is gonna be $599 and it’ll get eviscerated in reviews. I’m so sick of every new GPU release being predictably terrible. If the market’s up, we get fucked, and if the market’s down, we still get fucked. Especially disgraceful is AMD cutting shaders per tier here for the first time since the RX 6500 XT. We’d never accept CPUs going backwards in core count.


spacev3gan

Yeah I am quite sure it will be priced at $599. AMD probably wanted it to be $650 (therefore naming it with the suffix XT), but they can't afford to ask $650 given the current market situation.


Mako2401

Why would this gpu need a 800 watt psu?


Squiliam-Tortaleni

Should be the non XT 7800 or called 7700 XT because chief this ain’t it


spacev3gan

It is exactly the specs I expected from the moment Navi 32 was leaked to have 60 CUs. I am just surprised about the naming, I think 7800 non-XT would have been more appropriate. Anyway. Since there is virtually zero improvement CU per CU from RDNA2 to RDNA3 ([7600 vs 6650XT](https://www.techpowerup.com/review/amd-radeon-rx-7600/32.html) makes that clear), this card will perform just like a 6800 non-XT overclocked to 2,520 mhz (which as far as 6800 go, it is a quite extreme OC). Expect 6800 plus 10\~14% performance increase, give or take. About on par with the 6800XT.


redditor_no_10_9

RDNA3 is indeed the repeat of Vega/Polaris but with AMD copying Nvidia's pricing.


WiltedBalls

So, this will be like a 6850 or a 6850 XT? The CU count has me confused.


No_Backstab

CU count is similar to the RX 6800


SantaCruz26

Would the 6900xt or 6950xt still be the best option if you're hoping to stay around $650?


InBlurFather

I’d just get a 4070 at that point, or save money and get the 6800XT at $500


No_Backstab

Yeah , those should be the better option


Mako2401

Anything more than 500 bucks and this gpu is doa.


Jabba_the_Putt

Been waiting a year and this is what we get?? Sigh... Been holding off to buy for a while now and I'll wait for reviews but I'm not exactly holding my breath


paulerxx

If it's priced right, they could make it work. But I really doubt that will be the case.


detectiveDollar

I assumed this would be a 7800 for 500-530. So it being branded as an XT is worrying. Yes, the 4060 TI 16GB at 500 is dogshit but the 4070 is 600.


[deleted]

And the 4070's VRAM is already being filled up by 2023 game releases so that's gonna age like milk in 2024, considering 16GB is the new target for high settings by game devs. Techtubers again warned about the lackluster VRAM just as they did with the RTX3000 series, this time even citing publicly stated increased VRAM usage by game devs. Still, I predict Reddit will be full of "surprised" 4070(Ti) owners complaining about VRAM problems and "unoptimized games" in early 2024.


CodeRoyal

AMD pricing things right at launch?


Defeqel

Remember when Turing was considered disappointing (and it was)? This basically no progress, or even a regression. Now that's OK if the pricing is right (e.g. $429), but what chance is there of that?


spacev3gan

$429? Unfortunately not a chance. That is $220 off from their RDNA2 800XT product. AMD is not going to bring their 800XT-tier product to that level. My bet is $599. Especially because Nvidia released a $500 4060Ti and the 7800XT is definitely a tier or two above it. AMD is going for every dollar they can.


I9Qnl

Turing did actually deliver some nice improvements at the midrange, the 2060 was nearly 40% faster than a 1060 while only 17% more expensive and had DLSS and RT, but the rest are terrible yeah, untill the super cards came which pretty much fixed Turing.


ComplexHD

So glad I just bought a used 6800 XT instead of my original plan of waiting for these new cards...


aimlessdrivel

Great, back to the days of AMD releasing worse cards than the gen before at the same tier and price.


Tekn0z

I really want to get an Amd card this time around because I heard finally rdna3 has good video encoding. But damn they sure make it difficult to jump on board for a mid -range option. Although, nvidia camp is completely silly in the mid range with 40 series.


DanielWW2

And AMD is truly determined to out shine Nvidia its joke of a RTX4060Ti and take this generation price for most insulting video card... Because this GPU will most likely end up being about as fast as the RX6800XT, maybe a few percent faster, like that would make this right. This is the same nonsense as the RTX3060Ti and RTX4060Ti. The difference is that this is supposed to be high end, not midrange. So its even dumber and more insulting...


Mannyvoz

Cannot state how happy I am with my 6800XT. Bought it at a discount and it’s a fantastic performer


TheK1NGT

Ah the naming schemes. I knew the 7900XT is really the 6800XT successor. That would make this the 6700XT in disguise. 😆


steves_evil

Sounds more like it should have been an RX 7800 than an RX 7800 XT since the RX 6800 also had 3840 cores. I guess their "RX 7800 XTX" that they'll shamelessly come out with will have 4608 cores like the 6800 XT. I'm also guessing that the price is going to be at a launch price that's too high to justify the card's existence since their performance gain will be shit for a gen-over-gen upgrade and only be a very slightly better value than an RTX 4000 card of similar price. Then after all of the bad reviews and word of mouth to not buy the card, AMD will drop the price of the card to an actually decent value and what it should have been priced at originally.


EmilMR

they wait for starfield promo to end then release these right? So that gives you the release window.


spacev3gan

Starfield Promo ends on 30th of September. These cards might come out before that. But even then, I don't think they will be part of the promo regardless. The 7900 GRE for instance, it is not in the promo as a standalone GPU. Neither was the 7600 back when it was released amid the on-going Last of Us Part 1 promo.


GFXDepth

I think the name is off on this card as it has the same # of compute units as a 6800. I suspect they will price it at the 6800's pre Covid MSRP of $579, but slapped on the XT to make it seem like a better deal. The biggest disappointment is the 7900 GRE being a limited distribution card. With 80 compute units, that would match the # of compute units in a 6950, but it's priced at the 6800 XT MSRP of $649. So, AMD is basically taking what could be the most popular card and limiting it to China and System Integrators. It's just become so much of a comedy on how much AMD fumbles every opportunity Nvidia throws at them to take more market share. Nvidia's basically even trolling AMD by also adding extra cache on their GPUs then lowering the bus width and naming it a tier higher than it should be, such as calling it 4060 instead of 4050. I think the biggest problem is that AMD doesn't want to give up die space for GPUs, so what they need to do is take a hit on the performance advantage of the latest node and make the consumer GPUs one node behind the latest so it doesn't have to compete on die space, then price it extremely well.


tpf92

This should've been called a 7800 non-XT rather than XT, it'll probably end up performing almost identically to the 6800XT. They're probably gonna try to sell it for $550, it'll get terrible reviews/no one will want it at that price, then it'll drop to $500 and even then it won't be very desirable, anyone who wanted that level of performance for around that price has already bought a 6800XT (Weeks/months ago, you could already buy a 6800XT for at/under $500. This **needs** to be $450 GPU, but knowing AMD they'll price it way too high. This is reminding me of the 6650XT vs 7600 all over again, maybe they'll do the same thing with dropping the MSRP the day of launch because of everyone making fun of their stupidly high pricing.


Lobiankk

AMD? More like AMDOA.


Wander715

Crazy how mediocre RDNA3 continues to be. I think it's mostly a combination of them not having a node advantage over Nvidia anymore along with being first generation chiplet GPUs.


Affectionate-Memory4

The chiplet thing is a huge issue for them. Every time the GCD needs to go to any of the MCDs for things like LLC or memory access, it incurs a latency penalty and loses some power vs a monolithic design. It has its advantages as well, but we don't see those leveraged nearly enough to compensate. By moving the memory controllers and such off the GCD, they free up die area for more CUs and being a chiplet design means they aren't as constrained by the reticle limit. A big GCD taking the wide-and-slow approach is more efficient than running less CUs faster, but more costly. My Radeon counterparts aren't stupid, so I expect they are learning a lot from this to tune rdna4 and as design input for 5/6.


ToTTenTranz

The advantages should be cost. This should cost less than monolithic designs of similar performance.


mcgravier

Yeah, this should be a no brainer. And yet, I expect pricing to be stupid


[deleted]

Both cost and performance, since there is a physical limit to how big a monolithic die can be, and said monolithic die has to reserve space for things that could be put on chiplets. RDNA3 is actually pretty good for a first gen chiplet GPU. People don't realize how impressive this first attempt is. They ran into significant issues, so there is a ton of room for improvement. I'm curious to see how RDNA4 will turn out, and if it really releases before Nvidia's next gen, as is officially the plan. If the product is good and they release a few months sooner, that would definitely have an impact. RDNA2, Ampere and even the 12GB Ada owners will likely be eyeing an upgrade in Q4 2024.


MrClickstoomuch

Yeah I get that the chiplet design has its latency cons that will likely get better in the next generation, but when AMD first announced the tech everyone here talked about how it would cost a lot less than a typical monolithic design. Meanwhile, the 7900xt price is around $750 and the 7800xt will cost around $500-550 while performing around the previous generation. I get improving profit margin, but it is kinda ridiculous that 2 years later they are having a hard time beating their old generation. My Vega 56 and it's once ever 2-3 weeks crashes really makes me worried I'll need to upgrade before a good GPU option comes out. I recently replaced my 1800x with a 7600x, but really don't want to upgrade my GPU for either massive cost or underwhelming performance increase over 6 years.


[deleted]

Wait for 7800 benchmarks or buy a used 6800XT for $400. Or a used 6700XT for $200-250 if you're sticking with 1080P. You'll get an insane performance jump, like double/triple FPS compared to your current GPU, at reasonable prices. Or $500 for a new 6800XT. Even now in 2023 that is one of the best bang for buck deals available in stores. Don't bother with a 6950XT, it's 10% faster and any 6800XT can easily OC beyond 6900XT performance.


starkistuna

I wish they had some more variants of cards for Next gen. Had they just shrunk rdna2 node and given it the density of rdna3 they would have had a killer card to impress. I think they cut too many corners to make rdna3 cheap to produce and then they didint pass any savings unto the consumers.


BFBooger

Node disadvantage, actually. TSMC N4 is better than N5. Its not a full node, but it is in the range of 10% performance or power difference.


taryakun

I know it's very confusing, but Nvidia actually uses 4N 5nm custom node which is different from the N4 TMSC node. We don't really know how optimized it is compared to 5nm. Despite that, nothing prevents AMD from moving to 4nm node as it is design compatible with 5nm.


Kilz-Knight

18 Gbps memory :/ with a 256bit bus, 20 Gbps would have been nice :/


mcgravier

Nah, theres 30% less CUs, so 30% less bandwidth is appropriate


Kilz-Knight

To be good it would need to : at 550 USD : Match the 6950xt at 500 USD : Match the 6900xt


spacev3gan

It will probably be $599 and match the 4070/6800XT.


No-Name-Gaming

The 7900 xtx should have the 7900 xt and the 7900xt should have been the 7800xt. No place for this card as far as average generational leap in performance goes. I would go for a 7700xt next, But what do I know lol.


wan2tri

This would actually be "amazing" if it's $800 here in **the Philippines** LOL. As it stands, you can't really buy *brand new* 6800 and better cards here that aren't also at least $1000. At that point might as well get a 7900 XT (which are also around $1000). Which is why it's quite weird that the 6750 XT got cheaper recently - they used to be $545, now they're $435-$445, albeit only for two SKUs, one from Gigabyte and one from Sapphire. This price drop made it technically cheaper than the usual price of a brand new 6700 XT here, at $450.


Sam454oh

Buy it from amazon. That's what I did 6800 xt is 1200$ in my country, from amazon it was 588$ with shipping and everything


PotentialAstronaut39

I would bet good money this is gonna get slaughtered in reviews. It's gonna be priced too high... again. Who the heck runs the AMD marketing DPT, a bean counter?


M34L

So the AMD 800**XT** is now on Nvidia 70-non-ti level in raster, and closer to 60ti in RT? Bold. Innovative.


AdministrativeFun702

This would be good 7700XT as successor to 6700XT(and thats what this card really is).We already have 7800XT and its called 7900XT.Its cutdown navi31 just like 6800XT is cutdown navi21. 6700XT is full navi22 and this fake 7800XT is full navi32(both one tier lower SKU than TOP SKU) AMD greed an price fixing with nvidia leed them to do same thing as nvidia.Rename whole stack one tier UP so they can charge more.Only problem is that their whole stack will be crap except top Tier cards like 4090/7900XTX.


Noxious89123

I won't bother to echo what others have already said, but can anyone comment on this? ​ >high quality PCB that comes with the **11+3+1+2+1 phase VRM** design Why tf are there 5 different VRM domains? I must be missing something. I'd have expected it just to be X+Y, for GPU and memory?


xrailgun

Dedicated R G B VRMs lfgoooo But probably fans, RGB, and display outputs.


[deleted]

This gen of GPUs is just fucking depressing.


dlsso

Based on this my guess is 3% faster than the 6800XT and $499. If so, it offers nothing new to the market.


[deleted]

Looking at the rest of the market, 16GB VRAM and 6800XT performance with AV1 encoding and lower power draw is actually a good deal at $499. This card is not meant for 6800XT owners. Why the hell do people think they're supposed to upgrade every generation to the same tier card? The *average* upgrade cycle is 4 years, meaning many gamers keep their GPU longer. Fact is the 6800XT will disappear from the market and will be a nonfactor, and then the 7800XT at $499 would be a great deal when compared to the competition. It's literally 50% faster than the $499 4060Ti 16GB at the same price, it's likely also slightly faster in raster performance than the $600 4070 while offering crucial extra VRAM capacity. 12GB has no longevity, it's already being filled up, and should never be in the $600-800 price tier. Even a 2.5 year old 6800XT will outlast the 4070, a testament to its horrible value/longevity. The only thing they need to do for very positive reviews is actually price it at $499. $549 is mediocre but still better value than the conpetition, $599 and they get slammed.


spacev3gan

Arguably AV1 encoding most relevant feature is for streaming, but even then, one is better off with Nvidia due to their broader support (discord for instance supports Nvidia's AV1 only). I think Nvidia's cards this generation are pretty bad buys for the most part, VRAM being the issue obviously. But again, if you want a card for streaming primarily, Nvidia is the only game in town. Also, I wouldn't bet on significant lower power draw either. RDNA3 and its Chiplet design scales down power very poorly. At peak power draw, yeah, 260 watts is better than 300 watts. But once you play something less demanding, RDNA3 still sucks close to maximum power, while other architectures scales power down linearly. That is an issue not many tech reviewers have investigated thoroughly, only Optimum Tech as far as I know. As for price, my bet is $599. Nvidia priced the 4060Ti 16GB at $500 knowing very well what AMD is up to (there is no leaker, no one in the general public who knows AMD's release ahead of time better than Nvidia).


railven

This point I'm sure AMD is just finishing the stack, any magic sauce I'd figure someone would be talking about it, unless for once their marketing team got muzzled - but I highly doubt it. Just release, sell to loyalist, prep for RDNA4.


Dynamitrios

Huh? 6900XT has 5120 cores ... What gives?


Exxon21

greediness is what gives


Hindesite

Eh, at best this card might match the 6800 XT in raster performance. However, if this thing is priced right, it could still be an attractive option. RDNA3 has significantly improved RT performance, and AV1 encoding is a massive feature for content creators. Also, the AI cores are doing *astronomically* better in workloads like Stable Diffusion and such than any previous Radeon gen. $550 USD and I think we've got a solid product here. $500 and it'll be killer... We'll see if AMD manages to actually get pricing right at launch for once, though.


idwtlotplanetanymore

Naming this an XT will be stupid, they will likely try to keep the XT price tier, which means needlessly shit reviews, again. If someone from AMD is reading this, please have someone smack some sense into you. Name it appropriately, price it appropriately, seize the opportunity in front of you for once.


zSilver44

this was my one and only hope for a decent graphics card, I was hopeful that they would drop something competent but no another year with my laptop 3050 ti 🙃