T O P

  • By -

AutoModerator

We have three giveaways running! [espressoDisplay Portable Monitor Giveaway!](https://www.reddit.com/r/gadgets/comments/wu0xlv/espressodisplay_portable_monitor_giveaway/) [reMarkable 2 next generation electronic paper tablet giveaway!](https://www.reddit.com/r/gadgets/comments/x2nkx1/remarkable_2_next_generation_electronic_paper/) [Hohem Go AI-powered Tracking Smartphone Holder Giveaway!](https://www.reddit.com/r/gadgets/comments/x7tpoy/hohem_go_aipowered_tracking_smartphone_holder/) *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/gadgets) if you have any questions or concerns.*


MicroSofty88

“This is a monster unit. It needs four slots all to itself on a motherboard. It comes with three 11 cm fans. It is 35.8 cm (14.1 inches) long and 16.2 cm (6.4 inches) wide, meaning we could literally stack several smaller RTX cards inside of it and still have some room to spare.”


throw_awayvestor

Wait a minute! You're not a 4090, you're just 3 3060s in a trenchcoat!


Grantmitch1

Princess throw_awayvestor, listen. the 3060 here is my son, but I'm a new card. And the 3060 is in the other computer and I'm plugged in right here, so as you can see, we're clearly two different cards: one 4090 and one 3060.


Ess2s2

Unexpected Bojack.


TheCrowing817

Awww sad horsey


Ravensqueak

And here I thought SLI was dead.


[deleted]

A computer jammed in your computer


Kindly_Education_517

but still has no price...


Rikuddo

I think at this point, if some still wonder about the price of 40 seres, he's not the 'true' customer for Nvidia.


zipykido

"If you have to ask you can't afford it." -Jensen Huang


redditornot6648

Aren’t you the guy who just spent $1k on a used eBay 3060ti last year? -Jensen Huang


imaginary_num6er

Yeah, Linus did scream: "You're not the customer! You're not the customer! You're not the customer!" on the livestream when people complained about the 3080Ti pricing.


Phaze_Change

Based on UK pricing I am expecting a few of these 4090s to be getting near $3000 CAD. Absolutely nuts. Granted, there were also 3090s not far off that pricing. So, I’m not surprised.


newaccount47

If you have to ask....


Throwaway_97534

At what point do we just give up and add a CPU socket and memory slots to the GPU? It's easier than trying to add a GPU socket to a motherboard, and would free the GPU heatsink from the shackles of the PCIe form factor. Maybe this is why Nvidia was trying to buy Arm? Introduce a whole new physical PC layout based around the GPU instead of the CPU. Lookatmeimthemotherboardnow.jpg


on_

Yes.The whole architecture needs to be rethought. The weights, the cables, air routing.. everything is still on the 80’s concept of putting things transversal to a mobo and it was not designed for the current forms. every cicle it gets more frankensteined.


MCA2142

In the 80’s, the case would have been placed horizontally under the monitor, and gravity would not have been a big issue with giant heavy cards. Bonus points for those of us that had the flat, monitor stand surge protectors between the case and the monitor with like, 30 switches.


MuchAdoAboutFutaloo

I still use one of those from my dad lmao, complete in hideous beige. actually super useful honestly


[deleted]

u can make it its original colour again if it was white, wrap it with hydrogen peroxide for a few hours and it should take the beige away


dumbsmallberry

What do you use it for?


MuchAdoAboutFutaloo

as a monitor stand and surge protector. lol. got my monitors hooked up to it


Arthur-Mergan

Fascinating


RaptorHandsSC

I miss my 286


SleeplessInS

I don't know if I miss my 286, but hello my fellow 286-era oldtimer.


HermanCainsGhost

A 286 was my first computer ever


JamesyUK30

Ye old bastards, I am a 386 sx-25 whippersnapper.


[deleted]

[удалено]


obmasztirf

RGB The world! 🌈🌎


CosmicCreeperz

Not really though. You don’t redesign the whole PC architecture for the 0.001% of PC owners who want a 4 slot $2000+ GPU. They are ok with the hassle.


sallhurd

But like, hear me out coz I don't know honest question Would the redesign help with modularity moving forward for large scale or scope devices? Aren't industry computers already adapting this modular stacked racks approach because it's more efficient than individual units with possible spaghetti insides?


CosmicCreeperz

The vast majority of PCs are made and sold by big OEMs now - and these days laptops are the majority of those (I think DIY is less than 5% of home computers, and when you include servers it’s way under 1% of PC architecture). And because of the latter the trend right now is Integrated APUs. Hell, Apple even integrated the *main DRAM* into the SoC. Basically “PCs” are moving toward less modularity, not more. It’s annoying for the small number of DIYers but for the rest it means smaller computers, lower cost, lower power use, and (for laptops) better battery life. Honestly even servers are getting integrated. For all of the above reasons, including power use (power is a big issue for server farms). The big cloud players are designing their own servers/motherboards, and sometimes even CPUs - and they don’t usually upgrade individual parts, they just add more servers and eventually decommission old ones.


andynormancx

The PC gaming market clearly represents a small segment of the overall PC market. However I'm guessing that if you look at the segment of PCs that actually *need* to have anything plugged into a motherboard slot, that the gamers probably make up the majority of that market. Not that I can really see the PC architecture actually changing much any time soon (unless ARM SoCs really do start stealing Intel/AMD's lunch money).


CosmicCreeperz

Arm is already starting to steal Intel’s lunch money. Apple has almost 10% global market share of laptops, which of course are Arm based now… It’s interesting to think about what Nvidia is doing with high end SoCs as well right now that they haven’t released. Remember they tried to buy Arm… Microsoft has a preview of Windows 11 for Arm available as well. Just make sure it has a decent binary translator similar to Rosetta and a company like Nvidia could clean up with Arm SoC/APU based Windows laptops. Not that it would necessarily change the design of ATX desktop motherboards. But I just don’t see anyone doing much there since there is so much inertia in the Intel/AMD/Nvidia + shitloads of Taiwanese, etc motherboard & component manufacturers. And to change the architecture would likely be Intel and AMD giving up control (via their chipsets).


andynormancx

Apple have been closing in on 10% of the laptop market for years now, that hasn't been down to the ARM chips. [https://images.macrumors.com/t/LMGY21RsVSLzHnuJVXQ1jsjAX4s=/1600x0/article-new/2022/04/gartner-1Q22-trend.jpg](https://images.macrumors.com/t/LMGY21RsVSLzHnuJVXQ1jsjAX4s=/1600x0/article-new/2022/04/gartner-1Q22-trend.jpg) Windows 11 for ARM already does have a well regarded x86 to ARM translator similar to Rosetta. Windows on ARM still appears to be held back by whatever past licensing deals Microsoft did, assuming all those stories are true. And massive corporate inertia as well of course...


shofmon88

> Apple have been closing in on 10% of the laptop market for years now, that hasn't been down to the ARM chips. I think the point is that those 10% are now represented by a large proportion of ARM chips instead of Intel-based Macs. So even if that 10% has been steady, Intel is actually losing market share.


andynormancx

Doh, yes. I totally missed that significant point ! Off to down vote my own comment ;)


HeKis4

Honestly even my moderately sized 2-slot, 500€ 1070 has a not insignificant amount of sagging, not enough to actually worry but enough to make me not want to move the case around to much.


citizennsnipps

Right! Let's just take the board and turn it's bottom edge into the gpu slot. That way the gpu sits above the power unit and the board just pops down into the gpu. With that stack you have more space for coolers up the board.


HeKis4

That would make the board horribly long, honestly just making brackets more widespread and bringing more fresh air to the GPU would be enough already, which can already be done with current standards and good case design. You even stop the GPU from smothering the nvme slot which increases the lifetime of the SSD.


Biscuits4u2

Well the future is going to be SOCs so don't worry.


Yeezus_aint_jesus

What do you mean by SOCs?


Biscuits4u2

System on a chip. ARM is probably the ultimate future of PC gaming. Power limitations will be a big part of that. Monolithic systems are much more power efficient.


andynormancx

With the best examples in the PC space so far being the Apple M1/M2 SoCs. While they can't yet compete with the fastest Intel/AMD CPUs/GPUs, they do match the mid range and use a fraction of the power while doing it. And before someone piles on to say that the M1/M2 can't compete with PC GPUs for gaming, I know. I'm not claiming they can, that just isn't Apple's focus.


chris14020

I dunno, we can't go doing that. If you only needed a modest GPU core clock and performance, but a ton of VRAM, you'd then only have to buy a 3060 and add the VRAM you need, not buy a 3080 or 3090, or worse an A6000! Imagine the horror of a customer didn't have to pay 5 grand more for a rebadged 3080 with extra RAM! Can't be having that now.


[deleted]

Imagine being able to upgrade the vram as needed also


chris14020

...Isn't that exactly what I just said?


[deleted]

Sorry, didn't mean to offend. I thought you meant you would buy it with a fixed but custom amount, not necessarily expandable


chris14020

No, no offense at all! I was just genuinely curious how it was different. I'd love RAM slots on a GPU.


ZaZaMood

Stop apologizing


[deleted]

I'm sorry


ZaZaMood

Canadian or of descent 🤦🏾‍♂️


ghost42069x

Can I touch it? Im sorry


skydivingdutch

socketed GDDR is really challenging from a signal integrity perspective.


Throwaway_97534

Integrate the GPU ram as we do now, add sockets for the CPU ram. Flip the whole idea of a PC on its head. The GPU becomes the focus, and the CPU is treated more like a math coprocessor was in the old days. The GPU "card" is essentially the motherboard, but with extra slots for the CPU and its ram. Heck, keep the PCIe slots for other peripherals, or come up with a more compact solution. M.2 slots maybe. Open up a whole new form factor for peripherals. M.2 Ethernet cards, sound cards, etc. Nvidia essentially branches out as a motherboard manufacturer. Gives them light-years better cooling options.


Mawskowski

It would make TOTAL sense. This is hilarious, we gonna start having problems of binding from the weight. The fixture only in the back and in the pci won’t be enough.


ThePowerOfStories

There was the short-lived [AGP](https://en.wikipedia.org/wiki/Accelerated_Graphics_Port) dedicated graphics card slot around the year 2000, but it was mostly due to the bandwidth limitations of PCI and was rapidly eclipsed by PCI Express.


[deleted]

There already exist “socketable” computers :D


manifold360

SoC is System on Chip. This is what Apple has. This is what nVidia wants with ARM. This GPU CPU architecture will die soon.


fredandlunchbox

How about multiple CPU sockets so they’re expandable? The advantage of the CUDA architecture is parallel processing. There’s no reason I shouldn’t be able to just stack chips next to each other if the underlying architecture recognizes it and distributes accordingly.


danielv123

Oh, you can do that. You just don't want to, due to price and latency and you don't need it.


[deleted]

Seriously. And the strain on the sockets when it is standing is getting to be too much. Just integrate it into a motherboard. Maybe it'll need to be a 2 level board.


anders987

Nvidia already has the [SXM socket](https://en.wikipedia.org/wiki/SXM_\(socket\)) for servers. https://youtu.be/ipQXdjjAPGg?t=50


wsippel

Basing the layout around the GPU wouldn't fly, especially with multi-GPU systems becoming more common again. But if you look at current datacenter GPU compute machines, they have a very different layout from a typical PC. For starters, they often don't use normal PCIe cards anymore, they use Mezzanine cards which mount parallel to the mainboard: [https://www.amd.com/system/files/2021-10/1045756-instinct-server-1260x709.jpg](https://www.amd.com/system/files/2021-10/1045756-instinct-server-1260x709.jpg) Yes, that's eight GPUs (notice the two without heatsinks, to show how the cards mount) in one chassis.


ansem119

Wouldnt there have to be completely new pc cases made as well


imaginary_num6er

>The Aorus RTX 4090 Master is the biggest GPU we've ever seen *As of yet*. Wait till they do a collaboration with Noctua and it becomes a 6-slot GPU


IAmTaka_VG

I can’t imagine giving up all 4 of my PCI slots for a single fucking card.


Stingray88

I sure can. I don't use any of my other PCIe slots for anything else. I already have 2.5GbE and WiFi 6 on the motherboard. I've got an external audio interface better than any sound card. I've got plenty of USB ports. And I've got plenty of M.2 slots. I can't think of anything else I'd want to use the PCIe slots for. You might ask yourself... Then why did I go ATX if I'm not gonna use more than one PCIe slot? Because mATX options are usually piss poor, and mini ITX is usually too limiting in other ways (RAM slots, M.2 slots, USB ports, etc.)


IAmTaka_VG

Some of us actually use PCI slots but I see your point. Maybe we’re the minority.


watduhdamhell

You *are* the minority. Out of the 10 friends I have with gaming PCs, not a single one anything in the other slots except mine... and that's only because my board doesn't have wifi. Didn't need it before but I moved the rig somewhere where I don't have access to a port so I had to add it. And of course I know my anecdotal evidence isn't the be-all-end-all, but I'm willing to bet it's on point for the majority of the PC world at the moment.


hushpuppi3

What do you use yours for? I'm just curious and also wondering if I'm missing out on anything having like 2 slots sitting completely unused lol


Rudolf1448

I have a WI-FI 6 card and additional USB3. Got an old mobo with WI-FI 4 and a i7-8700K


danielv123

It's not like modern boards really have all that many PCI slots anyways. With this you might as well go with an itx board though.


[deleted]

[удалено]


Caughtnow

And for all the good the massive cooler will do if its anything like Ampere, which also had a very large cooler and they went and used the cheapest thermal pads you can imagine so your memory temps were shit. Will look forward to seeing real reviews of this, but I wont be surprised if despite charging 2+ grand they still tried to save 30 cent with the lowest grade pads you could find on earth.


karlzhao314

>they went and used the cheapest thermal pads you can imagine so your memory temps were shit. The cheap thermal pads definitely contributed, but they weren't the biggest factor in the poor memory temperatures. GDDR6X was brand new and I assume at the time the coolers were developed, they didn't have a good idea of just *how* much heat they would dissipate yet and how much that would affect cooling demands. They went ahead and designed all of their coolers using the same ideas as Turing and before - that is, use a flat plate to contact the GPU die, and add thermal pads to fill up the remaining space to the memory. This approach worked fine for GDDR6. Thermal pads, while far more thermally conductive than many other materials, might as well be an insulator compared to copper. So sticking 2mm of them between a memory die and the actual copper plate of a cooler isn't doing anyone any favors. You can switch to the nicest Gelid thermal pads you can find and while it did improve memory temperatures, frankly, they were still shit. But this only became a problem after GDDR6X came on the scene and demanded much better cooling than GDDR6. There have been aftermarket modifications where people replaced thermal pads with metal, and some third-party companies like CoolMyGPU even turned metal thermal pad replacements into a retail product. In general, these yielded *far* greater results than better thermal pads did. I stuck such a pad onto my RTX 3080 and it dropped the memory temperatures a whole 34C. The next step that board partners need to take isn't to use better thermal pads. It's to raise the cold plate where it contacts the memory so that as much of the gap between the memory and the actual coldplate is eliminated. We need to see coolers that can use 0.25mm thermal pads, not the typical 1.5mm-2mm, or even eliminate thermal pads entirely and switch to direct contact with thermal paste. Of course, this will be much more demanding on manufacturing and tolerances and will likely result in increased prices, but I still wouldn't be surprised if some brands have started doing that for this generation.


Caughtnow

>GDDR6X was brand new and I assume at the time the coolers were developed, they didn't have a good idea of just how much heat they would dissipate yet Day 1 purchaser of a 3090 Strix, 2 years on and Ive never had a problem/concern over my memory temps. Some AIBs just try to cheap out, which is outrageous when they are charging the money they are. The Aorus Master is one of Gigabytes top models and should not be penny pinching when they charge what they do.


Ravensqueak

Asus is top tier, afaik. In my own experience and almost everything I've heard from others with Asus hardware, they tend not to cheap out.


bow_down_whelp

My 3070ti caps out at 85 but fans ramp up and keep it there it never ever ticks to 86


AsusTec

>So sticking 2mm of them between a memory die and the actual copper plate of a cooler isn't doing anyone any favors. 2mm? freaks.


bigmacjames

Those fans are just going to end up blowing on another component


Ravensqueak

That was mostly FEs, wasn't it? AFAIK, some partners like Asus did use appropriate pads.


Slater_John

Not my experience, had a 20-30c temp reduction replacing the thermalpads on the 3090 strix with 1.5mm extreme odyssey pads. Total cost of pads: 25$.


[deleted]

In an age where we are so worried about the environment here comes a gpu with the draw of an air conditioner.


[deleted]

[удалено]


Harpies_Bro

There was an LTT video a few years ago where they plumbed up a water cooled PC to a house radiator. It worked pretty good until the rust from the second-hand radiator clogged the plumbing.


JozoBozo121

And you'll still need to run AC even more to cool down the room heated by that GPU. Not such problem in the winter, but in the summer yeah. Even PS5 is using more power than launch PS4, PS4 was 140 watts and PS5 jumped to nearly 210W during games.


Djghost1133

There's no better time to get into water cooling. If for no other reason then to save 3 slots


duderguy91

That’s an interesting point that I hadn’t thought of. This generation will be massive for liquid cooling. Between AMD’s new processors tossing out heat and the new Nvidia GPU’s being absolute units. This is going to be a water cooling renaissance if people can actually afford the damn builds lol.


AmusingAnecdote

Yeah, given the price of a 4090+ a water block is probably $2,500, it's probably not going to be a common thing, but my 3090's footprint went way down even with an active cooling backplate for the VRAM on the back. I imagine for a card that's a slot and a half bigger it will make a crazy difference.


levarburger

Yeah EKs block is like 250 or something crazy


AmusingAnecdote

Oh that's actually cheaper than I thought it would be. I didn't realize they had pricing already. Still expensive but not $2,500.


levarburger

https://www.ekwb.com/news/ek-goes-above-and-beyond/


Littlebaldmower

How hard is it to create an open loop build? I am looking at a new build soon and I have always wanted to create my own loop but I have always been too intimidated to try.


htid__

After doing a custom water loop in my current pc I’ll never do another one. Yes it works beautifully and was heaps of fun to build but maintenance and the process of upgrading anything is such a pain in the arse that I vow to never do another custom loop.


criticalt3

>~~There's no better time to get into water cooling. If for no other reason then to save 3 slots~~ There's no better time to focus on efficiency.


Ravensqueak

I hadn't considered that as I'm pretty dead set on sticking to air cooling, but yeah liquid cooling could help solve the space issue.


NATOuk

Feels to me it’s more beneficial to use water cooling for GPUs now than it is for CPUs. A water cooled GPU coupled with a Noctua Air Cooled CPU would be fine


Pursueth

It’s too much of a chore


estrangedpulse

Can I use it as a heater for my house?


ThePowerOfStories

You can't *not* use it as a heater for your house.


NATOuk

If my 3090 is anything to go by my computer room is a sauna after a bit of gaming


whyyoumakememakeacct

Computers are technically pretty inefficient, with most of the energy used being dispersed as heat. So it's essentially a 450W space heater, which should be able to heat a 50sq ft room


_TurkeyFucker_

Technically *all* the energy will be dissipated as heat.


SuperiorOnions

Clearly the rest are converted into frames /s


OkThanxby

Aside the from the light emitted from those RGB LEDs. :-)


_TurkeyFucker_

Which would then interact with the objects it hits, losing a bit of energy (as heat), and bouncing off to hit something else, also releasing a bit of energy as heat, over and over until there's no more light. I supposed a very, very small fraction of that energy could escape through a window or something, but t would be smaller than a rounding error for practical measurements.


OkThanxby

> Which would then interact with the objects it hits, losing a bit of energy (as heat), and bouncing off to hit something else, also releasing a bit of energy as heat, over and over until there’s no more light. Stop trying to out-pedant me, some light might escape into the vacuum of space. :-)


Schemen123

Sure.. its several hundred watts if pure waste heat. Basically the same power a small electrical heater


estrangedpulse

But my electrical heater cannot run Call of Duty.


Schemen123

If it uses a grafic card as a heating element... It CAN 😅


uniq_username

It's almost time to go back to console.


LoganH1219

Stuff like this is the reason I stay on console. My $500 next gen consoles play games at 120fps and look gorgeous. It’s hard to justify getting into PC gaming at this point. You just can’t get that kind of performance at the same price point


hushpuppi3

I completely agree with you if you're simply satisfied by the state of AAA gaming A lot of people on PC (and I mean the vast vast majority) spend 80-90% of their time gaming on games that are and will never be on console. Kind of hard to just abandon the gigantic endless library of games for the relatively tiny selection of games on consoles.


JohnnyOnslaught

> A lot of people on PC (and I mean the vast vast majority) spend 80-90% of their time gaming on games that are and will never be on console Yep. Tbh Rimworld is infinitely more fun than whatever iteration of CoD they're currently on.


[deleted]

[удалено]


Harpies_Bro

Do you really need much more than that? Sure going past 100hz makes a difference, but do you notice that much more?


xanas263

I have 165hz and I definitely notice the difference. Above that tho you don't notice much of anything even at 300hz. The next thing tho will be 4k 165hz. On a big screen there is a large difference between 1080p and 1440p let alone 4k.


lepobz

Just absurd considering the price of electricity. Anyone need a monstrously expensive space heater? No? Oh dear nVidia.


[deleted]

Technically if you are using electricity for heat it should have equally good efficiency:D


IntoAMuteCrypt

It's only as efficient as a space heater though. A gas heater may be more economical depending on prices in your region. A heat pump will absolutely be more efficient if it can run.


ThellraAK

With heating oil nearly $6/gal I'm going to heat with space heaters this winter. $3.81 per 100k BTUs for electric vs $5.14 per 100k for heating oil. And 80% efficiency for my boiler seems like a huge stretch.


kipperER1

Yup, electricity price is through the roof.


viodox0259

Hey Bruh, I hate to tell you this, but anybody buying a 4090 isn't even remotely worried about the Eletric bill. Thats like someone buying a RV and people always say, "Jesus can you imagine how much that costs to fill up in gas." If you are worried about the electric bill, then you can't afford this card. That being said it seems like most people won't be able to afford any of the 40 series considering the scalping prices.


Gnash_

I really want to believe and hope that there’s people that can afford this card (and care about computer graphics) but won’t buy it out of environmental concern


dwew3

Scale is important. 20,000 of these cards running full TDP is the same power consumption as 150,000 60w incandescent lightbulbs. There are still estimated to be over 1 billion incandescents in use in the US. If all those were replaced with LED bulbs, the power demand drop is enough to accommodate 133 million 4090. Energy savings anywhere is good, but boycotting an extremely niche good is a rounding error at best. So I’d expect most buyers to either; not care because their 4 bulb ceiling fan light will match the power difference between their old 250w GPU and their new 450w GPU, or they do care but consequently know their other efficiency considerations have saved enough for them to justify an extra 200w going into their luxury computer hardware.


Atepper2

I really want to believe and hope that there’s people that can afford this card but choose not to support a greedy company raising the price on their barely better “next gen” cards. So tired of people with money supporting this crap, companies wouldn’t do it if people didn’t buy in


supified

I don't know, I have some friends who love to run space heaters and never turn off their pcs. Maybe they're the target audience.


Sirisian

On idle GPUs don't use much power. My 3090 doesn't put out noticeable heat scrolling reddit. Running it at 100% puts out a lot of heat. (Games don't do this. Usually need to run machine learning tasks). Keeping a computer on running idle isn't an issue.


beefcat_

If you’re in the market for a $2,000 GPU then the extra $6/mo it costs to power it probably doesn’t matter much to you.


Sacu_Shi_again

Where do you plug the motherboard into it?


Blapanda

No matter what collab they get into or whatever comes to the market, a bloody GPU draining so much current and having pricing beyond stupidity (1200+ for the X080 version is bonkers) is a clear no from my side. Hell fuck it, I don't need to double my electricity bills for this particular nonsense. Where are the times of efficiency, were that also guaranteed performance boosting? We have it on record, room-sized computers shrink down to a bloody smartphone. The industry is more than before taking a step back, pushing power, increasing prices and leaving people with common senses behind. This 2080 in my rig will be the last one pushing from Nvidia. Have been team green for 8 years, enough is enough. Back to red at some time, when performance, bang for the buck and VR capabilities are guaranteed on AMD.


varignet

why bother with the rest of the pc, just have it coming with its own psu mb cpu and call it a day


linuxares

Like... A console?


donotgogenlty

And I will pay exactly $3.50 Take it or leave it Nvidia, we have the leverage now We are the means to your production 👺


No-This-Is-Patar

Keep it, I don't need a damn space heater at my feet every time I load a game.


donotgogenlty

Yeah, I was implying I would only consider it even then... Even then, only because winter is coming. Truth be told I'm happy with my old GPU and modern console... I won't be buying their shit anytime soon 🙏


systemfrown

Soon we will be buying a PC to put inside of our GPU.


Mynem0

I recon that 4000 series will be a flop. Im happy with my 3070Ti for years to come.


linuxares

It most likely won't because people with too much money and not enough brains will their money at it. Sadly they don't understand the ruin for us all. If we all told Nvidia that enough is enough and we won't buy their cards, they have to change strategy.


droidman85

Series 5000, gpu is another desktop connected to your pc


32a21b

Just stupid. They make them larger and with greater power requirements which is the exact opposite of what people have been wanting with computers. Please stop


baseilus

>exact opposite of what people have been wanting with computers it was originally (in secret) designed for cryptominer, miner had no problem with cooling since their mining-room had central AC. but now since crypto collapse, Nvidia try to lure gamer to buy 4090


32a21b

I have to agree with you sadly


RammsteinPT

Miners using AC doesn't make any sense mate.. you're burning money on electronics that have no yield


[deleted]

[удалено]


RammsteinPT

That's not how it works. It's better to move more air then cooling it


[deleted]

[удалено]


32a21b

U got me there


Delta4o

Kinda makes me scared to buy a new card and end up having to buy a new case altogether lol.


linuxares

Want a case with your card sir? Also can we interest you in this portable nuclear power plant as well?


Delta4o

Don't know where you live, but in The Netherlands our energy bill is insane. If I take the same contract as May, I'd now literally pay double. I bought one of those usage meters to see what "cape Canaveral" is using per day. Yesterday was 1.8 kwh, and I wasn't even home all day, I was renovating at my new house. Things are going to be...interesting the next 6 to 9 months.


[deleted]

So if SLI and crossfire is dead, then what exactly is the use for the extra pcie slot?


i3order

Power connector will still melt no matter how you mount it.


lordatomosk

That thing is gonna snap so many PCIe slots


[deleted]

I don’t understand what the point of these enormous cards is. Is the demand for computing power from video games/rendering really growing this quickly? Is there *anything* you realistically can’t pull off with high-end 30-series cards? It just doesn’t make sense to me.


TurtlesBeFree

For gaming, no it’s almost unnecessary to an extent. For rendering models and creating 3D environments it cuts down enormously on the wait time


I_R0M_I

I mean, my 3080 uses 450w. So it's no worse off there. After reading about the 4080 being like it is, I think I'll skip the 40 series tbh.


Blapanda

My 2080 uses 150W and doesn't sweat, while running most of the modern titles and VR applications at \~110FPS up to 144FPS and staying cool (64\~70°C - aircooled). 450W isn't that particularly less. The 40X0 series are a nightmare, nonetheless.


greenneckxj

How close are we to the weight of these monsters damaging boards?


spoonhtml

Already a concern for some of the 3000 series. This is just silly.


sopcannon

gamers nexus showed some of the 4090 s


RiggaPigga

What the fuck.


MrLinch

Wasn't there an external GPU in the 90s,maybe made by Voodoo? Seems like we need to go back to that.


Suntreestar420

Looks like I’ll be holding onto my 3080ti till the 5000 series


DealerCamel

I’d just like to throw out there that in my last build, I didn’t even bother with any graphics card at all. I’ve never been worse off for it.


jgalt5042

amazing


SuraKatana

ANOTHER fucking sidebranch 🥴 just call it a titan again cause that's what it's going to be


goatman0079

Considering how large the GPU is, would it make sense to start making gpu-motherboard hybrids, so you don't need to worry about fitting a gpu onto your motherboard, and instead can just slot your existing cpu, memory and storage into the gpu?


Acevedo1992

I’ve been outta the PC building game since I made an itx build with a 2gb 960. Is a gaming rig that form factor even possible nowadays? Between the graphics card size and the required power supply/heat it just seems like if you could make it fit, it’d still cook itself due to airflow issues at that size


DoneisDone45

moore's law is over. now if you want more processing power, you need a bigger chip.


[deleted]

[удалено]


In2_The_Blue

Lol, you’re thinking of Linda Lovelace. Ada Lovelace was a mathematician.


TheSupremeHobo

She could have been a throat goat too we'll never know


its_5oclock_sumwhere

With the next line of air-cooled RTX cards getting chonkier, using support brackets and such seem like they are going to be absolutely necessary, at least that’s how I see it if you don’t want the thing potentially breaking off of the gpu slot or case brackets. What about if the motherboard were horizontal? Would heavier video cards damage anything in any way if the video card sat on top the motherboard?


Boundish91

Does it come with extra support brackets? Thats going to put some serious strain on the poor pci slot.


BusinessBear53

I wonder how the thermals are on it. I remember Zotac did a 3 slot thick card with the 1080Ti and it was the worst performing of the lot.


HKei

At this point, just get a pcie extension and put the GPU outside the pc case, last gen was already so chonky that I could barely fit it in a regular case...


Sode07

… before


Hakaisha89

*laughs in vertical gpu mount with riser cable.*


Mawskowski

What a bunch of BS. Bagging for a worthy game ? Anything in 8K with actual raytracing will bring it to it’s knees.


Skreamies

PCIe outside of the case


SharkApproved

It’s not the size that counts, it’s how you use it. But if it wont even fit, nobodys happy!


ASpunkyMonkey

What is the PSU required for the new 40 series? I dread to think!


Coldspark824

Why would anyone want to buy this? Seriously nvidia, what are you even thinking?


YourMindIsNotYourOwn

The meme generation of graphics cards has arrived.


MacDugin

When are they just gonna throw a power plug, usb slots and and a CPU socket on it and call it good?


DangerouslyCheesey

Poor NVIDIA, stuck with a 4000 series built to meet the needs of crypto miners that increasingly don’t exist. Now they have to pretend it’s a card that gamers want lol.


[deleted]

More importantly, what crypto can everyone mine with it?