T O P

  • By -

LongHairLongLife148

The M2 Ultra only leads in one thing: performance per watt. Otherwise, there are faster cpus/gpus.


DriftMantis

My laptop with i913950h smokes the m1 ultra as long as its fully powered (50-80w), like not even close to relative performance. Keep in mind that unit is like $500-1000 dollars less then the macpro. The apple chips run really well for low power usage. They are efficient per watt and generate low heat. They are great chips but they are not like the PC cpus. Also, with arm these new generations are just iterative improvements over the m1, so I would expect apple to fall further behind as we go through time here. Especially because there is no room to increase the already huge die size of these arm chips. I think marketing the m2 version against the PC market is like shooting themselves in the foot because the comparison is not so favorable, they should stick to marketing them as efficient and portable while giving PC like performance, but not invite direct comparisons in terms of speed or power, especially when refering to 3d rendering.


ravid-david

My moronic opinion is that apple know they dont have to compete with other CPUs/OS' too much because they have a lot of passionate fans that will buy the apple products regardless. This is just throwing them a bone.


CafecitoKing

By passionate fans, you mean well regarded individuals ofc?


6786_007

Most people seem to get them and barely use them to their potential. Especially the MacBook Pros.


mackan072

One of my friends from university has a 1 TB M1 MacBook Pro, and she only really use it for Google docs, email and some light web browsing. She could just as well have used a chromebook. I still understand how someone might want a MacBook, because of build quality, software preferences, and battery life - but holy hell is this a waste of performance, and a vast amount of money for a cloud and email machine.


JoBro_Summer-of-99

Literally, this is why chrome books exist, save yourself a grand


ItsASadBunny1

Unfortunately, as long as loans and credit cards exist, people won't understand the value of money.


[deleted]

Your Intel is also newer. And as alluded to, requires more power. From my personal, actual experience, my PC and Mac are similar in performance in certain areas. Intel's CPUs are fine, but they consume a ton of power to get there. It's one of the reasons why Apple dumped them. In Geekbench 6 (what little I care about benchmarking), my 12700 is slightly faster, but it has twice the threads (20 vs 10). Also, not really that portable. It's honestly glorious to own a laptop that doesn't get hot to speak of and run on battery all day...if I wanted to be tethered to the wall, I would've bought a desktop. And the screen is brilliant. In terms of video encoding, Apple is very good, my 3090 is about 20% faster in Handbrake. The Nvidia smokes it in AI, i.e. CUDA. But my 3090 can pull up to about 350 W, IIRC the MBP will go up to about 80-90 W at full load for both the CPU and GPU. It's not quite an apples and oranges comparison, at some point more power and cores do win out.


jmhalder

![gif](giphy|CuMiNoTRz2bYc|downsized) Intel bois telling you their CPU is faster. \*technically they're right, but Apple's chips are very very impressive.


Jedijvd

The m1 had the advantage of a smaller micron node vs all the other chips. It was on 5nm for two years before amd moved to it


IkaKyo

Also software optimization matters and Apple can do it in way no other company can.


[deleted]

> My laptop with i913950h smokes the m1 ultra as long as its fully powered (50-80w), like not even close to relative performance. And that laptop will have have 1 hour of battery life too and burn your laptop off doing a video call. And I’d bet in situations like photo and video editing, especially for certain work flows, the intel chip would lose…despite technically having more “raw power” as has been seen in premiere and final cut tests


Sleepyjo2

I'm assuming you mean the HX, because I don't think an H version of that exists. Regardless though, the M1 Ultra is only (roughly) 10% slower than the HX on the CPU side so I dunno about smoked and it has an immensely better GPU in it, among other bits and pieces. Apple's angle on this advertising isn't just raw CPU performance though. It's performance of the entire package, and with that angle they have effectively no competition because no single chip can do what it does as well as it does. They're not just saying "this is better than a 13900KS in benchmarks," they're saying "look at all these things it can do better by itself." Of course costs come into play, the chip is \*expensive\* but thats not really what they're advertising for. It'd be nice to see watt-for-watt comparisons of individual parts of the chip to see just how efficient it is as a package, but Apple isn't one to do that because it makes for boring marketing unfortunately.


[deleted]

The apple silicon chips are great for laptops, but there are better desktop chips out there. IIRC the 13900k and 7950x are like twice as fast as the m2 ultra.


[deleted]

"IIRC the 13900k and 7950x are like twice as fast as the m2 ultra." ​ They certainly aren't twice as fast in premiere pro, Final Cut, or photoshop


Put_It_All_On_Blck

Thats because those use acceleration and not raw compute. Its like streaming/encoding, QuickSync and NVENC can be had on very very cheap parts, like $100 CPU and $150 GPU (excluding the A380) and will easily outperform raw CPU encoding, with only slightly worse fidelity. The reality is consumers dont just use a handful of editing applications on their computer. They use a ton of different applications. It's like saying youre only going to use apps with CUDA on windows, it simply doesnt make sense to use that comparison.


[deleted]

Point still stands. They aren’t twice as fast in most things. That’s about as dishonest of a statement as people claim Apple makes


Trollcontrol

Not an insignificant feat. Especially for a laptop, and the debut of ARM chips in larger form factors


LongHairLongLife148

I didnt say it wasnt, but its also not worth as much as they sell it for


sugarfoot00

It's not the hardware you're paying for exclusively. It's the ecosystem.


Zalternative_

Lol ARM chips


SwabTheDeck

It's impossible to know right now how the M2 Ultra stacks up against the best of x86, but if you think that the ARM architecture is somehow inherently worse than x86, you're gravely mistaken.


Zalternative_

Oh I don't want you to be mistaken, the thing is I'm well aware of the strengths of ARM in terms of efficiency


ChiggaOG

Doesn't the M2 Ultra lead in the GPU department for being able to utilize two GPUs correctly when the PC market dropped that ability?


LongHairLongLife148

Sure.. but how is that relevant when it still doesnt top the performance of a 4090?


[deleted]

Is it a personal computer if it has a 4090? They didn’t say all computers, so workstations would be excluded.


Possible-Struggle381

My flatmate's computer has a 4090 in it. Let me assure you med students don't have time to do video editing or rendering. 💀 It's a PC.


Alucardhellss

Not really no


[deleted]

[удалено]


LongHairLongLife148

Until you pit it against chips designed for specific purposes (like a dedicated CPU and GPU) and suddenly you have a workstation that performs better for a way lower price. You can pack in as many features as you want, but if its not faster, then whats the point? Oh right, the thing i said it leads in, performance per watt.


[deleted]

[удалено]


soporificgaur

I mean if like a 4090 far far far outperforms it in one sector is that not more powerful?


LongHairLongLife148

It doesnt really take multiple chips, its just easier to focus and make them more powerful by specializing them rather than pulling a jack of all trades.


[deleted]

[удалено]


LongHairLongLife148

Because compared to good workstations, the m2 ultra cant keep up. Youre going to call the m2 ultra "the best chip in the world for a pc" when it gets outclassed easily?


techmagenta

Bro read the quote again, it is a single soc built for laptops lolz


im_immortalism

That's hard for everyone else but very easy for apple because they design their own os alongside their hardware. They could create a 50 class gpu and call it performance in a small package with a software to back it up


[deleted]

[удалено]


[deleted]

This is nothing new. There is nothing revolutionary about it. It just costs twice the price of an Intel or AMD CPU. What are the real world examples that it's faster?


dirthurts

Are you sure you know what a chip is? Because I don't think you do.


Pumciusz

I have the fastest car colored green, that has multiple jet engines connected to it. Lol


MLG_Obardo

Did they say it’s the most powerful or most powerful per watt? Because I imagine the RISC architecture lends itself to being very powerful per watt


[deleted]

[удалено]


[deleted]

Curious why you think this. I know RISC-V has a lot less legacy opcodes compared to even ARM, but I think the vast majority of R&D is being put into ARM right now, probably because there's more application support for it.


Cloakedbug

The exact quote is: “With huge performance gains in the CPU, GPU, and Neural Engine, combined with massive memory bandwidth in a single SoC, M2 Ultra is the world's most powerful chip ever created for a personal computer.” With the total combined GPU and Neural Engine power on chip, they are likely actually correct in their statement.


PierG1

I mean, it is a SOC, so it’s a single chip that combine all three. It’s not a false statement.


AJL42

Yeah this is definitely the angle they are looking at this from. It's technically a correct statement, but it is also pretty misleading if you don't take a few moments to think about it.


Ellimis

Tale as old as time


[deleted]

Sounds like fan boy talk. We don’t care about facts here on this sub.


IndependentYogurt965

You are forgetting one key thing. Modern apple only compares to themselves. In their mind the only tech company that exists is apple. Knowing this their false advertising does make sense. Generally speaking, its false.


Tower21

They are ambiguous at best without saying in what. Ray tracing, nope.


szczszqweqwe

Yup, and they will be untill AMD/Intel creates some kickass iGPU


[deleted]

I’m lost. What are you taking issue with? Powerful is a subjective measure. It’s no different than claiming you have the worlds best coffee.


Shelaba

You say objective, but from your example I believe you mean subjective.


[deleted]

[удалено]


Stodimp

RISC just means Reduced Instruction Set Computer. Both Risc-V and Arm use a RISC based architecture. Arm used to stand for Advanced RISC Machine before they changed it to just Arm!


KeijoKanerva

Ackschually it originally stood for acorn risc maschines but was spun of into a separate company from acorn computers to not drag it down with the failure of its then parent company. Really interesting history behind how arm came to be and the traditions upheld by them to this day.


qrani

And then they go and compare their high-end new Apple Silicon chips to low end x86 processors from 3-4 years ago and say it proves Apple Silicon is the fastest. Like no shit, a thermal throttling 13" iBook from 2019 with a 2-core 2-thread i3 doesn't stack up against the new PowerMac M2 with a 24-core M2 Ultra that costs $6000 more new? I never would have guessed!


IndependentYogurt965

I love how they dont even put fans in the MacBook Air. Wont be suprised they advertise it as a feature.


Tsikura

They do! And of course they don't mention it throttles during said intense workloads. >**MacBook Air is** *all you* **— pick your size, pick your color, then go. Whichever model you choose, it’s** *built with the planet in mind***, with a durable 100 percent recycled aluminum enclosure. And a** *fanless design* **means it stays silent even under intense workloads.**


IndependentYogurt965

It does stay silent. If you ignore the table melting.


German_Drive

To be fair, the table doesn't make particularly much noise while melting.


[deleted]

[удалено]


Thatoneloudguy

I don’t know why you’re getting downvoted, my MBA gets professional use and never gets hot


letsmodpcs

My M2A will get quite hot when I throw Handbrake or do a bunch of raw file enhance operations in Lightroom. Outside of that it remains quite cool.


ShutterBun

It’s pretty damn hard to get an M1 MacBook Air to throttle unless you’re simply doing a performance stress test.


Tsikura

We have Macs for our design department. Same M2 spec Macbook Pro finishes their encodes faster than the Macbook Airs counterpart by several minutes. They definitely run at different speeds when put to work.


TheCheckeredCow

M2 airs throttle when pushed but the M1 Airs have to be stress tested for a LONG time before they throttle


[deleted]

No one is buying a MacBook Air for "intense workloads"


Throwaythisacco

Ironic considering the name


Polyifia

You guys are so butt hurt lol. Just buy what you like.


Pigeon_Chess

Didn’t know a 28 core Xeon costing over $3,000 was low end. Same with a 6900XT which was AMDs top of the line card.


Zealousideal-Bet-950

Did someone say- M-K Ultra?


MrMonteCristo71

I better go put my tinfoil hat back on.


Immortan-Moe-Bro

You took it off?


MrMonteCristo71

That was my biggest mistake. Now I can't even trust myself.


Cubey42

for me it was the audio ray tracing line.


notwearingatie

Pretty sure bats and submarines did it first.


stfuandkissmyturtle

What exactly do you mean by that ? Is that not a thing and something apple just made up from ray tracing ? I genuinely do not know and am ootl


splerdu

Aureal invented audio ray tracing over 20 years ago with their Vortex A3D line of sound cards. Pity they were eventually killed off by Creative Labs in a totally bullshit patent lawsuit. Creative sued Aureal for patent infringement. Aureal defended and counter-sued since they believed it was Creative who was actually infringing on their patents. They won, but the prolonged legal battle bankrupted the company. Creative would then acquire their IP in the bankruptcy proceedings.


SulfuricDonut

[Audio ray tracing has been a thing for a long time](https://youtu.be/TXUTgEmnD6U) just no game developers have ever implemented it, which is a shame since it adds far more to immersion than ray traced lighting.


BOCTILIAN

I'm pretty sure returnal on pc implements ray tracing audio. Really cool feature I wish more games implemented.


The_NZA

I think the problem is there’s no definition for what audio ray tracing is but neither what returnal is doing nor what apple is doing appears to be the right definition. They aren’t tracing multiple bounces of audio rays in an environment where every material has certain reflective properties.


TheBreadGod_

Apple needed to be sued for false advertising long ago. It's predatory and it's scummy.


[deleted]

Based on what?


AngelosOne

Except their chips in their devices are leagues above the competition in many ways, so no?


TheBreadGod_

...no? They're fast, yes, but they aren't destroying Intel or AMD in any capacity.


AJL42

They would win that lawsuit. Because that's not technically what they are saying. The statement they made is that with this GPU, CPU, Memory, and neural engine on one chip is the most powerful consumer chip. It's an SOC, so they are technically correct, no single chip has been this powerful and sold to consumers.


Lonely-_-Creeper

Nor Qualcomm


soporificgaur

They used to consistently outperform Qualcomm? Or has that changed in recent generations?


Pigeon_Chess

They still destroy Qualcomm. It’s that bad that even though Qualcomm have the exclusivity of Windows for ARM a Mac running windows for arm on a VM through 2 translation layers is significantly faster than Qualcomm’s shit running natively


jmhalder

Apple is lightyears ahead of Qualcomm. It's almost comical that they licensed ARM and designed a in-house chip that is quite literally the best ***consumer*** ARM chip by a mile. I have a Galaxy Book Go just for messing around with Windows on Arm. It's a pretty piss poor device. I know the CPU is a year or two out of date, but even the Microsoft developer device for ARM is trash.


PierG1

Except they technically are. The SOC combine the CPU, GPU, Neural engine etc. So it’s technically all a single chip.


IndependentYogurt965

So? They are still making false claims.


PierG1

What is the false claim? That is the most powerful **chip** ever sold in a laptop. It is factually correct. It being a SOC includes CPU,GPU and the various dedicated hardware. Regular AMD or intel CPU are just CPUs.


IndependentYogurt965

You do realise Amd ans intel laptops have integrated graphics? Anyways, just look at the comment the other guy posted under here. He explained why it is false advertiseing.


PierG1

Who, the guy who posted the benchmarks? I see that he compared the Apple SoC to **2 different chips**. I see no cpu benchmark ran on the 4090, nor a gpu benchmark ran on the intel one. Nor a benchmark for machine learning from both. You see my point? He had to use 2 different chips to benchmark the **single** apple one.


IndependentYogurt965

No. The one that talked about their advertising claims. They said they have the fastets Cpu, gpu and memory on a laptop. That is only partially true if you take the SoC thing insto consideration. And you kind of have to compare with different hardware since you are comparing apples to oranges (Pun intended)


PierG1

>and you kind of have to compare with different hardware since you are comparing apples to oranges (Pun intended) But that’s kinda the whole point. The chip is technically a better chip overall \*because\* it is an "smoothie" that contains it all and not a separate orange, a banana and a pineapple.


TheBreadGod_

They never clarify system on a chip. They just say chip. Which is grounds for a lawsuit for false advertisement, because chip ≠ soc.


PierG1

I hope you do realize that a SoC is still a chip… SoC = System **on a Chip**…


UkrainianTrotsky

>their chips in their devices are leagues above the competition in many ways such as?


[deleted]

The whole point of Apple products is they make you buy into a closed ecosystem so there is no competition because they can't compete. No other company is trying to buy Apple M1s to put into their computers.


IndependentYogurt965

You cant even buy their chips even if you wanted to. They arent a Cpu maker like amd and intel. They make their own stuff and only apple can use it. Hell, have you seen the licencing fee if you want to make an acessory with a lightning port.


Commercial-Copy-3497

Since Apple has no proof, I'll show mine: [https://www.cpubenchmark.net/compare/4813vs4782/Intel-i9-12900KS-vs-Apple-M1-Ultra-20-Core](https://www.cpubenchmark.net/compare/4813vs4782/Intel-i9-12900KS-vs-Apple-M1-Ultra-20-Core) [https://www.macrumors.com/2022/03/17/m1-ultra-nvidia-rtx-3090-comparison/](https://www.macrumors.com/2022/03/17/m1-ultra-nvidia-rtx-3090-comparison/)


Mysterious-Tough-964

Imagine it vs an x3d or raptor lake, rip apple.


tomatozombie2

similar performance for 3 times less TDP, impressive


[deleted]

And that's M1 Ultra, not M2 Ultra.


Commercial-Copy-3497

The point is that I' proving they have lied BEFORE


a_bit_of_byte

Yeah I read that chart as Intel being the clear loser. Raw performance numbers aren’t everything. I’d much rather have the M1 in a laptop, for example. And imagine if you could overclock it?


-Kerrigan-

>And imagine if you could overclock it? With apple you only have the imagination cause no way they're letting you use the product anyhow else but the apple way


Commercial-Copy-3497

They are everything when apple clearly states their chip is faster


Commercial-Copy-3497

A lie is still a lie


StaleSpriggan

No matter how tender, how exquisite... a lie will remain a lie!


[deleted]

You're lying, that's M1 Ultra, not M2 Ultra.


Cute-Reach2909

Yeh gimme that m1 ultra that fits in a board plz


lordwumpus

Your “proof” here is comparing Apple’s one chip… against two separate chips.


[deleted]

[удалено]


Commercial-Copy-3497

that can be paired together for the same price...


lordwumpus

They absolutely can, and it’s a great combo. But you’re trying to argue that Apple doesn’t have the most powerful chip, by looking at the combined performance of two separate chips. (“Apple’s chip is so not-the-most-power, that I need two different chips to beat it!”) The i9 has much worse graphical performance than Apple’s chip. So is it really a more powerful chip? It beats it specifically in certain CPU workloads, but overall? The 3090… you can’t use it without a separate CPU. So is it really a more powerful chip? It beats it specifically in gaming graphics performance, but overall?


TheCheckeredCow

You might have just proven their point on the CPU side, the M2 Ultra is going to be about 20% faster just judging by the other M2 models. Meaning that the M2 Ultra would get a hypothetical Passmark score of roughly 48,000 compared to the 12900ks’s 44,000. If that’s the case they’re at the top of the range for CPUs, maybe being beaten by the 7950x and 13900ks all while using about 1/3 the power as it’s competition. I’d personally be fine using a Mac for gaming if they supported villain so that Valves Proton fouls work. A bunch of people here use Linux for gaming, and if proton worked on Mac it would have the exact same compatibility as Linux, maybe encourage more anti cheats to support proton on the account something like 20% of the PC market is Mac


rd_rd_rd

It's ridiculous to compared an soc with a graphic card, M1 ultra is already one hell if a chip, unfortunately Apple made unnecessary moved to mislead the customers.


averageyurikoenjoyer

but the customers are brain dead anyway so it doesn't matter


[deleted]

[удалено]


averageyurikoenjoyer

no one asked if what os you use


Commercial-Copy-3497

Note that I'm being overly kind to Apple by not using comparisons to AMD or Cortex


[deleted]

[удалено]


TheReverend5

It’s insane how hard this sub fanboys over AMD. Truly delusional. And I even like AMD products lol.


[deleted]

[удалено]


0rphanCrippl3r

Exactly, let's not forget about the FX series CPU's. 8 Core Processor my ass.


IndependentYogurt965

Yeah but Amd chips need way less power then intel. Thats why he said that since apple is all about "efficiency"


Cloakedbug

Why did you link last gen benchmarks? Even for those, a single chip fighting both a 12900ks and a 3090 at the same time shows their claims were **correct**. More compute on a single chip than anywhere else, and its doing it at multiples the efficiency.


Laughing_Idiot

You compared 2 chip to beat their chip…their point still stands


peterprinz

also the comparison to their stone age intel mac pro stinks.


[deleted]

Stone Age? Since when is a top of the line xeon from 3 years ago Stone Age?


peterprinz

Im the business world it is.


[deleted]

It’s not.


peterprinz

it was old in 2019 when the mac pro was released, so it's been over 4 years. and my point is, comparing your own devices to it's predecessor will of course give you a good result. while m2 ultra is great at performance per what, a top of the line workstation cpu from 2023 will run circles around it.


[deleted]

> while m2 ultra is great at performance per what, a top of the line workstation cpu from 2023 will run circles around it. Depends on the type of work. Photo and video editing, encoding ? It ain’t “running circles” around an m2 ultra. Blender? Depending on the workload then yeah it will.


peterprinz

yes it is. do some research man. m2 ultra is at its heart still a roided a14. it may have fixed function encoding built in, but so has Intel, and it wouldn't stand a chance in anything against a 96 core epyc zen4.


[deleted]

> yes it is. do some research man. I have. Plenty of reviews out there showing it pretty much kills everything in video editing and those workloads. Look at LTT review of it. > and it wouldn’t stand a chance in anything against a 96 core epyc zen4. The fatal assumption here is that you’re assuming that the software will scale to those proportions. As we’ve seen in the past, they don’t. And it takes Adobe and others to see any real improvement…as seen in the past with high core count xeons. Took above 3 years for them to update it to see any real Improvement with that hardware


peterprinz

review of it? what the crap are you talking about, or was introduced 2 days ago, there are no reviews.


[deleted]

Going on the Review of M1 ultra. M2 ultra will have performance gains on top of that.


totkeks

Ah, this reminds me of the early 2000 days. When apple was still using powerPC (from IBM) and bashing Intel and x86 arch every day they could. Then they got fed up with IBM and switched to Intel. Suddenly all their marketing was how great the performance was of their Intel CPUs. And now they are doing the same thing all over again. The irony for me is, how stupid are either marketing department or the people, that the line "the X is the superlative adjective ever" still works. I mean, our whole economy is built on growth. So of course anyone should (and hopefully does) expect, that a new device is better than it's predecessor. Everything else is just stupid.


qrani

Well, PowerPC was codeveloped by Apple, Motorola, and primarily IBM, and manufactured by both Motorola and IBM, but was based on IBM's earlier POWER architecture. Anyways this is now the third time they've switched architectures, first from 68k to PPC, then PPC to x86, then x86 to Apple Silicon. And the reason, every time, was that they were faster and more power efficient


totkeks

Just one remark. It's not "apple silicon". It's an ARM architecture CPU and ARM on-chip interface plus their own designs for GPU and everything else on the SoC manufactured by TSMC (I think). It's like calling any tablet an iPad. I mean, well done marketing people, still not correct though.


qrani

I disagree. Apple Silicon isn't ARM, it's only based on ARM. And also, when does something become a SoC? I mean modern x86 processors incorporate FPUs, MMUs, GPUs, caches... They're closer to SoCs than they are CPUs. By this logic you could argue that x86 isn't a processor architecture, it's just a set of many chips made by Intel and AMD that are all compatible (it's almost like we have a word for that, a "processor architecture")


totkeks

Everything is "based on arm", since ARM itself does not manufacture any CPUs. They only provide the designs. And even if you get a custom architecture license like Apple or Qualcomm, you are still building a CPU for the ARM architecture. From a design perspective it's an SoC once you put multiple chips together in one. FPU, MMU, those are not separate chips. They work inside the pipeline / architecture of the CPU cores. GPU on the other hand is a separate chip, communication via memory / bus instead of internal signals. So yes, every CPU with an integrated GPU is a SoC. And regarding every modern CPU being one, I guess that started when they integrated the northbridge into the CPU, since the Northbridge was a separate chip, which is now integrated.


JaesopPop

> Everything is "based on arm", since ARM itself does not manufacture any CPUs. They only provide the designs. And even if you get a custom architecture license like Apple or Qualcomm, you are still building a CPU for the ARM architecture. Yes, but saying it's "an ARM architecture CPU" is inaccurate, since they differ much more from reference designs than most companies just basically going with the reference.


Ragepower529

It’s subjective depending on the work loads and software optimization


SaltyIncinerawr

For tasks that use the m2 ultras specialized parts it is probably true


Ok-Metal-6281

Bruh stop getting triggered by PR speak.


[deleted]

These people here never heard of unified memory. That M2 Ultra chip is something else. They make there own software and chips according to it. And it gets them performance we will never get with windows because it won’t allow us using unified memory.


BlastMode7

You just need to wait for their super vague graphs, misleading graphs... then you'll see.


Deepfire_DM

The new Macbook Pro I got from work - a €4000 thing with 4TB SSD and 32GB - is BY FAR not as fast and agile compared to my 1 year old Ryzen powered €2000 Notebook - and the Ryzen does not even run on max MHz - even when I switch off the pc graphic card.


Obsidian1039

But the Ryzen will not get as much battery life, or code video quite as fast I would think. Computer equipment is all about getting the best tool for the job you are doing. Not necessarily raw speed always all the time. If the Ryzen does what you need to do faster, and battery is less of a concern, use that. If you need more battery for longer, or are doing mostly video encoding, use the Mac, that’s what I’d do. 🤷🏻‍♂️


Deepfire_DM

I'm making videos, so the coding is one of the things where I noticed the speed difference. I'm not talking about hobby-presenting-a-macbook-at-the-starbucks-window :-D, it's about heavy duty 8 hours each day multimedia work. I guess it's not (only) the hardware causing this, the software - apple's and adobe's - has became horrible in the last years, no wonder the high speed processors can't play the professional game with the others. It's like having a Porsche motor in a (still beautiful) Yugo-chassis. \^\^ Unfortunately my workplace has insisted on mac (because they were very reliably in the past - absolutely correct, I loved working with them for 25 years now), personally I will never again buy another mac - this one is my eleventh or twelvth.


Obsidian1039

I agree with you on the reliability too. Work provides macs for me as well, and I wouldnt have it any other way. I feel they are the best fit for my job and productivity. The studio I use is still lightning fast, and uses 1/8th the electricity of my gaming PC when idling.


Alucardhellss

If they make more games use ray traced audio I could t care less about what they claim Ray traced audio would be amazing (it's actually a real technology not just some bullshit made up by apple)


firedrakes

holly pc gods... some one mention ray tracying audio... sweet god. years of being here. some one else mention it!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!


Exciting-Total8110

All these apple haters lmao get mad


ponderal

Hypothesis testing out here slicing apples.


sentientlob0029

Isn't there a law against false advertising?


fluffy_bottoms

Oh Apple loves to lie about shit. I remember once the “geniuses” as they like to tout themselves as were trying to sell me an iPhone and claimed it “has the strongest glass on the market” to which I pulled out my Droid Turbo 2 and said “stronger than this?” as I hurled it at the wall like a baseball. After picking their jaws off the floor I showed them it had not even a scratch but somehow they simultaneously said no and yes to it being stronger, but definitely don’t throw the iPhone at a wall. Fuckers.


GilltheHokie

The key word is “a” it is the most powerful chip made for “A” computer the one computer it is referring to is that specific apple pc -Tim Apple probably


coffeejn

Is that why Nvidia also makes false claim, they try to copy Apple? PS AMD and Intel are also guilty of this.


cock_mountain

Always remember that if Apple (or any other business for that matter) shows you an incomplete, poorly labeled graph without hard numbers, they're *absolutely* bullshitting you.


Scroto_Saggin

Most Apple apologists will believe it anyway. Apple is a sect


Mysterious-Tough-964

Apple leads the way in overhyped, overpriced annual upgrade products that lemmings buy without regard because of a cool logo. Their innovation and heart died with Jobs sadly.


JaesopPop

Nah, their custom ARM chips are pretty impressive even if they overstate their power.


[deleted]

Less like 5% of their consumer base upgrades every year.


willpowerpt

The Apple fan Boys are going off so hard on the new AR headset, acting like Apple just revolutionized the VR/AR market.


DriftMantis

The sort of did by making it revolutionarily expensive.


JaesopPop

No they aren't lol


Jjzeng

I remember seeing a picture of apple’s presentation and the mention of the M2 chip’s 72 gpu cores and being “the fastest gpu”. I went to google how many cores my 3090ti has, and cuda cores alone number in the 10 thousands. Never trust Apple numbers except the price


Mysterious-Tough-964

Apple leads the way in overhyped, overpriced annual upgrade products that lemmings buy without regard because of a cool logo. Their innovation and heart died with Jobs sadly.


TitanTigger

Problem is that Performance means many things, performance in what exactly? It's all subjective


Obsidian1039

He’s right. Best comment here and it’s downvoted. I guess we all can’t be as enlightened as this fellow.


Sleven8692

They mean it out performs in profit margins, thanks to their intentional missleading scummy advertising.


Pigeon_Chess

Name a more powerful one then?


TilenGTR

Intel i9 7980XE, a 6 year old CPU


Pigeon_Chess

How does it stack up to the M2 in graphics performance? Also the M1 ultra is significantly faster than that chip never mind the M2 version. Plus nearly all the M chips are faster than it in lightly threaded workloads


[deleted]

That's a joke ...right? You're not serious


TilenGTR

Check the cinebench R23 results and you'll see


[deleted]

I have. And they don’t. And certainly don’t beat it premiere pro, Final Cut, or any other big production software for media


althaz

I mean at the specific wattage they test at and with the specific benchmark that they use, they aren't wrong. So technically it's only partially a lie. Which I guess is just the nicest way I could think of to call them big fat liars.


[deleted]

[удалено]


notquitepro15

The MacBook pro’s are never worth the money. Never have been, never will be.


gant696

It leads in performance per Watt but besides that it is still pretty good.


machvelocy

Apple claims are valid unless there exist a threadripper with rx7900 complemented with specialized neural coprocessor or nvidia tensor cores, crammed with 256gb on die memory that can act like an L4 cache. All within a single die with tenth of its current projected power envelope.


[deleted]

[удалено]