This is the new Apple Silicon Mac Pro

News

HomeHome / News / This is the new Apple Silicon Mac Pro

Jan 03, 2024

This is the new Apple Silicon Mac Pro

The Mac Pro might not look different from its predecessor on the outside, but on

The Mac Pro might not look different from its predecessor on the outside, but on the inside, Intel's Xeon CPU and AMD's Radeon Pro graphics are gone, and in their place we have a new chip called the M2 Ultra. This is the same chip in the new Mac Studio; it has a 24-core CPU and an up to 76-core GPU, and it starts with twice the memory and SSD storage of the old Mac Pro. Apple promises it will be "3x faster" than the Intel Mac Pro. Memory tops out at 192GB. These stats all match the new Mac Studio—the only thing you get from the bigger chassis is expansion capabilities and more ports.

The whole point of a Mac tower is support for traditional expansion cards, and that normally means discrete GPUs. Apple demoed some expansion cards, but none of them were graphics cards. It sounds like you’ll be using the M2 Ultra's on-board GPU. Making real graphics cards work with an ARM chip would have been a massive undertaking—for starters, no ARM drivers exist. Even for the non-GPU options, compatibility will be an interesting problem. Apple calls out digital signal processing (DSP) cards, serial digital interface (SDI) I/O cards, and additional networking and storage as PCI express card possibilities.

Apple's transition from Intel to ARM is now complete, and there's no denying they’ve done a fantastic job. The competition is catching up, but for now, especially the Mac laptop lineup is in the best state it's ever been in.

Follow me on Twitter @thomholwerda

This is very good, but you’d essentially be paying $3,000 for a motherboard and chassis.

Here is the Mac Studio ($4000):https://www.apple.com/shop/buy-mac/mac-studio/24-core-cpu-60-core-gpu-32-core-neural-engine-64gb-memory-1tb

And here is the new Mac Pro ($7000):https://www.apple.com/shop/buy-mac/mac-pro/tower

Same CPU, GPU, RAM and SSD. Supports the same number of displays. Both has same Wifi and Ethernet (10Gbe) option.

The only difference is on expansion. There are more USB4 ports (type-C): 8 vs 6, and USB-A ports: 3 vs 2, and HDMI ports: 2 vs 1. And of course it comes with a free mouse!

(Okay, there is the option to add seven PCIe cards, which is respectable for a workstation. But again, except for the motherboard, almost everything else is the same).

I don't think they support PCI-e GPUs, so this mac pro may be even more niche than the previous one. Since it's just basically a mac studio with PCI expansion, which may be few use cases that are not covered by the studio.

I was expecting a more beefy SoC, specially given the thermal envelope of the chassis.

To be honest, they use the power/cooling headroom for potential upgrades. But then, the machine easily becomes $10,000+

As for the GPUs, yes, unless they make up with nvidia, I don't see drivers arriving to fix the current situation. And while having 7(?) full sized nvidia GPUs for machine learning would be really nice, then it puts the M2's own tensor cores into question.

Even storage would be overkill. Assuming you’d use PCIe x4 nvme, and maybe port multipliers (bifurcation), one would need to have 7 x 4 x 4TB (largest size I know today) = 112TB storage to saturate those ports. And that is an overkill. (I mean, there are use cases that need such storage, but you are better going with a dedicated server at that point).

Circling back, this might signal a hope for a future GPU expansion. (I did not include AMD or Intel GPUs, since they are practically unused in machine learning at the moment).

I think there is still plenty of bad blood between Apple and NVIDIA, so I doubt CUDA's AI libraries are going to make it to macOS/ARM.

It's such a low volume product, that I don't think even AMD or Intel want to invest the time and effort to get Apple's business for discrete GPU support on Apple Silicon.

I would be intersted in seeing what the topology of those PCIe slots is, since previous AS had fairly limited PCIe controllers and Phys.

Yeah, Intel isn't likely to do Apple any favors, even if it could make a decent discrete video card. AMD is their only hope. absent a Nvidia resolution.

I was expecting a more beefy SoC, specially given the thermal envelope of the chassis.

Yes. I was expecting something quite different to this myself. Socketed SoC, user replaceable RAM (maybe hard with the SoC, but i’m sure you could do it…), multiple sockets for SoC's

Basically, i was expecting you to be able to option it with everything from a bog standard plain M2, all the way up to multiple (2? 4?) M2 Ultras, with support for terabytes of user replaceable RAM. Heck, i’d have been happy with replaceable RAM even if it was proprietary, the clone makers would reasonably quickly make compatible products.

The new ARM Mac Pro somehow seems even more limited than the trashcan Mac.

The123king,

Yes. I was expecting something quite different to this myself. Socketed SoC, user replaceable RAM (maybe hard with the SoC, but i’m sure you could do it…), multiple sockets for SoC's

Basically, i was expecting you to be able to option it with everything from a bog standard plain M2, all the way up to multiple (2? 4?) M2 Ultras, with support for terabytes of user replaceable RAM.

I agree. Apple's "mac pro" category, once known for giving power users options and flexibility has been reinvented and and is getting more restricted. IMHO it's become harder to understand who this is for. While someone can make the argument that enterprises and professionals can afford the mac pro's $7k entry level price point, I would think the expectation for high end workstations is that they are extremely flexible in the ways you describe and can be easily upgraded for years. This is the opposite.

Heck, i’d have been happy with replaceable RAM even if it was proprietary, the clone makers would reasonably quickly make compatible products.

The new ARM Mac Pro somehow seems even more limited than the trashcan Mac.

Personally I wouldn't be happy with proprietary ram. Consider the related topic of apple proprietary storage on the 2019 mac pro, were any 3rd parties able to bypass the DRM to produce clones? I searched but couldn't find any, and even if they achieved it how confident can users be that apple wouldn't block the clones with a future update?

https://everymac.com/systems/apple/mac_pro/mac-pro-tower-faq/how-to-upgrade-mac-pro-tower-rackmount-ssd-storage.html

Unfortunately, because the SSDs are controlled by Apple's T2 security chip, it seems unlikely that more affordable third-party replacements for these modules will become available.

As far as I know 3rd party NVMe solutions for the mac pro have their own PCI addon cards and need their own macos drivers.

Third-Party Internal SSD Upgrade Options (PCIe)

Thankfully, because the "2019" Mac Pro models have eight PCIe slots, third-parties have been able to release PCIe cards equipped with fast NVMe storage for literally hundreds of dollars less than the default Apple upgrade options.

I had previously read that some users needed to disable macos's "system integrity protection" to load 3rd party drivers. It makes me wonder if these PCI cards are compatible with the m2 mac pro? Honestly, as someone who's been accustomed to vendor independent standards and being able to install off the shelf hardware all my life, I would find it extremely annoying and regressive to go to proprietary hardware with a single vendor having so much control over my upgrade options.

And no one will ever make PCIe cards for it (ok, Apple will probably release a handful at the start, and never touch it again for 6 years like the trash can model). Who would when a main purpose of their Metal API is to lock out 3rd party GPUs?

I have a suspicion inside of the black box is basically a small Mac attached to the "Motherboard" via a couple thunderbolt cables and the motherboard is just a big hub that excepts PCIE cards

They don't need any cables, if they already are in charge of the design.

It would be just extended traces on the motherboard.

Thom Holwerda,

Apple's transition from Intel to ARM is now complete, and there's no denying they’ve done a fantastic job. The competition is catching up, but for now, especially the Mac laptop lineup is in the best state it's ever been in.

It seems premature to say "they’ve done a fantastic job" without reviews and verified benchmarks. I feel too often apple are given way too much leeway in the press before data comes in.. Maybe they will have done a fantastic job, but surely that must be contingent on how well it actually does in the field, right?

Toms hardware suggests m2 ultra could be 20% faster than m1 ultra, and if user experience bares this out then that's good.https://www.tomshardware.com/news/m2-ultra-mac-studio-specs-price-release-date

The M2 Ultra chip will have 24 CPU cores (16 performance cores and 8 efficiency cores) for 20% faster performance than the M1 Ultra, with options for up to 76 GPU cores.

IMHO the inability to add dedicated GPUs is really disappointing though considering it's one of the main benefits of mac pro's form factor. I get that apple doesn't want to outsource hardware any more, but it seems like macpro users are going to be disadvantaged if they’re limited to the iGPU for AI & render farms work while non-apple users have a lot more scalability. I expect apple will eventually have to offer more in terms of GPU, but who knows how long they’ll have to wait.

It would’ve been interesting if Apple had offered up an add-in card based on their GPU, and make those extra slots actually worth something, but oh well.

Drumhellar,

It would’ve been interesting if Apple had offered up an add-in card based on their GPU, and make those extra slots actually worth something, but oh well.

Yeah, we talked about this for the M1. Theoretically they could make some kind of M2 gpu add in card with dedicated power and memory that behaves and scales a lot more like other dedicated GPUs. Amd & nvidia devs take this for granted, but apple kind of painted themselves into a corner with the shared/unified memory model. Of course in principal software can be rewritten to support GPUs with dedicated memory. However considering these mac pros have a $7k starting price (plus whatever this hypothetical PCI addon GPU would cost), how many mainstream developers would even bother to target such niche hardware? I’m not sure, but it seems to me that the best way to encourage software support for discrete GPUs is to include discrete GPUs in some MBP laptops.

It might not happen in the near term, but I do think apple is going to have to cross this bridge at some point if they don't want to loose demanding customers to platforms that offer expandable GPUs.

It is such a low volume, that it is not worth the bother to do a GPU-specific silicon part, much less a board build around it.

javiercero1,

It is such a low volume, that it is not worth the bother to do a GPU-specific silicon part, much less a board build around it.

Or they could use the same m2 chips they already have and just put them in a cluster with high speed interconnects. I’d agree apple may not be interested in targeting this market, but if apple allowed other OEMs to use their m2 chips there would almost certainly be 3rd parties interested in doing it. Alas, apple is unlikely to allow it, but there would be interesting opportunities for innovation if they did.

@ Alfman

These things don't just sell magically. They require a lot of support, specially in terms of drivers, customer service/engagement, etc. Stuff which is not part of apple's business model whatsoever, nor do they have the corporate culture for it. It would need too much investment for little return in the big scheme of things.

Apple does AS to sell their services and HW. They seem to have decided that supporting discrete GPUs is not a use case with enough return for the investment. So that market will go elsewhere. There are much better alternatives anyway.

javiercero1,

These things don't just sell magically. They require a lot of support, specially in terms of drivers, customer service/engagement, etc. Stuff which is not part of apple's business model whatsoever, nor do they have the corporate culture for it. It would need too much investment for little return in the big scheme of things.

I wasn't talking about apple doing it themselves, as I said earlier "I’d agree apple may not be interested in targeting this market". 3rd parties however would be very interested and they already build clusters using other ARM CPUs. I have no doubt there would be interest in OEM applications for m2 if apple would allow them to do it.

Apple does AS to sell their services and HW. They seem to have decided that supporting discrete GPUs is not a use case with enough return for the investment. So that market will go elsewhere.

User's used to be able to plug in eGPUs to expand their rendering capacity. I think the GPU shortcoming is going to be a growing problem for apple users.

There are much better alternatives anyway.

Are there better ARM processors than apple's m1/m2? If not then I think there is demand. Whether apple chooses to participate is a different matter.

@ Alfman

The "3rd parties" are the "customers" I was talking about. Apple still has to provide support for their SoCs to them in terms of drivers, engineering/system integration, never mind the sales channels that as of right now Apple has ZERO experience in. So it makes no sense whatsoever for them to waste their time and effort in that market. Specially since they’re capacity limited as it is in terms of their SoC production.

Workstation is not defined by ISA. Right now, if you want to do heavy duty GPU compute it is all about CUDA, and CUDA still works best on a linux environment (or Windows to a slightly lesser extent). So Apple is not really interested in that market either, since NV is too entrenched as it is.

It does not bode well that Apple's own website only compares it to the previous version, which means it probably can't compete with just an RTX 4090. Very problematic when you price it at $6K and expect businesses to expense it.

dark2,

It does not bode well that Apple's own website only compares it to the previous version, which means it probably can't compete with just an RTX 4090. Very problematic when you price it at $6K and expect businesses to expense it.

Yeah totally. Apple are kind of notorious for that. Whenever they do compare themselves to competitors, they often censor their data, procedures, benchmarks and hardware models being compared in order to make it impossible for anyone to independently reproduce their results.

It would be an apples to oranges comparison. They both have very different design goals.

On raw power, I think 4090 would have a significant lead. It also is compatible with most modern gaming (which Apple silicon almost completely lacks), and has more features (like DLSS).

However, M2, especially M2 Ultra offers much better RAM. The unified RAM goes up to 192GB of RAM, whereas 4090 is stuck on 24GB: https://www.techpowerup.com/gpu-specs/geforce-rtx-4090.c3889. This is a major difference for some workloads, like machine learning which is memory heavy. (At least currently).

Additionally RAM bandwidth is roughly caught up: 800GB/s vs 1,008GB/s.

That is why Apple mentioned LLMs. They don't mention this but deep embedding based recommendation systems are also memory heavy. They would all significantly benefit from M2's setup.

(As a reference: nvidia's 80GB memory A100 card is about $16,000)

sukru,

On raw power, I think 4090 would have a significant lead. It also is compatible with most modern gaming (which Apple silicon almost completely lacks), and has more features (like DLSS).

However, M2, especially M2 Ultra offers much better RAM. The unified RAM goes up to 192GB of RAM, whereas 4090 is stuck on 24GB:

I’d like to see benchmarks to compare the performance at various ram densities, but assuming the performance doesn't degrade too much that could be very interesting combined with a top tier GPU. That's the thing, their otherwise strong platform is let down by rather limited GPU hardware. On top of that they’re not supporting open standards like vulkan which hurts apple users. I feel they should just bury the hatchet and allow owners to use the best GPUs on the market. With a huge amount of low latency ram, it could have actually been a very compelling system. I think I’d be happy with their CPUs but disappointed with their GPUs. Who knows, maybe they’ll deliver better GPUs in the future or maybe they won't.

It is unlikely Apple will capture any of the AI market with this. First the form factor isn't a rack mount server. Second the Metal API is known for not supporting double floating point. Third, Nvidia won't be hamstrung by needing to provide the G part of GPU in their enterprise offerings. They can just offer cards that are pure AI hardware acceleration and nothing else.

Alfman,

The Apple "excuse" is that Metal predates Vulkan by about two years. In any case, there is an open source effort to bridge the gap: https://github.com/KhronosGroup/MoltenVK. But they frequently run into hard issues.

dark2,

These can still be used for LLM inference, which is practically impossible on current consumer or even workstation GPUs. They rank around 100GB or so: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard

And then there is also this:https://www.apple.com/shop/buy-mac/mac-pro/rack

But you need to spend another $1,000 for the rack version.

Sukru,IIRC, Vulkan is based on AMDs Mantle effort which predates both Apple's Metal and Microsoft's DX12.

Apple's excuse here is no more valid than with regards to Lightning predating USB-C.

Apple's transition from Intel to ARM is now complete, and there's no denying they’ve done a fantastic job.

No, they blew it. Just because they pretend Thunderbolt eGPUs and PCI-E GPUs don't exist, it doesn't mean they don't. This problem has been a problem since the first Apple Silicon chip launched and Apple has been dancing around the issue ever since, and it seems like they’ve hit a design limit with their iPad-derived ARM64 chips here. They got a pass with the Mac Studio, since it's not meant to be a Mac Pro, but do they really want us to believe that a professional who has been accustomed to a pair of Radeon W6900X GPUs or a quartet of W6800X GPUs (2×2) would be satisfied with a pair of SoC GPUs in the M2 Ultra, no matter how overgrown? I understand why they don't want to spend money to deal with the issue given the limited share of the Mac Pro compared to other Macs, but some humility about the fact the outgoing Intel-based Mac Pro would be the last true MacOS workstation would be appreciated instead of all the boasting.

No "true" Scottish workstation.

I assume the amount of Mac Pros apple moved is ridiculously small, and probably AMD is not willing to invest any more time/effort in doing any more GPUs for the macOS market.

Honestly, it sounds like Apple is just conceding the discrete GPU workstation market to Windows/Linux. The AS NPUs are fine for executing models, not necessarily for training them. Sounds like they are not even bothering entering the general AI development market, or at least not the one dealing with training huge models. Also they are not bothering with the high end 3D pro market either.

Sounds like Apple is just expecting their workstations to be mainly Final Cut Pro or Logic seats, or similar AV type of pro use cases.

In any case. If you want to do high end AI development stuff, just get a decent x86 Linux Workstation with some beefy NV GPUs and call it a day. That market is cornered right now with the CUDA/Linux stack.

Agree. Apple already left the servers market, and the actual workstation market is next to go. They did ARM-based Mac Pro mostly to be able to claim that they had entirely moved to the ARM realm. It is essentially paying a lot of money for the ability to get some niche PCIe cards attached to the Mac Studio computing power. The next generation probably will replace it with an external enclosure attached to the Mac Studio – which is, in essence, a "trashcan" Mac Pro done right.

It's probably massively complicated to add the required plumbing for standalone graphics cards, but I’m sure they will get there at some point. Integrated solutions aren't able to handle all workloads but apparently the demand for something more is not all that huge — at least in the Mac bubble.

So that's the end of Intel Macs and therefore also Hackintoshes.

1. Last Intel Macs sold (Mac Pro) in 20232. Add 5 years of OS updates (macOS 14 just came out supporting no devices older than 5.5 years)3. Add 2 years of security updates to the last supported OS version

So I think the Intel Mac Pro and Hackintoshes will lose support in 2030.

I think it will be sooner. The PPC to x86 transition was faster. Tiger was the first x86 compatible version, followed by Leopard which was still PPC compatible, followed by Snow Leopard which dropped PPC compatibility. I already expect Sonoma to be the last or second to last x86 version. Sad, but we had a good run.

As far as Hackintoshes are concerned, it should be sooner as getting past the T2 chip requirement is going to be difficult, if at all possible.

Transition to ARM is not complete until Apple starts properly supporting internal peripherals such as dedicated graphics cards.

sj87,

Transition to ARM is not complete until Apple starts properly supporting internal peripherals such as dedicated graphics cards.

Your point is valid for users left without an upgrade path, but apple themselves may not see it that way. They may be intent on dropping intensive GPU use cases altogether such that they consider the ARM transition officially complete without any GPU upgrade path.

The big question I have is how many real apple customers care about this? The answer will determine whether apple can afford to merely shrug them off, or if these users have enough clout within apple to demand better dedicated GPUs in the future. The thing is, a non-insignificant number of apple users are creative types who are fans of macos, but their time is money and apple's lack of GPU upgrade paths is a sore spot.

How much time can a macos professional afford to loose? Admittedly it's not an easy choice especially as Microsoft are making windows less appealing and more tethered by the version while linux is harder to support and does not have as much software.

I think Nvidia could have sued Apple about them not signing their drivers (and probably still could to this day in regards to ARM macs) but likely assessed after Apple destroyed their 3D workstation market with the trashcan that the lawsuit would cost more than future GPU sales to Mac workstation clients (especially with Apple pulling "dropping OpenCL" and other things that would just make it harder for competing GPUs). Apple has relegated themselves to video editing and being a status symbol.

You must be logged in to post a comment.