Mark Gurman posted an update saying the M2 Extreme chip had likely been cancelled. I’m going to assume what Gurman is reporting is accurate, because it generally is.
Workstation chips are expensive to design and produce. CPUs like the Xeon are sustainable because Intel splits the development cost over a wide range of customers. It didn’t seem sustainable for Apple to design a workstation chip just for the Mac market. According to Gurman, this is likely a factor in the cancellation of M2 Extreme.
If Apple is refocusing the Mac Pro on customization and modularity. I think that’s a very good thing. But Apple should go a step further – and bring support for third party GPUs back to Apple Silicon.
In general usage – the 64 core GPU in M1 Ultra has lagged behind Nvidia and AMD’s previous generation of GPUs. And Nvidia and AMD’s latest GPUs are even better. A 72 core GPU in an M2 Ultra will be a relatively lackluster upgrade that won’t keep up with AMD and Nvidia. If Apple can’t produce the GPUs they need right now, why not support Radeon GPUs on Apple Silicon?
It might be a painful decision for Apple to make. Having a single GPU vendor producing a single GPU driver on the Mac would writing applications simpler for Mac developers. Both Apple and developers have been looking forward to only needing to target a single GPU with a single set of optimizations. iPad and iPhone apps also rely on Apple SoC GPUs to work properly when running on the Mac.
But – adding third party GPUs back into the mix might be managable. Every Mac – including the Mac Pro – would still ship with an Apple GPU. Software that requires an Apple GPU would still be able to run. Apps and games that were more flexible could run on a third party GPU or eGPU when present.
It seems – mostly – like win for everyone. Consumers could plug a MacBook Air into an eGPU to play the latest games on a big monitor. Pros that demanded the best GPU performance could continue to use one or more MPX cards or eGPUs. Because Apple Silicon is available on every Mac, developers would continue to optimize for it. Developers would still need to test and optimize for Radeon GPUs separately – but it’s better than the alternative of users leaving the platform for greener pastures.
49elg9
h8fxwb
crlsqz
r50o8e
qemd41
lxak6j
jxkfpg
zdelwl
8v0iqw
l88uil
9znnf7
7r1tx3
8i4q4g
q4o86f
lj93zi
fy3ado
3nofsm
2vowro