Mac ComfyUI Performance: Current Beta Limitations and Future GPU Acceleration Prospects

Hello everyone,

I am a MacBookPro user: M3Max+128G+8T. I know that the ComfyUI desktop version for Mac is still in beta, and I also know that Apple’s ARM-based chips use MPS and Metal to run and accelerate ComfyUI, but the performance efficiency is far behind NVIDIA CUDA computers… I’ve learned that currently, ComfyUI on Mac can only generate and edit images which is good; for video generation and heavy training like LoRA and other large models, we still have to switch to WebUI. I’ve even heard that some people running ComfyUI on Mac have burned out their CPUs. This means that Mac users currently cannot fully experience all aspects of ComfyUI. Some ComfyUI communities directly advise me to switch to an NVIDIA-based PC… So, I want to ask: is it possible that the ComfyUI desktop version running on Mac will increasingly approach the acceleration effect of native NVIDIA GPU in the future…?

If my opinion is incorrect, please correct me.

Thanks,
Cao

Hi. I have the same MacBook Pro! :slight_smile: Honestly, regarding Mac performance with AI, it’s not gonna come anywhere near Nvidia CUDA soon. Nvidia are far far in the lead on AI hardware, with any AI/ML solution being developed for their CUDA architecture first. ComfyUI can give us a tool that works on Macs, but AI performance will always be crippled by the underlying hardware. I’d love to see Apple enter the game on AI, but their investments in this space have really been disappointingly minimal. Coupled with that their long-running feud with Nvidia, a partnership between the two any time soon (even allow Nvidia eGPUs to work with Macs) is very unlikely. I bit the bullet and purchased a PC, for the first time in 25 years… just to be able to engage in this brave new world of AI, and don’t regret it.

If you want a detailed explanation on why AI/ML performs best on CUDA, I’d advise (ironically) asking Gemini or ChatGPT…. that will inform you best.