- Parallel Computing Toolbox
- GPU Coder
- Image Processing Toolbox
- Deep Learning Toolbox
- Statistics and Machine Learning Toolbox
- Computer Vision System Toolbox
- Signal Processing Toolbox
- Communications Toolbox
- Phased Array System Toolbox
- Text Analytics Toolbox
- Reinforcement Learning Toolbox
You are now following this question
- You will see updates in your followed content feed.
- You may receive emails, depending on your communication preferences.
Can I use MATLAB with an NVIDIA GPU on macOS 10.14 Mojave and newer?
187 views (last 30 days)
Show older comments
Can I use MATLAB with an NVIDIA GPU on macOS 10.14 Mojave and newer?
Accepted Answer
MathWorks Support Team
on 2 Sep 2022
Edited: MathWorks Support Team
on 24 Mar 2021
MATLAB requires that an NVIDIA-supplied graphics driver be installed on your Mac in order to take full advantage of an NVIDIA GPU. NVIDIA has not released an Apple-approved graphics driver for macOS Mojave. For more information, please see this official statement from NVIDIA on NVIDIA's developer forums.
The impact on MATLAB is as follows:
Graphics
You can use MATLAB with an NVIDIA GPU on macOS Mojave and newer, however, graphics performance is degraded when compared to running MATLAB on previous releases of macOS.
Computational acceleration
NVIDIA-specific functionality such as CUDA is not available which means GPU Arrays, provided by Parallel Computing Toolbox and used by many products, will not work.
The following products have features that make use of CUDA functionality and these features will be impacted by the lack of an NVIDIA-supplied graphics driver:
32 Comments
Aleksander Tyczynski
on 14 Aug 2019
Hello,
Would downgrading to macOS 10.13 mean that I could make full use of the NVIDIA GPU on my mac? Use it for the Deep Learning Toolbox?
Thanks
Colin Fraser
on 14 Aug 2019
Hi Aleksander,
Assuming that NVIDIA was able to release Apple-approved drivers for 10.13, you should be able to use it for Mac. Just make sure the release is supported by the OS.
-Colin
Jason Ross
on 14 Aug 2019
Yes, the GPU will still work with the CUDA driver on MacOS X 10.13 for now. There are some things to keep in mind with respect to this support for future versions of MATLAB, though:
- The generations of GPUs supported on Mac OSX are Pascal and Kepler. There is no support for Volta or Turing cards as of this writing. nVidia will drop support for Kepler and Pascal cards at some point in the future. There's no hard date for this to happen at this point.
- MATLAB has a dependency on the CUDA SDK and Toolkit, which in turn have dependencies on system compilers. At some point it is likely that the toolkit version and compilers will continue to advance, and there may not be an nVidia driver that supports that toolkit release, or that works with a current compiler.
- Apple will at some point stop supporting OSX 10.13 in terms of security upgrades. We publish a road map for what release of MATLAB supports what release of OSX here. For MacOSX 10.13, the last supported release for MATLAB is R2020a.
Walter Roberson
on 24 Aug 2019
Edited: MathWorks Support Team
on 25 Sep 2022
The discussion from "metacollin" at https://forums.developer.nvidia.com/t/when-will-the-nvidia-web-drivers-be-released-for-macos-mojave-10-14/65895 is interesting. The claim made there is that Mojave no longer uses OpenGL itself and that Apple will not approve any drivers that do not have Metal support, which NVIDIA does not have as yet.
As Metal is a proprietary API it is not obvious that it is financially worthwhile to Nvidia to write such drivers.
Matthew Fitzgerald
on 14 Sep 2019
Edited: Matthew Fitzgerald
on 14 Sep 2019
Sorry to reopen this can of worms but is Metal support on the development roadmap for Matlab?
Or would it be possible to develop a "translator" or something similar to be able to use Metal GPU, or is is just too much work for an individual or small team?
Apple developer documentation describes support for parallelisation, matrix operations etc. and it looks like they should be able to be driven in a metal context.
Walter Roberson
on 14 Sep 2019
The "translator" is the existing opengl framework from Apple, which they are saying that they will stop supporting soon and which they will certainly not improve. I would expect by two, at most three OS releases from now that MacOS will pretty much not function with opengl.
Apple considers such a translator to be too much work for them, so I would not expect Mathworks to be able to handle it.
Matthew Fitzgerald
on 14 Sep 2019
Thanks Walter.
I guess either Apple and Nvidia will work out their differences, or Apple and AMD will develop Metal to be a real alternative to CUDA.
In the meantime, does anyone know if it's possible to use something like PlaidML for these hardware/software (Apple/Metal, PlaidML/Matlab) combinations?
Walter Roberson
on 14 Sep 2019
Edited: MathWorks Support Team
on 25 Sep 2022
Metal is a graphics protocol, not a gpu interface.
In theory Apple could certify a set of gpu drivers for cuda that were distinct from the graphics drivers. I do not know what either Nvidia or Apple are thinking of at this point.
Based on past history, I can speculate, without any inside knowledge at all (so I could be wrong, widely so)
Apple seems willing to to have Nvidia say "fine, we won't bother porting to Apple then!". It has been 6 years since Apple put an Nvidia into anything other than the Mac Pro, so except perhaps on the Mac Pro side, their direct revenue doesn't depend much on Nvidia.
Apple probably has more to lose from the game industry depending on opengl, and several major high profile games companies are working on metal ports and (I gather) getting performance better than DirectX, so they can expect to keep some of the games market, the high performance end.
One reason Apple can afford to tell/let Nvidia take a hike is that apple has AMD to rely on. The old play one company off against the other trick.
But really Apple hates being dependent on one company because that gives the company too much leverage. Apple's solution to this is to go in-house, to build its own graphics and gpu. Indeed it has already been working on that for years https://www.ft.com/content/8c778fae-18df-11e7-a53d-df09f373be87 and I find 2015 articles about this. They have already put their own gpu into some of their phones.
Apple has also been working on replacing the x64 with in-house cpu https://www.engadget.com/2018-04-03-apple-macbook-laptop-chips.html with possible Mac out next year. If I recall correctly, definitions for the new CPU have already been found inside the os currently in beta, Catalina (which, by the way, ends 32 bit support)
If I were Mathworks I would probably think hard about holding off on putting effort into Metal until more was known about the new CPUs, because if the new CPUs are not machine code compatible with x64 then it is not obvious that Mathworks will want to bother: it would be a big effort for a platform estimated by some parties to be roughly 15% of their market.
Oh yes, if Apple goes in-house for gpu (already known to be well underway) then there is no certainty that they will be AMD or Nvidia compatible, and more likely that they will not be, OpenCL at most. This is a reason why it would be risky for Mathworks to spend much effort on AMD gpu for Apple systems.
I can talk about these things because all I know is what is known to the public: I have not discussed this with Mathworks or Nvidia or Apple or AMD.
Michael Melnychuk
on 29 Nov 2019
Hi Walter
On Nvidia's site there is a CUDA driver released in May 2019 (after the above comments). I'm wondering if you know if this will work with with Matlab? Maybe this will help someone.
Thanks,
Mike
Walter Roberson
on 29 Nov 2019
Unfortunately the supported OS for CUDA 418.163 is 10.13 (High Sierra). I do not know for sure whether Mojave will reject it, but I am certain that Catalina will.
Christian Kennedy
on 9 Dec 2019
The take-away on this is that the intersection of GPU support in Matlab and GPU support post Mac OS 10.13 is the null set. Nvida provided no web drivers for 10.14 and there's no rational reason to expect them to do so for 10.15; meanwhile there's no support for AMD GPUs within Matlab, and it seems equally unlikely that it will evolve.
Compiler sensitivity has already bitten me in the ass on multiple occasions under Linux, so it's not clear that there's a solution there. In short, if you want to have meaningful GPU support under Matlab, it's looking like it's going to take a freaking Windows box to provide it.
It's at dark moments like this that Julia running in a Jupyter notebook seems an almost rational solution to life's problems.
Walter Roberson
on 9 Dec 2019
Edited: MathWorks Support Team
on 15 May 2023 at 11:54
It's official for NVIDIA:
LeChat
on 15 Feb 2020
I agree with @christian Kennedy. I feel more and more the push toward getting into Julia (OpenCL support being one very decisive feature for me, especially with the divorce between Nvidia and Apple). I believe Matlab should really evolve toward OpenCL if it wants to survive (in the GPU computing work and on Mac).
Walter Roberson
on 16 Feb 2020
Apple does not support OpenCL either, and will not in future.
Apparently OpenCL leaves enough parts optional, and which do in fact differ between manufacturers and models, such that Mathworks would not be able to provide a single OpenCL implementation... at least not an efficient one.
LeChat
on 17 Feb 2020
Does this mean that GPU computing on Matlab will die on Mac? Is there any compatibility with Metal planned?
At least please try to leave the future Matlab versions compatible with MacOSX High Sierra (10.13.6), so that we still have CUDA drivers for our Nvidia GPU...
Walter Roberson
on 17 Feb 2020
Does this mean that GPU computing on Matlab will die on Mac?
I have been emphasizing to Mathworks that this is something they need to talk about. They have acknowledged reading my concerns, but I do not have a response from them on the topic.
Alejandro Robinson Cortes
on 25 Aug 2020
Is there any solution to using GPU computing on Matlab with a MAC?
This issue has been in multiple forum questions for months. Is there any solution to this? I am buying a new computer, and seriously considering to drop MATLAB and/or MAC just for this reason. At this rate, I am inclined to drop both. Please come up with a solution for this.
Jason Ross
on 25 Aug 2020
The current solution is that if you want to do CUDA computing of any sort (which includes MATLAB), you need to do it on Linux or Windows. nVidia continues to provide both new hardware and software for these platforms, and MATLAB continues to work with them. We are dependent upon Apple and nVidia to provide that support, and it's been non-existant since 10.13, as I commented about nearly a year ago. There has been no new information from Apple, and no new information from nVidia, despite both vendors developing and delivering major new hardware and software offerings to the market.
I'm sorry that I don't have a better answer at this time, but that is the current state of affairs.
Ted Wong
on 22 Sep 2020
Is there a way to switch from GPU to CPU? I'm ok the code takes longer to run.
Walter Roberson
on 22 Sep 2020
Edited: MathWorks Support Team
on 15 May 2023 at 11:54
Well, in the time since a year ago, there was news from NVIDIA, in November, that they will not be making any further MacOS drivers. https://www.cgchannel.com/2019/11/nvidia-drops-macos-support-for-cuda/
Apple moved third party drives down one ring in security (to reduce the ability of drivers to affect the security of other processes), and apparently now requires that third party drivers be included with each different application, instead of being able to install one driver for use with all applications. That would have required that each different application include the NVIDIA drivers (and driver updates would have to be through Apple App Store for any product purchased through App Store).
That would have been quite a burden for developers, unless NVIDIA and Apple had been able to come to an agreement for Apple to bundle NVIDIA drivers... which Apple would not have much inclination to do unless NVIDIA paid them a bunch of money. Reminder: along with Apple's new ARM based CPUs, Apple also has its own custom GPUs, so Apple now sees NVIDIA as a competitor...
Walter Roberson
on 22 Sep 2020
Does this mean that GPU computing on Matlab will die on Mac?
NVIDIA GPU computing for MATLAB is already gone on Mac; it is not present in R2020b.
I do not have any information about whether Mathworks is working on support through AMD cards -- but considering Apple is moving to their own GPUs, it would not really make sense for Mathworks to pursue AMD GPU support for the sake of keeping the Mac market. My reading has also suggested that IBM hardware is where the second biggest deep learning research is, so from a research perspective, IBM support might have higher priority than AMD support.
Walter Roberson
on 12 Nov 2020
M J:
No, there are no GPU options for Mac starting with Mojave. Nvidia gave up and has left the Apple market. (Apple was not cooperating with Nvidia, and was declining to approve all new driver versions Nvidia produced; there was a bunch of chatter alleging that top Apple people had ordered the company to not approve the drivers.)
As of today the new ARM based Mac was released . Mathworks has indicated that they are working on patch for support in R2020b using Rosetta, with native support for the release after. However GPU support is not expected.
Walter Roberson
on 12 Nov 2020
Unfortunately the cost of adding support for a different kind of GPU is not small. It is not enough to add OpenCL, as the current support makes extensive use of a vendor-supplied high performance library of linear algebra and similar routines (such as fft).
The reading I did earlier this year suggested that AMD is considerably behind in market share for Deep Learning, and that the major competitor to NVidia is IBM -- so if the goal were to be to support high performance Deep Learning kinds of techniques specifically rather than just general GPU computing, then IBM might be a wiser move. On the other hand, the IBM boards do not seem to be common in the mass market, so for general purpose work, AMD would seem a better choice... but then there is the factor that with Apples new M1 architecture, Apple systems will not be either IBM or AMD...
M J
on 6 Dec 2020
Edited: M J
on 6 Dec 2020
Very interesting! Thank you for the info. So in any case, I think I will get a desktop/workstation to overcome this problem. Do you think it would be safe to go with something that has, say, a Geforce RTX 2060 (compute capability = 7.5), assuming I intend to work with the trainNetwork function in the long term? Would it be a wise move for the long term (next 5 years)? Thanks again.
Walter Roberson
on 6 Dec 2020
My personal assumption would be that 5 years from now, you would be handing an RTX 2060 down to a young relative or neighbour who has an interest in "retro" gaming (early 2020's), with you either having gotten out of the deep learning field yourself, or else having upgraded to something newer / faster / buzz-word-ier.
5 years ago was Maxwell architecture, GeForce 9xx timeframe. Not a terrible architecture by any means, but if you had one in hand now, you would be dithering over upgrading it now or waiting for the next NVIDIA release hoping for a price drop on the RTX 2xxx series.
Walter Roberson
on 31 May 2021
Deep Learning on GPU, and GPU use in Statistics and Machine Learning, will not be supported on Mac any time soon.
I have no information about whether Mathworks is working on GPU for M1; every time I ask through private channels, I get silence.
Walter Roberson
on 15 May 2023 at 17:12
... and now 2 years later the 2060 is passé and you get a 40x0 if you you can afford it, 30x0 otherwise...
More Answers (3)
SALIOU Fall
on 8 Feb 2021
the Matlab app cannot be istalled in my Macbook air. how can i do?
1 Comment
victor chen
on 1 Aug 2021
where I can get the newest version GPU PROCESSOR IN COMPUTER, like APPLE PRO?
OR best for you recommanded !
many thanks indeed
Victor Chen
4 Comments
Walter Roberson
on 1 Aug 2021
Unfortunately, MATLAB for MacOS has already dropped Nvidia support. MacOS Catalina is the last MacOS that had the drivers and Pascal was the newest supported architecture.
Walter Roberson
on 1 Aug 2021
But to answer the question:
For the Mac Pro, the limit is the Gen 2 Mac Pro (2009) with the NVIDIA GT 120, CUDA 1.1. You would have ot use it with a quite old version of MATLAB, somewhere around R2013-ish. This was the only NVIDIA card that Apple itself supported for any Mac Pro, as far as I can tell.
Jason Ross
on 2 Aug 2021
Edited: Jason Ross
on 2 Aug 2021
As Walter says, there are no modern Mac systems that support CUDA or GPU processing using CUDA. To use the latest GPUs (at this point, Ampere cards like GeForce 30XX or A100) you need to run Windows or Linux.
Walter Roberson
on 2 Aug 2021
You may be wondering about getting an eGPU for Mac Pro. As far as I can determine at the moment, the eGPU recommended by Apple never supported NVIDIA. I find two Thunderbolt 3 eGPU manufacturers that do support Nvidia, but you cannot get MacOS CUDA drivers for anything newer than Pascal architecture on Catalina, and Mathworks has dropped support for GPUs on MacOS (because Nvidia has dropped support.)
Felix-A. Lebel
on 20 Mar 2022
Considering how performant apple made system on a chip architecture (M1) has become, is mathworks considering the possibility of using the new possibilities of this hardware to the benefit of its users? It is a shame that we cannot take full advantage of the hardware due to dispute between Nvidia and apple. Politics should not get in the way of science.
3 Comments
Walter Roberson
on 20 Mar 2022
Mathworks considered it, and decided not to proceed for several years (if ever.)
Mathworks is not in the business of writing high performance numeric libraries such as Eigenvalue and QR and fft. It relies on third-party libraries. Apple has not created suitable libraries. Apple does not have the kind of tool chains that Nvidia has to create high performance mathematics. The needs of graphics systems for display are not the same as the needs for science and engineering.
Walter Roberson
on 30 May 2022
I read a posting from a company that was trying to do some higher performance computing on the Apple M1 GPUs. They wrote that the documentation from Apple about how to achieve performance was very weak, and that they tried a number of approaches but were not able to get nearly the rated performance. Apple did not cooperate with them.
This is very different than Nvidia, which puts a lot of effort into making high performance computing accessible to developers.
If Apple does not provide the ecosystem and does not provide enough information for developers to create ecosystems themselves, then the task becomes rather difficult.
See Also
Tags
No tags entered yet.
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!An Error Occurred
Unable to complete the action because of changes made to the page. Reload the page to see its updated state.
Seleziona un sito web
Seleziona un sito web per visualizzare contenuto tradotto dove disponibile e vedere eventi e offerte locali. In base alla tua area geografica, ti consigliamo di selezionare: .
Puoi anche selezionare un sito web dal seguente elenco:
Come ottenere le migliori prestazioni del sito
Per ottenere le migliori prestazioni del sito, seleziona il sito cinese (in cinese o in inglese). I siti MathWorks per gli altri paesi non sono ottimizzati per essere visitati dalla tua area geografica.
Americhe
- América Latina (Español)
- Canada (English)
- United States (English)
Europa
- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)
- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- Switzerland
- United Kingdom (English)
Asia-Pacifico
- Australia (English)
- India (English)
- New Zealand (English)
- 中国
- 日本Japanese (日本語)
- 한국Korean (한국어)