-
Notifications
You must be signed in to change notification settings - Fork 608
macOS GPU Support #141
Comments
Hi @Gerzer! Thanks for reaching out. Currently, TensorFlow only supports NVIDIA GPUs, and only on Linux. (The reason is that for now, as we're building on the existing TensorFlow C++ core, our GPU support is the same.) Although we're still shaping up our roadmap for our next release, I expect it will help significantly with regards to your requests! :-) All the best, |
Hi any updates on macOS GPU support (especially AMD GPU support)? |
@GauthamSree This topic may have less to do with TensorFlow, S4TF and PyTorch and more to do with software and driver development by AMD. So, as someone mention in TensorFlow's SIG Build meeting more than a year ago, AMD had been trying to add GPU support for over 10 months at the time. Why do Macs not use NVidia? Could be the history of drivers too. |
There's a community supported TF+AMD though (with ROCm and TFrocm ROCm/tensorflow-upstream#362). |
Issue #341 asks this same question, and I have some slightly newer responses there. As a short recap of those: until the introduction of the X10 backend, we had been reliant on the TensorFlow eager runtime and could only target the platforms it supports. While ROCm has experimental support in TensorFlow on the Linux side, that doesn't help macOS compatibility at present. With X10 and the more flexible backend system we've been developing, there might be more opportunities to add platform support without requiring as large an investment in development resources as would be required to add this across TensorFlow as a whole. There are a few different levels at which this support might be integrated, with the way that we have things configured now. Our current focus is still on Linux-based systems with TPUs and Nvidia GPUs as accelerators, because those are the most common systems in use by researchers and practitioners. We want to make sure we make the toolchain and APIs fast, stable, and easy to use first. |
According to the installation documentation, Nvidia GPUs are supported when using Linux. Are these Nvidia GPUs supported on macOS, too? What about AMD GPUs? (Only AMD cards—not Nvidia ones—are officially supported by Apple on macOS, though projects like PurgeWrangler seemingly can serve as workarounds.)
By the way, I'm personally very excited about Swift for TensorFlow. I can't wait for recurrent layers and model serialization! Then, I'll be able both to develop my model and to ship it using Swift exclusively.
The text was updated successfully, but these errors were encountered: