All Questions
8 questions
2
votes
1
answer
687
views
How to use AMD GPU to process data with tensorflow and keras on Windows 11 pc
I'm unable to use my AMD GPU for work with data in Python code and using TensorFlow and Keras in Windows 11 Pro.
I already tried some things like Intel plaidML and other third-party software but ...
2
votes
0
answers
276
views
DirectML InvalidArgumentError: Graph execution error: No OpKernel was registered to support Op 'CudnnRNN'
I was trying to use DirectML for usage of my AMD rx580 graphics card in TensorFlow, but I'm having a real hard time to pull this up.
I'm getting this error:
--------------------------------------------...
1
vote
1
answer
3k
views
How to enable AMD Radeon graphics to train deep learning models?
We can train our deep learning model on GPU, I know how to enable NVIDIA graphics to train deep learning models, I just want to ask how can we use AMD Radeon graphics to train deep learning models in ...
4
votes
0
answers
549
views
Is there any way to connect Tensorflow Dataset to plaidML-keras for deep learning with GPU?
I have an AMD GPU and I'm running on Windows 10 with Python 3.8. TensorFlow didn't provide any support for AMD users in terms of GPU acceleration.
I've found a way on StackOverflow describing how to ...
0
votes
0
answers
2k
views
How does Keras with PlaidML backend significantly outperform Keras with Tensorflow backend?
I was surprised to find out that there is a significant performance difference in terms of speed and model accuracy when using different backends for the same Deep Learning problem, namely the famous ...
14
votes
2
answers
11k
views
Anyway to work with Keras in Mac with AMD GPU?
I have a MacBook Pro with AMD processor and I want to run Keras (Tensorflow backend) in this GPU. I came to know Keras only works with NVIDIA GPUs. What is the workaround (if possible)?
1
vote
1
answer
586
views
Setting up Keras and TensoFlow to operate with AMD GPU
I am trying to set up Keras in order to run models using my GPU. I have a Radeon RX580 and am running Windows 10.
I saw realized that CUDA only supports NVIDIA GPUs and was having difficulty finding a ...
0
votes
2
answers
3k
views
Deep Learning with Python using Tensorflow and Keras on AMD GPU with ROCm gives errors when I run the program below
I have an AMD GPU PC which I have recently started to use with Linux Mint OS. I've seen a way of installing ROCm on this PC following this tutorial, but when I tried to write a Python program using ...