728x90 AdSpace

Friday, September 1, 2017

Make Tensorflow use the GPU


Hi, Today I was using Keras lib that uses Tensorflow in the back end.

I installed Tensorflow using this documentation (don't follow it the correct method is bellow) and I made sure to install the GPU version,
 but when I was running my code it seemed that Tensorflow is using the CPU. So after few minutes of googling things up, I find a way to see if Tensorflow is using the CPU or GPU by executing this code in python :

import tensorflow as tf
hello = tf.constant('Hello, TensorFlow!')
sess = tf.Session()
print(sess.run(hello))


In my case I got an error :
tensorflow.python.framework.errors_impl.InvalidArgumentError: Cannot assign a device for operation 'MatMul_1': Operation was explicitly assigned to /device:GPU:0 but available devices are [ /job:localhost/replica:0/task:0/cpu:0 ]. Make sure the device specification refers to a valid device. 


Like the error shows : available device is the only CPU, the GPU wasn't detected so I guess the reason is that my graphics card driver (NVIDIA) wasn't installed, I recently formatted my PC and I am using  Parrot OS as a Linux distribution and didn't need to install the driver before but now I have to.

After using google again I found the right way to install Tensorflow.

Make sure the NVIDIA Compute Capability of your GPU is more than 3.0 by checking this link :
https://developer.nvidia.com/cuda-gpus
Otherwise, Tensorflow won't work on your GPU.

First, let's install the NVIDIA driver.

I found this link to install the NVIDIA GPU driver it was for Kali Linux but it works with my distribution too, they are all based on Debian.

Using this command I was able to see my card NVIDIA in the list :

[hosni@parrot][~]$ sudo lspci -v

and to install the driver and the CUDA toolkit .. I used this command 

[hosni@parrot][~]$ sudo apt install -y ocl-icd-libopencl1 nvidia-driver nvidia-cuda-toolkit

And we must restart the PC afterward.


Now let's make sure the driver is installed we use this command : 

[hosni@parrot][~]$ sudo nvidia-smi

Now we have to install Cuda Toolkit and cuDNN, for Cuda toolkit use this link: https://developer.nvidia.com/cuda-downloads

it's a normal deb package installation.

For cuDNN use this link downloads the file (download version 6 for Cuda 8) and use the following commands :

[hosni@parrot][~/Downloads]tar xvzf cudnn-8.0-linux-x64-v6.0.tgz
[hosni@parrot][~/Downloads]sudo cp -P cuda/include/cudnn.h /usr/local/cuda/include
[hosni@parrot][~/Downloads]sudo cp -P cuda/lib64/libcudnn* /usr/local/cuda/lib64
[hosni@parrot][~/Downloads]sudo chmod a+r /usr/local/cuda/include/cudnn.h /usr/local/cuda/lib64/libcudnn*

The files are big so it may take a while.

Few more steps update your bash file : 


[hosni@parrot][~]geany ~/.bashrc

With the text editor open, scroll to the bottom and put in these lines:


export LD_LIBRARY_PATH="$LD_LIBRARY_PATH:/usr/local/cuda/lib64:/usr/local/cuda/extras/CUPTI/lib64"
export CUDA_HOME=/usr/local/cuda

Save and close it and put this command :


[hosni@parrot][~]source ~/.bashrc

Now the last step is to install Tensorflow use pip or pip3 depends on your python version  and put this command :

 [hosni@parrot][/home/hosni/IDE/anaconda3]$ pip install --upgrade tensorflow-gpu
If you are using Conda like me make sure to create Tensorflow environment first using this commands :

[hosni@parrot][/home/hosni/IDE/anaconda3]$ conda create -n tensorflow
[hosni@parrot][/home/hosni/IDE/anaconda3]$ source activate tensorflow 
And that's it :



Now in your IDE make sure to select the right python interrupter version where Tensorflow is Installed.

For Intelij I needed to add the environment variables again


Everything should work fine now, for me after I spent the whole day trying to make this work I got this message :

Ignoring visible GPU device (device: 0, name: GeForce GT 620M, PCI bus id: 0000:01:00.0) with Cuda compute capability 2.1. The minimum required Cuda capability is 3.0.

it seems my graphics card is old and I have to use the CPU 😢 so make sure to Check the NVIDIA Compute Capability of your GPU card first.

0 comments:

Post a Comment

Item Reviewed: Make Tensorflow use the GPU Rating: 5 Reviewed By: Hosni
Scroll to Top